Date: 14th July 2011
A colleague draw my attention to this BBC news item today, which quotes an “online entrepreneur” saying “millions of pounds worth of business is probably being lost each week” in online retail due to the issue he raises (more on that later).
I hope that readers of that BBC piece aren’t tempted to call their online colleagues and suggest there’s a new trick to help increase online sales because they will be disappointed. The article is very low on evidence, and as the sites it names seem to be connected to the company of the “online entrepreneur” there’s a question is to whether the whole piece is just a successful bit of PR cooked up to promote the name of a new online store for women’s tights.
All too often though, when talking to new clients about their web performance, aspirations and activity to date, I do hear of time and money being spent on ideas that whilst not wrong as such, aren’t supported by evidence form the actual website in question. The most common end result is that well intentioned effort has produced a poor ROI in terms of any change to online sales.
The BBC article quotes Mr Dunstone saying that online retailers can double their sales by addressing ‘simple spelling mistakes’, based on the evidence of… just one website, and one that is linked to Mr Dunstone. Humm. That is not evidence – and anyone rushing to spend time and money based on it risks seeing very little return for their efforts.
But it doesn’t have to be that way – it doesn’t have to be tricky to get the right evidence for your specific site and to ensure you maximise the time and money spent on your online store for a couple of good reasons…
Firstly, at all but the most truly dysfunctional retailers, it’s not hard to get agreement round the table that activity should be first supported by evidence. In general, given that spending money involves getting over a few more hurdles, getting a new purchase agreed tends to require a higher ROI evidence threshold anyway, whereas projects that just take up the time of existing staff don’t get the same scrutiny. Hence it’s more common for IT guys and software coders to have spent time on optimising things that had no evidence behind them and made no difference in reality.
But agreeing that evidence is needed isn’t the same as having it to hand.
And this is where the chasm opens. Too often, IT will be asked to find evidence, and after a trawl through a mass of logs from the internals of various systems will mix in some data gathered from monitoring of a few web pages, maybe ‘a typical product page’ here and a ‘home page’ there, and produce a document with evidence in the title.
This is starting entirely at the wrong end. If you want to improve the performance of your website – and I’m talking improvements here that are delivered by IT (not improvements delivered via marketing campaigns, or branding projects etc) – then you have to start where the money is: start with the users.
That simple realisation can transform your ongoing web performance management planning.
To get the evidence to know where to target your IT teams, to get the best ROI from their time, you have to start by measuring the user experience: and measuring it for multi-page journeys that actually reflect the real-world routes that real users take.
If you want to plan a website load test, so that IT can address bottlenecks and improve performance, start by planning the User Journeys you want to measure -that will give you the real evidence.
Armed with that data you know which user journeys have performance issues, when they occur, what pages in the route are the prime root cause, what objects within those pages, which products in your catalogue are most prone to slowdown or error: and which never suffer. You know which system,s and which servers, are in the spotlight.
Now you can examine that real evidence and it will point you to look deeper in the right places. Like a surgeon who starts his operation, the patient has already been through the hands of the GP who has ruled out a myriad other root causes for the symptoms in question. The IT team, like the surgeon, will know to insert their knife, know which logs on which systems are worth drilling down into, which blocks of software are worth examining.
That’s the beauty of an evidence-based approach: faster ROI, best use of precious internal resources.
And as icing on the cake bymaking User Journey performance the KPI that is used across the eCommerce operation IT and non-IT staff alike can understand the pages that make up the realistic journeys being measured. Everyone can understand the common language of Journey X going red when it’s in error, versus the other Journeys all being green and therefore OK.
A wallboard screen can be used to display the real time status of the measured journeys and much less time is taken up in meetings and with questions about ‘how is the site today’: and ‘did we have a slow down last night’ etc.
As Chris Howell Director at Dixon Stores talked about at his Keynote speech at Internet World this year – User Journeys can be the ‘common language’ that ‘unite the tribes’ of IT and Business teams:
“It really brought home to us that our ability to understand what’s going on in real time makes it possible for us to better help our customers and deliver on the services…
“The transparency piece is that we remove the smoke and mirrors and you create opportunities for conversation. And the best way to build these journeys is to sit down with your user experience guys, your marketing guys,and actually you sit there and explain what you’re going to try and do and you do talk about the mystery shop, the exit survey, they’re used to the kind of pop up survey at the end of it, and if we asked them, the insight guys would love to be standing on the shoulders of the guys walking around our shops you know we can’t do it because shoppers would probably find that a bit irritating, but SciVisum gives us that chance and it is embedded in all of our routines.”