Date: 9th March 2012
Some phrases and concepts cause a lot of confusion and take up so much time, when planning website performance management projects.
Website ‘Last Mile’ monitoring is one – there are several limitations in practise, and we are finding fewer clients getting value from it within their overall end-user experience monitoring.
The phrase Last Mile came originally from the Telco sector and refers to the physical wiring between a users premises and the local exchange.
In the realm of Website monitoring, Last Mile has the promise of getting better real-world insights into the experience of customers visiting your site. That’s a good aim and we are always striving for more Realism in our projects here.
There are a few ways to monitor the Last Mile and for a few clients we have provided a dedicated test system with dedicated, domestic quality, broadband lines installed at key locations round the UK with the same User Journeys running at each so that comparison between them is possible. This is called CityWatch
For clients with particularly focused geographic needs, this can be effective, for example if, say, 30% of their user base is connecting from Manchester it is vitally important to have specific data from that location.
But for most clients, it’s impossible to get a statistically significant picture of the last mile unless you’re measuring from all of the top 5 UK Broadband providers, and from scores of UK locations. Clearly that’s an impossible task! However, without that level of coverage, the data points are just too few to be statistically significant.
Another issue is that even if a huge set of data points could be gathered, there is little actionable data to be had. Even if you know BT is having a problem at city X for an hour there’s nothing you can do you to change that. There’s no one at BT you can call to ask to fix it! The most you can hope to do is pass on that information to any customers as the reason for a slow down.
An alternative and widely publicised approach to Last Mile has the promise to get a huge set of data points but in reality the data it generates is flawed. It uses the concept of putting software into real, consumers devices at home or on the move (in exchange for small payments) thousands of them, connected to thousands of local ISPs and having monitoring scripts run from those.
Is the flaw obvious to you? I have teenagers at home, so it’s clear to me. The performance of any website from my laptop at home is determined to a great extent by just what my teenagers are doing online! If they are streaming audio, iPlayer, downloading, gaming… all of those will change the network performance I’m getting and have the effect of changing the performance of my broadband at home from fast, down to old-style modem speed!
And you can imagine the variability you get on a wifi network in cafe, where the user base contending for bandwidth varies hugely, so mobile Last Mile monitoring is particularly inaccurate.
So, statistically, there is just way too much “noise” on the data to have any value at all in making evidence-based decisions on your monitoring website: Last Mile doesn’t help.
At the end of the day – that is the most important thing – do not make decisions about your website 24/7 user experience unless it’s evidence based. The most important Realism, is to have dynamic Journeys that Really Do what the Customer Does – whether they are monitoring mobile journeys, iPad/tablet Journeys or regular browser journeys.