The User Experience Management dilemna: Big Box or Best of Breed

Date: 21st September 2012
Author: Deri Jones

I was talking an interesting senior User Experience Manager today, who handles 14 countries for a B2C vendor with very strong European roots,  and at least as many local websites, and who’s got some great advice,  picked up whilst trying to pull a coordinated strategy together with the User Experience teams, whilst fitting in with the overall organisation goals to become a more e-orientated business on the inside.

The first challenge overcome, was touring the region to meet all the user experience teams, and find out what tools they each are using:  many said they had tools in place, but on the ground were sometimes getting little value for the budget spend.

Tip No 1: The key tip here – was to feed back constantly, not just at the end , to all the groups.  Rather than write a big report that appears a couple of months after the first team was visited: a more blog like approach was taken – after each country visit a mini-report would be sent, listing the hard facts form that country (down to the list of user experience monitoring tools), but also some more qualitative notes, on the hows of that teams approach.

This ensured all teams felt involved in the process, and some questions and discussions that were triggered even before all countries had been profiled allowed engagement, and users of different systems to discuss directly their likes/dislikes of their own systems.

Out of that came a plan for a standardised set of tools – that each country would be encouraged to move to: with the market in each country being at  different state of maturity there is a big spread between the websites in terms of sales revenue per year and local team ability to introduce change.

The first decision was central, and relatively straightforward  – to standardise on a common Web Analytics platform.   There was no question that this was needed, pulling the facts together from across the countires on page hits etc was easy and running a streamlined RFP in accordance with company guidelines went fairly quickly so within 3 months all countries had access to Google Analytics via a central group-wide contract:  with each country not costing the group anything until they actual become users.

A smaller bite sized project put in place a common choice of online User Survey technology.

The next focus for joined-up, standard tool sets was the need to have data on more detail actual customers  –  that would allow drilling down to segmented types of real users and tracking usage patterns and changes in some detail.  With a need to actiually see examples of error pages that real customers had been exposed to.

Whilst Web Analytics provides the top level view of user behaviour en masse – there was a need to drill down and see which types of users are failing at which specific places inwhich user journeys.  Being able to see the actual error pages the users saw  – something analytics doesn’t set out to provide.

So after a second formal purchasing process, a fnal selection was made for TeaLeaf.

Tip No 2: when going out to the market looking for these wider user experience measuring tools – you have a key choice to make.   Do you want a Big Box approach: meaning a fully integrated, enterprise wide system that promises to organise your entire organisation but to achieve it will require deep involvement and buy-in from the IT teams outside of the Customer Experience space: and will overlap with the smaller scope internal tools beloved of IT teams that help them get their job done.

In your organisation, depending upon your overall company agility, eCommerce maturity and desired timescales this Big Box approach is right for you.  And the big suppliers like IBM, Oracle, CompuWare and HP will happily pitch to you, (and to directors at board level without your knowledge if you’re unlucky!  * ) the benefits.  And the big costs.

But in this case, an alternative approach was desired, a more agile approach.  The success of some of these earlier Big Box approaches had not been good, so the company’s attempts to become moree-enlightened internally was to encourage agile, point solutions that could be implemented quicky without needing the wide, joined up projects that so often  slow down and stall.

Tip 3:   Website Performance Monitoring  – you need to plan a strategy for getting a meaningful realistic User Journey website monitoring in place.   But it often won’t be on your User Experience dept budget: it won’t therefore be you that are the decision maker; (depending on the organisation it may be elsewhere in the eCommerce Business team budget:  or else the eCommerce IT budget).

Why is this important – because whatever you do in the user experience space, it is being delivered to your customers via technology.  All the UI usability improvments you make, all the increasing use of Ajax and Responsive Web approaches:  the final user experience all depends on the technology:  and unless you have meaningful measurements of the speed and performance delivered ; your Business Inteligence will always be flawed.

Small tweaks in the technology or  small code changes can be made with the best intentions but with the Law of Unintended Consequences – there will always be times when this slows down certain pages unexpectedly, or limits capacity during a high-traffic marketing campaign.

So the challenge is for you is work out a way you can help and support and champion a move to a better website performance solution.

Tip 4 – to help other teams buy into a more realistic User Journey Monitoring approach:

The challenge is often when non-IT teams come to IT wanting ‘more measurements’ that it feels like the benefit will not be the tech team but other departments.

So to help address the  IT Teams natutral reluctance to champion such changes, key issues to cover are:

  • we don’t want to replace the tools you already use and depend on
  • new metrics of user journeys will enable you to be free of blame when the business are looking for reasons of a recent sales drop off!
  • if the new metrics do highlight some technical areas, the data will be specific and point out where 3rd party systems are the root cause so that internal IT are out of the loop
    • as things become more cloud-based, this 3rd party flag becomes more important
  • Drill down data  will provide exact component level and network level details that point specifically at te root cause – this will will save your team trouble-shooting time, if ever an issue is highlighted



*  When shopping for User Experience Management Tools,  you’ll want to avoid taking up too much time on looking at the range of good quality server monitoring tools out there (that your IT are likely using some of) that also include marketing messages on their sites that hint at being good for User Experience teams; when in reality they are not, they’re essentially just great tools for IT teams at a server/systems level.

Marketing phrases like this hint at the orientation around servers and hardware:

User experience monitoring requires administrators to track performance on numerous systems throughout the network to understand how that performance impacts end users. User experience management tools can minimize end user system failure by delivering real-time data about processor and memory usage, disk usage and availability, and other key metrics that can proactively point to potential problems before they arise.