Why Choose SV?
Do What The User Does
Website Load Testing
What Happens In A Load Test
Clients new to load testing, or who haven't experienced testing using a managed service or involving sophisticated Dynamic User Journeys to map different kinds of usage and different kinds of peaks, often ask us what to expect.
The example below is based on a typical Load Test project.
A world famous retail brand with an enviable high street pedigree now sees its online business serving over tens of thousands of visitors per hour at peak times. Wanting the ensure that the customer service they have become famous offline is carried through to their online presence the company needs to be absolutely certain that their website can handle very heavy traffic and deliver a perfect customer experience to every single visitor on every single visit.
As the site became increasingly popular the eCommerce platform was starting to struggle in busy periods, especially during the big sales or as a result of successful campaigns and promotions. With an impeccable reputation for quality and customer service it is imperative that the brand delivers the same levels of service online as customers have come to expect on the high street.
With sites becoming increasingly complex and relying on the integration of many different systems the identification of any bottlenecks require in depth analysis and the correlation of many different sources of data. The flexibility of sites has also lead to increased sophistication in user behaviour as there are many more ways for customers to interact with sites. This can also make testing more challenging where an organisation is determined to ensure that all these different types of users and usage patterns have been understood and optimised.
It was becoming increasingly clear to the team that as eCommerce continues to rise in strategic importance the data available from simpler, traditional, load testing that they could do themselves with self service tools that concentrates on basic metrics such as concurrent user sessions, page views, or technology driven performance measures alone was no longer able to provide the level of insight needed.
SciVisum's innovative approach to “realistic testing” through the use of Dynamic User Journeys that really “Do What The Customer Does” can provide just the in-depth information needed by all the teams involved at all stages of the eCommerce strategy and execution.
SciVisum work closely with clients to generate as close to a real-life scenario as possible. We do this through building a representative set of user journeys by collating information from benchmarked 24x7 monitoring, web analytics usage stats, and other relevant systems to generate load that offers much more precise insights than can be achieved with simpler traditional or self service methods.
Session lengths, the weight of pages and the speed of processes involved and load for these various kinds of journeys and the systems they run on will most likely differ greatly. In addition to that the proportion of the journey types will likely change at different times of the day or periods of the year. For an effective load test it is important to cover not only a mixture of different journeys, but to weight the number of each type of journey within the test by the proportional breakdown of those journeys in different time periods, reflecting the ways the site must be able to handle different kinds of traffic peaks.
To accomplish this client teams work closely with SciVisum's lead load test engineer to analyse both the data collected by SciVisum’s load engine and various results from different relevant areas of the rest of the infrastructure. Working with multiple data sources and correlating the information across all of them enables the SciVisum engineer to identify likely problem areas. In turn this means that tests can be devised to specifically explore those potential problem areas as a complement to the known mix of user journeys under investigation.
Working together in this way means that the client's staff can focus in on designing and implementing changes resulting from initial findings before re-running the test to see the results.
The tests themselves can involve complex user journeys of many separate dynamic steps and top level measures including:
The number of users that can complete a given journey per minute within the expected mix of total journeys.
Realistic number of user journeys per hour based on projections from web analytics about how users might “bunch” during different periods.
Max pages per hour with further detailed breakdown to reflect the fact that not all pages are equal in the load they cause a system, or how they are called on as a proportion of page visits for a given time period and type of traffic peak.
Another difference with the SciVisum test engine is that it does not simply 'replay' a fixed sequence of URLs for each Journey, but dynamically looks into each page to determine the link for the next. Coupled with the ability to understand user behaviour based on web analytics data this enabled SciVisum engineers to ensure that the virtual users of the load test were able to behave, and interact with content, in the same way as real users and so emulate stress conditions accurately when testing for stability.
The most successful projects involve many different stakeholders at different stages in the process. When this happens it usually indicates that eCommerce is held in high importance throughout the organisation and there is interest in the load test results both from practical, hands-on, users of the data and from participants in the strategic discussions arising from the activity.
The exact process varies from client as each load test is bespoke, but most follow a similiar outline.
Firstly an approach is agreed where various stakeholders would take on different project roles/responsibilities at the client and at Scivisum. Often the Web Analytics Manager from the business team will spend time with the SciVisum engineer either, by phone or onsite, to work out user journeys and a sensible mix of those to generate the load based on current traffic or projections for the future. In order to benchmark performance SciVisum also often maintain 24/7 monitoring of User Journeys, taking measurements of journey completion time every 5 minutes which allows the engineers to make observations that can be passed back to the client team for explanation and provision of context.
Next comes a series of conference calls and onsite meetings involving SciVisum Load Test engineers, Web Performance Lead and Customer Liaison manager with client teams to prepare and agree pre/post actions for the load test. This is the point we agree what metrics to use, and how they correlate to actual traffic on the site. This enables the SciVisum Test Engineer to generate load equivalent to busy periods, then incrementally increase the load to test the capacity of the site.
The load test is executed and the client IT team provide the SciVisum engineer with data from various pieces of their infrastructure monitoring tools for correlation against the SciVisum load testing data. The engineer reviews all this data, not only the SciVisum test results, and writes a bespoke report detailing his findings and observations, complete with screen shots and detailed descriptions of all errors thrown, and their specific implications for the client's online goals.
The report, with sections appropriate for technical and business users, is then used as the basis for a workshop with all parties involved to discuss the findings and agree actions that the client IT team can take to try and remove potential bottlenecks. Subsequent load tests are then run and results again reviewed during conference calls to agree if any further action is required.
After The Test
Going forward many clients devise a plan of regular load testing to ensure changes they make to the platform during the year don’t significantly reduce the capacity of their sites. This is often coupled with use of the Site Release Management tool that is part of the monitoring suite and the Load Test Portal which is designed for QA testing on Dev Environments.
Deri Jones' CEO Blog
Read our case studies to find out exactly how our web testing services have helped clients optimise the user experience delivered by their web applications.