NS, the Dutch rail operator running local and international trains, partnered with Xomnia to boost ticket sales and user experience on its website. In order to enable all visitors to go through the NS International website as smoothly as possible, Xomnia’s data scientists Thijs Nieuwdorp and Nicky de Beer worked with NS' website development team to design an A/B-testing process based on data and psychology together.
After a first analysis of the way all traffic navigates through the website, the team’s consumer psychologist drew up several hypotheses on how visitors interact with the website, which the team started to test. The main target was found to be the booking module, since its pages had the highest traffic and best measurable chances for improvement. Thijs explains the collaboration in the case below.
Both juniors [data scientists] are smart, creative young professionals and very pleasant to work with. We use the insights we gather from their analyses and predictions to set up A/B tests and customer interviews. This leads to improvements to the website as well as to more bookings, and to relevant travel recommendations to our customers.
Tina Gremmen, online sales manager
Sustainability is more relevant nowadays than ever. Flying produces at least 25x more CO2 than travelling by international train, which drives many to look for different modes of transport to reduce their ecological impact. This shift in consumer priorities is an opportunity for international trains, which offer a more sustainable alternative to travel.
With this backdrop, NS International seeks to grow its share of the international train travelers' market, and to connect customers with the parties that drive these trains. Facing NS International, however, is a complex market that the train operator aims to simplify by unifying different sales platforms into one single simple interface.
Every carrier (a party that drives their own trains) has its own pricing system, and NS International connects these different platforms and attempts to sell their services as one product. In order to do this, NS needs to design an interface which should be able to convey all the necessary information without overwhelming the visitor. To optimize this interface for all the website visitors, it is necessary to start experimenting.
We started by working with a tool called “Visual Web Optimizer”, which allows making content changes on the website and then splits the traffic randomly between the web content variants. Because the only variable in this situation is the difference between the variants, one can reliably test hypotheses and build on those results.
The more we started to test, the more we ran into the limitations of the used tool. Especially in our own interactive booking module, VWO was not powerful enough to test changes. This is where our development team came in. Together with them, we designed our own A/B-testing code, which sent its results to Google Analytics. We then analyzed those results using Bayesian statistics programmed in our own analysis scripts. Using this form of statistical analysis allowed us to more easily interpret experiment results and communicate them to management and other stakeholders.
When starting out with weaving A/B-testing into the existing process, we ran into a big “moment of truth”. The development team had worked on a new feature that they wanted to implement. We discussed putting it to the test and eventually did test it (“meten is weten”! Or measuring is knowing).
According to our results, this hard-worked-on variant turned out to be a “loser”, a test variant which performed significantly worse than the old version of the website. This created friction between the development team who suddenly got blocked from releasing their feature, and our new working method backed by data and statistics.
After discussions about this and receiving support from the management following the realization that this is what data driven working entails, we worked together with the development team to tweak their feature according to our results. After retesting, this feature became a “winner”! The tweaking we did? We swapped the color scheme that was used to mark search results… Small changes can make a big difference!
This experience created a turnaround in the local culture at NS International. The development team started to see the value in making small iterations and testing them as soon as possible. After a few years, we’ve done tens of tests, ranging from smaller to bigger changes, from A/B test to A/B/C/D tests. Part of releasing changes became to always test the effects they have on customer behavior.
By calculating the difference in conversion, and by combining it with the traffic on that page and the average order value, one can make an estimate of the impact of an A/B test on revenue. Sometimes we also simply didn’t make a significant impact on the revenue with a variant. We’ve done tests where the A/B test validated variations that improved conversion rates in such a way that it’d yield us €160.000 per month. On other occasions it saved us from implementing a variant of the booking module which would’ve lost us €220.000 per month!