Why Experiment? Because Small Changes Can Equal Big Results

Try, Try and Try Again

How do we get people’s attention, make them click on a link, or sign up for an Open Day? The answers to these questions can be the difference between prospective parents and paying parents.

In a busy market, with increased competition for visitor engagement, schools need to address these questions and regular experimentation offers a great way to do that. It can not only help us attract more visitors, but help us to keep them for longer, showing us where our websites are performing and whether our visitors are getting the user experience they need and expect. 

When it comes to journey optimisation, readability factor, or even the choice of imagery, being open to experimentation can eliminate guesswork and ensure that your online experience is the best it can be. 

Blue or... Blue?

Every year, leading companies around the world (think: Microsoft, Amazon and Google to name a few) conduct more than 10,000 A/B tests to help them make informed marketing decisions, using data gathered from millions of users. Often, they’re examining subtle changes – things we hardly notice on a web page or app – but they’re changes which can make all the difference.

Example:

When Google designed the advertisement links in their Gmail interface, they famously tested 41 different shades of blue to decide which colour would perform best. They showed each shade to 1% of their users and deduced that a slightly purple shade of blue led to more clicks – clicks that amounted to a $200m boost in ad revenue.

Oranges and Lemons

This kind of testing is not rocket science, it’s simply a kind of randomised controlled experiment. In fact, as early as 1753, the earliest known example was carried out by a Royal Navy physician, James Lind, who ran a series of tests on Navy seamen to prove his theory that citrus fruits could provide a cure for scurvy. 

After eight weeks at sea, when scurvy had begun to take its toll on the crew, he divided 12 sick sailors into pairs and provided each pair with a different supplement in their diet: seawater, cider, diluted sulfuric acid, vinegar, two oranges and a lemon, or a laxative.

No prizes for guessing that the last pair didn’t fare too well, but, Lind wrote, “The most sudden and visible good effects were perceived from the use of oranges and lemons.” 

Unfortunately, it was another 42 years before the British admiralty finally made citrus fruit compulsory in the diet of sailors. Today, marketers can act on their findings much quicker, with data-driven results quick to obtain and easy to interpret.

And it’s not just oranges and lemons. 

But We’ve Built Our Website

Yes, you have, at a moment in time. And the one pesky thing about time is that it keeps moving on. Headlines, copy styles, layout and formatting, menus, navigation, and application forms all can benefit from a review now and then. The trick is to remain agile, open-minded and not be afraid to try something new.

Google Analytics provides a useful starting point, showing you any hiccups in the user journey. Let’s say you want people to register for your Open Day; is your navigation as clear as it might be? Busy parents don’t have time to hunt around for crucial information, so make life easy for them! Try different versions of a call to action – with two different photos for instance – or play around with your new school colours. Different colours have been proven to elicit different responses – try them out! 

Simply becoming your own mystery shopper and experimenting with the customer journey can help you see exactly where there is room for improvement. Having the right data at your fingertips will give you the confidence to know what is working and to try alternatives for things that aren’t.

As Simple as A/B ... Testing

Data-driven A/B testing is a simple way to experiment with low-risk modifications at very little cost. And, if you decide to go back to how things were, that’s quick and easy too.

Here is how it might work:

  • Data review: Look at your existing website analytics for suggestions of where improvements could be made e.g. pages with a high drop-off rate.
  • Query decision: Make a list of testing ideas that you think may be better than the current version. You can prioritise them in terms of expected impact and ease of implementation. 
  • Choice of metric: You’ll need to decide how you will measure the effectiveness, or not, of any changes you make. That could be anything from the number of clicks through to your admissions page, to the number of social shares, or sign-ups for an Open Day.
  • Test build: A/B testing software will help you implement the changes. This could be something as simple as changing a call-to-action button, altering navigation, or even imagery.
  • Test run: Kick off your experiment and wait. At this point, users of your website will be randomly assigned to either the control or the variation of your test experience. Their interaction with each is measured, counted and compared.
  • Analyse and conclude: It’s time to analyse the results. Examine the data to see which version performed the best and whether the improvement is sufficient to warrant making the change.

It’s a No-Brainer

One by one, tests like these will show you what is working for your audience. Put those ingredients together and voila! You’ll have the recipe for a website that works.

Experimenting with digital optimisation is central to effective digital marketing – ignore it at your peril!


 



READY TO LEARN MORE ABOUT UBIQ?

Feel free to reach out by email or via our contact us form.
We look forward to learning more about your school’s website needs!

Contact Us

 

Ready to talk?