Google is, without a doubt, the Ruler of the Search Engines. With over 2.3 million searches per second, it’s the first place people go when they need information.
Some of these users were recently shocked when all of their search results came up in black text, instead of the familiar blue. This strange modification was the result of an A/B color test that the company was conducting.
A/B testing (also known as split testing or bucket testing) is a method of comparing two different website versions in order to optimize and discover which one performs better. This is done by modifying the website to create a second version with one or two specific changes. Employing website visitors as the control and variant groups, you then use analytics to discover which version provides a better user experience and more positive reactions.
The Google team itself is no stranger to A/B testing. On one occasion, they couldn’t decide which color blue they wanted to use for their results, so they ended up testing 41 different shades!
Public Reaction to Google’s Testing
In this case, the result of the A/B test was clear: users hated the new search colors. They flooded social media platforms like Twitter to share their outrage, threatening to leave Google for other search engines, and even starting a hashtag trend #BringBackTheBlue.
Those who disliked the change searched (most likely through Google!) for ways to revert the results back to blue. Although there wasn’t a blanket way of doing it, users reported that the following had worked for them:
- Logging out of their Google account and logging back in again
- Within chrome’s settings, going to Google’s home page and selecting “My Account” in the top right hand corner and disabling “Your Searches and Browsing Activity”
- Using Chrome extensions (like Stylist) to restyle Google’s web page
A/B Testing is Vital
Google has since admitted that they’re “not quite sure that black is the new blue.” This proves the importance of A/B testing. A full rollout could have been catastrophic, resulting in a lot more people searching how to change Google back to blue, and maybe even switching to other engines.
Imagine if Google search results have never been blue and had been black instead. Perhaps the favored search engine would currently be Bing or Yahoo! But Google knows how to provide a good user experience, which is why 80% of people randomly selected by Search Engine Land, consider Google to be their primary search engine, with Yahoo following at 8% and Bing at 6%.
Every company has huge amounts of data generated by their customers. This includes performance statistics from websites, email campaigns, landing pages, live chat software, pay per click ads, and more. Instead of just collecting it, use it as a baseline and A/B test new improvements for your website and marketing campaigns. When you find changes that work, you’ll improve your customer satisfaction rate, sales, ad campaigns, and the quality of your business overall.
Success with A/B Testing
A/B testing even the smallest changes is important and sometimes leads to the biggest successes. Blind Five Year Old, a San Francisco Internet Marketing firm, saw a 53% increase in click-through rates just by capitalizing their ad in Google Adwords.
Sometimes A/B test results are surprising. A client of Afterclicks Interactive discovered that, oddly enough, their audience prefers to read dozens of pages rather than one structured page. They proved this by testing a new compacted landing page against the longer original page. The new landing page attracted a higher bounce rate and lower conversions. So it’s not always best to go with what the experts say. A/B testing will help you find out what actually works for your company.
The Ability to A/B Test Live Chat Software
Live chat software has many different features, offering the opportunity for lots of A/B tests. Some of the more popular features to test are:
- Chat buttons
- Chat Windows
- Proactive Invitations
- Proactive Promotions
A good aspect for live chat A/B testing is the type of chat button: embedded or sticky.
An embedded button stays on a certain part of the webpage, best placed near the company’s phone number. A sticky button is more commonly placed in the bottom right. It “sticks” in position, and as the visitor scrolls, the button remains in sight.
Companies should first A/B test which type of button works best for them. Once they have selected their type of button, they can also test its design, shape, size, color and text content, and whether to hide it when offline or present an unavailable button.
Chat Window Style
When a customer clicks a chat button, one of two windows is displayed.
A pop up window is a separate browser window that allows visitors to stay in the chat even if they close the web page. The window has more room for customization and the visitor can move it around their screen and resize it if they wish to.
An embedded window moves with the visitor as they scroll so their chat is in view as they continue to browse the website. This type of button is more common in social media platforms like Facebook. Using A/B tests, you can discover which type of window your customers prefer.
Chat Window Type
Each style of window is then divided further into stages of the chat:
A pre-chat form is an optional chat software tool. It is usually the first window that displays when a visitor clicks on an available chat button. The pre-chat form is used to qualify the visitors to chat. This is done through data collection, which can simply be a name field or expand to include email address, company name (in B2B environments), nature of enquiry, reference numbers etc. A/B test the pre-chat form to see whether it returns positive results, and which fields customers use the most. Asking a lot of questions at this stage can result in chat abandonment, so it is best to ask only vital questions in the pre-chat form and then follow up later for more information.
The dialogue window is where the chat takes place, and is the only mandatory window. The branding, color of the chat bubbles, useful links and the sizing of the email/print buttons are all examples of A/B testing material within the dialogue window.
Personalizing the chat windows to the company brand is important so that visitors are fully confident that they are speaking with the business directly, rather than a third party. If the color of the default chat bubbles clashes with the company brand, businesses should test other color options that are still readable for both the operator and visitor.
Test useful links to see if they distract visitors within a chat by navigating them around other pages, or whether it helps decrease chat handling times by enabling them to answer their questions elsewhere. A common question raised by visitors is whether they can get a copy of the chat transcript. By testing the size of the email and print buttons, companies can find the right size to reduce the amount of time spent answering how customers can receive a transcript.
The post-chat window is an optional window that pops up when a chat has concluded. It usually displays a survey about customer experience. Test the length and type of survey so that you can get more data from your customers. I spoke with members of the tech team at live chat software provider Click4Assistance. They said previous testing results showed that two pages of shorter questions work better than one long questionnaire.
Proactive invitations are messages that automatically invite your visitors to chat based on certain rules. Test these rules to see which ones provide the best results.
First, set the invitations to appear based on timing. To maximize the chat uptake, analyze the average amount of time a visitor spends on the website, and set the proactive invitation so it doesn’t appear too early or too late.
Other aspects to test are the pages that the invitation appears on (all of them or specific pages), and the actual image that the invitation contains. Every test matters. Since proactive invitations are known to be a bit of a dark art, even the smallest tweak can have a big impact on their success.
Proactive promotions are very similar to invitations, except the image has a URL assigned to it. If a visitor clicks accept, instead of entering a chat, their browser is redirected to a certain page. This is very useful for relaying the latest information or sales pages. Like the invitations, you can test the timing and the image of these promotions.
Live Chat Software A/B Test Results
Click4Assistance has clients who use both sticky and embedded buttons. After analyzing results from customer accounts, they found that 79% of Click4Assistance customers prefer using the sticky button. Click4Assistance users have conducted further testing around their windows and buttons. Here are some of the results:
Bonmarché started using live chat in 2015. Since their customer base consists of an older demographic, the company thought they would need a more prominent button. However feedback from reports run by Bonmarché’s account manager revealed that customers disliked large buttons because they cover some of the products when browsing. A smaller expandable button ended up receiving a higher click through rate to the chat.
Energy UK launched Home Heat Helpline in 2005. The helpline advises people who have difficulty paying bills or keeping warm in the winter, helping low-income households in urgent need of heating advice. Energy UK split tested several proactive invitation images which included a mid-aged couple, young family and an elderly couple. After running reports, they found that the young family image achieved the best uptake.
Your company doesn’t have to be the size of Google to do A/B testing, but it certainly helps if you’re testing 41 shades of blue! Start taking advantage of your company’s data collection, and use A/B testing to achieve goals and improve every aspect of your business.
Do you have any stories or advice about A/B testing? Let us know in the comments below!
Looking for Customer Service software? Check out Capterra's list of the best Customer Service software solutions.