A/B testing (or split testing) is a classic direct mail fundraising tactic that has been widely adopted by online marketers to track constituent engagement and find out which version of a web page prospective donors (or event participants) connect with most. You can test layout, graphics, colors, copy, headings, and any other elements of your page, to see the effect changes have on the time on site on pages, or completion rate on donation forms.
How does it work? When beginning an A/B test, you need to develop a call to action, a control, and one or more variations on the control. The fewer variations you test, the simpler the process, and the more reliable your results will be. The basic process is to test the success rate of the control, and the success rate of the variation(s), and compare them to find out which is more effective.
Let’s say you are considering a new design for your event’s online registration form. You want to test the effectiveness of the current form (the control), as compared to a new form (the variation). In this case your form’s success is determined by whether or not a visitor registers for your event. Before you test the new form, you need to get an idea of your current conversion rate, or the percentage of people viewing your form who actually register. HTML scripts are typically used within a web page to keep track of the number of conversions, and free tools like Google Website Optimizer and Google Analytics can help you set up tracking mechanisms to determine success.
When you’re ready to test, you can drop a simple script on page to send some visitors to the current form, and others to the new form on a random basis. A script is basically a block of code you attach to your web page that tells it to perform certain tasks. Scripts are usually included with optimizer tools (see below). By comparing each form’s conversion rate, you’ll know which form is more effective at soliciting donations. The bigger your sample size (the longer your run the test), the more reliable the test results will be.
President Obama’s campaign analytics team is a popular example of how a group of web marketers used website optimization and A/B testing to improve their conversion rates.. The group tested several versions of its website splash page to find out which was most successful in getting people to sign up for the mailing list. The campaign staff was surprised to find out that the images they preferred weren’t the ones that delivered the best results.
Obama’s campaign used Google’s free Website Optimizer, one of many A/B testing tools available, to perform this effective test. Optimizer makes testing much easier and more accessible, especially if you’re testing multiple variations. You can use it to test any combination of variables and eliminate potential errors in your calculations. Optimizer will create reports that analyze your results and take the guesswork out of optimizing your website for success.
The group tested two different aspects of the splash page: the media on display (a graphic or video) and the call to action button. They had six versions of the media (including an image of a family, a video of Obama, etc) and three different versions of the call to action button (Learn More, Join Us Now, and Sign Up Now), and tested the splash page with various combinations of the media and call to action button. They found that the combination of the family image with the “Learn More” button had the highest success rate. The switch from the original splash page to the winning combination led to a 40.6% increase in the sign-up rate. Typically, 10% of those who signed up for the list during the campaign became volunteers. So if the campaign staff hadn’t run the tests, they could have missed out on about 288,000 additional volunteers!
Have you actively begun testing your important conversion pages?