Even the most experienced advertising veteran can be surprised by the behavior of customers and end users.
Responding to a changing landscape and refining your approach is not an indication that you are making any wrong decisions, but that you are listening to your customers and honing your message to meet their needs.
One of the promises of new media is that there is a conversation between brands and their customers. This conversation can take place via channels that are more transparent like Twitter or Facebook or in a forum format like getsatisfaction.com. There is also a much less direct way to have this communication with the use of analytics and testing.
Much less akin to a conversation than a call-and-response technique, analytics provide insight into everything that your users are responding to. I would love to paint a picture of a world that has analytics refining all aspects of your projects for the best possible ROI. This is difficult, costly and can lead to some very unexpected outcomes. I can tell you that analytics, in particular a method called A/B or Comparative testing, can help improve your campaign and yield insight into your digital campaigns.
A/B testing is a method that compares 2 or more approaches to the same design challenge to determine which can generate a better response (or in some cases to test that there is no better response). The results are then monitored and compared. If the differences are significant, additional tests are recommended to prove the better solution. The improved solution then becomes your best practice or a “control” for further tests. If this sounds simple, it is. It does require some effort from both the client and those involved in the creation of messaging, creative assets, technology partners and a party to review the analytics. In return, you have the beginnings of an evolving marketing strategy with quantifiable analytics behind it.
The idea seems instinctual. It has been in use for many years in the form of best practices and specialty groups who know and understand particular audiences. What makes this so powerful in the digital domain is the near real-time results and responses that can be put to use.
This is best demonstrated in the following scenario: You position a call to action on your website for users to buy a new product. The product offers 2 benefits to the purchaser (longevity of the product and increased safety). Below are 2 examples of messaging that can be tested to see which generate the best response.
The first message expresses the obvious benefits of improved efficiency of the product and the value of purchasing a higher quality item. The second message is emotion based and draws from a bit of fear (very popular with political ads and products aimed at parents).
This example uses different creative and messaging types to determine which will result in more users clicking. Since you want to have no bias in the testing, you want a true 50/50 chance in a users seeing this message. This can be managed several ways, the most popular is to use an analytics software solution like Google Analytics or Adobe’s Omniture to manage the testing. Google Analytics (free, but lacks real-time results) is a great way to do so and will provide you with a very simple interface with results with a granular breakdown of which sample performed better, how long users where engaged, if they made a purchase (and if that purchase was for that same item), how likely they are to return, etc.
Using a simple test like this will generate a good data set for you to use for shaping current and future campaigns. This same method can also be used to test everything from pricing, registration process and position of call-to-action within a visual design.
Don’t get carried away. It’s very easy for marketers new to this style of analytics to get carried away with data. If you are not careful with your testing, it can lead to some bad decisions as well. First, you have to be sure that your testing set is large enough to gather reliable statistics. If you have traffic comparable to Amazon.com, real-time data is more feasible. If your site traffic is closer to Uncle Bill’s Pancake house during the off-season, you may need to be more judicious in making changes.
Even the smallest websites should be looking at A/B testing as a way to refine their websites and shape traffic. Doing so can be done by looking at trends over time. Comparing traffic against the same days of the week or looking at how competitor’s offers may be affecting your website’s traffic and business.
When you begin to look at all of the effort and variables required to begin A/B testing your website or digital campaign, it can seem very daunting. It’s benefit is not for the short-term wins, but the benefit you have over the course of time. You begin to have a very strong understanding of your community and how users are interacting with your digital properties. This is an incredible benefit if you aren’t making use of any social media to enable a true “conversation” with users.
These same analytics will also give website owners and agencies data to make evolutionary decisions with. Few things are as powerful or as useful when discussing changes than real data and case studies from your own brands. Analytics and testing can give you a deep well of this experience to draw from and make bold and informed moves with.
To learn more about using comparison testing to improve your digital campaign, take a look at some of the sites below that provide a more detailed explaination and enhanced perspective. And if you have any questions, leave a comment.
How to analyze A/B testing using Google Analytics