search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
EXPERT OPINION


E-SHOTS


(and how ironic is that!). Companies keen to do the right thing ditched their lists of prospects and started again to ensure they were fully permission- based. After the initial pain of that decision, starting over has proved a good way to ensure that e-shots can indeed be a high-reward, low- wastage route to opening dialogue with new customers,


I testing generating


repeat business and maintaining positive relationships. So, if you’re not currently using them, then it’s time to try – or try again.


The key to e-shots is testing… and then measuring the results, refining and


again. Even A/B testing, also


known as split-testing, means testing the effect of one small difference against a control, with everything else remaining the same. For example, testing one subject line over another to see what brings in the most email opens.


This method allows you to test


different variations within a single email campaign to determine what the recipients find most motivating. You can set up two (or more depending on the platform you use) variations of the campaign. All you need to do is: Identify the variable, decide on the variations you would like to test, determine the size of data you would like to test, choose your winning metric, and keep your testing data somewhere safe, so that you can refer to it.


Step 1: Identify the variable When thinking of A/B testing, many people automatically assume subject


www.diyweek.net simple


A/B testing can be highly effective and increase email engagement and conversions.


lines… But there are many more variables to choose from, such as: From name/address – Does the company name receive more opens or does a personal approach work better? info@ or hello@? Content/wording – The tone of voice and length of the content. Does your audience prefer a short and sweet email or a content-heavy newsletter?


Email design/layout – Short emails or long emails? Wide emails or thin emails? With or without a navigation bar? The options are endless.


Artwork – GIFs vs static images? Pastel colours or bold colours? Send date and time – What day and time are your subscribers most likely to open the email? Call to action – Test the colours, positioning and wording used. Do your subscribers respond better to “find out more” or “take a sneak peek”?


Subject line/summary line – What is the ideal length? What about emojis? Does personalisation increase engagement?


It’s important only to test one variable at a time so that you can track changes in engagement easily. Also, don’t base decisions on one email test. Run the same variant test in various email campaigns, so that you have plenty of data.


Step 2: Decide on the variations Once you’ve decided the topic of your test, you’ll need to decide which variations to


t’s not that long ago that every other unsolicited email received was about the falling foul of the GDPR laws


use. For


example, let’s say you want to test colours of your calls to action. Test A = Red button with white text; Test B = Blue button with white text. Depending


on your email


management platform, you might be able to test more than one variant at a time. The bigger your mailing list, the more tests you can run with a reliable sample size, but the more variants you add, the harder it becomes to track cause and effect.


Step 3: Define email test settings


The tests should be sent to a small group of your data, for example, 10-20% of the database, depending on


HOW TO MAKE SURE YOUR ARE WINNING BUSINESS


In the last of a series of articles on sales and marketing in the home improvement and DIY industry, Kate Newton from home enhancement marketing agency Brookes & Co looks at optimising email marketing.


its size. Half of this group will receive test A, while the other half receives test B. The remaining recipients will receive the winning version. Next, you need to decide on test duration. Some email management platforms will give you the option to automatically send the winning test to the remaining recipients after the test period, or you can choose to send the winning test manually once you’ve analysed the results. The longer the test period, the better.


If


you have the time, why not test over a 24-hour period? That way you even get to test how the time of send affects engagement rate.


Step 4: Choose your winning metric Your winning metric will depend on your test. For example, if you are testing the subject line, summary line, “from” name or the “from” address, you would select “open rate” as your winning metric. Or, if you are changing a design or content element, you’d select the “click to open rate” or “effective rate” as the winning metric because the objective of the test was to drive more clicks. This “click to open rate” provides us with a percentage of subscribers who clicked on a link in an email, as related to the total number of people who opened it, therefore gauging the overall effectiveness of an email campaign.


Step 5: Create a report and save it


There’s no point doing the testing if you don’t document the results. Save the report with a clear and put it somewhere safe. A/B testing like this is a continuous strategy that will help you understand your prospects and customers, refine their experience and influence your approach to your marketing and business decisions.


14 JANUARY 2019 DIY WEEK 19


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52