NOTE: This article is for customers with the New Admin Experience interface (vertical left-navigation bar).
If your interface is different, view the "classic" version of this article.
You can use the A/B Testing feature of campaigns to test the effectiveness of your messages. You do this by sending two or more messages, that are not identical, to multiple groups and then determine which message had the better response.
NOTE: A/B Testing is available for standard campaigns and Landing Page campaigns.
The "testable" parts of messages are the:
- Message - send a different sample message to each test group.
- Subject line - specify a different subject line for each sample message.
- Pre-header - specify different pre-header text for sample message.
- From field - specify a different from address for each sample message.
- Time of send - send the sample messages on a different date and time.
NOTE: These options and their purpose are described in more detail later in this article.
After you configure which aspect of the message to test, you have to determine what makes one sample message the "winner." You can also set when the test will end and post-test actions.
The system will send the sample messages, track the results, and then determine the "winner" based on the criteria you specified during setup.
Create an A/B Testing campaign
The bulk of creating an A/B Testing campaign is done on the Setup Campaign page, which has several sections that are described below in Set up an A/B Testing campaign.
To create an A/B Testing campaign:
- Access the Admin interface.
- Click Campaigns > List.
- On the Campaign Management list page, enter a name in the New Landing Page field at the top and click Create New.
NOTE: A/B Testing campaign names can contain and can start with a number. Those that start with a number will be listed before those that start with a letter. For example, 2019 Renewal and 2020 Welcome would appear earlier in the list than Renewal 2019 and Welcome 2020.
- On the Create Campaign page, select the industry your organization belongs to. The information in the How do I know which campaign to choose? area will update based on the chosen industry.
- Click the right dropdown and select A/B Testing.
- Click Select.
- Continue with Set up an A/B Testing campaign, below.
Set up an A/B Testing campaign
The Setup page has three sections, each with multiple configuration options that can change as you work through the set up and make selections. The sections are:
- Select Test Members
- Setup Test Variable
- Determine Winner
There are also a variety of option types in each section; these are described below.
- Circled "i" icon - These are on-page information boxes. Click to access details that are specific to that section. TIP: "Grab" the title bar of the info box to reposition it.
- Blue text - These are prompts for you to choose something. Click to open an on-page dialog to configure the option. (Some of these dialogs might present additional dialogs.) After selecting an option, click it again to re-open the dialog to remove it and/or make other selections.
- Dotted-underline text - These are prompts for you to choose something. Click to open an on-page list of options. (Some of these options might change the clickable items that follow them, or even add options that weren't there but which are now required.) After selecting an option, click it again to re-open the list to change your selection.
- Text boxes - These have different behaviors. For example, click into one to just specify a number. However, click into hh:mm and select from a list of times.
- Plus symbol (+) - These are optional and enable you to further refine how you want the test to perform.
1 - Select Campaign Members
OBJECTIVE: Choose the recipients (by group) of the test, the number of recipients in each "sample," and how many samples to send.
TIP: Click the for section-specific guidance.
The following table describes the options in this section.
Option | Description |
---|---|
Who would you like to include in the test? | Click Select Group(s) and choose one or more groups from the dialog. Click + condition and use the on-page selector to filter the recipients in your selected groups. Click + condition again to add more conditions. |
How many samples would you like to test? | Specify the number samples (2 to 26) to use in your test. Click Submit. |
Setup samples | Specify the number of recipients in each sample and select:
NOTE: Recipients are not enrolled until the test is deployed so it's possible to have a larger number of recipients assigned to the samples than are in the test. If you want to use large sample sizes, you may want to first conduct a Recipient Search for the groups you're using for enrollment. This will give you an estimate of how many recipients you can expect. |
When the test campaign is deployed:
- All recipients in these groups are possible candidates for the test. Any members added to the groups after the test is deployed will not be added to the test.
- Members will be randomly added to fill each sample to your specified threshold. These members will be sent the appropriate message for their sample. The remaining, non-participant members will not be sent a test message.
NOTE: When setting up your samples, make sure that they are large enough to yield a usable result. Choosing a very small percentage from a very small sample could leave you with a result that is due more to random chance than any meaningful differences.
2 - Setup Test Variable
OBJECTIVE: Choose which part of the message you want to test and then configure it; or choose a time of day.
TIP: Click the for section-specific guidance.
The following table describes the options in this section.
Option | Description |
---|---|
What would you like to test? | Select message to select and send a different message to each sample. Then, select a date and time to send your messages. Use case: Test different messages for the same product. Use the results to determine which content or layout performed better/best at driving web traffic to your sales page. |
Select subject line to enter a different subject line for each sample. Use case: Test different subject lines, soliciting a response. Check the open rates to see which was more/most effective. |
|
Select pre-header to create different pre-header text (appears when the message is viewed as part of a list) for the same message. Use case: Test different pre-header text to see if it impacts open rates. |
|
Select from field to enter a different from address for each sample. If your account has a default from address option, it will appear. Use case: Test different from addresses to see if there is any difference in their effectiveness. |
|
Select time of send to have each sample message sent at a different date and time. Use case: Test how and whether different days (or weekends) and times of day affect open rates. |
|
Samples | Use the configuration options to select/specify the variables for each sample. |
Configure Send Options | Choose a date and time to start sending the sample messages (except for time of send) and click send options to set other parameters. |
3 - Determine Winner
OBJECTIVE: Specify what determines the "winner," when the test will end, and optional post-test actions.
TIP: Click the for section-specific guidance.
The following table describes the options in this section.
Option | Description |
---|---|
How do you want to determine the winner? | Select either open rate (percentage of messages opened by the recipients in each sample) or click rate (number of clicks per message in each sample) to determine the "winner" and, optionally, choose whether a minimum rate is required in order to determine the winner. |
What time do you want to evaluate the winner? | Select an end date and time -- after the last send time for all sent messages -- for the test. Real Magnet will calculate which message "won." TIP: Give your samples plenty of time (two days is suggested) for reasonable participation and response. |
When the winner is determined: | Choose what to do, if anything, when the winner is determined.
|
After you've specified all the conditions, click Next to review your campaign's configuration on the Review page.
NOTE: If you click Next and there are unsatisfied conditions, you will be prompted to "fix the errors" (displayed in red) before you can proceed to the Review page.
NOTE: The system does not "QA" the logic of your conditions. You should review your selections before proceeding.
Review the testing campaign
The Review page presents a "consumable" summary of your configuration options.
- Click Print Screen to open a "preview" of this page. You can print a copy or choose one of the save options.
- Some conditions might appear as links. When a link is clicked, that message opens in a new tab. This is a useful "check" before deploying your A/B Testing campaign.
- If you want to make changes, click Setup Campaign in the top bar.
- After reviewing your configuration options and conditions, click Next to access the Deploy page.
Deploy the testing campaign
The Deploy page is your last opportunity to review your testing campaign details. If you're satisfied and want to proceed:
- Choose when to start the campaign by either accepting the "immediately" option or selecting the "schedule" option and choosing a date and time.
- Choose when to end the campaign by either accepting the "manually" option or selecting the "schedule" option and choosing a date and time.
- Click Deploy Campaign.
- At the confirmation prompt, click Yes.
The page refreshes and updates the Schedule section with your start and end selections. Also, now there's a Campaign Status section in which you can cancel the pending deployment.
TIP: Navigate back to the Campaign Management list page and confirm that your testing campaign is listed and that its details are correct.
Next step...
After your A/B Testing campaign is deployed, you can track its progress. See Track an A/B Testing Campaign to learn more.