Configuring A/B Testing for Bitrix24 Mailings
A/B testing for mailings in Bitrix24 is a built-in tool in the sender module that allows you to compare variants of the email subject line, sender name, or content on a subset of the audience, and then automatically send the winning variant to the remaining subscribers. It sounds straightforward, but configuration affects the statistical validity of the result — if the test is set up carelessly, the "winner" will be a matter of chance.
How A/B testing works in the sender module
When creating a mailing, navigate to the campaign settings and enable A/B test mode. Bitrix24 allows you to test:
- Email subject line (
SUBJECT) - Sender name (
FROM_NAME) - Email content (different templates)
The audience is divided into groups: you can set the share of each test group as a percentage of the total list. For example, 15% receive variant A, 15% receive variant B, and the remaining 70% wait for the "winner". The winner is determined by a metric: Open Rate or Click Rate. The waiting period before results are tallied is a configurable parameter in hours.
Open and click data is collected via a tracking pixel and redirect links. The pixel is embedded in the template automatically — it is a one-pixel image whose request is recorded in b_sender_mailing_chain_table. Clicks are tracked via redirect with UTM parameters through the portal's domain.
Statistical validity: what is often ignored
The most common mistake when configuring an A/B test is too small a test sample or too short a waiting window. If the list has 500 people and the test group is 10%, each variant reaches 25 recipients. A difference of one open between variants equals 4% Open Rate difference — statistically meaningless.
Minimum sample per variant — around 100 recipients before any trend can be observed. Results at p<0.05 require more. Bitrix24's built-in tools do not show p-values — they simply pick the "higher" metric. For rigorous testing, export data from b_sender_mailing_chain_table and calculate a z-test manually or using an external tool.
Waiting window. Open Rate is largely determined within the first 24–48 hours. Setting a window of less than 4 hours means measuring only those who check their email in the morning. Optimal: 24 hours on weekdays, 48 hours if the mailing falls on a weekend.
UTM parameter and analytics configuration
By default, links in Bitrix24 emails receive UTM parameters of the form utm_source=bitrix24&utm_medium=email. For A/B testing, it is essential to split traffic by variant. Configure custom UTM parameters in the template manually or via system variables:
utm_source=email&utm_medium=newsletter&utm_campaign=promo_march&utm_content=variant_a
Variant B gets utm_content=variant_b. This allows comparison not only of opens and clicks within Bitrix24, but also of on-site behaviour in Google Analytics — conversions, page depth, goal completions.
Configuration steps
- Subscriber list audit: freshness, absence of duplicates, correct segmentation
- Define the test hypothesis and the winning metric (Open Rate vs Click Rate)
- Calculate the required sample size for the target effect size
- Prepare variants A and B (subject line or template)
- Configure the campaign in the
sendermodule: group shares, waiting period, win criterion - Verify tracking: test send with pixel and redirect checks
- Launch, monitor, analyse results
| Task | Timeline |
|---|---|
| Configuring a single A/B test | 2–4 hours |
| Configuring a test series with analytics | 1–3 days |
Pricing is calculated individually.







