Setting up A/B testing of Bitrix24 mailings

Our company is engaged in the development, support and maintenance of Bitrix and Bitrix24 solutions of any complexity. From simple one-page sites to complex online stores, CRM systems with 1C and telephony integration. The experience of developers is confirmed by certificates from the vendor.
Our competencies:
Development stages
Latest works
  • image_website-b2b-advance_0.png
    B2B ADVANCE company website development
    1175
  • image_bitrix-bitrix-24-1c_fixper_448_0.png
    Website development for FIXPER company
    811
  • image_bitrix-bitrix-24-1c_development_of_an_online_appointment_booking_widget_for_a_medical_center_594_0.webp
    Development based on Bitrix, Bitrix24, 1C for the company Development of an Online Appointment Booking Widget for a Medical Center
    564
  • image_bitrix-bitrix-24-1c_mirsanbel_458_0.webp
    Development based on 1C Enterprise for MIRSANBEL
    747
  • image_crm_dolbimby_434_0.webp
    Website development on CRM Bitrix24 for DOLBIMBY
    655
  • image_crm_technotorgcomplex_453_0.webp
    Development based on Bitrix24 for the company TECHNOTORGKOMPLEKS
    976

Configuring A/B Testing for Bitrix24 Mailings

A/B testing for mailings in Bitrix24 is a built-in tool in the sender module that allows you to compare variants of the email subject line, sender name, or content on a subset of the audience, and then automatically send the winning variant to the remaining subscribers. It sounds straightforward, but configuration affects the statistical validity of the result — if the test is set up carelessly, the "winner" will be a matter of chance.

How A/B testing works in the sender module

When creating a mailing, navigate to the campaign settings and enable A/B test mode. Bitrix24 allows you to test:

  • Email subject line (SUBJECT)
  • Sender name (FROM_NAME)
  • Email content (different templates)

The audience is divided into groups: you can set the share of each test group as a percentage of the total list. For example, 15% receive variant A, 15% receive variant B, and the remaining 70% wait for the "winner". The winner is determined by a metric: Open Rate or Click Rate. The waiting period before results are tallied is a configurable parameter in hours.

Open and click data is collected via a tracking pixel and redirect links. The pixel is embedded in the template automatically — it is a one-pixel image whose request is recorded in b_sender_mailing_chain_table. Clicks are tracked via redirect with UTM parameters through the portal's domain.

Statistical validity: what is often ignored

The most common mistake when configuring an A/B test is too small a test sample or too short a waiting window. If the list has 500 people and the test group is 10%, each variant reaches 25 recipients. A difference of one open between variants equals 4% Open Rate difference — statistically meaningless.

Minimum sample per variant — around 100 recipients before any trend can be observed. Results at p<0.05 require more. Bitrix24's built-in tools do not show p-values — they simply pick the "higher" metric. For rigorous testing, export data from b_sender_mailing_chain_table and calculate a z-test manually or using an external tool.

Waiting window. Open Rate is largely determined within the first 24–48 hours. Setting a window of less than 4 hours means measuring only those who check their email in the morning. Optimal: 24 hours on weekdays, 48 hours if the mailing falls on a weekend.

UTM parameter and analytics configuration

By default, links in Bitrix24 emails receive UTM parameters of the form utm_source=bitrix24&utm_medium=email. For A/B testing, it is essential to split traffic by variant. Configure custom UTM parameters in the template manually or via system variables:

utm_source=email&utm_medium=newsletter&utm_campaign=promo_march&utm_content=variant_a

Variant B gets utm_content=variant_b. This allows comparison not only of opens and clicks within Bitrix24, but also of on-site behaviour in Google Analytics — conversions, page depth, goal completions.

Configuration steps

  1. Subscriber list audit: freshness, absence of duplicates, correct segmentation
  2. Define the test hypothesis and the winning metric (Open Rate vs Click Rate)
  3. Calculate the required sample size for the target effect size
  4. Prepare variants A and B (subject line or template)
  5. Configure the campaign in the sender module: group shares, waiting period, win criterion
  6. Verify tracking: test send with pixel and redirect checks
  7. Launch, monitor, analyse results
Task Timeline
Configuring a single A/B test 2–4 hours
Configuring a test series with analytics 1–3 days

Pricing is calculated individually.