Integration of Bitrix24 with Power BI

Our company is engaged in the development, support and maintenance of Bitrix and Bitrix24 solutions of any complexity. From simple one-page sites to complex online stores, CRM systems with 1C and telephony integration. The experience of developers is confirmed by certificates from the vendor.
Our competencies:
Development stages
Latest works
  • image_website-b2b-advance_0.png
    B2B ADVANCE company website development
    1175
  • image_bitrix-bitrix-24-1c_fixper_448_0.png
    Website development for FIXPER company
    811
  • image_bitrix-bitrix-24-1c_development_of_an_online_appointment_booking_widget_for_a_medical_center_594_0.webp
    Development based on Bitrix, Bitrix24, 1C for the company Development of an Online Appointment Booking Widget for a Medical Center
    564
  • image_bitrix-bitrix-24-1c_mirsanbel_458_0.webp
    Development based on 1C Enterprise for MIRSANBEL
    747
  • image_crm_dolbimby_434_0.webp
    Website development on CRM Bitrix24 for DOLBIMBY
    655
  • image_crm_technotorgcomplex_453_0.webp
    Development based on Bitrix24 for the company TECHNOTORGKOMPLEKS
    976

Bitrix24 Integration with Power BI

The analytics team builds reports manually in Excel, downloading data from Bitrix24 once a week. By the time of the presentation the data is already stale. Power BI with a direct connection to the Bitrix24 REST API refreshes dashboards automatically — hourly or on a schedule.

Two Connection Approaches

Approach 1. Power BI Dataflow + REST API — configured in Power BI Service without coding. Uses the Web connector with authentication via a Bitrix24 webhook. Suitable for small data volumes (up to 50,000 records).

Approach 2. ETL process → staging database → Power BI — a Python/Node.js script exports data from the Bitrix24 API into PostgreSQL/MySQL, and Power BI connects to the SQL database. The correct approach for production: more reliable, faster, and not affected by REST API rate limits at the time the report refreshes.

ETL Process Architecture

Bitrix24 REST API → Python ETL → PostgreSQL (analytics schema) → Power BI

The ETL script runs on a schedule (every hour via cron). The database schema uses denormalised tables suited to analytical queries:

-- Deal fact table
CREATE TABLE b24_deals (
    id            BIGINT PRIMARY KEY,
    title         TEXT,
    stage_id      VARCHAR(50),
    amount        NUMERIC(15,2),
    currency      VARCHAR(3),
    assigned_id   INT,
    contact_id    INT,
    company_id    INT,
    created_date  TIMESTAMP,
    closed_date   TIMESTAMP,
    pipeline_id   INT
);

-- Dimension tables
CREATE TABLE b24_users (
    id INT PRIMARY KEY, full_name TEXT, department TEXT, email TEXT
);
CREATE TABLE b24_stages (
    id VARCHAR(50) PRIMARY KEY, name TEXT, pipeline_id INT, sort INT, is_final BOOL
);

Extracting Data via REST API

The Bitrix24 REST API returns data with pagination (50 records per page). Iterative collection is required:

import requests
import psycopg2

WEBHOOK = "https://your-domain.bitrix24.ru/rest/1/token/"
PG_CONN = "postgresql://user:pass@localhost/analytics"

def fetch_all(method, params=None):
    """Paginated export from the Bitrix24 API"""
    items, start = [], 0
    while True:
        r = requests.post(WEBHOOK + method, json={
            **(params or {}), "start": start
        }).json()
        items.extend(r.get("result", []))
        if r.get("next") is None:
            break
        start = r["next"]
    return items

def sync_deals():
    deals = fetch_all("crm.deal.list", {
        "select": ["ID","TITLE","STAGE_ID","OPPORTUNITY","CURRENCY_ID",
                   "ASSIGNED_BY_ID","CONTACT_ID","COMPANY_ID",
                   "DATE_CREATE","CLOSEDATE","CATEGORY_ID"],
        "filter": {">=DATE_MODIFY": last_sync_timestamp()},
    })

    conn = psycopg2.connect(PG_CONN)
    cur = conn.cursor()
    for d in deals:
        cur.execute("""
            INSERT INTO b24_deals VALUES (%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s)
            ON CONFLICT (id) DO UPDATE SET
                stage_id=EXCLUDED.stage_id, amount=EXCLUDED.amount,
                closed_date=EXCLUDED.closed_date
        """, (d["ID"], d["TITLE"], d["STAGE_ID"], ...))
    conn.commit()

Key Dashboard Metrics

Power BI builds dashboards on top of the analytics tables. Typical visualisations:

Metric Source
Deal funnel by stage b24_deals GROUP BY stage_id
Revenue by manager b24_deals JOIN b24_users
Conversion: lead → deal → payment b24_leads JOIN b24_deals
Average time per stage Calculated from stage transition timestamps
Manager workload (active deals) Filter is_final = false

Scheduled Refresh in Power BI Service

In Power BI Service, configure the dataset refresh schedule: connect to PostgreSQL via the On-premises data gateway (for local databases) or directly to a cloud PostgreSQL instance. Refresh frequency ranges from once per day (free plan) to 8 times per day (Premium Per User).

For real-time sales monitoring, configure DirectQuery instead of Import Mode. Trade-off: queries hit the database on every report open, adding load to the database server.

Incremental Load

With large data volumes (100,000+ historical deals), a full reload every hour is wasteful. In Power BI Premium, configure Incremental Refresh: the system automatically determines the range of new data from a date field and loads only the changes.

Task Effort
ETL script (Python + psycopg2) 8–12 h
Analytics database schema 4–6 h
Power BI dashboard (5–7 reports) 8–16 h
Scheduling and monitoring setup 3–4 h