Configuring Data Export from Bitrix24 to BI Systems
Management asks for a quarterly sales funnel report. A manager opens CRM, applies filters, exports CSV, copies it to Excel, builds a pivot table. An hour passes. The director asks: "Now break it down by sales managers." Another hour of manual work. The data exists in Bitrix24, but extracting it each time manually is a dead end. You need continuous export to a BI system where dashboards update themselves.
Data Export Methods
Bitrix24 offers several mechanisms for data extraction:
| Method | Best For | Limitations |
|---|---|---|
| REST API | Any CRM entities, tasks, users | Rate limit 2 requests/sec, pagination by 50 records |
| BI Connector (Professional plan and above) | Ready datasets: deals, leads, activities | Fixed field set, updates hourly |
| Webhooks | Event-driven model — reaction to create/update | No historical data |
| CSV Export | One-time exports, hypothesis testing | Manual work, no automation |
For most tasks, the optimal combination is REST API + ETL pipeline. The BI Connector is simpler to start but limited in field composition and transformation flexibility.
ETL Pipeline: Extract, Transform, Load
An ETL pipeline is a sequence: fetch data from Bitrix24, bring it to the required structure, load it into a warehouse.
Extraction. A script calls REST API methods crm.deal.list, crm.lead.list, crm.activity.list and others. Request data filtered by modification date—fetch only updated records instead of the entire database each time. For initial load, run a full iteration with pagination.
Transformation. Raw data from Bitrix24 contains IDs instead of names, dates in different formats, custom fields with technical names like UF_CRM_1678901234. At this stage:
- Replace stage IDs, responsible persons, types with readable names
- Convert dates to a single format
- Expand multiple fields into separate rows or columns
- Calculate derived metrics: deal duration, conversion between stages
Loading. Transformed data is written to a warehouse—PostgreSQL, ClickHouse, Google BigQuery. Table structure is designed for specific reports: fact tables (deals, activities) and reference tables (managers, stages, directions).
Schedule and Incremental Load
The pipeline runs on schedule—cron, Airflow, or a simple server timer. Typical frequency is every 30–60 minutes for operational dashboards, daily for analytical reports.
Incremental loading saves time and traffic: request only records with DATE_MODIFY greater than the last successful sync. For deleted records, a separate mechanism—periodic ID reconciliation.
Connecting the BI System
When data is in the warehouse, the BI system (Power BI, Metabase, Superset, Google Looker Studio) connects directly to the database. Dashboards are built once and update automatically.
Typical dashboards:
- Sales funnel — conversion by stage, average check, deal duration
- Manager activity — calls, emails, meetings per period
- Plan vs. Fact — revenue target achievement by department and manager
What We Configure
- Selection of export method for the task: REST API, BI Connector, or combination
- ETL pipeline with incremental loading and error handling
- Data warehouse structure for required reports
- Sync schedule with failure monitoring
- BI system connection and basic dashboard setup
- Documentation of data structure and field mapping







