Custom Import/Export Module Development for 1C-Bitrix
The standard Bitrix data exchange — CommerceML for 1C, CSV import into infoblocks — covers typical tasks. When the data source is non-standard, the data structure is complex, or two-way real-time synchronisation is needed — we write a custom module.
Custom Module Architecture
A Bitrix module is a directory in /local/modules/{vendor}.{modulename}/, registered via RegisterModule. Structure:
local/modules/company.import/
├── install/
│ ├── index.php # InstallDB(), UnInstallDB(), DoInstall()
│ └── db/mysql/install.sql
├── lib/
│ ├── Importer.php # main logic
│ ├── Parser/
│ │ ├── XmlParser.php
│ │ └── CsvParser.php
│ └── Queue/
│ └── ImportQueue.php
├── admin/
│ └── import_settings.php # admin interface
├── include.php
└── .settings.php
Importing from Arbitrary Sources
Import from XML (custom schema, not CommerceML):
namespace Company\Import;
use Bitrix\Main\Loader;
use Bitrix\Catalog\ProductTable;
class Importer {
private \SimpleXMLElement $xml;
public function __construct(string $filePath) {
Loader::includeModule('iblock');
Loader::includeModule('catalog');
$this->xml = simplexml_load_file($filePath);
}
public function run(): array {
$stats = ['created' => 0, 'updated' => 0, 'errors' => 0];
foreach ($this->xml->products->product as $product) {
try {
$this->processProduct($product, $stats);
} catch (\Throwable $e) {
\Bitrix\Main\Diag\Debug::writeToFile($e->getMessage(), 'IMPORT ERROR', '/bitrix/modules/company.import/error.log');
$stats['errors']++;
}
}
return $stats;
}
private function processProduct(\SimpleXMLElement $p, array &$stats): void {
$externalId = (string)$p->id;
$existing = $this->findByExternalId($externalId);
$fields = [
'IBLOCK_ID' => IMPORT_IBLOCK_ID,
'NAME' => (string)$p->name,
'CODE' => \CUtil::translit((string)$p->name, 'ru'),
'ACTIVE' => (string)$p->is_active === '1' ? 'Y' : 'N',
'PROPERTY_VALUES' => [
'EXTERNAL_ID' => $externalId,
'VENDOR_CODE' => (string)$p->sku,
'DESCRIPTION' => (string)$p->description,
],
];
if ($existing) {
\CIBlockElement::Update($existing, $fields);
$stats['updated']++;
} else {
$el = new \CIBlockElement();
$newId = $el->Add($fields);
if (!$newId) throw new \RuntimeException($el->LAST_ERROR);
$stats['created']++;
}
// Update price and stock
\CPrice::SetBasePrice($newId ?? $existing, (float)$p->price, 'RUB');
\CCatalogProduct::Update($newId ?? $existing, ['QUANTITY' => (int)$p->stock]);
}
}
Batch Processing and Progress
With large data volumes (100,000+ records) processing through a web request is not possible — timeout. We use a Bitrix agent with saved progress:
// The table stores: file, current position, statistics
class ImportQueue {
public static function processChunk(int $jobId, int $offset, int $limit = 500): array {
$job = ImportJobTable::getById($jobId)->fetch();
// ... read $limit rows starting from $offset
// ... process
// ... update progress in DB
return ['processed' => $count, 'total' => $job['total_rows']];
}
}
// Agent called every minute, processes the next chunk
function ImportAgent(): string {
$activeJob = getActiveImportJob();
if (!$activeJob) return '';
$result = ImportQueue::processChunk($activeJob['id'], $activeJob['offset']);
if ($activeJob['offset'] + $result['processed'] >= $result['total']) {
markJobComplete($activeJob['id']);
return ''; // agent finished
}
return 'ImportAgent();'; // agent continues
}
Exporting Data
Export to an arbitrary format for an external system:
class Exporter {
public function exportOrders(\DateTime $from, \DateTime $to): string {
$orders = \Bitrix\Sale\OrderTable::getList([
'filter' => [
'>=DATE_INSERT' => $from->format('d.m.Y H:i:s'),
'<=DATE_INSERT' => $to->format('d.m.Y H:i:s'),
'CANCELED' => 'N',
],
'select' => ['ID', 'ACCOUNT_NUMBER', 'PRICE', 'CURRENCY', 'DATE_INSERT', 'USER_ID'],
])->fetchAll();
$xml = new \XMLWriter();
$xml->openMemory();
$xml->startDocument('1.0', 'UTF-8');
$xml->startElement('orders');
foreach ($orders as $order) {
$xml->startElement('order');
$xml->writeElement('id', $order['ID']);
$xml->writeElement('number', $order['ACCOUNT_NUMBER']);
$xml->writeElement('amount', $order['PRICE']);
$xml->writeElement('date', $order['DATE_INSERT']->format(\DateTime::ATOM));
// ... order line items
$xml->endElement();
}
$xml->endElement();
return $xml->outputMemory();
}
}
Two-Way Synchronisation
The most complex task — synchronisation without data loss when both systems change simultaneously.
Solution: timestamp-based sync with a SYNC_HASH field:
ALTER TABLE b_iblock_element ADD COLUMN sync_hash VARCHAR(32);
ALTER TABLE b_iblock_element ADD COLUMN synced_at DATETIME;
When exporting, we record the state hash. On the next import: if the hash changed in the source — update in Bitrix. If the hash changed in Bitrix (user edited) — send the changes back to the source. If both changed — conflict, log for manual review.
Module Admin Interface
// admin/import_settings.php
require_once $_SERVER['DOCUMENT_ROOT'] . '/bitrix/modules/main/include/prolog_admin_before.php';
$APPLICATION->SetTitle('Import Settings');
// Settings form: FTP path, agent schedule, field mapping
// Log of recent runs with results (created/updated/errors)
// "Run now" button
Formats and Sources
| Format/Source | Tools | Notes |
|---|---|---|
| XML (custom) | SimpleXML, XMLReader | XMLReader for files > 100 MB |
| CSV/XLSX | PhpSpreadsheet, fgetcsv | XLSX is binary, requires a library |
| JSON REST API | curl, Guzzle | Pagination, rate limiting |
| FTP/SFTP | phpseclib | Automatic file download |
| 1C CommerceML | Built-in Bitrix exchange | Customisation via events |
| Google Sheets | Google Sheets API v4 | For small volumes |
Timelines
| Stage | Timeline |
|---|---|
| Format analysis and field mapping | 1–2 days |
| Parser/exporter development | 3–5 days |
| Batch processing, Bitrix agent | 2–3 days |
| Admin interface | 2–3 days |
| Two-way synchronisation (if needed) | 3–5 days |
| Testing on real data | 2–3 days |
Total: 2–3 weeks for one-way import; 3–4 weeks for two-way synchronisation.







