SEO Configuration for 1C-Bitrix
The seo module in Bitrix generates title tags using a template like #ELEMENT_NAME# — buy at the online store. Works fine for the first dozen pages. On a catalog of 20,000 products with faceted filters — you end up with thousands of identical titles, pagination duplicates in the index, and filters with GET parameters that Yandex happily indexes. We bring technical SEO to a state where the crawl budget is spent on revenue-generating pages, not on junk.
The Main Problem: Duplicates from the Smart Filter
This accounts for 70% of all SEO work on Bitrix, so let's start here.
The catalog.smart.filter component generates URLs with GET parameters: /catalog/?brand=nike&color=white&size=42. Yandex treats this as a separate page. There are thousands of combinations. The crawl budget burns, the index fills with junk, and main category positions drop.
The solution — a smart filter SEO module (iblock.property.type + custom URLs):
- We define the promoted combinations: "Nike sneakers," "white men's sneakers," "sneakers under 5000." These pages get human-readable URLs (
/catalog/sneakers/nike/), unique title, description, H1, and SEO text - The remaining combinations are closed via
noindex, followin meta robots +Disallowin robots.txt for parametric URLs - Settings are stored in
b_iblock_section_propertyand a custom SEO rules table — the content manager controls it from the admin panel without a developer
Meta Tags: Three Levels of Configuration
Level 1 — templates in the information block settings.
Settings → Information Blocks → Information Block Types → [block] → SEO. Variables: {=this.Name}, {=parent.Name}, {=this.PreviewText}, {=this.Property.BRAND}. Different formulas for each information block:
- Clothing:
{=this.Property.BRAND} {=this.Name} — buy, price from {=this.Property.MIN_PRICE} - Equipment:
{=this.Name} {=this.Property.ARTICLE} — specifications, price, delivery - Different intents — different templates. "Buy sneakers" and "server equipment specifications" are different search queries
Level 2 — manual work on key pages.
The homepage, main categories, top 30 products by traffic. Manual title and description via element properties or via $APPLICATION->SetPageProperty() in the component template. These pages generate 60-80% of organic traffic — a template won't cut it here.
Level 3 — SEO filters.
Unique meta tags for smart filter pages. Configured via a custom rules table or a marketplace module like aspro.seo / sotbit.seometa. Each promoted filter combination gets its own title, description, H1, and text block.
Human-Readable URLs: Where Bitrix Stumbles
URL settings in urlrewrite.php and in the SEF_MODE component parameters. Common problems:
-
Excessive nesting —
/catalog/clothing/women/dresses/summer/product-123/. Four catalog levels. Google recommends no more than three. We restructure:/catalog/summer-dresses/product-123/ -
Trailing slash duplicates —
/catalog/shoesand/catalog/shoes/— two different URLs with identical content. Solution:merge_slashes onin Nginx + 301 redirect viaurlrewrite.phpor.htaccess - www and non-www — one canonical domain, 301 redirect for the other at the Nginx level
-
Symbolic codes —
CIBlockElement::Add()andCIBlockSection::Add()support auto-generation ofCODEfromNAMEvia transliteration. Configured inb_iblock→FIELDS→CODE→TRANSLITERATION. Standard — ISO 9 or GOST 7.79-2000
sitemap.xml via the SEO Module
The seo module generates a sitemap automatically (/bitrix/admin/seo_sitemap.php). But the default settings are weak:
- Pagination pages, search results, and the cart page are included. They need to be excluded via module settings
-
priorityandchangefreqare the same for all URLs. We configure: homepage —1.0 / daily, categories —0.8 / weekly, products —0.6 / weekly, articles —0.5 / monthly - With a catalog of > 50,000 URLs — a single sitemap.xml exceeds the limit. We create a sitemap index with splits:
sitemap-products.xml,sitemap-categories.xml,sitemap-articles.xml - Multilingual site — separate maps with
hreflangviaxhtml:linkin each entry
robots.txt — Protecting the Crawl Budget
User-agent: Yandex
Disallow: /bitrix/
Disallow: /auth/
Disallow: /personal/
Disallow: /search/
Disallow: /cart/
Disallow: /compare/
Clean-param: utm_source&utm_medium&utm_campaign&utm_content&utm_term
Clean-param: sort&order&PAGEN_1
Crawl-delay: 0.5
User-agent: Googlebot
Disallow: /bitrix/
Disallow: /auth/
Disallow: /personal/
Disallow: /search/
Disallow: /cart/
# Google doesn't understand Clean-param — use canonical instead
Sitemap: https://site.com/sitemap.xml
Key point: Clean-param only works for Yandex. For Google, filter parameters are handled via canonical URLs and settings in Search Console → "URL Parameters" (though Google is gradually removing this feature and relying on canonical).
Schema.org — Structured Data for Snippets
JSON-LD in <head> — recommended by both Google and Yandex. Implemented via component_epilog.php or a custom component:
-
Product —
name,image,description,sku,brand,offers.price,offers.priceCurrency,offers.availability. Data fromCIBlockElement::GetByID()+CCatalogProduct::GetByID() - AggregateRating — average rating from an information block property or a reviews module. Stars in the snippet increase CTR by 15-30%
-
BreadcrumbList — navigation chain. Bitrix generates breadcrumbs via
CBitrixComponent, but without Schema.org markup. We add@type: ListItemfor each level -
Organization — name, address, phone,
logo,sameAs(social media). Displayed in Google's right panel - FAQPage — question-and-answer blocks. They take up a lot of space in search results, pushing competitors down
-
WebSite + SearchAction — a search bar right in the snippet.
potentialAction.targetpoints to/search/?q={search_term_string}
Validation: Google Rich Results Test + Yandex.Webmaster → "Structured Data Validator."
Canonical and Duplicate Prevention
$APPLICATION->SetPageProperty("canonical", $url) — in the template of every component that can generate duplicates:
- Pagination: canonical of the first page on all
?PAGEN_1=2,?PAGEN_1=3, ... - Sorting:
?sort=price&order=asc→ canonical to the page without sort parameters - Filters: non-promoted combinations → canonical to the parent section
-
hreflangfor multilingual sites — each language version references all others +x-default -
noindex, followfor technical pages — the bot doesn't index but follows links
301 Redirects During Migration
Migrating from another CMS or restructuring the catalog — without 301 redirects, all accumulated link equity is lost.
- Bulk redirects via the
b_urlrewritetable or.htaccess/ Nginxmap. For 10,000+ URLs — only via Nginxmap, otherwise Apache slows down on every request - Auto-redirect on element
CODEchange — anOnBeforeIBlockElementUpdatehandler saves the old URL to a custom table,init.phpchecks for 404 and performs the 301 - Eliminating chains: A → B → C becomes A → C. A chain means losing 1% of PageRank at each hop and unnecessary crawling
- Unified format: www/non-www, HTTP/HTTPS, trailing slash/no trailing slash — one canonical variant, the rest get 301
Page Speed — A Ranking Factor
-
Composite cache —
\Bitrix\Main\Composite\Engine. Turns a dynamic page into static HTML. First hit — PHP rendering, subsequent hits — served from file/memcached cache in milliseconds. Configuration:Performance → Composite Site, exclusions for cart and personal account -
Images — conversion to WebP via
CFile::ResizeImageGet()with theBX_RESIZE_IMAGE_PROPORTIONALparameter + lazy loading vialoading="lazy". Specifyingwidth/heightin<img>to prevent CLS -
CSS/JS — concatenation and minification via core settings:
Settings → Product Settings → CSS/JS Optimization. Critical CSS inline for FCP -
CDN —
Settings → CDNin the Bitrix admin panel. Static assets are pushed to edge nodes -
Server-side — Brotli/gzip, HTTP/2, OPcache with
opcache.jiton PHP 8.1+, caching headersCache-Control: public, max-age=31536000for static assets
Index Management
- Yandex.Webmaster + Google Search Console — verification, crawl error monitoring, index coverage tracking
- Crawl budget: if only 5,000 out of 50,000 URLs in the sitemap are indexed — the bot is spending time on junk pages. We clean up robots.txt, remove duplicates, apply noindex
- 404 and soft-404 errors — from webmaster reports. Each one wastes crawl budget. Fix: 301 to the relevant page or 410 (resource permanently removed)
- Monitoring drops: if the number of indexed pages drops sharply — robots.txt may have blocked something unintended or canonical points to the wrong place
Timelines
| Task | Timeline |
|---|---|
| Basic setup (meta tags, URLs, sitemap, robots) | 1-2 weeks |
| Schema.org for an online store | 1-2 weeks |
| Comprehensive technical SEO optimization | 3-5 weeks |
| Speed optimization (Composite, images, CDN) | 2-4 weeks |
| SEO filters with meta tags and clean URLs | 2-3 weeks |
| Migration with redirects | 1-3 weeks |
SEO is not a one-time task. Yandex and Google algorithms change, competitors improve their sites, the catalog grows. We provide both one-time optimization and subscription-based SEO support with monthly audits of positions, indexation, and technical errors.







