Cross-chain Bridge Security Audit
Cross-chain bridges are the largest category by amount of stolen funds in DeFi history. Ronin ($625M, March 2022), Wormhole ($320M, February 2022), Nomad ($190M, August 2022), Harmony Horizon ($100M, June 2022) — four hacks in one year, totaling over $1.2 billion. All attacks had fundamentally different vectors, indicating systemic complexity of bridge security.
Bridge audit is not standard smart contract audit. It includes analysis of cryptographic protocols, trust assumptions of validator networks, off-chain infrastructure (relayer, oracle), economic model, and operational procedures. None of these areas alone covers full risk picture.
Attack surface: what is unique about bridges
Cross-chain message integrity
Central problem of any bridge: how does destination chain verify that event on source chain actually happened? Different architectures solve this differently, and each solution has specific vulnerabilities.
Multisig validator model. Threshold signature from validators confirms events. Attack vector: key compromise or validator collusion. Ronin: 5 of 9 Sky Mavis keys were in one place (infrastructure compromise + social engineering via fake job offer).
Optimistic model. Messages valid if not challenged. Nomad was hacked differently: initialization bug allowed any message to be "trusted" — attacker copied one already-valid transaction, replaced recipient with his address, and repeated for all tokens.
Light client verification. Most secure, but expensive in gas. Audit includes checking correctness of consensus proof verification implementation, especially edge cases in BFT finality.
Finality assumptions
Bridge must correctly determine finality of source chain event. Premature processing (before finality) creates window for double-spend via reorganization.
| Network | Finality | Type |
|---|---|---|
| Ethereum PoS | ~12 minutes (2 checkpoints) | Probabilistic → Economic |
| Arbitrum | L2 instant, L1 finality via ~1 hour | Sequencer → L1 |
| Polygon | 128 blocks (~5 min on Bor) | Checkpoint-based |
| Solana | ~400ms (slots) + optimistic | Probabilistic |
| BNB Chain | 15 blocks (~45 seconds) | PoSA |
Audit: check that requiredConfirmations parameter in relayer matches actual finality of each supported network. Underestimation — risk of reorg attack. Overestimation — poor UX.
Smart contract analysis
Signature verification vulnerabilities
Top-1 class of vulnerabilities in bridges by statistics.
Signature malleability. Standard ecrecover in Solidity accepts both signature forms (low-S and high-S). Attacker can transform signature of valid message into another valid signature of same message with different v,r,s values. If bridge uses signature hash as unique identifier for replay protection — this bypasses protection.
// VULNERABLE: direct ecrecover
address signer = ecrecover(messageHash, v, r, s);
// SECURE: OpenZeppelin ECDSA with s check
address signer = ECDSA.recover(messageHash, signature);
// OZ ECDSA checks: require(uint256(s) <= 0x7FFFF...FFFF, "Invalid s")
Missing chain ID in domain separator. EIP-712 domain separator must include chainId. Without it, signature for Ethereum is valid on Polygon (same chainId if test/prod not separated, or if bridge deployed on EVM-compatible chain with same ID).
Missing deadline in signed message. Signature without expiry can be reused years later. Including deadline in signed structure is mandatory.
Replay protection
// Check everything in one place before execution
function _validateAndMarkNonce(
uint256 sourceChainId,
uint256 destChainId,
bytes32 nonce
) internal {
require(destChainId == block.chainid, "Wrong destination chain"); // chain ID check
require(!processedNonces[sourceChainId][nonce], "Nonce already used");
processedNonces[sourceChainId][nonce] = true;
}
Audit: check that nonce includes source chain ID — otherwise one nonce on Ethereum and Polygon are independent, and message can be replicated between them.
Reentrancy in token callbacks
Some tokens call callbacks on transfer/transferFrom (ERC-777 tokensReceived, ERC-677 transferAndCall, custom hooks). If bridge calls transfer before updating state (processedNonces, balances) — callback can reenter bridge function.
// VULNERABLE: transfer before state update
function release(address token, address recipient, uint256 amount, bytes32 nonce) external {
// ERROR: nonce updated after transfer
IERC20(token).safeTransfer(recipient, amount); // callback here!
processedNonces[nonce] = true; // too late
}
// SECURE: Checks-Effects-Interactions
function release(...) external {
require(!processedNonces[nonce], "Used");
processedNonces[nonce] = true; // state first
IERC20(token).safeTransfer(recipient, amount); // then transfer
}
Wrapped token contract
Bridge mints wrapped tokens on destination chain. Audit wrapped token contract:
- Who has MINTER_ROLE? Only bridge contract or also deployer?
- Is there cap on mint? Without cap — compromised minter role gives infinite mint
- DeFi compatibility: no fee-on-transfer or rebase logic that breaks bridge math
Upgrade mechanism
Upgradeable bridges are standard (need ability to fix bugs), but upgrade itself is attack vector.
Questions for audit:
- Is there timelock on upgrade? Without timelock owner can instantly upgrade to malicious implementation
- Who has upgrade rights? Only multisig? Governance?
- Storage layout: new contract version can break existing storage if upgrade rules not followed (EIP-1967 proxy storage slots)
- Initialization:
initialize()instead ofconstructor()in upgradeable contracts. Can anyone callinitialize()twice if noinitializermodifier?
Validator infrastructure audit
Key management
For multisig-based bridges this is most critical area. Audit includes:
Threshold adequacy. 3 of 5 — insufficient for bridge with $100M+ TVL. Recommended minimum: 5 of 9 or 7 of 13. Wormhole had 19 guardians with threshold 13 — 13 compromise was technically necessary, but didn't happen.
Geographic and organizational diversity. All validator keys in one infrastructure (one cloud provider, one team) — single point of failure. Ronin: Sky Mavis controlled 5 of 9 keys itself.
HSM usage. Validator keys must be in Hardware Security Module (AWS CloudHSM, Azure Dedicated HSM, physical Ledger) — not in hot environment.
Key rotation. Is there key rotation procedure? Compromised key without rotation remains threat forever.
Relayer security
Relayer is privileged off-chain component. Its compromise shouldn't lead to fund theft (signature verification handles that), but can block operations (DoS) or manipulate message order.
Audit relayer:
- Replay protection: relayer shouldn't resend same message twice (bridge contract should reject it, but relayer also shouldn't try)
- Monitoring and alerting: does relayer react to anomalies (sudden volume spike, unusual addresses)
- Failover: what happens on relayer downtime? Is there fallback mechanism for users?
Economic analysis
Bridges vulnerable not only to technical attacks but also economic.
Liquidity imbalance attacks. If bridge uses liquidity pool model, attacker can drain pool on one side, creating state where bridge accepts deposits but can't execute withdrawals.
Price oracle manipulation. Bridges with percentage-based fees use price oracles for calculation. Oracle manipulation (flash loan through short-period Uniswap TWAP) can create incorrect fee calculation situation.
Wrapped token depeg. Wrapped token should always be 1:1 with original. On hack (mint without lock) wrapped token depegs. Is there circuit breaker pausing bridge on depeg detection?
Audit process and deliverables
Bridge contract audit includes:
- Manual code review (4–8 days): every line of Solidity contracts, special attention to signature verification, replay protection, access control, upgrade mechanism
- Automated analysis (1–2 days): Slither, Mythril, Semgrep for systematic search of known patterns
- Integration testing (2–3 days): Foundry fork tests of attack scenarios on mainnet-state
- Validator infrastructure review (2–3 days): key management architecture, operational security
- Economic model analysis (1–2 days): liquidity mechanics, oracle dependencies, circuit breakers
Result: detailed report with finding classification (Critical/High/Medium/Low/Informational), PoC of exploits for Critical/High, remediation recommendations and re-verification after fixes.
Mandatory condition for deployment: bridge without passing audit by specialized team (not generic smart contract auditors, but specifically with bridge-project experience) shouldn't accept TVL above test amounts. History shows this too convincingly.
Audit timeline: 2–4 weeks depending on complexity and contract volume. Cost discussed after codebase and architecture documentation provided.







