Sybil-Resistant Airdrop System Development
Sybil attack on airdrop — not exotic, it's the norm. Any airdrop announced in advance gets bot army before snapshot moment. After Optimism distributed tokens in 2022 and quickly found significant portion of recipients — Sybil-wallet clusters, standard for serious distributions changed. Now Sybil-resistance — mandatory component, not optional.
Anatomy of Sybil Attack on Airdrop
Attacker creates N wallets, imitates eligibility criteria on each, gets N shares of airdrop, transfers everything to main wallet. Attack complexity depends on cost of imitating one wallet versus expected payout.
Three Sybil attack levels by complexity:
Level 1 — Address farming. Create wallets, do minimal actions. Cheap and massive. Works against simple criteria like "at least one transaction in network."
Level 2 — Activity simulation. Scripted protocol interactions: swaps, bridging, liquidity provision. Higher cost — gas + time. Works against transaction count criteria.
Level 3 — Social graph manipulation. Creating links between accounts, imitating organic behavior. Expensive, but applied for large airdrop sums.
Key observation: no method gives 100% protection. Goal — raise Sybil attack cost above expected profit.
On-chain Analysis: Address Clustering
Transaction graph and common funder detection
Most basic and effective method — analyzing wallet funding source. Sybil wallets almost always funded from one address or via one CEX withdrawal.
import networkx as nx
from collections import defaultdict
def build_funding_graph(addresses: list[str], provider) -> nx.DiGraph:
"""
Build graph: who funded whom.
addresses — list of eligible addresses.
"""
G = nx.DiGraph()
for addr in addresses:
# Get first incoming transaction (initial funding)
first_tx = provider.get_first_incoming_tx(addr)
if first_tx:
funder = first_tx['from']
G.add_edge(funder, addr, tx=first_tx['hash'])
return G
def detect_sybil_clusters(G: nx.DiGraph, threshold: int = 5) -> list[list[str]]:
"""
If one address funded >= threshold eligible addresses — cluster.
"""
clusters = []
for node in G.nodes():
successors = list(G.successors(node))
if len(successors) >= threshold:
clusters.append(successors)
return clusters
In practice threshold tuned for specific airdrop. For large airdrop with thousands of addresses — 5 quite conservative. For small — 3 sufficient.
Temporal Pattern Analysis
Sybil wallets created and used in batches. Characteristic pattern: N wallets active in identical time windows, with identical pauses between transactions.
def detect_temporal_clusters(
addresses: list[str],
txs_by_address: dict,
time_window_seconds: int = 300,
min_cluster_size: int = 3
) -> list[list[str]]:
"""
Find addresses whose transactions fall into identical
time windows suspiciously often.
"""
# Group transactions into time slots
time_slots = defaultdict(list)
for addr, txs in txs_by_address.items():
for tx in txs:
slot = tx['timestamp'] // time_window_seconds
time_slots[slot].append(addr)
# Build co-activity graph
co_activity = defaultdict(lambda: defaultdict(int))
for slot, addrs in time_slots.items():
for i, a in enumerate(addrs):
for b in addrs[i+1:]:
co_activity[a][b] += 1
co_activity[b][a] += 1
# Clusters with high co-activity
clusters = []
visited = set()
for addr in addresses:
if addr in visited:
continue
cluster = [addr]
for other, count in co_activity[addr].items():
if count >= 3 and other not in visited: # coincided 3+ times
cluster.append(other)
if len(cluster) >= min_cluster_size:
clusters.append(cluster)
visited.update(cluster)
return clusters
Gas price and nonce patterns
Scripts creating Sybil wallets often use identical parameters. Identical gasPrice / maxFeePerGas in transactions of different wallets in one block — strong signal. Nonce = 0 or 1 for many addresses shows recent creation for specific airdrop.
Identity-based Verification
Proof of Humanity and Worldcoin
Most reliable Sybil-resistance — proof of person uniqueness. Two main protocols:
Proof of Humanity (PoH): video registration + social verification (vouch from existing participants). Stores verified humans list on-chain (Gnosis Chain). Scales slowly, but high quality.
Worldcoin: biometric verification via iris scan on Orb devices. Issues World ID — ZK-proof of person uniqueness without revealing biometrics.
interface IWorldID {
function verifyProof(
uint256 root,
uint256 groupId,
uint256 signalHash,
uint256 nullifierHash,
uint256 externalNullifierHash,
uint256[8] calldata proof
) external view;
}
contract SybilResistantAirdrop {
IWorldID public immutable worldId;
uint256 public immutable groupId = 1;
uint256 public immutable externalNullifier;
// nullifierHash tied to specific action (claim this airdrop)
// one person = one nullifierHash = one claim
mapping(uint256 => bool) public usedNullifiers;
constructor(IWorldID _worldId, string memory appId, string memory action) {
worldId = _worldId;
externalNullifier = abi.encodePacked(
abi.encodePacked(appId).hashToField(),
abi.encodePacked(action).hashToField()
).hashToField();
}
function claim(
address recipient,
uint256 root,
uint256 nullifierHash,
uint256[8] calldata proof
) external {
require(!usedNullifiers[nullifierHash], "Already claimed");
worldId.verifyProof(
root,
groupId,
abi.encodePacked(recipient).hashToField(),
nullifierHash,
externalNullifier,
proof
);
usedNullifiers[nullifierHash] = true;
_distributeTokens(recipient);
}
}
Key property: nullifierHash unique for (person, action) combination. One person can't get two different nullifierHash-es for one action — ensured by ZK-proof.
Gitcoin Passport
Gitcoin Passport aggregates signals from different sources: GitHub account, Twitter, ENS name, BrightID verification, Proof of Humanity, POAP collections. Each stamp gives score. Higher score — higher probability it's real person.
// Check Gitcoin Passport score via API
async function getPassportScore(address: string): Promise<number> {
const response = await fetch(
`https://api.scorer.gitcoin.co/registry/score/${scorerId}/${address}`,
{ headers: { 'X-API-KEY': process.env.GITCOIN_API_KEY! } }
);
const data = await response.json();
return parseFloat(data.score ?? '0');
}
// Score threshold for eligibility
const MINIMUM_PASSPORT_SCORE = 15.0;
async function buildEligibleList(candidates: string[]): Promise<string[]> {
const scores = await Promise.all(
candidates.map(async (addr) => ({
address: addr,
score: await getPassportScore(addr)
}))
);
return scores
.filter(({ score }) => score >= MINIMUM_PASSPORT_SCORE)
.map(({ address }) => address);
}
Multi-Level Scoring System
In practice no single method used in isolation. Best approach — scoring: each address gets points for different signals, final eligibility determined by sum.
| Criterion | Points | Rationale |
|---|---|---|
| World ID verification | +50 | Person uniqueness |
| Gitcoin Passport ≥ 20 | +30 | Aggregated identity |
| ENS name | +10 | Long-term on-chain identity |
| Wallet age > 1 year | +10 | Not created for airdrop |
| Activity in multiple protocols | +15 | Organic usage pattern |
| No funding graph matches | +5 | Not part of Sybil cluster |
| Sum for eligibility | ≥ 40 | Configured per project |
Addresses in Sybil cluster from graph analysis get penalty or full disqualification regardless of score.
Appeal Mechanism
Any automatic filtering gives false positives. Need appeal process:
- On-chain appeal: address sends transaction with evidence (Proof of Humanity proof, signature from PoH registration, explanation of wallet links)
- Timelock: appeal submission period fixed (e.g., 14 days after snapshot publication)
- Multisig review: team reviews appeals via multisig, decisions transparent on-chain
- Public list: all filtered addresses and reasons published before claim period starts
Transparency critical — community must see filtering isn't arbitrary.
Distribution Contract Architecture
contract SybilResistantDistributor {
bytes32 public merkleRoot;
mapping(address => bool) public claimed;
// Addresses blocked by Sybil analysis
mapping(address => bool) public sybilBlacklist;
// Minimum Passport score (stored off-chain, verified by signature)
address public scoreOracle;
struct ClaimData {
uint256 amount;
bytes32[] merkleProof;
// Optionally: oracle signature confirming Passport score
bytes oracleSignature;
uint256 passportScore;
}
function claim(ClaimData calldata data) external nonReentrant {
require(!claimed[msg.sender], "Already claimed");
require(!sybilBlacklist[msg.sender], "Address flagged as Sybil");
// Verify Passport score if required
if (requirePassport) {
_verifyPassportScore(msg.sender, data.passportScore, data.oracleSignature);
require(data.passportScore >= minPassportScore, "Insufficient Passport score");
}
// Verify merkle proof
bytes32 leaf = keccak256(bytes.concat(
keccak256(abi.encode(msg.sender, data.amount))
));
require(
MerkleProof.verify(data.merkleProof, merkleRoot, leaf),
"Invalid proof"
);
claimed[msg.sender] = true;
token.safeTransfer(msg.sender, data.amount);
emit Claimed(msg.sender, data.amount);
}
function _verifyPassportScore(
address user,
uint256 score,
bytes calldata sig
) internal view {
bytes32 hash = keccak256(abi.encodePacked(user, score, block.chainid));
bytes32 ethHash = hash.toEthSignedMessageHash();
require(ethHash.recover(sig) == scoreOracle, "Invalid oracle signature");
}
}
Temporal Delays as Defense
One underestimated tool — time window between snapshot and eligibility announcement. If airdrop announced without snapshot date, farming already happening. Best practice:
- Snapshot done without warning or with minimal notice
- Eligibility announced after snapshot — attackers have no time to adapt
- Merkle root deployed via timelock (24–72 hours) — community can verify list before claim starts
Tools and Data
On-chain data analysis: Dune Analytics (ready queries for Sybil patterns), Nansen (wallet labeling), Arkham Intelligence (entity clustering).
Identity protocols: Worldcoin (World ID), Proof of Humanity, BrightID, Gitcoin Passport, Lens Protocol (social graph).
Technical tools: Python + networkx for graph analysis, The Graph for indexing, Alchemy/QuickNode for bulk RPC requests.
Timeline and Scope
| Phase | Contents | Duration |
|---|---|---|
| Eligibility criteria design | Define metrics, thresholds, scoring model | 1–2 weeks |
| On-chain analysis | Collect data, clustering, filtering | 2–3 weeks |
| Smart contracts | Distributor + identity integration | 2–3 weeks |
| Off-chain infrastructure | API, Merkle tree, oracle signatures | 2–3 weeks |
| Appeal process | Contract + UI + procedure | 1–2 weeks |
| Audit and testing | 2–3 weeks |
Minimum realistic timeline for production-ready system — 10–16 weeks. Rushing Sybil-filtering costly reputationally: if community sees obvious bots in recipient list, project trust drops sharply.







