Custom Analytics Dashboards Development on Dune
Dune Analytics is a SQL engine on top of decoded blockchain data. If you can write SQL and understand Ethereum transaction structure, you can build any analytics for any protocol without your own infrastructure. The complexity isn't in the tool — it's in understanding how blockchain data is structured in relational model, and writing queries that return results in reasonable time, not 5 minutes.
Data Structure in Dune
Dune works with two data levels:
Raw tables — chain raw data:
-
ethereum.transactions— all transactions (from, to, value, gas, data...) -
ethereum.logs— all events (address, topics[0..3], data) -
ethereum.traces— internal transactions (call traces) -
ethereum.blocks— blocks
Decoded tables — decoded events of specific protocols. If contract ABI loaded in Dune, all its events accessible as normalized tables:
-
uniswap_v3_ethereum.Pair_evt_Swap— all Uniswap V3 swaps -
aave_v3_ethereum.Pool_evt_LiquidationCall— Aave liquidations -
erc20_ethereum.evt_Transfer— all ERC-20 transfers
Always prefer decoded tables — they're faster and don't require manual data parsing.
Writing Efficient Queries
Main Antipatterns
Full scan without date partitioning — most common cause of timeouts:
-- BAD: full scan of all history
SELECT date_trunc('day', block_time), count(*)
FROM ethereum.transactions
WHERE "to" = 0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48 -- USDC contract
-- GOOD: always filter by block_time
SELECT date_trunc('day', block_time), count(*)
FROM ethereum.transactions
WHERE "to" = 0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48
AND block_time >= date_trunc('day', now()) - interval '90 days'
Excessive JOINs on large tables — erc20_ethereum.evt_Transfer contains billions of rows. JOIN without strict filtering kills the query:
-- BAD: JOIN on full transfer history
SELECT t.from, SUM(t.value)
FROM erc20_ethereum.evt_Transfer t
JOIN my_users u ON t."from" = u.address
WHERE t.contract_address = 0xdAC17F958D2ee523a2206206994597C13D831ec7 -- USDT
GROUP BY 1
-- GOOD: add time range
SELECT t.from, SUM(t.value / 1e6) as usdt_sent
FROM erc20_ethereum.evt_Transfer t
JOIN my_users u ON t."from" = u.address
WHERE t.contract_address = 0xdAC17F958D2ee523a2206206994597C13D831ec7
AND t.evt_block_time >= now() - interval '30 days'
GROUP BY 1
ORDER BY 2 DESC
Patterns for DeFi Analytics
TVL (Total Value Locked) — requires tracking token balances on contract:
-- TVL protocol via Deposit/Withdraw events
WITH deposits AS (
SELECT
date_trunc('day', evt_block_time) AS day,
token,
SUM(amount / POWER(10, decimals)) AS amount
FROM protocol_ethereum.Pool_evt_Deposit
JOIN tokens.erc20 ON token = contract_address AND blockchain = 'ethereum'
WHERE evt_block_time >= now() - interval '180 days'
GROUP BY 1, 2
),
withdrawals AS (
SELECT
date_trunc('day', evt_block_time) AS day,
token,
-SUM(amount / POWER(10, decimals)) AS amount
FROM protocol_ethereum.Pool_evt_Withdraw
JOIN tokens.erc20 ON token = contract_address AND blockchain = 'ethereum'
WHERE evt_block_time >= now() - interval '180 days'
GROUP BY 1, 2
),
daily_flows AS (
SELECT day, token, SUM(amount) AS net_flow
FROM (SELECT * FROM deposits UNION ALL SELECT * FROM withdrawals)
GROUP BY 1, 2
)
SELECT
day,
token,
SUM(net_flow) OVER (PARTITION BY token ORDER BY day) AS cumulative_tvl
FROM daily_flows
ORDER BY day DESC, token
DAU (Daily Active Users) — count unique addresses:
SELECT
date_trunc('day', block_time) AS day,
COUNT(DISTINCT "from") AS dau,
COUNT(*) AS transactions,
COUNT(*) / COUNT(DISTINCT "from") AS avg_txs_per_user
FROM ethereum.transactions
WHERE "to" IN (
SELECT DISTINCT contract_address FROM protocol_ethereum.Pool_evt_Swap
)
AND block_time >= now() - interval '90 days'
AND success = TRUE
GROUP BY 1
ORDER BY 1
Revenue / Fee analytics:
-- Protocol fees via PoolFeeUpdated and Swap events
SELECT
date_trunc('week', s.evt_block_time) AS week,
SUM(
CASE
WHEN s.token0 = 0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48 -- USDC
THEN ABS(s.amount0) / 1e6 * p.fee / 1e6
ELSE ABS(s.amount1) / 1e6 * p.fee / 1e6
END
) AS protocol_fees_usd
FROM uniswap_v3_ethereum.Pair_evt_Swap s
JOIN uniswap_v3_ethereum.Factory_evt_PoolCreated p
ON s.contract_address = p.pool
WHERE s.evt_block_time >= now() - interval '52 weeks'
GROUP BY 1
ORDER BY 1
Working with Spellbook (Dune V2)
Spellbook is dbt project with reusable data models. Instead of writing decoding logic every time, use ready tables:
-- prices.usd — normalized USD prices for all tokens by hour
-- dex.trades — all DEX swaps in unified format (Uniswap, Curve, Balancer...)
-- tokens.erc20 — symbols, decimals of all ERC-20 tokens
SELECT
date_trunc('day', dt.block_date) AS day,
SUM(dt.amount_usd) AS volume_usd,
COUNT(DISTINCT dt.taker) AS unique_traders,
COUNT(*) AS trade_count
FROM dex.trades dt
WHERE dt.project = 'uniswap'
AND dt.version = '3'
AND dt.blockchain = 'ethereum'
AND dt.block_date >= now() - interval '30 days'
GROUP BY 1
ORDER BY 1 DESC
The dex.trades table already contains normalized USD amounts — no need to join with price feeds manually.
Dashboard: Architecture and UX
Good Dune dashboard is a product, not a set of charts. Practical principles:
Metrics hierarchy: top-level KPI (TVL, Volume, Users) → drill-down (by networks, tokens, time periods) → details (top addresses, specific transactions).
Parameters for interactivity: Dune supports parameters in queries ({{token_address}}, {{days_back}}). This makes one dashboard applicable to different assets:
SELECT *
FROM dex.trades
WHERE token_bought_address = {{token_address}}
AND block_date >= now() - interval '{{days_back}} days'
Cacheable queries: heavy historical queries (> 30 days history) better split into:
- Historical snapshot (recalculated once a day via scheduled query)
- Delta for last 24h (fast, updates often)
Combine via UNION in final query — user sees up-to-date data without waiting for 5-minute recalculation.
What's Included
- Protocol analysis and identifying key metrics for dashboard
- SQL query writing with execution time optimization
- Visualization: chart types selection, axis config, colors, legends
- Query parameterization for interactivity
- Update schedule setup for heavy queries
- Documentation of calculation logic for team







