Dune Analytics API Integration
Dune stopped being just a tool for manual browser analysis long ago. With the release of Dune API, it became possible to embed on-chain analytics directly into products: dashboards, reports, alerts — without setting up your own indexer and without writing SQL from scratch.
What Dune API really does
The API provides two main scenarios: executing a query by its ID (POST /execute/{queryId}) and retrieving results of the last execution (GET /results/{queryId}). The second option is significantly cheaper in credits — if the data is fresh enough, there's no need to run a new execution.
import requests, time
DUNE_API_KEY = "your_api_key"
QUERY_ID = 3540604 # example: Uniswap V3 pool stats
def get_query_results(query_id: int, params: dict = None) -> list[dict]:
headers = {"X-Dune-API-Key": DUNE_API_KEY}
# Execute with parameters
execute_resp = requests.post(
f"https://api.dune.com/api/v1/query/{query_id}/execute",
headers=headers,
json={"query_parameters": params or {}}
)
execution_id = execute_resp.json()["execution_id"]
# Wait for completion
while True:
status_resp = requests.get(
f"https://api.dune.com/api/v1/execution/{execution_id}/status",
headers=headers
)
state = status_resp.json()["state"]
if state == "QUERY_STATE_COMPLETED":
break
if state == "QUERY_STATE_FAILED":
raise RuntimeError(f"Query failed: {status_resp.json()}")
time.sleep(2)
results = requests.get(
f"https://api.dune.com/api/v1/execution/{execution_id}/results",
headers=headers
)
return results.json()["result"]["rows"]
Typical query execution time: from 5 seconds to several minutes. For production systems this is unacceptable as a synchronous call — you need either cached results or background updates.
Parameterized queries and caching
Dune API's strength lies in parameterization. A query in the browser can be made universal via {{param}} syntax and values passed through the API. This allows one SQL query to cover, for example, any ERC-20 address:
SELECT
date_trunc('day', block_time) AS day,
sum(amount / 1e18) AS volume
FROM erc20_ethereum.evt_Transfer
WHERE contract_address = {{token_address}}
AND block_time >= now() - interval '{{days}}' day
GROUP BY 1
ORDER BY 1 DESC
Calling with parameters:
results = get_query_results(
query_id=MY_QUERY_ID,
params={"token_address": "0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2", "days": "30"}
)
Caching is mandatory. Credits are spent on every execution, not on reading. Schema for most dashboards:
- Last 24 hours data: update once per hour via cron
- Historical data (>7 days): update once per day
- Results stored in Redis or PostgreSQL with TTL
Dune also returns result_metadata.execution_started_at — use this as cache timestamp to show users data freshness.
Limitations and workarounds
Rate limits: free plan — 40 requests per month, Plus — 2000, Premium — 15000+. For production with several users — Plus minimum.
Response size: by default returns up to 25000 rows. For larger datasets — pagination via offset and limit parameters in the results query.
Latency: GET /results/{queryId} without rerun (cached results) returns instantly. Use this endpoint for read-heavy integrations and run new execution only on schedule.
Integration from zero to working dashboard with caching — 1–2 days.







