Flipside Crypto Integration
Flipside is an analytics platform with curated blockchain data tables: decoded contract events, DEX trades, NFT transactions, bridge activity across 30+ networks. Unlike direct node parsing, here data is already normalized and available through SQL interface. Main integration scenario — connecting analytics layer to product without own indexing infrastructure.
Data Access Models
ShroomDK / Flipside API. Main programmatic interface. Query is written in SQL (Snowflake/Trino dialect), sent via REST API, result returned as JSON. Free plan limits: 5000 requests per month, 1M rows per result, maximum query execution time ~120 sec.
import requests
API_KEY = "your_api_key"
def query_flipside(sql: str) -> list[dict]:
response = requests.post(
"https://api-v2.flipsidecrypto.xyz/json-rpc",
headers={"x-api-key": API_KEY},
json={
"jsonrpc": "2.0",
"method": "createQueryRun",
"params": [{"sql": sql, "maxAgeMinutes": 10, "timeoutMinutes": 2}],
"id": 1
}
)
query_run_id = response.json()["result"]["queryRun"]["id"]
# polling until completion
return poll_results(query_run_id)
Key tables for EVM networks:
| Table | Content |
|---|---|
ethereum.core.fact_transactions |
All transactions with decoded input |
ethereum.core.fact_event_logs |
Decoded contract events |
ethereum.defi.fact_dex_swaps |
Swaps on Uniswap, Curve, Balancer |
ethereum.nft.fact_nft_sales |
NFT sales (OpenSea, Blur, LooksRare) |
ethereum.price.fact_prices_ohlc_hourly |
Token OHLC prices |
ethereum.core.fact_token_transfers |
ERC-20 transfers |
Similar structure for Solana (solana.core.*), Polygon, Arbitrum, Base, Avalanche and other supported networks.
Typical query — DEX trading volume by protocols over last 7 days:
SELECT
platform,
COUNT(*) as swap_count,
SUM(amount_in_usd) as volume_usd
FROM ethereum.defi.fact_dex_swaps
WHERE block_timestamp >= DATEADD('day', -7, CURRENT_TIMESTAMP())
AND amount_in_usd IS NOT NULL
GROUP BY platform
ORDER BY volume_usd DESC
LIMIT 20
Integration Architecture
Caching — mandatory. Flipside is not designed for real-time queries. Architectural pattern: scheduled job pulls data every N minutes → saves to own DB → frontend reads from cache. Direct queries to Flipside API from user interface — bad practice: latency 5-30 sec, limits get exhausted quickly.
// Cron job every 15 minutes
import { FlipsideClient } from '@flipsidecrypto/sdk';
const flipside = new FlipsideClient({ apiKey: process.env.FLIPSIDE_API_KEY });
async function syncDexVolume() {
const result = await flipside.query.run({
sql: `SELECT ... FROM ethereum.defi.fact_dex_swaps WHERE ...`,
maxAgeMinutes: 15, // use Flipside cache if data is fresh
timeoutMinutes: 3,
});
await db.upsert('dex_volume_snapshots', {
timestamp: new Date(),
data: result.rows,
});
}
Official Python SDK:
pip install flipside
from flipside import Flipside
sdk = Flipside(api_key="...", base_url="https://api-v2.flipsidecrypto.xyz")
result = sdk.query("SELECT * FROM ethereum.core.fact_transactions LIMIT 100")
df = result.to_pandas() # convenient for analytics
Parameter maxAgeMinutes is important: if you request same data repeatedly, Flipside returns cached result without consuming request limit.
Limitations and Alternatives
Flipside updates data with 30-90 minute delay from current block — this is not tool for real-time monitoring. For data < 1 hour use other sources: The Graph (subgraphs), Dune Analytics (similar SQL approach, but different licensing model), direct RPC queries.
Coverage of some protocols may be incomplete or outdated — before using should verify data against on-chain source at least for few transactions.







