Transpose API Integration
Transpose is a SQL interface over blockchain data. Instead of writing and maintaining your own indexer, you write regular SQL queries and get historical transaction, token, NFT, and DeFi protocol data. Supported: Ethereum, Polygon, Optimism, Arbitrum, Base, BSC.
When this makes sense: analytical tasks, MVP, internal dashboards, research. When it doesn't: real-time events with latency < 1 second, custom contracts with non-standard events, query volumes that make Transpose economically inefficient.
Working with the API
Transpose provides two interfaces: SQL API (arbitrary queries) and REST API (predefined endpoints for typical queries).
SQL API is the primary tool for non-trivial tasks:
const TRANSPOSE_KEY = process.env.TRANSPOSE_API_KEY
async function queryTranspose<T>(sql: string): Promise<T[]> {
const response = await fetch('https://api.transpose.io/sql', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-KEY': TRANSPOSE_KEY,
},
body: JSON.stringify({ sql }),
})
if (!response.ok) {
const error = await response.json()
throw new Error(`Transpose error: ${error.message}`)
}
const data = await response.json()
return data.results as T[]
}
// Example: top-10 token holders
const topHolders = await queryTranspose<{ address: string; balance: string }>(`
SELECT owner_address AS address, balance
FROM ethereum.token_owners
WHERE contract_address = '0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48'
ORDER BY balance DESC
LIMIT 10
`)
Main tables
Transpose organizes data into schemas by blockchain (ethereum, polygon, etc.). Key tables:
| Table | Content |
|---|---|
ethereum.transactions |
All transactions: hash, from, to, value, gas, status |
ethereum.logs |
Raw event logs: address, topics, data |
ethereum.token_transfers |
Decoded ERC-20 transfers |
ethereum.nft_transfers |
ERC-721/1155 transfers |
ethereum.token_owners |
Current ERC-20 balances |
ethereum.nft_owners |
Current NFT owners |
ethereum.accounts |
Addresses: type (EOA/contract), ETH balance, first transaction |
Typical query patterns
Wallet activity over a period:
SELECT
DATE_TRUNC('day', timestamp) AS day,
COUNT(*) AS tx_count,
SUM(value::numeric / 1e18) AS eth_sent
FROM ethereum.transactions
WHERE from_address = '0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045'
AND timestamp >= NOW() - INTERVAL '30 days'
GROUP BY 1
ORDER BY 1
Transfer history of a specific token to an address:
SELECT
t.timestamp,
t.transaction_hash,
t.from_address,
t.to_address,
(t.quantity::numeric / POW(10, tk.decimals)) AS amount,
tk.symbol
FROM ethereum.token_transfers t
JOIN ethereum.tokens tk ON t.contract_address = tk.contract_address
WHERE t.to_address = '0xYourAddress'
AND t.contract_address IN (
'0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48', -- USDC
'0xdAC17F958D2ee523a2206206994597C13D831ec7' -- USDT
)
ORDER BY t.timestamp DESC
LIMIT 100
Limits and rate limits
Free plan: 1 request/second, 10,000 rows/request, 100 requests/day. Paid plans remove most limits, but cost scales with volume.
Practical limits to account for:
- Data with ~2 minute delay from real-time (not for real-time systems)
- Maximum result size — 10,000 rows; for larger sets, use pagination via
OFFSET/LIMIT - Complex JOINs across schemas (cross-chain) aren't supported
For production API integration, cache results — most analytical queries don't require data fresher than 5-10 minutes:
import { createClient } from 'redis'
const redis = createClient({ url: process.env.REDIS_URL })
async function cachedTransposeQuery<T>(sql: string, ttlSeconds = 300): Promise<T[]> {
const key = `transpose:${Buffer.from(sql).toString('base64').slice(0, 64)}`
const cached = await redis.get(key)
if (cached) return JSON.parse(cached)
const results = await queryTranspose<T>(sql)
await redis.setEx(key, ttlSeconds, JSON.stringify(results))
return results
}
Integration takes 1-2 days: setting up the API client, determining needed queries, caching, error handling with retry for 429 responses.







