Historical DEX trading data

The following datasets are available for historical DEX trading data. Sign up for a free API key to download the data.

Read the documentation how to get started with Trading Strategy Python library for algorithmic trading .

Available datasets

Name Tag Entry count (k) Size (MBytes) Format Last updated Links
Decentralised exchanges exchange_universe 4 3 JSON Documentation Download
Trading pairs pair_universe 173 22 Parquet Documentation Download
OHLCV candles, 1 minute candles_1m 488,350 28,046 Parquet Documentation Download
OHLCV candles, 5 minutes candles_5m 290,941 16,816 Parquet Documentation Download
OHLCV candles, 15 minutes candles_15m 184,132 10,750 Parquet Documentation Download
OHLCV candles, 1 hour candles_1h 92,291 5,446 Parquet Documentation Download
OHLCV candles, 4 hours candles_4h 40,965 2,447 Parquet Documentation Download
OHLCV candles, daily candles_1d 12,280 756 Parquet Documentation Download
OHLCV candles, weekly candles_7d 2,885 186 Parquet Documentation Download
OHLCV candles, montly candles_30d 1,008 67 Parquet Documentation Download
XY Liquidity, 1 minute liquidity_1m 509,068 20,030 Parquet Documentation Download
XY Liquidity, 5 minutes liquidity_5m 300,371 11,787 Parquet Documentation Download
XY Liquidity, 15 minutes liquidity_15m 189,450 7,488 Parquet Documentation Download
XY Liquidity, 1 hour liquidity_1h 95,013 3,784 Parquet Documentation Download
XY Liquidity, 4 hours liquidity_4h 42,523 1,715 Parquet Documentation Download
XY Liquidity, daily liquidity_1d 13,009 550 Parquet Documentation Download
XY Liquidity, weekly liquidity_7d 3,132 145 Parquet Documentation Download
XY Liquidity, monthly liquidity_30d 1,096 55 Parquet Documentation Download
Top momentum, daily top_movers_24h 0.446 0.525 JSON Documentation Download
AAVE v3 supply and borrow rates aave_v3 3,904 343 Parquet Documentation Download

Data logistics

Datasets are distributed in Parquet file format designed for data research. Parquet is a columnar data format for high performance in-memory datasets from Apache Arrow project.

Datasets are large. Datasets are compressed using Parquet built-in Snappy compression and may be considerably larger when expanded to RAM. We expect you to download the dataset, cache the resulting file on a local disk and perform your own strategy specific trading pair filtering before using the data. Uncompressed one minute candle data takes several gigabyte of memory.