Shadowell/Kairos-base-crypto
Fine-tuned Kronos-base on BTC/USDT + ETH/USDT 1-min K-lines (2024-01 ~ 2026-04) using Kairos. Architecture = Kronos + exogenous bypass channel (32-d) + quantile return head.
This run keeps the original tokenizer NeoQuasar/Kronos-Tokenizer-base, matching the original Kairos-small-crypto training flow. Training data comes from the public Binance Vision spot mirror, so the 5 crypto-native exogenous features (funding_rate / funding_rate_z / oi_change / basis / btc_dominance) remain padded to zero; the other 27 dimensions are real.
Results on test set (2026-01-01 04:16:00 ~ 2026-04-16 23:30:00, 304,710 1-min bars)
| horizon | model | hit_rate | rank_ic | ICIR |
|---|---|---|---|---|
| h1 | baseline | 50.78% | +0.025 | +0.630 |
| h1 | finetuned | 50.37% | +0.011 | +0.051 |
| h5 | baseline | 51.61% | +0.029 | +0.385 |
| h5 | finetuned | 50.95% | +0.029 | +0.351 |
| h30 | baseline | 52.49% | +0.055 | +0.325 |
| h30 | finetuned | 52.92% | +0.076 | +0.484 |
Baseline = original Kronos-base weights + randomly initialised exog / return head.
Tokenizer stays on the official NeoQuasar/Kronos-Tokenizer-base, matching the original crypto predictor flow.
Training stopped at epoch 4; best val_ce = 2.4842.
Usage
from kairos import KronosTokenizer, KronosWithExogenous
tok = KronosTokenizer.from_pretrained("NeoQuasar/Kronos-Tokenizer-base")
model = KronosWithExogenous.from_pretrained("Shadowell/Kairos-base-crypto")
Training config (preset crypto-1min)
- lookback 256 min, predict 30 min
- batch 24, OneCycleLR, early-stop patience 3
- progressive unfreeze: only last transformer block + exog bypass + return head
- tokenizer source =
NeoQuasar/Kronos-Tokenizer-base - 32-d EXOG = 24 common + 8 crypto-market features
Training recipe
Full command log, backtest commands, pitfalls and the reproduction checklist are in docs/CRYPTO_BTC_ETH_RUN.md.
- Downloads last month
- 40