Dataset Viewer

The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.

IMU-1 Stage 2 Training Corpus (Decay Phase)

Pre-tokenized training data for Stage 2 (decay phase) of IMU-1, a sample-efficient 430M parameter language model.

Dataset Details

Property Value
Tokens ~28B
Format Memory-mapped NumPy (.npy)
Tokenizer SmolLM2-360M
Vocab size 49,152

Data Sources

Stage 2 uses tighter quality filters compared to Stage 1:

  • DCLM-edu (higher threshold filtering)
  • FineWeb-edu
  • FineMath
  • Curated high-quality sources

Download

huggingface-cli download thepowerfuldeez/1226_imu1_base_decay_corpus --repo-type=dataset

Usage with sample_efficient_gpt

# Clone training framework
git clone https://github.com/thepowerfuldeez/sample_efficient_gpt
cd sample_efficient_gpt

# Install dependencies
export UV_TORCH_BACKEND=auto
uv pip install setuptools uv_build maturin
uv sync

# Train Stage 2 (requires Stage 1 checkpoint)
uv run torchrun --nproc_per_node 8 train.py \
    --config configs/imu1_base.yaml \
    --config-key decay

Training Configuration (Stage 2)

Parameter Value
Schedule WSD (decay phase)
Iterations 100,000 (200k total)
Batch size 312
Context length 896
Muon LR 1.15e-2 → 25% min
Decay start 100k steps

Related Resources

Citation

@misc{grigorev2026imu1sampleefficientpretrainingsmall,
      title={IMU-1: Sample-Efficient Pre-training of Small Language Models}, 
      author={George Grigorev},
      year={2026},
      eprint={2602.02522},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2602.02522}, 
}
Downloads last month
2,636

Paper for thepowerfuldeez/1226_imu1_base_decay_corpus