--- license: mit base_model: - deepseek-ai/DeepSeek-R1-Zero --- # DeepSeek-R1-Zero-256x21B-BF16 ## Files Available **Imatrix:** [DeepSeek-R1-Zero-256x21B-BF16.imatrix](https://huggingface.co/gghfez/DeepSeek-R1-Zero-256x21B-BF16/blob/main/DeepSeek-R1-Zero.imatrix) **GGUF files:** Moved to ModelScope (see below) ## Why ModelScope? Due to new storage limits introduced by HuggingFace, the GGUF files (30 × 46GB = ~1.38TB) have been moved to ModelScope. ## Download ### Python SDK ```python pip install modelscope ``` ```python from modelscope import snapshot_download model_dir = snapshot_download('quantzor/DeepSeek-R1-Zero-256x21B-BF16') ``` ### Direct Link 🔗 https://modelscope.cn/models/quantzor/DeepSeek-R1-Zero-256x21B-BF16