The collection of quantization models of Qwen3-VL
AI & ML interests
Model Compression
Recent Activity
View all activity
Organization Card
Welcome to AngelSlim 👋
AngelSlim, developed by Tencent, is a large language model compression toolkit engineered for enhanced usability, comprehensiveness, and efficiency. We will open-source compression algorithms including quantization, speculative decoding, pruning, and distillation. Supporting cutting-edge mainstream LLMs, the toolkit streamlines the complete end-to-end workflow from compression to deployment.
(AngelSlim是腾讯自研的,致力于打造更易用、更全面和更高效的大语言模型压缩工具包。我们将开源量化、投机采样、稀疏化和蒸馏等压缩算法。覆盖主流最前沿的大模型,并且端到端打通从压缩到部署的全流程。)
models
78
AngelSlim/Hunyuan-1.8B-Instruct_eagle3
Updated
•
26
AngelSlim/Hunyuan-4B-Instruct_eagle3
Updated
•
20
AngelSlim/Hunyuan-7B-Instruct_eagle3
Updated
•
17
AngelSlim/Qwen3-14B_eagle3
Updated
•
174
•
2
AngelSlim/Qwen3-VL-4B-Instruct-FP8-Static
4B
•
Updated
•
74
AngelSlim/Qwen3-VL-32B-Instruct-FP8-Static
Updated
•
103
AngelSlim/Qwen3-VL-8B-Instruct-FP8-Static
9B
•
Updated
•
92
AngelSlim/Qwen3-VL-2B-Instruct-FP8-Static
2B
•
Updated
•
77
AngelSlim/Qwen3-4B_eagle3
0.2B
•
Updated
•
492
•
1
AngelSlim/Glm4_6-fp8_static
353B
•
Updated
•
536
datasets
0
None public yet