metadata
inference: false
base_model: Qwen/Qwen3-Next-80B-A3B-Instruct
base_model_relation: quantized
tags:
- exl3
library_name: exllamav3
pipeline_tag: text-generation
Some pointlessly bigger exllamav3 quants of Qwen3-Next-80B-A3B-Instruct to complement Turboderp's optimized quants.
6.00bpw_H6 (56.561 GiB)
8.00bpw_H8 (75.026 GiB)