W4A16 GPTQ quantized version of CohereLabs/command-a-translate-08-2025
Using intel/auto-round version: git+7b8e280
Generation command-line
auto-round --model command-a-translate --scheme "W4A16" --format "auto_gptq" --dataset 'NeelNanda/pile-10k:apply_chat_template:system_prompt="Translate following sentences into Korean and Japanese:"' --output_dir "./cat_gptq"
- Downloads last month
- 14
Model tree for hell0ks/command-a-translate-08-2025-AutoRound-GPTQ-4bit
Base model
CohereLabs/c4ai-command-a-03-2025
Finetuned
CohereLabs/command-a-translate-08-2025