-
-
-
-
-
-
Inference Providers
Active filters: GPTQ
JunHowie/Qwen3-0.6B-GPTQ-Int8
Text Generation
• 0.6B • Updated
• 35
JunHowie/Qwen3-1.7B-GPTQ-Int4
Text Generation
• 2B • Updated
• 601
• 1
JunHowie/Qwen3-1.7B-GPTQ-Int8
Text Generation
• 2B • Updated
• 5
JunHowie/Qwen3-32B-GPTQ-Int4
Text Generation
• 33B • Updated
• 10.7k
• 4
JunHowie/Qwen3-32B-GPTQ-Int8
Text Generation
• 33B • Updated
• 1.41k
• 4
JunHowie/Qwen3-30B-A3B-GPTQ-Int4
Text Generation
• 5B • Updated
• 8
• 1
JunHowie/Qwen3-14B-GPTQ-Int8
Text Generation
• 15B • Updated
• 402
• 1
JunHowie/Qwen3-14B-GPTQ-Int4
Text Generation
• 15B • Updated
• 1.37k
• 4
JunHowie/Qwen3-8B-GPTQ-Int8
Text Generation
• 8B • Updated
• 105
JunHowie/Qwen3-8B-GPTQ-Int4
Text Generation
• 8B • Updated
• 1.15k
• 4
JunHowie/Qwen3-4B-GPTQ-Int4
Text Generation
• 4B • Updated
• 3.76k
• 1
JunHowie/Qwen3-4B-GPTQ-Int8
Text Generation
• 4B • Updated
• 34
JunHowie/Qwen3-30B-A3B-GPTQ-Int8
Text Generation
• 8B • Updated
• 144
iqbalamo93/Phi-4-mini-instruct-GPTQ-4bit
Text Generation
• 4B • Updated
• 688
iqbalamo93/Phi-4-mini-instruct-GPTQ-8bit
Text Generation
• 4B • Updated
• 23
• 2
GusPuffy/Legion-V2.1-LLaMa-70B-GPTQ
Text Generation
• 11B • Updated
• 1
QuantTrio/DeepSeek-R1-0528-Qwen3-8B-GPTQ-Int4-Int8Mix
Text Generation
• 11B • Updated
• 93
• 4
RedHatAI/DeepSeek-R1-0528-quantized.w4a16
Text Generation
• 104B • Updated
• 31
• 13
QuantTrio/DeepSeek-R1-0528-GPTQ-Int4-Int8Mix-Lite
Text Generation
• 721B • Updated
• 5
• 2
QuantTrio/DeepSeek-R1-0528-GPTQ-Int4-Int8Mix-Compact
Text Generation
• 847B • Updated
• 7
• 5
AXERA-TECH/Qwen2.5-0.5B-Instruct-CTX-Int8
QuantTrio/DeepSeek-R1-0528-GPTQ-Int4-Int8Mix-Medium
Text Generation
• 912B • Updated
• 12
• 1
kxdw2580/DeepSeek-R1-0528-Qwen3-8B-catgirl-v2.5-gptqv2-8bit
Text Generation
• 8B • Updated
• 3
kxdw2580/DeepSeek-R1-0528-Qwen3-8B-catgirl-v2.5-gptqv2-4bit
Text Generation
• 8B • Updated
• 3
dengcao/GLM-4.1V-9B-Thinking-GPTQ-Int4-Int8Mix
Image-Text-to-Text
• 15B • Updated
• 3
• 2
RedHatAI/Kimi-K2-Instruct-quantized.w4a16
Text Generation
• 146B • Updated
• 289
• 12
GusPuffy/BlackSheep-24B-GPTQ
Text Generation
• 4B • Updated
• 2
QuantTrio/Qwen3-235B-A22B-Instruct-2507-GPTQ-Int4-Int8Mix
Text Generation
• 248B • Updated
• 531
• 3
QuantTrio/GLM-4.1V-9B-Thinking-GPTQ-Int4-Int8Mix
Text Generation
• 15B • Updated
• 4
• 1
QuantTrio/Qwen3-Coder-480B-A35B-Instruct-GPTQ-Int4-Int8Mix
Text Generation
• 534B • Updated
• 54
• 7