EXL3 quantization of rnj-1-instruct, 6 bits per weight.

HumanEval (argmax)

Model Q4 Q6 Q8 FP16
rnj-1-instruct-exl3-6bpw 87.2 86.6 86.6 86.0
rnj-1-instruct-exl3-8bpw-h8 86.6 87.2 86.6 86.6
Downloads last month
8
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for isogen/rnj-1-instruct-exl3-6bpw

Quantized
(23)
this model