osgrep-coderank-q8

This model is a quantized (Int8 / Q8) version of nomic-ai/CodeRankEmbed export for use with transformers.js.

It is used as the primary dense retrieval model in osgrep.

Original Model License

This model is a derivative work of nomic-ai/CodeRankEmbed, licensed under the MIT License. Please refer to the original model card for citation and full license details.

Downloads last month
8
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ 2 Ask for provider support

Model tree for ryandono/osgrep-coderank-q8

Quantized
(9)
this model