Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
l3utterfly
/
mistral-7b-v0.1-layla-v4-gguf
like
14
GGUF
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
mistral-7b-v0.1-layla-v4-gguf
43.9 GB
1 contributor
History:
5 commits
l3utterfly
Update README.md
291bc4a
verified
almost 2 years ago
.gitattributes
Safe
1.56 kB
q2_k
almost 2 years ago
README.md
Safe
106 Bytes
Update README.md
almost 2 years ago
mistral-7b-v0.1-layla-v4-Q2_K.gguf
2.72 GB
xet
q2_k
almost 2 years ago
mistral-7b-v0.1-layla-v4-Q3_K_M.gguf
3.52 GB
xet
all quants
almost 2 years ago
mistral-7b-v0.1-layla-v4-Q4_K_M.gguf
Safe
4.37 GB
xet
all quants
almost 2 years ago
mistral-7b-v0.1-layla-v4-Q5_K_M.gguf
5.13 GB
xet
all quants
almost 2 years ago
mistral-7b-v0.1-layla-v4-Q6_K.gguf
5.94 GB
xet
all quants
almost 2 years ago
mistral-7b-v0.1-layla-v4-Q8_0.gguf
7.7 GB
xet
all quants
almost 2 years ago
mistral-7b-v0.1-layla-v4-f16.gguf
14.5 GB
xet
all quants
almost 2 years ago