π Torrent File for AI Model Download π
π
1
1
#11 opened about 2 years ago
by
Nondzu
Fine-tuning toolkit for Mixtral 8x7B MoE model
β€οΈ
π
11
18
#10 opened about 2 years ago
by
hiyouga
Sagemaker deployment config for sub second real time inference
#9 opened about 2 years ago
by
vibranium
AutoModelForCausalLM does not seem to work for Mixtral
8
#8 opened about 2 years ago
by
Mauceric
A question
7
#6 opened about 2 years ago
by
Hoioi
No multi GPU inference support?
8
#4 opened about 2 years ago
by
dataautogpt3
Is it possible to get GPTQ quants in 4 bpw?
1
#2 opened about 2 years ago
by
MrHillsss