ValueError: Model type lfm2_moe not supported.
#1
by
kadirnar - opened
I'm running this code. It's giving an error.mlx_lm.chat --model "mlx-community/LFM2-8B-A1B-4bit"
I'm running this code. It's giving an error.
mlx_lm.chat --model "mlx-community/LFM2-8B-A1B-4bit"
Same here, I am on mlx-lm version 0.28.2 as well
install from source!
It will be available on the next release
install from source!
I tested it, it's working. Thanks.
kadirnar changed discussion status to
closed
Can some of you guys give some recommendations for vllm-mlx or mlx-openai-server? https://huggingface.co/mlx-community/LFM2-8B-A1B-8bit-MLX/discussions/1#699bccf338e90b215dd16e2a