ValueError: Model type lfm2_moe not supported.

#1
by kadirnar - opened

I'm running this code. It's giving an error.
mlx_lm.chat --model "mlx-community/LFM2-8B-A1B-4bit"

MLX Community org

I'm running this code. It's giving an error.
mlx_lm.chat --model "mlx-community/LFM2-8B-A1B-4bit"

Same here, I am on mlx-lm version 0.28.2 as well

MLX Community org

install from source!

MLX Community org

It will be available on the next release

install from source!

I tested it, it's working. Thanks.

kadirnar changed discussion status to closed
MLX Community org

Can some of you guys give some recommendations for vllm-mlx or mlx-openai-server? https://huggingface.co/mlx-community/LFM2-8B-A1B-8bit-MLX/discussions/1#699bccf338e90b215dd16e2a

Sign up or log in to comment