Minnimax-M2-max-8bit-gs32 not supported

#1
by Elonqq - opened

I have updated mlx-lm to version 0.28.1, but it still does not support the model. Could you please advise how to resolve this, or do I need to wait for the 0.28.4 release?

Device: M3 Ultra with 512GB of memory

MLX Community org

Hi @Elonqq

All that you really need is the newly created "model" file for MiniMax M2 to let MLX understand this new AI model. You can download the "model" file at https://github.com/ml-explore/mlx-lm/blob/main/mlx_lm/models/minimax.py

Then you copy this file into the models folder of MLX. I used brew to install MLX with python 3.11, so my folder is found at opt/homebrew/lib/python3.11/site-packages/mlx_lm/models. I just copied the model file into that folder.

bibproj changed discussion status to closed

Sign up or log in to comment