Arch Error: minimax_m2

#3
by FelipeSerrano - opened

Every other model has minimax-m2 instead of minimax_m2, LM Studio can't load this model

ya I am getting the same issue. LM Studio is not working with this model, it shows this error: ```
🥲 Failed to load the model

Failed to load model

Error when loading model: ValueError: Model type minimax_m2 not supported.
```

I tried this 3 bit dwq quant with inferencer app, mlx_lm.chat in terminal.
All of them showing this error only_ ValueError: Model type minimax_m2 not supported.
Please see how to fix this Catalystsec

Catalyst Security org
edited 1 day ago

I've re-uploaded the config.json using the original model type (note that MLX-LM made a fix for this issue here: https://github.com/ml-explore/mlx-lm/commit/08c8c0a5ea85721ddd9ae3caa27ebab6dc328e8a). Should work now. Thanks!

Sign up or log in to comment