facebook/MobileLLM-Pro
#1494
by
Abhi99999
- opened
I'm almost certain that the architecture MobileLLMP1ForCausalLM is not supported by llama.cpp so this model can't be converted into a GGUF.
I'm almost certain that the architecture MobileLLMP1ForCausalLM is not supported by llama.cpp so this model can't be converted into a GGUF.