Warning: This model was not verified with the MNN Chat app; it wouldn't load on my 12 GB RAM phone.

This model DeProgrammer/shisa-v2.1-unphi4-14b-MNN was converted to MNN format from shisa-ai/shisa-v2.1-unphi4-14b using llmexport.py in MNN version 3.4.0 with default settings (4-bit quantization).

Inference can be run via MNN, e.g., MNN Chat on Android.

Downloads last month
18
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for DeProgrammer/shisa-v2.1-unphi4-14b-MNN

Base model

microsoft/phi-4
Finetuned
unsloth/phi-4
Quantized
(7)
this model