DeProgrammer's picture
Correct self-link in readme
ae8182c verified
metadata
license: apache-2.0
language:
  - en
base_model: janhq/Jan-v3-4B-base-instruct
base_model_relation: quantized
pipeline_tag: text-generation
library_name: mnn
tags:
  - code
  - mnn

This model DeProgrammer/Jan-v3-4B-base-instruct-MNN-Q8 was converted to MNN format from janhq/Jan-v3-4B-base-instruct using llmexport.py in MNN version 3.4.0 with --quant_bit 8 but otherwise default settings.

Inference can be run via MNN, e.g., MNN Chat on Android.