Model Card for Model ID
How to Get Started with the Model
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_id = "SkyAsl/Llama-3.2-3B-Medical_chatbot"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto")
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
prompt = ""
result = pipe(prompt, max_new_tokens=200, temperature=0.7, do_sample=True)
print(result[0]["generated_text"])
Training Hyperparameters
learning_rate = 2e-5
LoRA Config:
- r = 16
- alpha = 16
- dropout = 0.05
Model tree for SkyAsl/Llama-3.2-3B-Medical_chatbot
Base model
meta-llama/Llama-3.2-3B
Quantized
unsloth/Llama-3.2-3B-bnb-4bit