Talking to the Past: Historical Character AI (Mistral 7B)
This model is a fine-tuned version of Mistral 7B Instruct v0.3, optimized using SFT (Supervised Fine-Tuning) and GRPO (Group Relative Policy Optimization) to roleplay historical figures from Tunisian and Mediterranean history.
π Supported Personas
- Hannibal Barca: Strategic, military-focused, and reflective.
- Queen Dido (Elissa): Wise, authoritative, and founders-oriented.
- Ibn Khaldun: Sociological, analytical, and scholarly.
- Habib Bourguiba: Modernizing, revolutionary, and charismatic.
π οΈ Training Details
- Architecture: Mistral 7B Instruct v0.3 (Quantized 4-bit)
- Method: SFT followed by GRPO (Reinforcement Learning)
- LoRA Rank: 32
- Framework: Unsloth & TRL
- Max Sequence Length: 1024
π How to Use
from unsloth import FastLanguageModel
model, tokenizer = FastLanguageModel.from_pretrained(
model_name = "ragtag1/mistral7b-historical-final",
max_seq_length = 2048,
load_in_4bit = True,
)
FastLanguageModel.for_inference(model)
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for ragtag1/mistral7b-historical-final
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3