File size: 638 Bytes
0166db5
 
 
 
 
 
 
 
02a8864
 
 
 
489f2f6
 
 
95f41f1
489f2f6
95f41f1
 
 
 
 
489f2f6
95f41f1
 
 
489f2f6
95f41f1
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
---
license: mit
datasets:
- millat/StudyAbroadGPT-Dataset
language:
- en
base_model:
- unsloth/mistral-7b-instruct-v0.3-bnb-4bit
tags:
- unsloth
- trl
- sft
---
# StudyAbroadGPT-7B
    
A fine-tuned version of Mistral-7B optimized for study abroad assistance.
    
## Model Details
- Base Model: Mistral-7B
- Training: LoRA fine-tuning (r=16, alpha=32)
- Quantization: 4-bit
- Max Length: 2048 tokens
    
## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
    
model = AutoModelForCausalLM.from_pretrained("millat/StudyAbroadGPT-7B")
tokenizer = AutoTokenizer.from_pretrained("millat/StudyAbroadGPT-7B")
```