GPT-First Person Experience
An 6.83-million parameter LLM using GPT-2 encodings. Trained using 80MB of text adventure game transcripts.
Use this to provide another GPT model with a first-person experience.
Supervised fine-tuning should be performed before use.
Technical Information
| Layers | 2 |
| Heads | 2 |
| Embeddings | 128 |
| Context Window | 16384 tokens |
| Tokenizer | GPT-2 BPE |