This is a llama model with ~50M parameters.
This model has random parameters and is to be used for experimental reasons.
You can use modeling files from this GitHub repo.
- Model Size: 52,177,152
- Vocab Size: 32,768
- Context Length: 512
- Embedding Dimension: 256
- Attention Heads: 128
- KV Groups: 64
- Hidden Dimension: 2048
- Number of Layers: 20
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support