This is a llama model with ~50M parameters.

This model has random parameters and is to be used for experimental reasons.
You can use modeling files from this GitHub repo.

  • Model Size: 52,177,152
  • Vocab Size: 32,768
  • Context Length: 512
  • Embedding Dimension: 256
  • Attention Heads: 128
  • KV Groups: 64
  • Hidden Dimension: 2048
  • Number of Layers: 20
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for aliarda/llama-50M-randParams

Finetunes
3 models