Fractal-8b-v1 - Fresh Initialization
Status: Config-only (no pretrained weights)
This repo contains:
- โ Qwen3 8B architecture config
- โ Tokenizer
- โ No pretrained weights (deleted for fresh start)
Purpose
This is a fresh initialization checkpoint for the Integrated Cognitive Development Training Pipeline.
When loaded, the model will initialize with random weights - a clean slate for developmental training through 9 phases:
- Phase 0: Relational Foundation
- Phase 1A: Pattern Recognition
- Phase 1B: Compositional Structure
- Phase 2A: Perspective Taking (Theory of Mind)
- Phase 2B: Abstract Reasoning
- Phase 3A: Empirical Grounding
- Phase 3B: Value Calibration
- Phase 4A: Autonomous Agency
- Phase 4B: Integrated Navigation
Training Goal
Drive manifold coordinates to:
- T (Topology) โ 0.75
- G (Geometry) โ 0.85
Through curriculum ordering alone, not architecture.
Usage
Use with the Lark Entropy Trainer:
from transformers import AutoModelForCausalLM, AutoConfig
# This will initialize fresh random weights
config = AutoConfig.from_pretrained("ahmiershadowman/Fractal-8b-v1")
model = AutoModelForCausalLM.from_config(config)
Start training with Phase 0 dataset: ahmiershadowman/claude-ash-conversations
Dataset Collections
All 9 phases available at: https://huggingface.co/collections/ahmiershadowman
- Downloads last month
- 6
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support