The collections of MoFE agentic AI
Collection
The collection of all AI model trained on MoFE concept by Kiy(NOTE: Some model that may not work as expected)
•
2 items
•
Updated
A Mixture-of-Experts (MoE) enhanced version of Qwen2.5-Coder-3B-Instruct, optimized for agentic AI workflows and function calling.
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
# Load model and tokenizer
model = AutoModelForCausalLM.from_pretrained(
"Kiy-K/fyodor-agentic-v1.1",
torch_dtype=torch.bfloat16,
device_map="auto",
trust_remote_code=True
)
tokenizer = AutoTokenizer.from_pretrained(
"Kiy-K/fyodor-agentic-v1.1",
trust_remote_code=True
)
# Generate
prompt = "Write a Python function to calculate Fibonacci numbers:"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(
**inputs,
max_new_tokens=512,
temperature=0.7,
top_p=0.9,
do_sample=True
)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Training Data:
Sparse MoE implementation:
# With custom generation config
outputs = model.generate(
**inputs,
max_new_tokens=1024,
temperature=0.8,
top_p=0.95,
top_k=50,
repetition_penalty=1.1,
do_sample=True
)
Apache 2.0 (inherited from base model)
Built with love for the agentic AI community