One of the best models that you can run locally

This is a new model from alibaba (the company of Qwen) that can compete with the best in the world. Maybe better than openai, anthropic,xai, gemini (see graphic comparison in attached files).

Make sure you have enough ram/gpu to run.

It is not a thinking model, so it can be used in a workflow that uses mcp, langchain and other technologies.

Just run it side by side with other paid LLM and see by yourself.

Downloads last month
5
GGUF
Model size
1000B params
Architecture
bailingmoe2
Hardware compatibility
Log In to view the estimation

3-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ling1000T/Ling-1T-alibaba-gguf

Quantized
(7)
this model