Alibaba’s Qwen team releases an Apache-licensed, open-weight 80B coding model that runs at 3B active cost, processes entire repositories, and rivals or beats closed systems on speed, security, and real-world engineering tasks.
Alibaba’s Qwen team has released Qwen3-Coder-Next, an open source, Apache 2.0-licensed coding LLM, positioning it as one of the strongest open-weight challenges yet to proprietary systems from OpenAI, Anthropic, Google and xAI. Model weights are publicly available on Hugging Face in four variants, enabling both enterprise and independent commercial use.
Built specifically for repository-level, agentic software engineering, the model combines 80 billion total parameters with an ultra-sparse Mixture-of-Experts design that activates only 3 billion per forward pass. This delivers near-80B reasoning capability at close-3B inference cost, cutting deployment expenses while claiming 10× higher throughput than dense rivals.
A 262,144-token context window allows the system to ingest entire Python libraries or large JavaScript frameworks. A hybrid stack using Gated DeltaNet and Gated Attention mitigates quadratic scaling, while Best-Fit Packing reduces context hallucination.
Training shifts from static code pairs to a full agentic pipeline: 800,000 verifiable tasks mined from real GitHub pull requests, executed inside live containers with reinforcement learning and unit-test feedback via Alibaba’s MegaFlow Kubernetes infrastructure. The approach teaches self-correction and real-time debugging.
Developer support expands to 370 programming languages, repository-level datasets of ~600B tokens, and XML-style tool calling optimised for long code generation. Specialist web and UX experts were distilled back into the core model.
Benchmarks show competitiveness with closed leaders: 70.6% on SWE-Bench Verified, 61.2% on SecCodeBench, outperforming Claude-Opus-4.5, and 56.32% on CWEval.
As the team notes, “Scaling agentic training, rather than model size alone, is a key driver for advancing real-world coding agent capability”. Open source is no longer catching up; it is setting the pace.














































































