THE FUTURE IS HERE

Qwen3-Max vs OpenAI: The Rivalry Just Got Real 🔥

Qwen just put itself on the map as a true OpenAI competitor.
Their new Qwen3-Max series is showing insane progress:

Qwen3-Max-Instruct (already in preview!) rivals top models on SWE-Bench, Tau2-Bench, SuperGPQA, LiveCodeBench, and AIME25.

Qwen3-Max-Thinking (still training) adds tool use + parallel reasoning. In heavy mode, it’s nearly maxing out benchmarks, pushing them to their limits.

What makes it stand out?

Over 1T parameters trained on 36T tokens.

Smart training stack: ChunkFlow for 3x faster long-context (up to 1M tokens!), PAI-FlashMoE for 30% efficiency, and fault-tolerant infra to make trillion-scale stable.

API is OpenAI-compatible → you can plug it in directly.

Pricing is competitive: starts at $1.2 / $6 per 1M tokens for 0–32k context, scaling up to 252k.

This feels like the start of a new heavyweight rivalry in AI.
Which news should I cover next? Drop your thoughts and I’ll tag you in the next roundup.

I’m Louis-François, PhD dropout, now CTO & co-founder at Towards AI. Follow me for tomorrow’s no-BS AI roundup 🚀

#AInews #Qwen #OpenAI #ArtificialIntelligence #AIcommunity #TechUpdate #MachineLearning #AItrends #LLMs #FutureOfAI #DeepLearning #short