
DeepSeek V3.2 Goes Open Source Under MIT License, Matches GPT-5 Performance
DeepSeek releases V3.2 under MIT license with 671B MoE architecture, matching GPT-5 at one-tenth the cost and achieving gold-medal performance on IMO and IOI competitions.

DeepSeek releases V3.2 under MIT license with 671B MoE architecture, matching GPT-5 at one-tenth the cost and achieving gold-medal performance on IMO and IOI competitions.

A comprehensive review of Meta's Llama 4 Maverick, a 400B parameter open-weight MoE model with 128 experts, 1M context, and multimodal capabilities.

Analysis of how the MMLU benchmark gap between open-source and proprietary AI narrowed from 17.5 to 0.3 percentage points in a single year, reshaping the industry landscape.

A detailed review of Alibaba's Qwen 3 model family, featuring hybrid thinking modes, 119 language support, MCP integration, and Apache 2.0 licensing.