GLM-5: Open-Source Mixture-of-Experts Closes the Gap with Frontier Models

Z.ai’s GLM-5 is a 744B-parameter mixture-of-experts model with only 40B active parameters. Released under the MIT license, it achieves best-in-class open-source performance on reasoning, coding, and agentic benchmarks, rivaling Claude Opus 4.5 and Gemini 3 Pro.
artificial-intelligence
Author

Kabui, Charles

Published

2026-02-16

Keywords

glm-5, mixture-of-experts, open-source-llm, z-ai, agentic-ai, reinforcement-learning