MLX-LM – v0.26.2

📦 MLX-LM – v0.26.2

📢 AI Enthusiasts! MLX LM v0.26.2 is Here! 📢

Hey everyone! A new update (v0.26.2) has just dropped for the MLX LM project! 🎉

Here’s what’s new and exciting:

  • GLM4.5 Support! You can now experiment with GLM4.5 models using MLX LM.
  • Enhanced GLM-4 MoE Support: Improvements to support DWQ quantization, boosting performance.
  • System Prompt Integration: Added a system prompt to the chat script for more controlled interactions.
  • Gemma-3n Bug Fixes: Resolved an issue with intermediate size configurations and corrected how empty cache initialization is handled.

A huge thanks to @mzbac, @jussikuosa, and @brchristian for their valuable contributions!

Check out the full changelog for all the details: [v0.26.1…v0.26.2](link to changelog)

Get the latest release here: https://github.com/ml-explore/mlx-lm/releases/tag/v0.26.2

Happy experimenting! 🤖✨

🔗 https://github.com/ml-explore/mlx-lm/releases/tag/v0.26.2

August 6, 2025 (0)