⌘K

Zhipu AI

Available Models

GLM-4.7 Open-source MoE model with 355B total/32B active parameters. 200K context window with 131K max output. Features three thinking modes (Interleaved, Preserved, Turn-level). Strong coding performance: 73.8% on SWE-bench, excellent multilingual coding support.