Back to Models
Z.ai: GLM 4.5
z-ai/glm-4.5Jul 25, 2025131.1K context98.3K max output$0.60/M in · $2.20/M outReasoning
Description
GLM-4.5 is our latest flagship foundation model, purpose-built for agent-based applications. It leverages a Mixture-of-Experts (MoE) architecture and supports a context length of up to 128k tokens. GLM-4.5 delivers significantly enhanced capabilities in reasoning, code generation, and agent alignment. It supports a hybrid inference mode with two options, a "thinking mode" designed for complex reasoning and tool use, and a "non-thinking mode" optimized for instant responses. Users can control the reasoning behaviour with the reasoning enabled boolean. Learn more in our docs
Specifications
Provider
z-ai
Context Length
131.1K
Max Output
98.3K
Modality
Intext
Outtext
Pricing
| Type | Price / 1M tokens |
|---|---|
| Input | $0.60 |
| Output | $2.20 |
| Cache Read | $0.11 |
Quick Start
curl https://api.ominigate.ai/v1/chat/completions \
-H "Authorization: Bearer sk-omg-your-api-key" \
-H "Content-Type: application/json" \
-d '{
"model": "z-ai/glm-4.5",
"messages": [{"role": "user", "content": "Hello!"}]
}'