Back to Models
Mistral: Mixtral 8x22B Instruct
mistralai/mixtral-8x22b-instructApr 17, 202465.5K context$2.00/M in · $6.00/M out
Description
Mistral's official instruct fine-tuned version of Mixtral 8x22B. It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include:
- strong math, coding, and reasoning
- large context length (64k)
- fluency in English, French, Italian, German, and Spanish
See benchmarks on the launch announcement here. #moe
Specifications
Provider
mistralai
Context Length
65.5K
Max Output
—
Modality
Intext
Outtext
Pricing
| Type | Price / 1M tokens |
|---|---|
| Input | $2.00 |
| Output | $6.00 |
| Cache Read | $0.20 |
Quick Start
curl https://api.ominigate.ai/v1/chat/completions \
-H "Authorization: Bearer sk-omg-your-api-key" \
-H "Content-Type: application/json" \
-d '{
"model": "mistralai/mixtral-8x22b-instruct",
"messages": [{"role": "user", "content": "Hello!"}]
}'