Back to Models

Mistral: Mixtral 8x7B Instruct

mistralai/mixtral-8x7b-instruct
Dec 10, 202332.8K context16.4K max output$0.54/M in · $0.54/M outDeprecated

This model has been deprecated

This model was deprecated on 2026-05-07 and is no longer available for API calls. Please migrate to an alternative model.

Description

Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.

Instruct model fine-tuned by Mistral. #moe

Specifications

Provider
mistralai
Context Length
32.8K
Max Output
16.4K
Modality
Intext
Outtext

Pricing

TypePrice / 1M tokens
Input$0.54
Output$0.54

Quick Start

curl https://api.ominigate.ai/v1/chat/completions \
  -H "Authorization: Bearer sk-omg-your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "mistralai/mixtral-8x7b-instruct",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'
Mistral: Mixtral 8x7B Instruct — Pricing & API on OminiGate