SecureMistral

Mistral Large

Mistral's most capable model for complex enterprise workloads.

Context Window
128K
Input / 1M
$2.00
Output / 1M
$6.00
Tier
Secure
Capabilities
ChatSupported
CodeSupported
VisionNot supported
ReasoningSupported
EmbeddingNot supported
Image GenNot supported
Pricing
Input / 1M$2.00
Output / 1M$6.00
Mistral models — input price
Mistral Large
$2.00
Mistral Small
$0.10
Codestral
$0.30

Provider costs passed through at-rate. Zero markup. All inference encrypted in hardware.

Integration

Drop-in replacement for the OpenAI SDK. Point your base URL to Zima and every request is hardware-encrypted automatically.

TypeScript
1import OpenAI from "openai"
2
3const client = new OpenAI({
4  apiKey: process.env.ZIMA_KEY,
5  baseURL: "https://api.zima.ai/v1",
6})
7
8const response = await client.chat.completions.create({
9  model: "mistral-large",
10  messages: [{ role: "user", content: "Hello" }],
11})
Python
1from openai import OpenAI
2
3client = OpenAI(
4    api_key=os.environ["ZIMA_KEY"],
5    base_url="https://api.zima.ai/v1",
6)
7
8response = client.chat.completions.create(
9    model="mistral-large",
10    messages=[{"role": "user", "content": "Hello"}],
11)
Zima AI