Mistral AI’s le Chat is a multilingual, multimodal AI assistant built for both personal productivity and enterprise-grade workflows.
Le Chat responds at up to ~1000 words per second, enabling fluid conversations and rapid iteration. This speed, powered by Mistral’s low‑latency inference, makes brainstorming, Q&A, and coding assistance feel instantaneous.
Le Chat uses Mistral OCR to parse complex PDFs, tables, math, and multilingual text with leading accuracy. Upload manuals, research papers, scans, and spreadsheets to extract structured, actionable insights.
Run sandboxed code for data analysis, visualization, simulations, and algorithm validation directly in chat. Ideal for analysts and engineers who need quick, reproducible computations without switching tools.
Combine model knowledge with balanced web search and AFP journalism to get evidence‑based, cited answers. This reduces hallucinations and increases trust for research, planning, and decision‑making.
Opt‑in Memories help le Chat personalize and recall context. Group conversations into Projects to stay organized, and access le Chat anywhere with native iOS and Android apps.
Prices shown are monthly and exclude taxes. Student discounts are available for Pro. Enterprise tier is in private preview for some deployments. Le Chat offers many core features for free with usage limits; higher tiers increase limits and add admin/security features. Mistral OCR API is listed at ~1000 pages per $ (with ~2× more pages per $ via batch inference).
Mistral AI is a critical partner as we build towards an Agentic‑AI‑Led future. The AI Renewals Agent is just the start of what we can build together with Mistral’s LLMs.
We’re particularly happy to partner with Mistral AI for its strong ability to adapt quickly and drive meaningful results in a highly collaborative way.
With Mistral Large and Codestral, Snowflake is delivering cutting‑edge text‑to‑SQL capabilities for complex and nuanced enterprise data.
Deploying Gen AI models within our infrastructure will ally the latest technology with our strong commitment for security.
Yes. Le Chat Free includes access to Mistral’s latest models, Memories (up to 500), Projects, Connectors directory, Flash Answers with daily limits, image generation, document uploads, and Code Interpreter with limited runs.
Pro raises usage limits and adds capabilities: more messages and web searches, extended Think mode (30×), 5× more Deep Research reports, up to 15 GB document storage, 1,000 Memories, Flash Answers up to ~200/day, unlimited projects, and chat support.
Yes. Free and Pro have daily/usage caps for Flash Answers, image generation, document uploads, and Code Interpreter. Pro significantly increases these limits; Team adds per‑user quotas with admin controls.
Yes. A discounted Pro plan for students is available via the le Chat upgrade page (student verification applies).
Yes. Le Chat Enterprise supports private deployments (on‑prem, VPC, cloud partners), SSO, audit logs, white‑labeling, SCIM, data export, and bespoke integrations—available via custom pricing and expert engagement.
Select capabilities like Mistral OCR are available for self‑hosting on a selective basis. Mistral also supports bespoke, privacy‑first deployments across on‑prem, edge, and VPC environments.
For Mistral OCR, the API is offered at roughly 1000 pages per $ (approximately double the pages per dollar with batch inference). Access via la Plateforme; additional API pricing is available in the console.
Use le Chat for everyday assistance and uploads; it leverages Mistral OCR for complex documents. For developer workflows, use Code Interpreter within le Chat or explore AI Studio and Mistral Code for enterprise coding assistance.
Join thousands of developers who are already using Mistral to enhance their workflow and productivity.
LangChain is an end-to-end agent engineering stack that helps teams build, observe, evaluate, and deploy reliable AI agents.
Anthropic builds Claude, a family of frontier AI models and tools designed to be safe, reliable, and useful for both individuals and organizations.
GroqCloud is a high-performance AI inference platform built to deliver ultra-low latency, predictable cost, and production-grade reliability for real-world applications.
Claude is a next-generation AI assistant from Anthropic designed to help individuals and teams create, code, research, and analyze faster with strong safety and reliability.
Google AI Studio is a developer-focused platform that streamlines the journey from prompt to production with Gemini and other Google AI models.