Files
moltbot/docs/providers/index.md
Vincent Koc d92ba4f8aa feat: Provider/Mistral full support for Mistral on OpenClaw 🇫🇷 (#23845)
* Onboard: add Mistral auth choice and CLI flags

* Onboard/Auth: add Mistral provider config defaults

* Auth choice: wire Mistral API-key flow

* Onboard non-interactive: support --mistral-api-key

* Media understanding: add Mistral Voxtral audio provider

* Changelog: note Mistral onboarding and media support

* Docs: add Mistral provider and onboarding/media references

* Tests: cover Mistral media registry/defaults and auth mapping

* Memory: add Mistral embeddings provider support

* Onboarding: refresh Mistral model metadata

* Docs: document Mistral embeddings and endpoints

* Memory: persist Mistral embedding client state in managers

* Memory: add regressions for mistral provider wiring

* Gateway: add live tool probe retry helper

* Gateway: cover live tool probe retry helper

* Gateway: retry malformed live tool-read probe responses

* Memory: support plain-text batch error bodies

* Tests: add Mistral Voxtral live transcription smoke

* Docs: add Mistral live audio test command

* Revert: remove Mistral live voice test and docs entry

* Onboard: re-export Mistral default model ref from models

* Changelog: credit joeVenner for Mistral work

* fix: include Mistral in auto audio key fallback

* Update CHANGELOG.md

* Update CHANGELOG.md

---------

Co-authored-by: Shakker <shakkerdroid@gmail.com>
2026-02-23 00:03:56 +00:00

2.2 KiB

summary, read_when, title
summary read_when title
Model providers (LLMs) supported by OpenClaw
You want to choose a model provider
You need a quick overview of supported LLM backends
Model Providers

Model Providers

OpenClaw can use many LLM providers. Pick a provider, authenticate, then set the default model as provider/model.

Looking for chat channel docs (WhatsApp/Telegram/Discord/Slack/Mattermost (plugin)/etc.)? See Channels.

Highlight: Venice (Venice AI)

Venice is our recommended Venice AI setup for privacy-first inference with an option to use Opus for hard tasks.

  • Default: venice/llama-3.3-70b
  • Best overall: venice/claude-opus-45 (Opus remains the strongest)

See Venice AI.

Quick start

  1. Authenticate with the provider (usually via openclaw onboard).
  2. Set the default model:
{
  agents: { defaults: { model: { primary: "anthropic/claude-opus-4-6" } } },
}

Provider docs

Transcription providers

Community tools

For the full provider catalog (xAI, Groq, Mistral, etc.) and advanced configuration, see Model providers.