mirror of
https://github.com/moltbot/moltbot.git
synced 2026-04-28 08:52:45 +00:00
docs: prefer setup wizard command
This commit is contained in:
@@ -36,7 +36,7 @@ inside a sandbox workspace under `~/.openclaw/sandboxes`, not your host workspac
|
||||
}
|
||||
```
|
||||
|
||||
`openclaw onboard`, `openclaw configure`, or `openclaw setup` will create the
|
||||
`openclaw setup --wizard`, `openclaw configure`, or `openclaw setup` will create the
|
||||
workspace and seed the bootstrap files if they are missing.
|
||||
Sandbox seed copies only accept regular in-workspace files; symlink/hardlink
|
||||
aliases that resolve outside the source workspace are ignored.
|
||||
|
||||
@@ -15,7 +15,7 @@ For model selection rules, see [/concepts/models](/concepts/models).
|
||||
|
||||
- Model refs use `provider/model` (example: `opencode/claude-opus-4-6`).
|
||||
- If you set `agents.defaults.models`, it becomes the allowlist.
|
||||
- CLI helpers: `openclaw onboard`, `openclaw models list`, `openclaw models set <provider/model>`.
|
||||
- CLI helpers: `openclaw setup --wizard`, `openclaw models list`, `openclaw models set <provider/model>`.
|
||||
- Provider plugins can inject model catalogs via `registerProvider({ catalog })`;
|
||||
OpenClaw merges that output into `models.providers` before writing
|
||||
`models.json`.
|
||||
@@ -130,7 +130,7 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Auth: `OPENAI_API_KEY`
|
||||
- Optional rotation: `OPENAI_API_KEYS`, `OPENAI_API_KEY_1`, `OPENAI_API_KEY_2`, plus `OPENCLAW_LIVE_OPENAI_KEY` (single override)
|
||||
- Example models: `openai/gpt-5.4`, `openai/gpt-5.4-pro`
|
||||
- CLI: `openclaw onboard --auth-choice openai-api-key`
|
||||
- CLI: `openclaw setup --wizard --auth-choice openai-api-key`
|
||||
- Default transport is `auto` (WebSocket-first, SSE fallback)
|
||||
- Override per model via `agents.defaults.models["openai/<model>"].params.transport` (`"sse"`, `"websocket"`, or `"auto"`)
|
||||
- OpenAI Responses WebSocket warm-up defaults to enabled via `params.openaiWsWarmup` (`true`/`false`)
|
||||
@@ -150,7 +150,7 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Auth: `ANTHROPIC_API_KEY` or `claude setup-token`
|
||||
- Optional rotation: `ANTHROPIC_API_KEYS`, `ANTHROPIC_API_KEY_1`, `ANTHROPIC_API_KEY_2`, plus `OPENCLAW_LIVE_ANTHROPIC_KEY` (single override)
|
||||
- Example model: `anthropic/claude-opus-4-6`
|
||||
- CLI: `openclaw onboard --auth-choice token` (paste setup-token) or `openclaw models auth paste-token --provider anthropic`
|
||||
- CLI: `openclaw setup --wizard --auth-choice token` (paste setup-token) or `openclaw models auth paste-token --provider anthropic`
|
||||
- Direct API-key models support the shared `/fast` toggle and `params.fastMode`; OpenClaw maps that to Anthropic `service_tier` (`auto` vs `standard_only`)
|
||||
- Policy note: setup-token support is technical compatibility; Anthropic has blocked some subscription usage outside Claude Code in the past. Verify current Anthropic terms and decide based on your risk tolerance.
|
||||
- Recommendation: Anthropic API key auth is the safer, recommended path over subscription setup-token auth.
|
||||
@@ -166,7 +166,7 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Provider: `openai-codex`
|
||||
- Auth: OAuth (ChatGPT)
|
||||
- Example model: `openai-codex/gpt-5.4`
|
||||
- CLI: `openclaw onboard --auth-choice openai-codex` or `openclaw models auth login --provider openai-codex`
|
||||
- CLI: `openclaw setup --wizard --auth-choice openai-codex` or `openclaw models auth login --provider openai-codex`
|
||||
- Default transport is `auto` (WebSocket-first, SSE fallback)
|
||||
- Override per model via `agents.defaults.models["openai-codex/<model>"].params.transport` (`"sse"`, `"websocket"`, or `"auto"`)
|
||||
- Shares the same `/fast` toggle and `params.fastMode` config as direct `openai/*`
|
||||
@@ -185,7 +185,7 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Zen runtime provider: `opencode`
|
||||
- Go runtime provider: `opencode-go`
|
||||
- Example models: `opencode/claude-opus-4-6`, `opencode-go/kimi-k2.5`
|
||||
- CLI: `openclaw onboard --auth-choice opencode-zen` or `openclaw onboard --auth-choice opencode-go`
|
||||
- CLI: `openclaw setup --wizard --auth-choice opencode-zen` or `openclaw setup --wizard --auth-choice opencode-go`
|
||||
|
||||
```json5
|
||||
{
|
||||
@@ -200,7 +200,7 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Optional rotation: `GEMINI_API_KEYS`, `GEMINI_API_KEY_1`, `GEMINI_API_KEY_2`, `GOOGLE_API_KEY` fallback, and `OPENCLAW_LIVE_GEMINI_KEY` (single override)
|
||||
- Example models: `google/gemini-3.1-pro-preview`, `google/gemini-3-flash-preview`
|
||||
- Compatibility: legacy OpenClaw config using `google/gemini-3.1-flash-preview` is normalized to `google/gemini-3-flash-preview`
|
||||
- CLI: `openclaw onboard --auth-choice gemini-api-key`
|
||||
- CLI: `openclaw setup --wizard --auth-choice gemini-api-key`
|
||||
|
||||
### Google Vertex and Gemini CLI
|
||||
|
||||
@@ -218,7 +218,7 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Provider: `zai`
|
||||
- Auth: `ZAI_API_KEY`
|
||||
- Example model: `zai/glm-5`
|
||||
- CLI: `openclaw onboard --auth-choice zai-api-key`
|
||||
- CLI: `openclaw setup --wizard --auth-choice zai-api-key`
|
||||
- Aliases: `z.ai/*` and `z-ai/*` normalize to `zai/*`
|
||||
|
||||
### Vercel AI Gateway
|
||||
@@ -226,14 +226,14 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Provider: `vercel-ai-gateway`
|
||||
- Auth: `AI_GATEWAY_API_KEY`
|
||||
- Example model: `vercel-ai-gateway/anthropic/claude-opus-4.6`
|
||||
- CLI: `openclaw onboard --auth-choice ai-gateway-api-key`
|
||||
- CLI: `openclaw setup --wizard --auth-choice ai-gateway-api-key`
|
||||
|
||||
### Kilo Gateway
|
||||
|
||||
- Provider: `kilocode`
|
||||
- Auth: `KILOCODE_API_KEY`
|
||||
- Example model: `kilocode/anthropic/claude-opus-4.6`
|
||||
- CLI: `openclaw onboard --kilocode-api-key <key>`
|
||||
- CLI: `openclaw setup --wizard --kilocode-api-key <key>`
|
||||
- Base URL: `https://api.kilo.ai/api/gateway/`
|
||||
- Expanded built-in catalog includes GLM-5 Free, MiniMax M2.5 Free, GPT-5.2, Gemini 3 Pro Preview, Gemini 3 Flash Preview, Grok Code Fast 1, and Kimi K2.5.
|
||||
|
||||
@@ -262,13 +262,13 @@ See [/providers/kilocode](/providers/kilocode) for setup details.
|
||||
- xAI: `xai` (`XAI_API_KEY`)
|
||||
- Mistral: `mistral` (`MISTRAL_API_KEY`)
|
||||
- Example model: `mistral/mistral-large-latest`
|
||||
- CLI: `openclaw onboard --auth-choice mistral-api-key`
|
||||
- CLI: `openclaw setup --wizard --auth-choice mistral-api-key`
|
||||
- Groq: `groq` (`GROQ_API_KEY`)
|
||||
- Cerebras: `cerebras` (`CEREBRAS_API_KEY`)
|
||||
- GLM models on Cerebras use ids `zai-glm-4.7` and `zai-glm-4.6`.
|
||||
- OpenAI-compatible base URL: `https://api.cerebras.ai/v1`.
|
||||
- GitHub Copilot: `github-copilot` (`COPILOT_GITHUB_TOKEN` / `GH_TOKEN` / `GITHUB_TOKEN`)
|
||||
- Hugging Face Inference example model: `huggingface/deepseek-ai/DeepSeek-R1`; CLI: `openclaw onboard --auth-choice huggingface-api-key`. See [Hugging Face (Inference)](/providers/huggingface).
|
||||
- Hugging Face Inference example model: `huggingface/deepseek-ai/DeepSeek-R1`; CLI: `openclaw setup --wizard --auth-choice huggingface-api-key`. See [Hugging Face (Inference)](/providers/huggingface).
|
||||
|
||||
## Providers via `models.providers` (custom/base URL)
|
||||
|
||||
@@ -358,7 +358,7 @@ Volcano Engine (火山引擎) provides access to Doubao and other models in Chin
|
||||
- Provider: `volcengine` (coding: `volcengine-plan`)
|
||||
- Auth: `VOLCANO_ENGINE_API_KEY`
|
||||
- Example model: `volcengine/doubao-seed-1-8-251228`
|
||||
- CLI: `openclaw onboard --auth-choice volcengine-api-key`
|
||||
- CLI: `openclaw setup --wizard --auth-choice volcengine-api-key`
|
||||
|
||||
```json5
|
||||
{
|
||||
@@ -391,7 +391,7 @@ BytePlus ARK provides access to the same models as Volcano Engine for internatio
|
||||
- Provider: `byteplus` (coding: `byteplus-plan`)
|
||||
- Auth: `BYTEPLUS_API_KEY`
|
||||
- Example model: `byteplus/seed-1-8-251228`
|
||||
- CLI: `openclaw onboard --auth-choice byteplus-api-key`
|
||||
- CLI: `openclaw setup --wizard --auth-choice byteplus-api-key`
|
||||
|
||||
```json5
|
||||
{
|
||||
@@ -422,7 +422,7 @@ Synthetic provides Anthropic-compatible models behind the `synthetic` provider:
|
||||
- Provider: `synthetic`
|
||||
- Auth: `SYNTHETIC_API_KEY`
|
||||
- Example model: `synthetic/hf:MiniMaxAI/MiniMax-M2.5`
|
||||
- CLI: `openclaw onboard --auth-choice synthetic-api-key`
|
||||
- CLI: `openclaw setup --wizard --auth-choice synthetic-api-key`
|
||||
|
||||
```json5
|
||||
{
|
||||
@@ -476,7 +476,7 @@ ollama pull llama3.3
|
||||
|
||||
Ollama is detected locally at `http://127.0.0.1:11434` when you opt in with
|
||||
`OLLAMA_API_KEY`, and the bundled provider plugin adds Ollama directly to
|
||||
`openclaw onboard` and the model picker. See [/providers/ollama](/providers/ollama)
|
||||
`openclaw setup --wizard` and the model picker. See [/providers/ollama](/providers/ollama)
|
||||
for onboarding, cloud/local mode, and custom configuration.
|
||||
|
||||
### vLLM
|
||||
@@ -586,7 +586,7 @@ Notes:
|
||||
## CLI examples
|
||||
|
||||
```bash
|
||||
openclaw onboard --auth-choice opencode-zen
|
||||
openclaw setup --wizard --auth-choice opencode-zen
|
||||
openclaw models set opencode/claude-opus-4-6
|
||||
openclaw models list
|
||||
```
|
||||
|
||||
@@ -39,7 +39,7 @@ Related:
|
||||
If you don’t want to hand-edit config, run the setup wizard:
|
||||
|
||||
```bash
|
||||
openclaw onboard
|
||||
openclaw setup --wizard
|
||||
```
|
||||
|
||||
It can set up model + auth for common providers, including **OpenAI Code (Codex)
|
||||
|
||||
@@ -92,7 +92,7 @@ Flow shape:
|
||||
2. paste the token into OpenClaw
|
||||
3. store as a token auth profile (no refresh)
|
||||
|
||||
The wizard path is `openclaw onboard` → auth choice `setup-token` (Anthropic).
|
||||
The wizard path is `openclaw setup --wizard` → auth choice `setup-token` (Anthropic).
|
||||
|
||||
### OpenAI Codex (ChatGPT OAuth)
|
||||
|
||||
@@ -107,7 +107,7 @@ Flow shape (PKCE):
|
||||
5. exchange at `https://auth.openai.com/oauth/token`
|
||||
6. extract `accountId` from the access token and store `{ access, refresh, expires, accountId }`
|
||||
|
||||
Wizard path is `openclaw onboard` → auth choice `openai-codex`.
|
||||
Wizard path is `openclaw setup --wizard` → auth choice `openai-codex`.
|
||||
|
||||
## Refresh + expiry
|
||||
|
||||
|
||||
Reference in New Issue
Block a user