docs: restore onboard docs references

This commit is contained in:
Peter Steinberger
2026-03-16 05:50:48 +00:00
parent 2acbea0da7
commit f9e185887f
64 changed files with 328 additions and 219 deletions

View File

@@ -19,11 +19,11 @@ Create your API key in the Anthropic Console.
### CLI setup
```bash
openclaw setup --wizard
openclaw onboard
# choose: Anthropic API key
# or non-interactive
openclaw setup --wizard --anthropic-api-key "$ANTHROPIC_API_KEY"
openclaw onboard --anthropic-api-key "$ANTHROPIC_API_KEY"
```
### Config snippet
@@ -214,7 +214,7 @@ openclaw models auth paste-token --provider anthropic
```bash
# Paste a setup-token during setup
openclaw setup --wizard --auth-choice setup-token
openclaw onboard --auth-choice setup-token
```
### Config snippet (setup-token)

View File

@@ -22,7 +22,7 @@ For Anthropic models, use your Anthropic API key.
1. Set the provider API key and Gateway details:
```bash
openclaw setup --wizard --auth-choice cloudflare-ai-gateway-api-key
openclaw onboard --auth-choice cloudflare-ai-gateway-api-key
```
2. Set a default model:
@@ -40,7 +40,7 @@ openclaw setup --wizard --auth-choice cloudflare-ai-gateway-api-key
## Non-interactive example
```bash
openclaw setup --wizard --non-interactive \
openclaw onboard --non-interactive \
--mode local \
--auth-choice cloudflare-ai-gateway-api-key \
--cloudflare-ai-gateway-account-id "your-account-id" \

View File

@@ -15,16 +15,16 @@ models are accessed via the `zai` provider and model IDs like `zai/glm-5`.
```bash
# Coding Plan Global, recommended for Coding Plan users
openclaw setup --wizard --auth-choice zai-coding-global
openclaw onboard --auth-choice zai-coding-global
# Coding Plan CN (China region), recommended for Coding Plan users
openclaw setup --wizard --auth-choice zai-coding-cn
openclaw onboard --auth-choice zai-coding-cn
# General API
openclaw setup --wizard --auth-choice zai-global
openclaw onboard --auth-choice zai-global
# General API CN (China region)
openclaw setup --wizard --auth-choice zai-cn
openclaw onboard --auth-choice zai-cn
```
## Config snippet

View File

@@ -21,7 +21,7 @@ title: "Hugging Face (Inference)"
2. Run onboarding and choose **Hugging Face** in the provider dropdown, then enter your API key when prompted:
```bash
openclaw setup --wizard --auth-choice huggingface-api-key
openclaw onboard --auth-choice huggingface-api-key
```
3. In the **Default Hugging Face model** dropdown, pick the model you want (the list is loaded from the Inference API when you have a valid token; otherwise a built-in list is shown). Your choice is saved as the default model.
@@ -40,7 +40,7 @@ openclaw setup --wizard --auth-choice huggingface-api-key
## Non-interactive example
```bash
openclaw setup --wizard --non-interactive \
openclaw onboard --non-interactive \
--mode local \
--auth-choice huggingface-api-key \
--huggingface-api-key "$HF_TOKEN"

View File

@@ -15,7 +15,7 @@ Looking for chat channel docs (WhatsApp/Telegram/Discord/Slack/Mattermost (plugi
## Quick start
1. Authenticate with the provider (usually via `openclaw setup --wizard`).
1. Authenticate with the provider (usually via `openclaw onboard`).
2. Set the default model:
```json5

View File

@@ -19,7 +19,7 @@ endpoint and API key. It is OpenAI-compatible, so most OpenAI SDKs work by switc
## CLI setup
```bash
openclaw setup --wizard --kilocode-api-key <key>
openclaw onboard --kilocode-api-key <key>
```
Or set the environment variable:

View File

@@ -22,7 +22,7 @@ read_when:
### Via onboarding
```bash
openclaw setup --wizard --auth-choice litellm-api-key
openclaw onboard --auth-choice litellm-api-key
```
### Manual setup

View File

@@ -44,7 +44,7 @@ Enable the bundled OAuth plugin and authenticate:
```bash
openclaw plugins enable minimax # skip if already loaded.
openclaw gateway restart # restart if gateway is already running
openclaw setup --wizard --auth-choice minimax-portal
openclaw onboard --auth-choice minimax-portal
```
You will be prompted to select an endpoint:

View File

@@ -15,9 +15,9 @@ Mistral can also be used for memory embeddings (`memorySearch.provider = "mistra
## CLI setup
```bash
openclaw setup --wizard --auth-choice mistral-api-key
openclaw onboard --auth-choice mistral-api-key
# or non-interactive
openclaw setup --wizard --mistral-api-key "$MISTRAL_API_KEY"
openclaw onboard --mistral-api-key "$MISTRAL_API_KEY"
```
## Config snippet (LLM provider)

View File

@@ -13,7 +13,7 @@ model as `provider/model`.
## Quick start (two steps)
1. Authenticate with the provider (usually via `openclaw setup --wizard`).
1. Authenticate with the provider (usually via `openclaw onboard`).
2. Set the default model:
```json5

View File

@@ -26,13 +26,13 @@ Current Kimi K2 model IDs:
[//]: # "moonshot-kimi-k2-ids:end"
```bash
openclaw setup --wizard --auth-choice moonshot-api-key
openclaw onboard --auth-choice moonshot-api-key
```
Kimi Coding:
```bash
openclaw setup --wizard --auth-choice kimi-code-api-key
openclaw onboard --auth-choice kimi-code-api-key
```
Note: Moonshot and Kimi Coding are separate providers. Keys are not interchangeable, endpoints differ, and model refs differ (Moonshot uses `moonshot/...`, Kimi Coding uses `kimi-coding/...`).

View File

@@ -16,7 +16,7 @@ Export the key once, then run onboarding and set an NVIDIA model:
```bash
export NVIDIA_API_KEY="nvapi-..."
openclaw setup --wizard --auth-choice skip
openclaw onboard --auth-choice skip
openclaw models set nvidia/nvidia/llama-3.1-nemotron-70b-instruct
```

View File

@@ -21,7 +21,7 @@ Ollama is a local LLM runtime that makes it easy to run open-source models on yo
The fastest way to set up Ollama is through the setup wizard:
```bash
openclaw setup --wizard
openclaw onboard
```
Select **Ollama** from the provider list. The wizard will:
@@ -35,7 +35,7 @@ Select **Ollama** from the provider list. The wizard will:
Non-interactive mode is also supported:
```bash
openclaw setup --wizard --non-interactive \
openclaw onboard --non-interactive \
--auth-choice ollama \
--accept-risk
```
@@ -43,7 +43,7 @@ openclaw setup --wizard --non-interactive \
Optionally specify a custom base URL or model:
```bash
openclaw setup --wizard --non-interactive \
openclaw onboard --non-interactive \
--auth-choice ollama \
--custom-base-url "http://ollama-host:11434" \
--custom-model-id "qwen3.5:27b" \
@@ -73,7 +73,7 @@ ollama signin
4. Run onboarding and choose `Ollama`:
```bash
openclaw setup --wizard
openclaw onboard
```
- `Local`: local models only

View File

@@ -20,9 +20,9 @@ Get your API key from the OpenAI dashboard.
### CLI setup
```bash
openclaw setup --wizard --auth-choice openai-api-key
openclaw onboard --auth-choice openai-api-key
# or non-interactive
openclaw setup --wizard --openai-api-key "$OPENAI_API_KEY"
openclaw onboard --openai-api-key "$OPENAI_API_KEY"
```
### Config snippet
@@ -52,7 +52,7 @@ Codex cloud requires ChatGPT sign-in, while the Codex CLI supports ChatGPT or AP
```bash
# Run Codex OAuth in the wizard
openclaw setup --wizard --auth-choice openai-codex
openclaw onboard --auth-choice openai-codex
# Or run OAuth directly
openclaw models auth login --provider openai-codex

View File

@@ -21,9 +21,9 @@ provider id `opencode-go` so upstream per-model routing stays correct.
## CLI setup
```bash
openclaw setup --wizard --auth-choice opencode-go
openclaw onboard --auth-choice opencode-go
# or non-interactive
openclaw setup --wizard --opencode-go-api-key "$OPENCODE_API_KEY"
openclaw onboard --opencode-go-api-key "$OPENCODE_API_KEY"
```
## Config snippet

View File

@@ -22,15 +22,15 @@ as one OpenCode setup.
### Zen catalog
```bash
openclaw setup --wizard --auth-choice opencode-zen
openclaw setup --wizard --opencode-zen-api-key "$OPENCODE_API_KEY"
openclaw onboard --auth-choice opencode-zen
openclaw onboard --opencode-zen-api-key "$OPENCODE_API_KEY"
```
### Go catalog
```bash
openclaw setup --wizard --auth-choice opencode-go
openclaw setup --wizard --opencode-go-api-key "$OPENCODE_API_KEY"
openclaw onboard --auth-choice opencode-go
openclaw onboard --opencode-go-api-key "$OPENCODE_API_KEY"
```
## Config snippet

View File

@@ -14,7 +14,7 @@ endpoint and API key. It is OpenAI-compatible, so most OpenAI SDKs work by switc
## CLI setup
```bash
openclaw setup --wizard --auth-choice apiKey --token-provider openrouter --token "$OPENROUTER_API_KEY"
openclaw onboard --auth-choice apiKey --token-provider openrouter --token "$OPENROUTER_API_KEY"
```
## Config snippet

View File

@@ -27,7 +27,7 @@ endpoint and API key. It is OpenAI-compatible, so most OpenAI SDKs work by switc
## CLI setup
```bash
openclaw setup --wizard --auth-choice qianfan-api-key
openclaw onboard --auth-choice qianfan-api-key
```
## Related Documentation

View File

@@ -33,7 +33,7 @@ export SGLANG_API_KEY="sglang-local"
3. Run onboarding and choose `SGLang`, or set a model directly:
```bash
openclaw setup --wizard
openclaw onboard
```
```json5

View File

@@ -17,7 +17,7 @@ Synthetic exposes Anthropic-compatible endpoints. OpenClaw registers it as the
2. Run onboarding:
```bash
openclaw setup --wizard --auth-choice synthetic-api-key
openclaw onboard --auth-choice synthetic-api-key
```
The default model is set to:

View File

@@ -18,7 +18,7 @@ The [Together AI](https://together.ai) provides access to leading open-source mo
1. Set the API key (recommended: store it for the Gateway):
```bash
openclaw setup --wizard --auth-choice together-api-key
openclaw onboard --auth-choice together-api-key
```
2. Set a default model:
@@ -36,7 +36,7 @@ openclaw setup --wizard --auth-choice together-api-key
## Non-interactive example
```bash
openclaw setup --wizard --non-interactive \
openclaw onboard --non-interactive \
--mode local \
--auth-choice together-api-key \
--together-api-key "$TOGETHER_API_KEY"

View File

@@ -58,7 +58,7 @@ export VENICE_API_KEY="vapi_xxxxxxxxxxxx"
**Option B: Interactive Setup (Recommended)**
```bash
openclaw setup --wizard --auth-choice venice-api-key
openclaw onboard --auth-choice venice-api-key
```
This will:
@@ -71,7 +71,7 @@ This will:
**Option C: Non-interactive**
```bash
openclaw setup --wizard --non-interactive \
openclaw onboard --non-interactive \
--auth-choice venice-api-key \
--venice-api-key "vapi_xxxxxxxxxxxx"
```

View File

@@ -21,7 +21,7 @@ The [Vercel AI Gateway](https://vercel.com/ai-gateway) provides a unified API to
1. Set the API key (recommended: store it for the Gateway):
```bash
openclaw setup --wizard --auth-choice ai-gateway-api-key
openclaw onboard --auth-choice ai-gateway-api-key
```
2. Set a default model:
@@ -39,7 +39,7 @@ openclaw setup --wizard --auth-choice ai-gateway-api-key
## Non-interactive example
```bash
openclaw setup --wizard --non-interactive \
openclaw onboard --non-interactive \
--mode local \
--auth-choice ai-gateway-api-key \
--ai-gateway-api-key "$AI_GATEWAY_API_KEY"

View File

@@ -22,9 +22,9 @@ the `xiaomi` provider with a Xiaomi MiMo API key.
## CLI setup
```bash
openclaw setup --wizard --auth-choice xiaomi-api-key
openclaw onboard --auth-choice xiaomi-api-key
# or non-interactive
openclaw setup --wizard --auth-choice xiaomi-api-key --xiaomi-api-key "$XIAOMI_API_KEY"
openclaw onboard --auth-choice xiaomi-api-key --xiaomi-api-key "$XIAOMI_API_KEY"
```
## Config snippet

View File

@@ -16,16 +16,16 @@ with a Z.AI API key.
```bash
# Coding Plan Global, recommended for Coding Plan users
openclaw setup --wizard --auth-choice zai-coding-global
openclaw onboard --auth-choice zai-coding-global
# Coding Plan CN (China region), recommended for Coding Plan users
openclaw setup --wizard --auth-choice zai-coding-cn
openclaw onboard --auth-choice zai-coding-cn
# General API
openclaw setup --wizard --auth-choice zai-global
openclaw onboard --auth-choice zai-global
# General API CN (China region)
openclaw setup --wizard --auth-choice zai-cn
openclaw onboard --auth-choice zai-cn
```
## Config snippet