mirror of
https://github.com/moltbot/moltbot.git
synced 2026-05-13 15:47:28 +00:00
* docs: reorganize codex harness docs * docs: address codex harness review findings * docs: move codex references to reference nav * docs: add codex topic configuration section
515 lines
19 KiB
Markdown
515 lines
19 KiB
Markdown
---
|
|
summary: "Run OpenClaw embedded agent turns through the bundled Codex app-server harness"
|
|
title: "Codex harness"
|
|
read_when:
|
|
- You want to use the bundled Codex app-server harness
|
|
- You need Codex harness config examples
|
|
- You want Codex-only deployments to fail instead of falling back to PI
|
|
---
|
|
|
|
The bundled `codex` plugin lets OpenClaw run embedded OpenAI agent turns
|
|
through Codex app-server instead of the built-in PI harness.
|
|
|
|
Use the Codex harness when you want Codex to own the low-level agent session:
|
|
native thread resume, native tool continuation, native compaction, and
|
|
app-server execution. OpenClaw still owns chat channels, session files, model
|
|
selection, OpenClaw dynamic tools, approvals, media delivery, and the visible
|
|
transcript mirror.
|
|
|
|
The normal setup uses canonical OpenAI model refs such as `openai/gpt-5.5`.
|
|
Do not configure `openai-codex/gpt-*` model refs. `openai-codex` is the auth
|
|
profile provider for Codex OAuth or Codex API-key profiles, not the model
|
|
provider prefix for new agent config.
|
|
|
|
For the broader model/provider/runtime split, start with
|
|
[Agent runtimes](/concepts/agent-runtimes). The short version is:
|
|
`openai/gpt-5.5` is the model ref, `codex` is the runtime, and Telegram,
|
|
Discord, Slack, or another channel remains the communication surface.
|
|
|
|
## Requirements
|
|
|
|
- OpenClaw with the bundled `codex` plugin available.
|
|
- If your config uses `plugins.allow`, include `codex`.
|
|
- Codex app-server `0.125.0` or newer. The bundled plugin manages a compatible
|
|
Codex app-server binary by default, so local `codex` commands on `PATH` do not
|
|
affect normal harness startup.
|
|
- Codex auth available through `openclaw models auth login --provider openai-codex`,
|
|
an app-server account in the agent's Codex home, or an explicit Codex API-key
|
|
auth profile.
|
|
|
|
For auth precedence, environment isolation, custom app-server commands, model
|
|
discovery, and all config fields, see
|
|
[Codex harness reference](/plugins/codex-harness-reference).
|
|
|
|
## Quickstart
|
|
|
|
Most users who want Codex in OpenClaw want this path: sign in with a
|
|
ChatGPT/Codex subscription, enable the bundled `codex` plugin, and use a
|
|
canonical `openai/gpt-*` model ref.
|
|
|
|
Sign in with Codex OAuth:
|
|
|
|
```bash
|
|
openclaw models auth login --provider openai-codex
|
|
```
|
|
|
|
Enable the bundled `codex` plugin and select an OpenAI agent model:
|
|
|
|
```json5
|
|
{
|
|
plugins: {
|
|
entries: {
|
|
codex: {
|
|
enabled: true,
|
|
},
|
|
},
|
|
},
|
|
agents: {
|
|
defaults: {
|
|
model: "openai/gpt-5.5",
|
|
},
|
|
},
|
|
}
|
|
```
|
|
|
|
If your config uses `plugins.allow`, add `codex` there too:
|
|
|
|
```json5
|
|
{
|
|
plugins: {
|
|
allow: ["codex"],
|
|
entries: {
|
|
codex: {
|
|
enabled: true,
|
|
},
|
|
},
|
|
},
|
|
}
|
|
```
|
|
|
|
Restart the gateway after changing plugin config. If an existing chat already
|
|
has a session, use `/new` or `/reset` before testing runtime changes so the next
|
|
turn resolves the harness from current config.
|
|
|
|
## Configuration
|
|
|
|
The quickstart config is the minimum viable Codex harness config. Set Codex
|
|
harness options in OpenClaw config, and use the CLI only for Codex auth:
|
|
|
|
| Need | Set | Where |
|
|
| -------------------------------------- | ------------------------------------------------------------------ | ------------------------------ |
|
|
| Enable the harness | `plugins.entries.codex.enabled: true` | OpenClaw config |
|
|
| Keep an allowlisted plugin install | Include `codex` in `plugins.allow` | OpenClaw config |
|
|
| Route OpenAI agent turns through Codex | `agents.defaults.model` or `agents.list[].model` as `openai/gpt-*` | OpenClaw agent config |
|
|
| Sign in with Codex OAuth | `openclaw models auth login --provider openai-codex` | CLI auth profile |
|
|
| Fail closed when Codex is unavailable | Provider or model `agentRuntime.id: "codex"` | OpenClaw model/provider config |
|
|
| Use direct OpenAI API traffic | Provider or model `agentRuntime.id: "pi"` with normal OpenAI auth | OpenClaw model/provider config |
|
|
| Tune app-server behavior | `plugins.entries.codex.config.appServer.*` | Codex plugin config |
|
|
| Enable native Codex plugin apps | `plugins.entries.codex.config.codexPlugins.*` | Codex plugin config |
|
|
| Enable Codex Computer Use | `plugins.entries.codex.config.computerUse.*` | Codex plugin config |
|
|
|
|
Use `openai/gpt-*` model refs for Codex-backed OpenAI agent turns.
|
|
`openai-codex` is only the auth-profile provider name for Codex OAuth and
|
|
Codex API-key profiles. Do not write new `openai-codex/gpt-*` model refs.
|
|
|
|
The rest of this page covers common variants users must choose between:
|
|
deployment shape, fail-closed routing, guardian approval policy, native Codex
|
|
plugins, and Computer Use. For full option lists, defaults, enums, discovery,
|
|
environment isolation, timeouts, and app-server transport fields, see
|
|
[Codex harness reference](/plugins/codex-harness-reference).
|
|
|
|
## Verify Codex runtime
|
|
|
|
Use `/status` in the chat where you expect Codex. A Codex-backed OpenAI agent
|
|
turn shows:
|
|
|
|
```text
|
|
Runtime: OpenAI Codex
|
|
```
|
|
|
|
Then check Codex app-server state:
|
|
|
|
```text
|
|
/codex status
|
|
/codex models
|
|
```
|
|
|
|
`/codex status` reports app-server connectivity, account, rate limits, MCP
|
|
servers, and skills. `/codex models` lists the live Codex app-server catalog for
|
|
the harness and account. If `/status` is surprising, see
|
|
[Troubleshooting](#troubleshooting).
|
|
|
|
## Routing and model selection
|
|
|
|
Keep provider refs and runtime policy separate:
|
|
|
|
- Use `openai/gpt-*` for OpenAI agent turns through Codex.
|
|
- Do not use `openai-codex/gpt-*` in config. Run `openclaw doctor --fix` to
|
|
repair legacy refs and stale session route pins.
|
|
- `agentRuntime.id: "codex"` is optional for normal OpenAI auto mode, but useful
|
|
when a deployment should fail closed if Codex is unavailable.
|
|
- `agentRuntime.id: "pi"` opts a provider or model into direct PI behavior when
|
|
that is intentional.
|
|
- `/codex ...` controls native Codex app-server conversations from chat.
|
|
- ACP/acpx is a separate external harness path. Use it only when the user asks
|
|
for ACP/acpx or an external harness adapter.
|
|
|
|
Common command routing:
|
|
|
|
| User intent | Use |
|
|
| ------------------------------- | --------------------------------------- |
|
|
| Attach the current chat | `/codex bind [--cwd <path>]` |
|
|
| Resume an existing Codex thread | `/codex resume <thread-id>` |
|
|
| List or filter Codex threads | `/codex threads [filter]` |
|
|
| Send Codex feedback only | `/codex diagnostics [note]` |
|
|
| Start an ACP/acpx task | ACP/acpx session commands, not `/codex` |
|
|
|
|
| Use case | Configure | Verify | Notes |
|
|
| ---------------------------------------------------- | ---------------------------------------------------------------- | --------------------------------------- | ---------------------------------- |
|
|
| ChatGPT/Codex subscription with native Codex runtime | `openai/gpt-*` plus enabled `codex` plugin | `/status` shows `Runtime: OpenAI Codex` | Recommended path |
|
|
| Fail closed if Codex is unavailable | Provider or model `agentRuntime.id: "codex"` | Turn fails instead of PI fallback | Use for Codex-only deployments |
|
|
| Direct OpenAI API-key traffic through PI | Provider or model `agentRuntime.id: "pi"` and normal OpenAI auth | `/status` shows PI runtime | Use only when PI is intentional |
|
|
| Legacy config | `openai-codex/gpt-*` | `openclaw doctor --fix` rewrites it | Do not write new config this way |
|
|
| ACP/acpx Codex adapter | ACP `sessions_spawn({ runtime: "acp" })` | ACP task/session status | Separate from native Codex harness |
|
|
|
|
`agents.defaults.imageModel` follows the same prefix split. Use `openai/gpt-*`
|
|
for the normal OpenAI route and `codex/gpt-*` only when image understanding
|
|
should run through a bounded Codex app-server turn. Do not use
|
|
`openai-codex/gpt-*`; doctor rewrites that legacy prefix to `openai/gpt-*`.
|
|
|
|
## Deployment patterns
|
|
|
|
### Basic Codex deployment
|
|
|
|
Use the quickstart config when all OpenAI agent turns should use Codex by
|
|
default.
|
|
|
|
```json5
|
|
{
|
|
plugins: {
|
|
entries: {
|
|
codex: {
|
|
enabled: true,
|
|
},
|
|
},
|
|
},
|
|
agents: {
|
|
defaults: {
|
|
model: "openai/gpt-5.5",
|
|
},
|
|
},
|
|
}
|
|
```
|
|
|
|
### Mixed provider deployment
|
|
|
|
This shape keeps Claude as the default agent and adds a named Codex agent:
|
|
|
|
```json5
|
|
{
|
|
plugins: {
|
|
entries: {
|
|
codex: {
|
|
enabled: true,
|
|
},
|
|
},
|
|
},
|
|
agents: {
|
|
defaults: {
|
|
model: "anthropic/claude-opus-4-6",
|
|
},
|
|
list: [
|
|
{
|
|
id: "main",
|
|
default: true,
|
|
model: "anthropic/claude-opus-4-6",
|
|
},
|
|
{
|
|
id: "codex",
|
|
name: "Codex",
|
|
model: "openai/gpt-5.5",
|
|
},
|
|
],
|
|
},
|
|
}
|
|
```
|
|
|
|
With this config, the `main` agent uses its normal provider path and the
|
|
`codex` agent uses Codex app-server.
|
|
|
|
### Fail-closed Codex deployment
|
|
|
|
For OpenAI agent turns, `openai/gpt-*` already resolves to Codex when the
|
|
bundled plugin is available. Add explicit runtime policy when you want a written
|
|
fail-closed rule:
|
|
|
|
```json5
|
|
{
|
|
models: {
|
|
providers: {
|
|
openai: {
|
|
agentRuntime: {
|
|
id: "codex",
|
|
},
|
|
},
|
|
},
|
|
},
|
|
agents: {
|
|
defaults: {
|
|
model: "openai/gpt-5.5",
|
|
},
|
|
},
|
|
plugins: {
|
|
entries: {
|
|
codex: {
|
|
enabled: true,
|
|
},
|
|
},
|
|
},
|
|
}
|
|
```
|
|
|
|
With Codex forced, OpenClaw fails early if the Codex plugin is disabled, the
|
|
app-server is too old, or the app-server cannot start.
|
|
|
|
## App-server policy
|
|
|
|
By default, the plugin starts OpenClaw's managed Codex binary locally with stdio
|
|
transport. Set `appServer.command` only when you intentionally want to run a
|
|
different executable. Use WebSocket transport only when an app-server is already
|
|
running elsewhere:
|
|
|
|
```json5
|
|
{
|
|
plugins: {
|
|
entries: {
|
|
codex: {
|
|
enabled: true,
|
|
config: {
|
|
appServer: {
|
|
transport: "websocket",
|
|
url: "ws://gateway-host:39175",
|
|
authToken: "${CODEX_APP_SERVER_TOKEN}",
|
|
},
|
|
},
|
|
},
|
|
},
|
|
},
|
|
}
|
|
```
|
|
|
|
Local stdio app-server sessions default to the trusted local operator posture:
|
|
`approvalPolicy: "never"`, `approvalsReviewer: "user"`, and
|
|
`sandbox: "danger-full-access"`. If local Codex requirements disallow that
|
|
implicit YOLO posture, OpenClaw selects allowed guardian permissions instead.
|
|
|
|
Use guardian mode when you want Codex native auto-review before sandbox escapes
|
|
or extra permissions:
|
|
|
|
```json5
|
|
{
|
|
plugins: {
|
|
entries: {
|
|
codex: {
|
|
enabled: true,
|
|
config: {
|
|
appServer: {
|
|
mode: "guardian",
|
|
serviceTier: "priority",
|
|
},
|
|
},
|
|
},
|
|
},
|
|
},
|
|
}
|
|
```
|
|
|
|
Guardian mode expands to Codex app-server approvals, usually
|
|
`approvalPolicy: "on-request"`, `approvalsReviewer: "auto_review"`, and
|
|
`sandbox: "workspace-write"` when the local requirements allow those values.
|
|
|
|
For every app-server field, auth order, environment isolation, discovery, and
|
|
timeout behavior, see [Codex harness reference](/plugins/codex-harness-reference).
|
|
|
|
## Commands and diagnostics
|
|
|
|
The bundled plugin registers `/codex` as a slash command on any channel that
|
|
supports OpenClaw text commands.
|
|
|
|
Common forms:
|
|
|
|
- `/codex status` checks app-server connectivity, models, account, rate limits,
|
|
MCP servers, and skills.
|
|
- `/codex models` lists live Codex app-server models.
|
|
- `/codex threads [filter]` lists recent Codex app-server threads.
|
|
- `/codex resume <thread-id>` attaches the current OpenClaw session to an
|
|
existing Codex thread.
|
|
- `/codex compact` asks Codex app-server to compact the attached thread.
|
|
- `/codex review` starts Codex native review for the attached thread.
|
|
- `/codex diagnostics [note]` asks before sending Codex feedback for the
|
|
attached thread.
|
|
- `/codex account` shows account and rate-limit status.
|
|
- `/codex mcp` lists Codex app-server MCP server status.
|
|
- `/codex skills` lists Codex app-server skills.
|
|
|
|
For most support reports, start with `/diagnostics [note]` in the conversation
|
|
where the bug happened. It creates one Gateway diagnostics report and, for Codex
|
|
harness sessions, asks for approval to send the relevant Codex feedback bundle.
|
|
See [Diagnostics export](/gateway/diagnostics) for the privacy model and group
|
|
chat behavior.
|
|
|
|
Use `/codex diagnostics [note]` only when you specifically want the Codex
|
|
feedback upload for the currently attached thread without the full Gateway
|
|
diagnostics bundle.
|
|
|
|
### Inspect Codex threads locally
|
|
|
|
The fastest way to inspect a bad Codex run is often to open the native Codex
|
|
thread directly:
|
|
|
|
```bash
|
|
codex resume <thread-id>
|
|
```
|
|
|
|
Get the thread id from the completed `/diagnostics` reply, `/codex binding`, or
|
|
`/codex threads [filter]`.
|
|
|
|
For upload mechanics and runtime-level diagnostics boundaries, see
|
|
[Codex harness runtime](/plugins/codex-harness-runtime#codex-feedback-upload).
|
|
|
|
## Native Codex plugins
|
|
|
|
Native Codex plugin support uses Codex app-server's own app and plugin
|
|
capabilities in the same Codex thread as the OpenClaw harness turn. OpenClaw
|
|
does not translate Codex plugins into synthetic `codex_plugin_*` OpenClaw
|
|
dynamic tools.
|
|
|
|
`codexPlugins` affects only sessions that select the native Codex harness. It
|
|
has no effect on PI runs, normal OpenAI provider runs, ACP conversation
|
|
bindings, or other harnesses.
|
|
|
|
Minimal migrated config:
|
|
|
|
```json5
|
|
{
|
|
plugins: {
|
|
entries: {
|
|
codex: {
|
|
enabled: true,
|
|
config: {
|
|
codexPlugins: {
|
|
enabled: true,
|
|
allow_destructive_actions: false,
|
|
plugins: {
|
|
"google-calendar": {
|
|
enabled: true,
|
|
marketplaceName: "openai-curated",
|
|
pluginName: "google-calendar",
|
|
},
|
|
},
|
|
},
|
|
},
|
|
},
|
|
},
|
|
},
|
|
}
|
|
```
|
|
|
|
Thread app config is computed when OpenClaw establishes a Codex harness session
|
|
or replaces a stale Codex thread binding. It is not recomputed on every turn.
|
|
After changing `codexPlugins`, use `/new`, `/reset`, or restart the gateway so
|
|
future Codex harness sessions start with the updated app set.
|
|
|
|
For migration eligibility, app inventory, destructive action policy,
|
|
elicitations, and native plugin diagnostics, see
|
|
[Native Codex plugins](/plugins/codex-native-plugins).
|
|
|
|
## Computer Use
|
|
|
|
Computer Use is covered in its own setup guide:
|
|
[Codex Computer Use](/plugins/codex-computer-use).
|
|
|
|
The short version: OpenClaw does not vendor the desktop-control app or execute
|
|
desktop actions itself. It prepares Codex app-server, verifies that the
|
|
`computer-use` MCP server is available, and then lets Codex own the native MCP
|
|
tool calls during Codex-mode turns.
|
|
|
|
## Runtime boundaries
|
|
|
|
The Codex harness changes the low-level embedded agent executor only.
|
|
|
|
- OpenClaw dynamic tools are supported. Codex asks OpenClaw to execute those
|
|
tools, so OpenClaw remains in the execution path.
|
|
- Codex-native shell, patch, MCP, and native app tools are owned by Codex.
|
|
OpenClaw can observe or block selected native events through the supported
|
|
relay, but it does not rewrite native tool arguments.
|
|
- Codex owns native compaction. OpenClaw keeps a transcript mirror for channel
|
|
history, search, `/new`, `/reset`, and future model or harness switching.
|
|
- Media generation, media understanding, TTS, approvals, and messaging-tool
|
|
output continue through the matching OpenClaw provider/model settings.
|
|
- `tool_result_persist` applies to OpenClaw-owned transcript tool results, not
|
|
Codex-native tool result records.
|
|
|
|
For hook layers, supported V1 surfaces, native permission handling, queue
|
|
steering, Codex feedback upload mechanics, and compaction details, see
|
|
[Codex harness runtime](/plugins/codex-harness-runtime).
|
|
|
|
## Troubleshooting
|
|
|
|
**Codex does not appear as a normal `/model` provider:** that is expected for
|
|
new configs. Select an `openai/gpt-*` model, enable
|
|
`plugins.entries.codex.enabled`, and check whether `plugins.allow` excludes
|
|
`codex`.
|
|
|
|
**OpenClaw uses PI instead of Codex:** make sure the model ref is
|
|
`openai/gpt-*` on the official OpenAI provider and that the Codex plugin is
|
|
installed and enabled. If you need strict proof while testing, set provider or
|
|
model `agentRuntime.id: "codex"`. A forced Codex runtime fails instead of
|
|
falling back to PI.
|
|
|
|
**Legacy `openai-codex/*` config remains:** run `openclaw doctor --fix`.
|
|
Doctor rewrites legacy model refs to `openai/*`, removes stale session and
|
|
whole-agent runtime pins, and preserves existing auth-profile overrides.
|
|
|
|
**The app-server is rejected:** use Codex app-server `0.125.0` or newer.
|
|
Same-version prereleases or build-suffixed versions such as
|
|
`0.125.0-alpha.2` or `0.125.0+custom` are rejected because OpenClaw tests the
|
|
stable `0.125.0` protocol floor.
|
|
|
|
**`/codex status` cannot connect:** check that the bundled `codex` plugin is
|
|
enabled, that `plugins.allow` includes it when an allowlist is configured, and
|
|
that any custom `appServer.command`, `url`, `authToken`, or headers are valid.
|
|
|
|
**Model discovery is slow:** lower
|
|
`plugins.entries.codex.config.discovery.timeoutMs` or disable discovery. See
|
|
[Codex harness reference](/plugins/codex-harness-reference#model-discovery).
|
|
|
|
**WebSocket transport fails immediately:** check `appServer.url`, `authToken`,
|
|
headers, and that the remote app-server speaks the same Codex app-server
|
|
protocol version.
|
|
|
|
**A non-Codex model uses PI:** that is expected unless provider or model runtime
|
|
policy routes it to another harness. Plain non-OpenAI provider refs stay on
|
|
their normal provider path in `auto` mode.
|
|
|
|
**Computer Use is installed but tools do not run:** check
|
|
`/codex computer-use status` from a fresh session. If a tool reports
|
|
`Native hook relay unavailable`, use `/new` or `/reset`; if it persists, restart
|
|
the gateway to clear stale native hook registrations. See
|
|
[Codex Computer Use](/plugins/codex-computer-use#troubleshooting).
|
|
|
|
## Related
|
|
|
|
- [Codex harness reference](/plugins/codex-harness-reference)
|
|
- [Codex harness runtime](/plugins/codex-harness-runtime)
|
|
- [Native Codex plugins](/plugins/codex-native-plugins)
|
|
- [Codex Computer Use](/plugins/codex-computer-use)
|
|
- [Agent runtimes](/concepts/agent-runtimes)
|
|
- [Model providers](/concepts/model-providers)
|
|
- [OpenAI provider](/providers/openai)
|
|
- [Agent harness plugins](/plugins/sdk-agent-harness)
|
|
- [Plugin hooks](/plugins/hooks)
|
|
- [Diagnostics export](/gateway/diagnostics)
|
|
- [Status](/cli/status)
|
|
- [Testing](/help/testing-live#live-codex-app-server-harness-smoke)
|