---
summary: "Sign in to GitHub Copilot from OpenClaw using the device flow"
read_when:
- You want to use GitHub Copilot as a model provider
- You need the `openclaw models auth login-github-copilot` flow
title: "GitHub Copilot"
---
# GitHub Copilot
GitHub Copilot is GitHub's AI coding assistant. It provides access to Copilot
models for your GitHub account and plan. OpenClaw can use Copilot as a model
provider in two different ways.
## Two ways to use Copilot in OpenClaw
Use the native device-login flow to obtain a GitHub token, then exchange it for
Copilot API tokens when OpenClaw runs. This is the **default** and simplest path
because it does not require VS Code.
```bash
openclaw models auth login-github-copilot
```
You will be prompted to visit a URL and enter a one-time code. Keep the
terminal open until it completes.
```bash
openclaw models set github-copilot/gpt-4o
```
Or in config:
```json5
{
agents: { defaults: { model: { primary: "github-copilot/gpt-4o" } } },
}
```
Use the **Copilot Proxy** VS Code extension as a local bridge. OpenClaw talks to
the proxy's `/v1` endpoint and uses the model list you configure there.
Choose this when you already run Copilot Proxy in VS Code or need to route
through it. You must enable the plugin and keep the VS Code extension running.
## Optional flags
| Flag | Description |
| --------------- | --------------------------------------------------- |
| `--yes` | Skip the confirmation prompt |
| `--set-default` | Also apply the provider's recommended default model |
```bash
# Skip confirmation
openclaw models auth login-github-copilot --yes
# Login and set the default model in one step
openclaw models auth login --provider github-copilot --method device --set-default
```
The device-login flow requires an interactive TTY. Run it directly in a
terminal, not in a non-interactive script or CI pipeline.
Copilot model availability depends on your GitHub plan. If a model is
rejected, try another ID (for example `github-copilot/gpt-4.1`).
Claude model IDs use the Anthropic Messages transport automatically. GPT,
o-series, and Gemini models keep the OpenAI Responses transport. OpenClaw
selects the correct transport based on the model ref.
OpenClaw resolves Copilot auth from environment variables in the following
priority order:
| Priority | Variable | Notes |
| -------- | --------------------- | -------------------------------- |
| 1 | `COPILOT_GITHUB_TOKEN` | Highest priority, Copilot-specific |
| 2 | `GH_TOKEN` | GitHub CLI token (fallback) |
| 3 | `GITHUB_TOKEN` | Standard GitHub token (lowest) |
When multiple variables are set, OpenClaw uses the highest-priority one.
The device-login flow (`openclaw models auth login-github-copilot`) stores
its token in the auth profile store and takes precedence over all environment
variables.
The login stores a GitHub token in the auth profile store and exchanges it
for a Copilot API token when OpenClaw runs. You do not need to manage the
token manually.
Requires an interactive TTY. Run the login command directly in a terminal, not
inside a headless script or CI job.
## Related
Choosing providers, model refs, and failover behavior.
Auth details and credential reuse rules.