How to Install OpenAI Codex CLI on a Headless VPS
TL;DR: Codex CLI uses browser-based OAuth login, which doesn’t work on headless servers. The fix? SSH port forwarding. Here’s exactly how to do it.
The Problem#
OpenAI’s Codex CLI is a powerful terminal-based coding agent. But when you try to log in on a headless VPS, you hit a wall:
codex login
It spins up a local OAuth server on localhost:1455 and asks you to open a URL in your browser. Except your server doesn’t have a browser. And the OAuth callback needs to reach localhost:1455 — on a machine you can’t browse on.
Prerequisites#
- A VPS with Node.js 22+ installed
- SSH access to your server
- A browser on your local machine
- A ChatGPT Plus subscription (or OpenAI API key)
Step 1: Install Codex CLI#
SSH into your server and install globally:
ssh root@YOUR_SERVER_IP
npm install -g @openai/codex
Verify the installation:
codex --version
# @openai/codex v0.98.0
Step 2: The SSH Port Forwarding Trick#
This is the key. Codex CLI’s OAuth flow starts a local HTTP server on port 1455 to receive the authentication callback. We tunnel that port from your local machine to the server, so the browser redirect actually reaches the CLI.
From your local machine (not the server), open a new terminal:
ssh -L 1455:localhost:1455 root@YOUR_SERVER_IP
This creates a tunnel: port 1455 on your laptop → port 1455 on the server.
What’s happening#
[Your Browser] → localhost:1455 → [SSH Tunnel] → server:1455 → [Codex OAuth Handler]
Your browser thinks it’s talking to a local server. It’s actually talking to your VPS through the tunnel.
Step 3: Start the Login Flow#
In the SSH session (with the tunnel active), run:
codex login
You’ll see something like:
Visit this URL to authenticate:
https://auth.openai.com/authorize?client_id=app_XXXXXXXXXXXX&...
Copy that URL and open it in your local browser.
Step 4: Complete Authentication#
- The URL opens OpenAI’s login page
- Sign in with your account (Google, email, etc.)
- Authorize the Codex CLI application
- Browser redirects to
localhost:1455/callback - SSH tunnel forwards the callback to your server
- Codex CLI receives it and saves your tokens
You should see:
Authentication successful!
Step 5: Verify It Works#
codex "say hello and confirm you're working"
You should get a response from the model, confirming everything is connected.
Where Are the Tokens Stored?#
Auth credentials are saved at ~/.codex/auth.json:
{
"auth_mode": "chatgpt",
"OPENAI_API_KEY": null,
"tokens": {
"id_token": "eyJhbGciOiJSUzI1...",
"access_token": "eyJhbGciOiJSUzI1...",
"refresh_token": "rt_XXXXXXXXXXXXXXX...",
"account_id": "YOUR_ACCOUNT_ID"
},
"last_refresh": "2026-02-10T23:26:40Z"
}
⚠️ Treat these like passwords. The access token expires (~10 days), but the refresh token can generate new ones.
Using the Access Token#
If you need it as an environment variable:
# With jq
export OPENAI_API_KEY=$(jq -r '.tokens.access_token' ~/.codex/auth.json)
# With Python (if jq isn't installed)
export OPENAI_API_KEY=$(python3 -c "
import json
data = json.load(open('$HOME/.codex/auth.json'))
print(data['tokens']['access_token'])
")
To persist across sessions:
echo 'export OPENAI_API_KEY=$(python3 -c "\
import json; \
data=json.load(open(\"$HOME/.codex/auth.json\")); \
print(data[\"tokens\"][\"access_token\"])")' >> ~/.bashrc
Non-Interactive Mode#
For scripts and automation, use --full-auto:
codex --full-auto "refactor the auth module to use JWT"
This runs sandboxed with automatic approval for safe filesystem operations.
The Big Gotcha: OAuth Token ≠ API Key#
This is where we burned an hour so you don’t have to.
The ChatGPT Plus OAuth token from codex login is not the same as an OpenAI API key (sk-...). They’re completely different auth systems.
| OAuth Token (Codex) | API Key | |
|---|---|---|
| Format | JWT (long, eyJ...) |
sk-... (short) |
| Source | codex login |
platform.openai.com |
| Scope | Codex CLI only | All OpenAI APIs |
| Billing | ChatGPT Plus subscription | Separate pay-as-you-go |
| Expires | ~10 days (auto-refresh) | Never (until revoked) |
| Models | gpt-5.3-codex + others via CLI |
All API-available models |
We Tested Every Endpoint#
We tried using the OAuth access token as a Bearer token against the standard OpenAI API. Every single endpoint rejected it:
# List models
curl https://api.openai.com/v1/models \
-H "Authorization: Bearer $CODEX_ACCESS_TOKEN"
# → "Missing scopes: api.model.read"
# Chat completions
curl https://api.openai.com/v1/chat/completions \
-H "Authorization: Bearer $CODEX_ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d '{"model":"gpt-4.1-mini","messages":[{"role":"user","content":"hello"}]}'
# → "Missing scopes: model.request"
# Responses API
curl https://api.openai.com/v1/responses \
-H "Authorization: Bearer $CODEX_ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d '{"model":"gpt-4.1-mini","input":"hello"}'
# → "Missing scopes: api.responses.write"
The token’s JWT scopes are locked to the Codex CLI client app (app_EMoamEEZ73f0CkXaXp7hrann). OpenAI intentionally restricts it — your ChatGPT Plus subscription covers Codex CLI usage, but API access is a separate billing product.
What This Means#
- Codex CLI works great with the OAuth token — that’s what it’s for
- You cannot reuse this token to call the OpenAI API from your own apps, bots, or scripts
- For API access, you need a real API key from platform.openai.com/api-keys with its own credits
- Models like
gpt-5.3-codexare only available through Codex CLI, not the standard API
Workaround: Shell Out to Codex#
If you want to use the Codex model from your own app without an API key, you can shell out to the CLI:
codex --full-auto -q "your prompt here" 2>/dev/null
It’s not elegant, but it works with your existing ChatGPT Plus auth. Just be aware of the overhead — each call spawns a new process and establishes a fresh connection.
Troubleshooting#
| Issue | Fix |
|---|---|
codex login hangs after showing URL |
Check port 1455 isn’t in use: lsof -i :1455 |
| Browser shows “connection refused” | Verify SSH tunnel is active with -L 1455:localhost:1455 |
| “State mismatch” error | Don’t reuse old auth URLs — start a fresh codex login |
| Token expired | Run codex login again, or Codex CLI auto-refreshes via the refresh token |
Summary#
npm install -g @openai/codexssh -L 1455:localhost:1455 root@YOUR_SERVER_IP(from local machine)codex login(on server, through the tunnel)- Open the auth URL in your local browser
- Done — tokens saved at
~/.codex/auth.json
The SSH port forwarding trick works for any CLI tool that needs browser-based OAuth on a headless machine. And remember: the Codex OAuth token stays in its lane — if you need the OpenAI API, get a proper API key.
Tested on Ubuntu 24.04 VPS with Node.js 22 and Codex CLI v0.98.0.