Codex CLI supports custom model providers through ~/.codex/config.toml, so you can route requests to AI Stats without changing your local workflow.
Prerequisites
- An AI Stats API key.
- Codex CLI installed locally.
1) Set your API key
export AI_STATS_API_KEY="your_ai_stats_api_key"
$env:AI_STATS_API_KEY = "your_ai_stats_api_key"
Create or update ~/.codex/config.toml:
model = "openai/gpt-5-3-codex-2026-02-05"
model_provider = "ai_stats"
[model_providers.ai_stats]
name = "AI Stats"
base_url = "https://api.phaseo.app/v1"
env_key = "AI_STATS_API_KEY"
wire_api = "responses"
3) Start Codex
Verification
- Confirm your model id exists in Models before launching.
- Use Responses for the default Codex-style flow.
- If needed for compatibility, switch
wire_api to chat and use Chat Completions.
Troubleshooting
401 Unauthorized: check that AI_STATS_API_KEY is set in the same shell running codex.
model_not_found: fetch a current id from Models and update model.
- Provider-specific issues: follow Routing and fallbacks for failover behavior.
Last modified on March 11, 2026