Change ONLY ~/.codex/config.toml:
[profiles.glm_proxy]
model = "glm-5" # or "glm-5.1", "glm-4.7"No proxy restart needed. Start a new Codex session.
profile = "glm_proxy"
[model_providers.z_ai_proxy]
name = "z.ai via Local Proxy"
base_url = "http://127.0.0.1:4891/v1"
env_key = "Z_AI_API_KEY"
wire_api = "responses"
[profiles.glm_proxy]
model = "glm-5"
model_provider = "z_ai_proxy"ZAI_API_KEY— Z.ai API key (required)ZAI_BASE_URL— upstream API URL (default:https://api.z.ai/api/coding/paas/v4)PROXY_PORT— local port (default:4891)LOG_LEVEL— logging level (default:INFO)
/var/home/preston/podman/codex-zai-proxy/
https://github.com/erubescent/codex-zai-proxy
- Systemd user service:
systemctl --user restart codex-zai-proxy - Container name:
codex-zai-proxy - Port:
127.0.0.1:4891
erubescent hello@noceurmedia.com