Skip to content

[BUG] Can't log in to instance behind trusted proxy using header auth #418

@Amolith

Description

@Amolith

App Version

2.6.4

Platform

  • Android
  • iOS

How you installed it

GitHub Releases

What happened?

I set an instance up on Exe.dev with the attached prompt. tl;dr it's a basic offline OWUI setup with WEBUI_AUTH_TRUSTED_EMAIL_HEADER=X-ExeDev-Email and ENABLE_SIGNUP=False.

Click to expand prompt Install Open WebUI (https://github.com/open-webui/open-webui) and run it via Docker Compose with PostgreSQL.
  • ghcr.io/open-webui/open-webui:main with network_mode: host
  • postgres:17 with healthcheck (pg_isready), named volume, on 127.0.0.1:5432:5432

Open WebUI needs host networking to reach the Exe LLM gateway at http://169.254.169.254/gateway/llm/openai/v1 (link-local, only reachable from the host network namespace). PORT=8000 makes it listen directly on the exe.dev proxy port.

Generate a random database password and WEBUI_SECRET_KEY with: openssl rand -hex 30

Open WebUI environment:

PORT=8000
WEBUI_URL=https://<vm-name>.exe.xyz
WEBUI_SECRET_KEY=<generated>
WEBUI_AUTH_TRUSTED_EMAIL_HEADER=X-ExeDev-Email
ENABLE_SIGNUP=False
ENABLE_EASTER_EGGS=True
HF_HUB_OFFLINE=1
DATABASE_URL=postgresql://openwebui:<db-password>@127.0.0.1:5432/openwebui

Authentication: Use Open WebUI's built-in trusted header auth. exe.dev sends X-ExeDev-Email on every request and always strips client-supplied X-ExeDev headers, so the header is safe to trust. No OIDC proxy or nginx needed. The first user to visit becomes admin. See https://exe.dev/docs/login-with-exe.md for background.

LLM providers: HF_HUB_OFFLINE=1 disables automatic model downloads, because Exe VMs share resources and letting this VM infer models would waste those resources. We expect the user to connect OWUI to their own endpoints for things like embedding and TTS and such that OWUI would usually provide with models downloaded from HF. After setup, tell the user how to add the Exe LLM gateway as an OpenAI-compatible connection in Admin Settings → Connections (URL: http://169.254.169.254/gateway/llm/openai/v1, API key: any non-empty value). See https://exe.dev/docs/shelley/llm-gateway.md for available providers (includes an Anthropic endpoint). Curl https://github.com/boldsoftware/shelley/raw/refs/heads/main/models/models.go to see which models are available through the gateway and let the user know. Users can also add their own API keys for their own endpoints.

Create a systemd service (open-webui.service) that runs "docker compose -f /opt/open-webui/docker-compose.yml up" (foreground, not -d) so it starts on boot.

Retry the healthcheck in a loop — postgres init and image pulls can take 60s:

for i in $(seq 1 30); do curl -sf http://localhost:8000/ >/dev/null && break; sleep 2; done

After setup, append the following to /home/exedev/.config/shelley/AGENTS.md, substituting in vm-name:

Open WebUI

Open WebUI (https://github.com/open-webui/open-webui) is running at https://.exe.xyz (http://localhost:8000). The compose file is at /opt/open-webui/docker-compose.yml.

exe.dev trusted header auth logs the VM owner in automatically (first user = admin). ENABLE_SIGNUP=False; only users with exe.dev access get accounts. The user can share the VM with friends/coworkers using the share command from the https://exe.dev dashboard or with the share CLI command https://exe.dev/docs/cli-share.md

LLM Providers

No models are bundled (HF_HUB_OFFLINE=1). Add providers in Admin Settings → Connections:

From there, logging in to the instance requires first logging in to the Exe proxy. It seems to work perfectly so far.

Conduit doesn't like it, however. It correctly follows the redirect to Exe's login page, but never gives me a chance to actually log in before saying "you didn't log in". The screen recording further down demonstrates the behaviour.

What should have happened?

I should stay on the login page until I've logged in.

How to reproduce

  • Set OWUI up on Exe.dev using their proxy for auth with these OWUI settings. Could also paste the prompt above into the big text area when creating a new VM.
    PORT=8000
    WEBUI_URL=https://<vm-name>.exe.xyz
    WEBUI_AUTH_TRUSTED_EMAIL_HEADER=X-ExeDev-Email
    ENABLE_SIGNUP=False
    
  • Navigate to https://<vm-name>.exe.xyz in a browser, log in to Exe, and notice being automatically logged in to OWUI
  • Paste https://<vm-name>.exe.xyz into Conduit, no messing with advanced settings, and see the login page open then immediately go away

Does this happen every time?

Yes, every time

When did this start?

2.6.4 and the version immediately prior. I first saw the behaviour in the previous version, noticed 2.6.4 released today, and replicated the behaviour with it as well.

Screenshots

I paste the URL to my instance, the login page for the proxy loads, then Conduit navigates back to the initial screen without actually giving me a chance to log in.

screen-20260328-144722-1774730830245.mp4

Logs

From the video

Could not verify OpenWebUI server. The proxy cookies may have expired or be invalid. Please try again.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    Status

    No status

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions