Skip to content

Fix Ollama model resolution without API key#390

Open
ClaireGz wants to merge 2 commits intomainfrom
codex/fix-ollama-model-resolution
Open

Fix Ollama model resolution without API key#390
ClaireGz wants to merge 2 commits intomainfrom
codex/fix-ollama-model-resolution

Conversation

@ClaireGz
Copy link
Contributor

@ClaireGz ClaireGz commented Mar 2, 2026

Summary

  • allow explicit Ollama model selections to resolve without requiring project config or OLLAMA_API_KEY
  • treat OLLAMA_BASE_URL as sufficient to expose Ollama as an env-configured provider
  • add backend tests covering Ollama resolution and env provider detection

Testing

  • bun test apps/backend/tests/llm.utils.test.ts
  • bun test apps/backend/tests/utils.test.ts

Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 2 files

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="apps/backend/src/utils/llm.ts">

<violation number="1" location="apps/backend/src/utils/llm.ts:26">
P2: The Ollama branch in `getEnvProviders()` replaces the `hasEnvApiKey` check entirely with `getEnvBaseUrl`, dropping support for configurations where only `OLLAMA_API_KEY` is set (without `OLLAMA_BASE_URL`). The condition should accept either signal.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

Comment on lines +26 to +27
if (provider === 'ollama') {
return !!getEnvBaseUrl(provider);
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Mar 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: The Ollama branch in getEnvProviders() replaces the hasEnvApiKey check entirely with getEnvBaseUrl, dropping support for configurations where only OLLAMA_API_KEY is set (without OLLAMA_BASE_URL). The condition should accept either signal.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At apps/backend/src/utils/llm.ts, line 26:

<comment>The Ollama branch in `getEnvProviders()` replaces the `hasEnvApiKey` check entirely with `getEnvBaseUrl`, dropping support for configurations where only `OLLAMA_API_KEY` is set (without `OLLAMA_BASE_URL`). The condition should accept either signal.</comment>

<file context>
@@ -22,7 +22,12 @@ export function hasEnvApiKey(provider: LlmProvider): boolean {
 export function getEnvProviders(): LlmProvider[] {
-	return (Object.keys(LLM_PROVIDERS) as LlmProvider[]).filter(hasEnvApiKey);
+	return (Object.keys(LLM_PROVIDERS) as LlmProvider[]).filter((provider) => {
+		if (provider === 'ollama') {
+			return !!getEnvBaseUrl(provider);
+		}
</file context>
Suggested change
if (provider === 'ollama') {
return !!getEnvBaseUrl(provider);
if (provider === 'ollama') {
return !!getEnvBaseUrl(provider) || hasEnvApiKey(provider);
Fix with Cubic

@Bl3f
Copy link
Contributor

Bl3f commented Mar 2, 2026

@socallmebertille why do we need to do this? didn't we already support this in your PR #267?

@socallmebertille
Copy link
Contributor

Indeed @Bl3f , #267 covered it !

@MatLBS
Copy link
Contributor

MatLBS commented Mar 3, 2026

Ok so we can close this PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants