Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Pull Request
This PR introduces full support for local AI models via Ollama, resolving previous connection issues to ensuring a seamless offline experience. It also enhances the overall UX by refining the AI settings interface, enforcing correct model configurations for cloud providers, and adding a clear history button for better chat management.
Description
This PR adds support for Ollama and resolves connection issues to ensure a seamless local AI experience. It also refines the AI settings interface and improves the chat history UX.
How to Use
Install Ollama, then install any model using ollama run <model_name>. New change automatically detect all installed models on your system. And by selecting it use can use it.
Screenshots/Videos
Added Support for Ollama:
Dynamic Model Fetching:
Enhanced UX:
Show case:

Demonstration:
Screen.Recording.2025-12-16.at.4.05.45.PM.mp4
Related Issues
Closes #372