-
Notifications
You must be signed in to change notification settings - Fork 1
Add support for local Ollama models and unify commit message generation #39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This commit adds support for using local Ollama models as an alternative to Anthropic's Claude models. Key changes include: - Added configuration option to select between Anthropic and Ollama providers - Implemented new commands for selecting and changing Ollama models - Created OllamaManager class to handle model selection and server communication - Added OllamaCommitMessageGenerator class for generating commit messages using Ollama - Updated extension.ts to support both providers - Added configuration options for Ollama hostname and model selection - Updated package.json with new commands and configuration properties
… unified class - Merged separate Ollama generator into main CommitMessageGenerator class - Added constructor overloads to support both Anthropic and Ollama providers - Implemented provider-specific message generation methods - Improved error handling for both providers - Enhanced prompt building with shared logic between providers - Added response text normalisation to ensure consistent output formatting - Removed redundant ollamaCommitMessageGenerator.ts file - Updated logging to provide more detailed token usage information
… class - Remove `OllamaCommitMessageGenerator` import and class usage - Update `OllamaManager` constructor to no longer require context parameter - Modify `generateCommitMessage` function to use the main `CommitMessageGenerator` for both providers - Rename command reference from `selectOllamaModel` to `changeOllamaModel` for consistency
- Replace custom fetch implementation with official Ollama client library - Consolidate model selection logic into a single configurable method - Improve error handling with more specific error messages - Enhance user feedback with status bar messages - Simplify class by removing unnecessary context dependency - Add convenience methods for initial setup and model changes - Improve hostname validation with URL constructor
This commit adds comprehensive tests to support the Ollama feature addition as an alternative to Anthropic's Claude API: - Add new `CommitMessageGenerator` class that supports both Anthropic and Ollama providers - Implement `OllamaManager` for managing Ollama model selection and configuration - Add extensive test coverage for Ollama integration - Update configuration handling to support provider selection - Improve error handling for both Anthropic and Ollama API calls - Update token usage logging to be more detailed and consistent - Update model name from `claude-3-5-sonnet-latest` to `claude-sonnet-4-0` - Reduce default temperature from 0.4 to 0.2 for more consistent results
This commit adds support for using local Ollama models as an alternative to cloud-based Anthropic models: - Updated package description to mention Ollama support for offline usage - Added keywords related to Ollama and local/offline AI capabilities - Renamed command from `diffCommit.selectOllamaModel` to `diffCommit.configureOllamaModel` - Added Ollama dependency (version 0.5.16) to package.json
…for clarity The commit renames the command from "selectOllamaModel" to "configureOllamaModel" to better reflect its purpose, updating both the command registration and its reference in the subscriptions list.
Renames the command from `selectOllamaModel` to `configureOllamaModel` and adds the corresponding mock function. Updates tests to properly verify that each function is called exactly once instead of assuming both commands use the same underlying function.
- Added dual provider support with a new `diffCommit.provider` configuration option - Implemented Ollama integration with model selection and server configuration - Added new commands for Ollama setup and model switching: - `DiffCommit: Configure Ollama Model` - `DiffCommit: Change Ollama Model` - Added new configuration settings: - `diffCommit.ollamaHostname` for server connection - `diffCommit.ollamaModel` for model selection - Updated documentation with Ollama requirements and setup instructions - Enhanced error handling for Ollama-specific scenarios - Updated workflow documentation to include provider selection
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds support for using local Ollama models alongside Anthropic, unifies commit message generation logic, and updates tests and docs accordingly.
- Introduces
diffCommit.providerfor selecting Anthropic or Ollama - Implements
OllamaManagerwith setup/change commands and error handling - Refactors
CommitMessageGeneratorto handle both providers and updates related tests/docs
Reviewed Changes
Copilot reviewed 17 out of 17 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| test/withProgressAPI.test.ts | Updated progress message from API key validation to configuration |
| test/ollamaManager.test.ts | Added tests for Ollama model config and error scenarios |
| test/messageHandling.test.ts | Mocked default Ollama config in message handling tests |
| test/gitIntegration.test.ts | Aligned default config values (provider/hostname/model) |
| test/gitAndCommands.test.ts | Updated command registration tests for new Ollama commands |
| test/errorHandling.test.ts | Mocked Ollama API errors and added handling tests |
| test/configurationHandling.test.ts | Included provider and Ollama config in configuration tests |
| test/anthropicResponseHandling.test.ts | Adjusted console log assertions for separated token logs |
| src/ollamaManager.ts | New OllamaManager implementation for model setup and error handling |
| src/extension.ts | Integrated Ollama commands and extended commit flow for providers |
| src/configManager.ts | Extended config manager to include provider, hostname, and model |
| src/commitMessageGenerator.ts | Unified generator class for Anthropic and Ollama with prompt builder |
| package.json | Added Ollama dependency, provider config, and updated extension desc |
| README.md | Documented Ollama support, commands, and updated usage instructions |
Comments suppressed due to low confidence (2)
src/ollamaManager.ts:2
- Importing
consoleis unnecessary since it's a global in Node/VSCode; remove this import and use the globalconsoledirectly.
import console from "console"
src/commitMessageGenerator.ts:20
- The overload signatures declare parameters but the implementation signature takes none. Update the implementation to
constructor(...args: any[])so it matches the overloads.
constructor() {
tsdevau
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK to merge for feature addition of a new alternate provider being local Ollama models for an offline and cost free option to resolve issue #37.
Refactors the CommitMessageGenerator constructor to use TypeScript's rest parameters with proper type annotations instead of accessing the arguments object directly. This improves type safety and code readability while maintaining the same functionality for both Anthropic and Ollama constructor overloads.
The commit removes the trailing slash from the Ollama server URL in test expectations to ensure consistency in how the server address is referenced. This fixes potential issues with URL handling and ensures that error messages and configuration updates use the same URL format.
… to fix format for RP This reverts commit 6abd8d1 to be resubmitted with the correct conventional commit format for Release Please to parse the changes and generate a new release.
Resolves issue #37 - Added dual provider support with a new `diffCommit.provider` configuration option - Implemented Ollama integration with model selection and server configuration - Added new commands for Ollama setup and model switching: - `DiffCommit: Configure Ollama Model` - `DiffCommit: Change Ollama Model` - Added new configuration settings: - `diffCommit.ollamaHostname` for server connection - `diffCommit.ollamaModel` for model selection - Updated documentation with Ollama requirements and setup instructions - Enhanced error handling for Ollama-specific scenarios - Updated workflow documentation to include provider selection
[0.4.0](diff-commit-v0.3.9...diff-commit-v0.4.0) (2025-06-05) ### Features, Additions & Updates * **models:** add support for local Ollama models ([#41](#41)) ([8d0e942](8d0e942)) ### Work in Progress * (rp) revert PR for "Add support for local Ollama models ([#39](#39))" to fix format for RP ([7528a09](7528a09)) --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please).
Resolves issue #37
diffCommit.providerconfiguration optionDiffCommit: Configure Ollama ModelDiffCommit: Change Ollama ModeldiffCommit.ollamaHostnamefor server connectiondiffCommit.ollamaModelfor model selection