Skip to content

Conversation

@tsdevau
Copy link
Owner

@tsdevau tsdevau commented Jun 5, 2025

Resolves issue #37

  • Added dual provider support with a new diffCommit.provider configuration option
  • Implemented Ollama integration with model selection and server configuration
  • Added new commands for Ollama setup and model switching:
    • DiffCommit: Configure Ollama Model
    • DiffCommit: Change Ollama Model
  • Added new configuration settings:
    • diffCommit.ollamaHostname for server connection
    • diffCommit.ollamaModel for model selection
  • Updated documentation with Ollama requirements and setup instructions
  • Enhanced error handling for Ollama-specific scenarios
  • Updated workflow documentation to include provider selection

tsdevau added 12 commits June 5, 2025 10:22
This commit adds support for using local Ollama models as an alternative to Anthropic's Claude models. Key changes include:

- Added configuration option to select between Anthropic and Ollama providers
- Implemented new commands for selecting and changing Ollama models
- Created OllamaManager class to handle model selection and server communication
- Added OllamaCommitMessageGenerator class for generating commit messages using Ollama
- Updated extension.ts to support both providers
- Added configuration options for Ollama hostname and model selection
- Updated package.json with new commands and configuration properties
… unified class

- Merged separate Ollama generator into main CommitMessageGenerator class
- Added constructor overloads to support both Anthropic and Ollama providers
- Implemented provider-specific message generation methods
- Improved error handling for both providers
- Enhanced prompt building with shared logic between providers
- Added response text normalisation to ensure consistent output formatting
- Removed redundant ollamaCommitMessageGenerator.ts file
- Updated logging to provide more detailed token usage information
… class

- Remove `OllamaCommitMessageGenerator` import and class usage
- Update `OllamaManager` constructor to no longer require context parameter
- Modify `generateCommitMessage` function to use the main `CommitMessageGenerator` for both providers
- Rename command reference from `selectOllamaModel` to `changeOllamaModel` for consistency
- Replace custom fetch implementation with official Ollama client library
- Consolidate model selection logic into a single configurable method
- Improve error handling with more specific error messages
- Enhance user feedback with status bar messages
- Simplify class by removing unnecessary context dependency
- Add convenience methods for initial setup and model changes
- Improve hostname validation with URL constructor
This commit adds comprehensive tests to support the Ollama feature addition as an alternative to Anthropic's Claude API:

- Add new `CommitMessageGenerator` class that supports both Anthropic and Ollama providers
- Implement `OllamaManager` for managing Ollama model selection and configuration
- Add extensive test coverage for Ollama integration
- Update configuration handling to support provider selection
- Improve error handling for both Anthropic and Ollama API calls
- Update token usage logging to be more detailed and consistent
- Update model name from `claude-3-5-sonnet-latest` to `claude-sonnet-4-0`
- Reduce default temperature from 0.4 to 0.2 for more consistent results
This commit adds support for using local Ollama models as an alternative to cloud-based Anthropic models:

- Updated package description to mention Ollama support for offline usage
- Added keywords related to Ollama and local/offline AI capabilities
- Renamed command from `diffCommit.selectOllamaModel` to `diffCommit.configureOllamaModel`
- Added Ollama dependency (version 0.5.16) to package.json
…for clarity

The commit renames the command from "selectOllamaModel" to "configureOllamaModel" to better reflect its purpose, updating both the command registration and its reference in the subscriptions list.
Renames the command from `selectOllamaModel` to `configureOllamaModel` and adds the corresponding mock function. Updates tests to properly verify that each function is called exactly once instead of assuming both commands use the same underlying function.
- Added dual provider support with a new `diffCommit.provider` configuration option
- Implemented Ollama integration with model selection and server configuration
- Added new commands for Ollama setup and model switching:
  - `DiffCommit: Configure Ollama Model`
  - `DiffCommit: Change Ollama Model`
- Added new configuration settings:
  - `diffCommit.ollamaHostname` for server connection
  - `diffCommit.ollamaModel` for model selection
- Updated documentation with Ollama requirements and setup instructions
- Enhanced error handling for Ollama-specific scenarios
- Updated workflow documentation to include provider selection
@tsdevau tsdevau self-assigned this Jun 5, 2025
@tsdevau tsdevau requested review from Copilot and removed request for Copilot June 5, 2025 12:26
@tsdevau tsdevau marked this pull request as ready for review June 5, 2025 12:28
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds support for using local Ollama models alongside Anthropic, unifies commit message generation logic, and updates tests and docs accordingly.

  • Introduces diffCommit.provider for selecting Anthropic or Ollama
  • Implements OllamaManager with setup/change commands and error handling
  • Refactors CommitMessageGenerator to handle both providers and updates related tests/docs

Reviewed Changes

Copilot reviewed 17 out of 17 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
test/withProgressAPI.test.ts Updated progress message from API key validation to configuration
test/ollamaManager.test.ts Added tests for Ollama model config and error scenarios
test/messageHandling.test.ts Mocked default Ollama config in message handling tests
test/gitIntegration.test.ts Aligned default config values (provider/hostname/model)
test/gitAndCommands.test.ts Updated command registration tests for new Ollama commands
test/errorHandling.test.ts Mocked Ollama API errors and added handling tests
test/configurationHandling.test.ts Included provider and Ollama config in configuration tests
test/anthropicResponseHandling.test.ts Adjusted console log assertions for separated token logs
src/ollamaManager.ts New OllamaManager implementation for model setup and error handling
src/extension.ts Integrated Ollama commands and extended commit flow for providers
src/configManager.ts Extended config manager to include provider, hostname, and model
src/commitMessageGenerator.ts Unified generator class for Anthropic and Ollama with prompt builder
package.json Added Ollama dependency, provider config, and updated extension desc
README.md Documented Ollama support, commands, and updated usage instructions
Comments suppressed due to low confidence (2)

src/ollamaManager.ts:2

  • Importing console is unnecessary since it's a global in Node/VSCode; remove this import and use the global console directly.
import console from "console"

src/commitMessageGenerator.ts:20

  • The overload signatures declare parameters but the implementation signature takes none. Update the implementation to constructor(...args: any[]) so it matches the overloads.
constructor() {

Copy link
Owner Author

@tsdevau tsdevau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK to merge for feature addition of a new alternate provider being local Ollama models for an offline and cost free option to resolve issue #37.

tsdevau added 4 commits June 5, 2025 22:49
Refactors the CommitMessageGenerator constructor to use TypeScript's rest parameters with proper type annotations instead of accessing the arguments object directly. This improves type safety and code readability while maintaining the same functionality for both Anthropic and Ollama constructor overloads.
The commit removes the trailing slash from the Ollama server URL in test expectations to ensure consistency in how the server address is referenced. This fixes potential issues with URL handling and ensures that error messages and configuration updates use the same URL format.
@tsdevau tsdevau merged commit 6abd8d1 into main Jun 5, 2025
3 checks passed
@tsdevau tsdevau deleted the ollama branch June 5, 2025 13:03
tsdevau added a commit that referenced this pull request Jun 5, 2025
… to fix format for RP

This reverts commit 6abd8d1 to be resubmitted with the correct conventional commit format for Release Please to parse the changes and generate a new release.
tsdevau added a commit that referenced this pull request Jun 5, 2025
Resolves issue #37

- Added dual provider support with a new `diffCommit.provider` configuration option
- Implemented Ollama integration with model selection and server configuration
- Added new commands for Ollama setup and model switching:
  - `DiffCommit: Configure Ollama Model`
  - `DiffCommit: Change Ollama Model`
- Added new configuration settings:
  - `diffCommit.ollamaHostname` for server connection
  - `diffCommit.ollamaModel` for model selection
- Updated documentation with Ollama requirements and setup instructions
- Enhanced error handling for Ollama-specific scenarios
- Updated workflow documentation to include provider selection
tsdevau pushed a commit that referenced this pull request Jun 5, 2025
[0.4.0](diff-commit-v0.3.9...diff-commit-v0.4.0)
(2025-06-05)


### Features, Additions & Updates

* **models:** add support for local Ollama models
([#41](#41))
([8d0e942](8d0e942))


### Work in Progress

* (rp) revert PR for "Add support for local Ollama models
([#39](#39))" to fix format
for RP
([7528a09](7528a09))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants