Conversation
This commit adds support for using local Ollama models as an alternative to Anthropic's Claude models. Key changes include: - Added configuration option to select between Anthropic and Ollama providers - Implemented new commands for selecting and changing Ollama models - Created OllamaManager class to handle model selection and server communication - Added OllamaCommitMessageGenerator class for generating commit messages using Ollama - Updated extension.ts to support both providers - Added configuration options for Ollama hostname and model selection - Updated package.json with new commands and configuration properties
… unified class - Merged separate Ollama generator into main CommitMessageGenerator class - Added constructor overloads to support both Anthropic and Ollama providers - Implemented provider-specific message generation methods - Improved error handling for both providers - Enhanced prompt building with shared logic between providers - Added response text normalisation to ensure consistent output formatting - Removed redundant ollamaCommitMessageGenerator.ts file - Updated logging to provide more detailed token usage information
… class - Remove `OllamaCommitMessageGenerator` import and class usage - Update `OllamaManager` constructor to no longer require context parameter - Modify `generateCommitMessage` function to use the main `CommitMessageGenerator` for both providers - Rename command reference from `selectOllamaModel` to `changeOllamaModel` for consistency
- Replace custom fetch implementation with official Ollama client library - Consolidate model selection logic into a single configurable method - Improve error handling with more specific error messages - Enhance user feedback with status bar messages - Simplify class by removing unnecessary context dependency - Add convenience methods for initial setup and model changes - Improve hostname validation with URL constructor
This commit adds comprehensive tests to support the Ollama feature addition as an alternative to Anthropic's Claude API: - Add new `CommitMessageGenerator` class that supports both Anthropic and Ollama providers - Implement `OllamaManager` for managing Ollama model selection and configuration - Add extensive test coverage for Ollama integration - Update configuration handling to support provider selection - Improve error handling for both Anthropic and Ollama API calls - Update token usage logging to be more detailed and consistent - Update model name from `claude-3-5-sonnet-latest` to `claude-sonnet-4-0` - Reduce default temperature from 0.4 to 0.2 for more consistent results
This commit adds support for using local Ollama models as an alternative to cloud-based Anthropic models: - Updated package description to mention Ollama support for offline usage - Added keywords related to Ollama and local/offline AI capabilities - Renamed command from `diffCommit.selectOllamaModel` to `diffCommit.configureOllamaModel` - Added Ollama dependency (version 0.5.16) to package.json
…for clarity The commit renames the command from "selectOllamaModel" to "configureOllamaModel" to better reflect its purpose, updating both the command registration and its reference in the subscriptions list.
Renames the command from `selectOllamaModel` to `configureOllamaModel` and adds the corresponding mock function. Updates tests to properly verify that each function is called exactly once instead of assuming both commands use the same underlying function.
- Added dual provider support with a new `diffCommit.provider` configuration option - Implemented Ollama integration with model selection and server configuration - Added new commands for Ollama setup and model switching: - `DiffCommit: Configure Ollama Model` - `DiffCommit: Change Ollama Model` - Added new configuration settings: - `diffCommit.ollamaHostname` for server connection - `diffCommit.ollamaModel` for model selection - Updated documentation with Ollama requirements and setup instructions - Enhanced error handling for Ollama-specific scenarios - Updated workflow documentation to include provider selection
There was a problem hiding this comment.
Pull Request Overview
This PR adds support for using local Ollama models alongside Anthropic, unifies commit message generation logic, and updates tests and docs accordingly.
- Introduces
diffCommit.providerfor selecting Anthropic or Ollama - Implements
OllamaManagerwith setup/change commands and error handling - Refactors
CommitMessageGeneratorto handle both providers and updates related tests/docs
Reviewed Changes
Copilot reviewed 17 out of 17 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| test/withProgressAPI.test.ts | Updated progress message from API key validation to configuration |
| test/ollamaManager.test.ts | Added tests for Ollama model config and error scenarios |
| test/messageHandling.test.ts | Mocked default Ollama config in message handling tests |
| test/gitIntegration.test.ts | Aligned default config values (provider/hostname/model) |
| test/gitAndCommands.test.ts | Updated command registration tests for new Ollama commands |
| test/errorHandling.test.ts | Mocked Ollama API errors and added handling tests |
| test/configurationHandling.test.ts | Included provider and Ollama config in configuration tests |
| test/anthropicResponseHandling.test.ts | Adjusted console log assertions for separated token logs |
| src/ollamaManager.ts | New OllamaManager implementation for model setup and error handling |
| src/extension.ts | Integrated Ollama commands and extended commit flow for providers |
| src/configManager.ts | Extended config manager to include provider, hostname, and model |
| src/commitMessageGenerator.ts | Unified generator class for Anthropic and Ollama with prompt builder |
| package.json | Added Ollama dependency, provider config, and updated extension desc |
| README.md | Documented Ollama support, commands, and updated usage instructions |
Comments suppressed due to low confidence (2)
src/ollamaManager.ts:2
- Importing
consoleis unnecessary since it's a global in Node/VSCode; remove this import and use the globalconsoledirectly.
import console from "console"
src/commitMessageGenerator.ts:20
- The overload signatures declare parameters but the implementation signature takes none. Update the implementation to
constructor(...args: any[])so it matches the overloads.
constructor() {
Refactors the CommitMessageGenerator constructor to use TypeScript's rest parameters with proper type annotations instead of accessing the arguments object directly. This improves type safety and code readability while maintaining the same functionality for both Anthropic and Ollama constructor overloads.
The commit removes the trailing slash from the Ollama server URL in test expectations to ensure consistency in how the server address is referenced. This fixes potential issues with URL handling and ensures that error messages and configuration updates use the same URL format.
tsdevau
added a commit
that referenced
this pull request
Jun 5, 2025
… to fix format for RP This reverts commit 6abd8d1 to be resubmitted with the correct conventional commit format for Release Please to parse the changes and generate a new release.
tsdevau
added a commit
that referenced
this pull request
Jun 5, 2025
Resolves issue #37 - Added dual provider support with a new `diffCommit.provider` configuration option - Implemented Ollama integration with model selection and server configuration - Added new commands for Ollama setup and model switching: - `DiffCommit: Configure Ollama Model` - `DiffCommit: Change Ollama Model` - Added new configuration settings: - `diffCommit.ollamaHostname` for server connection - `diffCommit.ollamaModel` for model selection - Updated documentation with Ollama requirements and setup instructions - Enhanced error handling for Ollama-specific scenarios - Updated workflow documentation to include provider selection
tsdevau
pushed a commit
that referenced
this pull request
Jun 5, 2025
[0.4.0](diff-commit-v0.3.9...diff-commit-v0.4.0) (2025-06-05) ### Features, Additions & Updates * **models:** add support for local Ollama models ([#41](#41)) ([8d0e942](8d0e942)) ### Work in Progress * (rp) revert PR for "Add support for local Ollama models ([#39](#39))" to fix format for RP ([7528a09](7528a09)) --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please).
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Resolves issue #37
diffCommit.providerconfiguration optionDiffCommit: Configure Ollama ModelDiffCommit: Change Ollama ModeldiffCommit.ollamaHostnamefor server connectiondiffCommit.ollamaModelfor model selection