A requirement has been raised by customers of Gerrit that they would love to have the functionality shown by the current chatgpt-code-review-gerrit-plugin, without the need of sending important Intellectual Property, to a public service.
As such during the hackathon it was accepted that it would be good to take the current chat-gpt gerrit plugin and to create a more generic ai-code-review
plugin that could be used with on-premise, private hosted or public AI services.
The starting point of this was to create a fork of the current chatgpt plugin, in the gerritcodereview section under plugins/ai-code-review.
All classes that are not chatGPT specific, which is most of them, including the configuration options where to be generally renamed.
e.g. gptModel -> aiModel gptToken -> aiToken
A great benefit was that ollama has now extended their API with the OpenAPI specification so that they present an endpoint which allows compatiblity with chatGPT OpenAPI specification for requests and responses.
A new concept has been added to the plugin to allow the specification of different AITypes.
As such a new configuration option called aiType
can be specified as follows:
The ollama AI service, must be run against a model which supports the “tools” and “functions” extensions so that the response object can be easily specified to return in the correct JSON schema by the prompt. Without this the default is to return markdown json, which would require much more processing on the client side. So make sure that the OLLAMA model you choose appears in this list: ollama models (tools-filter)
The generic ai service ability has been added, to more easily test against a potentially new or not yet supported endpoint. It does so by allowing you to specify the authorization header (key: aiAuthHeaderName
), along with the different API endpoint URI (key: aiChatEndpoint
) which can be used for the code review. e.g. instead of /v1/chat/completions it might be /api/chat