Amend schedule with ai-code-review talk
- Update placeholder with link and overview to the generalization
of the chat-gpt-plugin with the new ai-code-review plugin.
- Added overview into hackathon-ai-code-review-plugin.md
- Added speakers.md section #trevorgetty.
Change-Id: I80e523e5cd901033db17fde6b3bf0cf29bf41cb8
diff --git a/lightning-talks/hackathon-ai-code-review-plugin.md b/lightning-talks/hackathon-ai-code-review-plugin.md
new file mode 100644
index 0000000..1ec3d9e
--- /dev/null
+++ b/lightning-talks/hackathon-ai-code-review-plugin.md
@@ -0,0 +1,44 @@
+# Hackathon Outcome: Generalising the chat-gpt-plugin
+
+A requirement has been raised by customers of Gerrit that they would love to have the functionality shown
+by the current [chatgpt-code-review-gerrit-plugin](https://github.com/amarula/chatgpt-code-review-gerrit-plugin/),
+without the need of sending important Intellectual Property, to a public service.
+
+As such during the hackathon it was accepted that it would be good to take the current chat-gpt gerrit plugin
+and to create a more generic `ai-code-review` plugin that could be used with on-premise, private hosted or public AI services.
+
+The starting point of this was to create a fork of the current chatgpt plugin, in the gerritcodereview section
+under [plugins/ai-code-review](https://gerrit-review.googlesource.com/admin/repos/plugins/ai-code-review).
+
+All classes that are not chatGPT specific, which is most of them, including the configuration
+options where to be generally renamed.
+
+e.g. gptModel -> aiModel
+ gptToken -> aiToken
+
+A great benefit was that ollama has now extended their API with the OpenAPI specification so that they present
+an endpoint which allows compatiblity with chatGPT OpenAPI specification for requests and responses.
+
+## AIType Support
+A new concept has been added to the plugin to allow the specification of different AITypes.
+
+As such a new configuration option called `aiType` can be specified as follows:
+ - CHATGPT (default if not specified)
+ - OLLAMA
+ - GENERIC
+
+### OLLAMA
+The ollama AI service, must be run against a model which supports the "tools" and "functions" extensions so that the
+response object can be easily specified to return in the correct JSON schema by the prompt. Without this the default
+is to return markdown json, which would require much more processing on the client side.
+So make sure that the OLLAMA model you choose appears in this list:
+[ollama models (tools-filter)](https://ollama.com/search?c=tools)
+
+### GENERIC
+The generic ai service ability has been added, to more easily test against a potentially new or not yet supported
+endpoint. It does so by allowing you to specify the authorization header (key: `aiAuthHeaderName`), along with the
+different API endpoint URI (key: `aiChatEndpoint`) which can be used for the code review.
+e.g. instead of /v1/chat/completions it might be /api/chat
+
+
+*[Trevor Getty, Software Architect / Cirata](../speakers.md#trevorgetty)*