Firebase Genkit <> GitHub Models Plugin
genkitx-github
is a community plugin for using GitHub Models APIs with
Firebase Genkit. Built by Xavier Portilla Edo.
This Genkit plugin allows to use GitHub models through their official APIs.
Installation
Install the plugin in your project with your favorite package manager:
npm install genkitx-github
yarn add genkitx-github
pnpm add genkitx-github
Usage
Configuration
To use the plugin, you need to configure it with your GitHub Token key. You can do this by calling the configureGenkit
function:
import {github, openAIGpt4o} from "genkitx-github";
configureGenkit({
plugins: [
github({
githubToken: '<my-github-token>',
}),
],
logLevel: "debug",
enableTracingAndMetrics: true,
});
You can also intialize the plugin in this way if you have set the GITHUB_TOKEN
environment variable:
import {github, openAIGpt4o} from "genkitx-github";
configureGenkit({
plugins: [
github({}),
],
logLevel: "debug",
enableTracingAndMetrics: true,
});
Basic examples
The simplest way to call the text generation model is by using the helper function generate
:
import {github, openAIGpt4o} from "genkitx-github";
const response = await generate({
model: openAIGpt4o,
prompt: 'Tell me a joke.',
});
console.log(await response.text());
Within a flow
export const myFlow = defineFlow(
{
name: 'menuSuggestionFlow',
inputSchema: z.string(),
outputSchema: z.string(),
},
async (subject) => {
const llmResponse = await generate({
prompt: `Suggest an item for the menu of a ${subject} themed restaurant`,
model: openAIGpt4o,
});
return llmResponse.text();
}
);
Tool use
const specialToolInputSchema = z.object({ meal: z.enum(["breakfast", "lunch", "dinner"]) });
const specialTool = defineTool(
{
name: "specialTool",
description: "Retrieves today's special for the given meal",
inputSchema: specialToolInputSchema,
outputSchema: z.string(),
},
async ({ meal }): Promise<string> => {
return "Baked beans on toast";
}
);
const result = generate({
model: openAIGpt4o,
tools: [specialTool],
prompt: "What's for breakfast?",
});
console.log(result.then((res) => res.text()));
For more detailed examples and the explanation of other functionalities, refer to the official Genkit documentation.
Supported models
This plugin supports all currently available Chat/Completion and Embeddings models from GitHub Models. This plugin supports image input and multimodal models.
API Reference
You can find the full API reference in the API Reference Documentation
Troubleshooting
- GPT
o1-preview
and o1-mini
it is still in beta. It does not support system roles, tools and the temperature
and topP
needs to be set to 1
. See OpenAI annocement here - Cohere models only supports text output for now. Issue opened here.
Contributing
Want to contribute to the project? That's awesome! Head over to our Contribution Guidelines.
Need support?
[!NOTE]
This repository depends on Google's Firebase Genkit. For issues and questions related to Genkit, please refer to instructions available in Genkit's repository.
Reach out by opening a discussion on GitHub Discussions.
Credits
This plugin is proudly maintained by Xavier Portilla Edo Xavier Portilla Edo.
I got the inspiration, structure and patterns to create this plugin from the Genkit Community Plugins repository built by the Fire Compnay as well as the ollama plugin.
License
This project is licensed under the Apache 2.0 License.