πŸš€ Launch Week Day 4:Introducing the Alert Details Page: A Better Way to Explore Alerts.Learn More β†’
Socket
Book a DemoInstallSign in
Socket

@gitlab/gitlab-ai-provider

Package Overview
Dependencies
Maintainers
6
Versions
12
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@gitlab/gitlab-ai-provider

GitLab Duo provider for Vercel AI SDK

Source
npmnpm
Version
3.2.0
Version published
Weekly downloads
25K
312.88%
Maintainers
6
Weekly downloads
Β 
Created
Source

GitLab AI Provider

A comprehensive TypeScript provider for integrating GitLab Duo AI capabilities with the Vercel AI SDK. This package enables seamless access to GitLab's AI-powered features including chat, agentic workflows, and tool calling through a unified interface.

🌟 Features

  • πŸ€– Multi-Provider Agentic Chat: Native tool calling support via GitLab's AI Gateway (Anthropic & OpenAI)
  • πŸ” Multiple Authentication: Support for OAuth, Personal Access Tokens, and OpenCode auth
  • 🌐 Self-Hosted Support: Works with both GitLab.com and self-hosted instances
  • πŸ”§ Tool Support: Native tool calling via Vercel AI SDK
  • πŸ” Project Detection: Automatic GitLab project detection from git remotes
  • πŸ’Ύ Smart Caching: Project and token caching for optimal performance
  • 🎯 Type-Safe: Complete TypeScript definitions with Zod validation

πŸ“¦ Installation

npm install @gitlab/gitlab-ai-provider

Peer Dependencies

npm install @ai-sdk/provider @ai-sdk/provider-utils

πŸš€ Quick Start

Basic Chat

import { createGitLab } from '@gitlab/gitlab-ai-provider';
import { generateText } from 'ai';

const gitlab = createGitLab({
  apiKey: process.env.GITLAB_TOKEN,
  instanceUrl: 'https://gitlab.com', // optional, defaults to gitlab.com
});

// All equivalent ways to create a chat model:
const model = gitlab('duo-chat'); // callable provider
const model2 = gitlab.chat('duo-chat'); // .chat() alias (recommended)
const model3 = gitlab.languageModel('duo-chat'); // explicit method

const { text } = await generateText({
  model: gitlab.chat('duo-chat'),
  prompt: 'Explain how to create a merge request in GitLab',
});

console.log(text);

Agentic Chat with Tool Calling

import { createGitLab } from '@gitlab/gitlab-ai-provider';
import { generateText } from 'ai';

const gitlab = createGitLab({
  apiKey: process.env.GITLAB_TOKEN,
});

// Use agentic model for native tool calling support
const model = gitlab.agenticChat('duo-chat', {
  anthropicModel: 'claude-sonnet-4-20250514',
  maxTokens: 8192,
});

const { text } = await generateText({
  model,
  prompt: 'List all open merge requests in my project',
  tools: {
    // Your custom tools here
  },
});

Model Variants

The provider automatically maps specific model IDs to their corresponding provider models (Anthropic or OpenAI) and routes requests to the appropriate AI Gateway proxy:

import { createGitLab } from '@gitlab/gitlab-ai-provider';
import { generateText } from 'ai';

const gitlab = createGitLab({
  apiKey: process.env.GITLAB_TOKEN,
});

// Anthropic Models (Claude)
const opusModel = gitlab.agenticChat('duo-chat-opus-4-5');
// Automatically uses: claude-opus-4-5-20251101

const sonnetModel = gitlab.agenticChat('duo-chat-sonnet-4-5');
// Automatically uses: claude-sonnet-4-5-20250929

const haikuModel = gitlab.agenticChat('duo-chat-haiku-4-5');
// Automatically uses: claude-haiku-4-5-20251001

// OpenAI Models (GPT-5)
const gpt5Model = gitlab.agenticChat('duo-chat-gpt-5-1');
// Automatically uses: gpt-5.1-2025-11-13

const gpt5MiniModel = gitlab.agenticChat('duo-chat-gpt-5-mini');
// Automatically uses: gpt-5-mini-2025-08-07

const codexModel = gitlab.agenticChat('duo-chat-gpt-5-codex');
// Automatically uses: gpt-5-codex

// You can still override with explicit providerModel option
const customModel = gitlab.agenticChat('duo-chat-opus-4-5', {
  providerModel: 'claude-sonnet-4-5-20250929', // Override mapping
});

Available Model Mappings:

Model IDProviderBackend Model
duo-chat-opus-4-5Anthropicclaude-opus-4-5-20251101
duo-chat-sonnet-4-5Anthropicclaude-sonnet-4-5-20250929
duo-chat-haiku-4-5Anthropicclaude-haiku-4-5-20251001
duo-chat-gpt-5-1OpenAIgpt-5.1-2025-11-13
duo-chat-gpt-5-miniOpenAIgpt-5-mini-2025-08-07
duo-chat-gpt-5-codexOpenAIgpt-5-codex
duo-chat-gpt-5-2-codexOpenAIgpt-5.2-codex

For unmapped Anthropic model IDs, the provider defaults to claude-sonnet-4-5-20250929.

OpenAI Models (GPT-5)

The provider supports OpenAI GPT-5 models through GitLab's AI Gateway proxy. OpenAI models are automatically detected based on the model ID and routed to the appropriate proxy endpoint.

import { createGitLab } from '@gitlab/gitlab-ai-provider';
import { generateText } from 'ai';

const gitlab = createGitLab({
  apiKey: process.env.GITLAB_TOKEN,
});

// GPT-5.1 - Most capable model
const { text } = await generateText({
  model: gitlab.agenticChat('duo-chat-gpt-5-1'),
  prompt: 'Explain GitLab CI/CD pipelines',
});

// GPT-5 Mini - Fast and efficient
const { text: quickResponse } = await generateText({
  model: gitlab.agenticChat('duo-chat-gpt-5-mini'),
  prompt: 'Summarize this code',
});

// GPT-5 Codex - Optimized for code
const { text: codeExplanation } = await generateText({
  model: gitlab.agenticChat('duo-chat-gpt-5-codex'),
  prompt: 'Refactor this function for better performance',
});

OpenAI Models with Tool Calling:

import { createGitLab } from '@gitlab/gitlab-ai-provider';
import { generateText, tool } from 'ai';
import { z } from 'zod';

const gitlab = createGitLab({
  apiKey: process.env.GITLAB_TOKEN,
});

const { text, toolCalls } = await generateText({
  model: gitlab.agenticChat('duo-chat-gpt-5-1', {
    maxTokens: 4096,
  }),
  prompt: 'What is the weather in San Francisco?',
  tools: {
    getWeather: tool({
      description: 'Get the weather for a location',
      parameters: z.object({
        location: z.string().describe('The city name'),
      }),
      execute: async ({ location }) => {
        return { temperature: 72, condition: 'sunny', location };
      },
    }),
  },
});

Agentic Chat with Feature Flags

You can pass feature flags to enable experimental features in GitLab's AI Gateway proxy:

import { createGitLab } from '@gitlab/gitlab-ai-provider';

// Option 1: Set feature flags globally for all agentic chat models
const gitlab = createGitLab({
  apiKey: process.env.GITLAB_TOKEN,
  featureFlags: {
    duo_agent_platform_agentic_chat: true,
    duo_agent_platform: true,
  },
});

const model = gitlab.agenticChat('duo-chat');

// Option 2: Set feature flags per model (overrides global flags)
const modelWithFlags = gitlab.agenticChat('duo-chat', {
  featureFlags: {
    duo_agent_platform_agentic_chat: true,
    duo_agent_platform: true,
    custom_feature_flag: false,
  },
});

// Option 3: Merge both (model-level flags take precedence)
const gitlab2 = createGitLab({
  featureFlags: {
    duo_agent_platform: true, // will be overridden
  },
});

const mergedModel = gitlab2.agenticChat('duo-chat', {
  featureFlags: {
    duo_agent_platform: false, // overrides provider-level
    duo_agent_platform_agentic_chat: true, // adds new flag
  },
});

πŸ”‘ Authentication

Personal Access Token

const gitlab = createGitLab({
  apiKey: 'glpat-xxxxxxxxxxxxxxxxxxxx',
});

Environment Variable

export GITLAB_TOKEN=glpat-xxxxxxxxxxxxxxxxxxxx
const gitlab = createGitLab(); // Automatically uses GITLAB_TOKEN

OAuth (OpenCode Auth)

The provider automatically detects and uses OpenCode authentication if available:

const gitlab = createGitLab({
  instanceUrl: 'https://gitlab.com',
  // OAuth tokens are loaded from ~/.opencode/auth.json
});

Custom Headers

const gitlab = createGitLab({
  apiKey: 'your-token',
  headers: {
    'X-Custom-Header': 'value',
  },
});

πŸ—οΈ Architecture

Core Components

1. GitLabProvider

Main provider factory that creates language models with different capabilities.

interface GitLabProvider {
  (modelId: string): LanguageModelV2;
  languageModel(modelId: string): LanguageModelV2;
  agenticChat(modelId: string, options?: GitLabAgenticOptions): GitLabAgenticLanguageModel;
}

2. GitLabAnthropicLanguageModel

Provides native tool calling through GitLab's Anthropic proxy.

  • Uses Claude models via https://cloud.gitlab.com/ai/v1/proxy/anthropic/
  • Automatic token refresh and retry logic
  • Direct access token management
  • Supports all Anthropic tool calling features

3. GitLabOpenAILanguageModel

Provides native tool calling through GitLab's OpenAI proxy.

  • Uses GPT-5 models via https://cloud.gitlab.com/ai/v1/proxy/openai/
  • Automatic token refresh and retry logic
  • Direct access token management
  • Supports all OpenAI tool calling features including parallel tool calls

Supporting Utilities

GitLabProjectDetector

Automatically detects GitLab projects from git remotes.

const detector = new GitLabProjectDetector({
  instanceUrl: 'https://gitlab.com',
  getHeaders: () => ({ Authorization: `Bearer ${token}` }),
});

const project = await detector.detectProject(process.cwd());
// Returns: { id: 12345, path: 'group/project', namespaceId: 67890 }

GitLabProjectCache

Caches project information with TTL.

const cache = new GitLabProjectCache(5 * 60 * 1000); // 5 minutes
cache.set('key', project);
const cached = cache.get('key');

GitLabOAuthManager

Manages OAuth token lifecycle.

const oauthManager = new GitLabOAuthManager();

// Exchange authorization code
const tokens = await oauthManager.exchangeAuthorizationCode({
  instanceUrl: 'https://gitlab.com',
  code: 'auth-code',
  codeVerifier: 'verifier',
});

// Refresh tokens
const refreshed = await oauthManager.refreshIfNeeded(tokens);

GitLabDirectAccessClient

Manages direct access tokens for Anthropic proxy.

const client = new GitLabDirectAccessClient({
  instanceUrl: 'https://gitlab.com',
  getHeaders: () => ({ Authorization: `Bearer ${token}` }),
});

const directToken = await client.getDirectAccessToken();
// Returns: { token: 'xxx', headers: {...}, expiresAt: 123456 }

πŸ“š API Reference

Provider Configuration

interface GitLabProviderSettings {
  instanceUrl?: string; // Default: 'https://gitlab.com'
  apiKey?: string; // PAT or OAuth access token
  refreshToken?: string; // OAuth refresh token
  name?: string; // Provider name prefix
  headers?: Record<string, string>; // Custom headers
  fetch?: typeof fetch; // Custom fetch implementation
  aiGatewayUrl?: string; // AI Gateway URL (default: 'https://cloud.gitlab.com')
}

Environment Variables

VariableDescriptionDefault
GITLAB_TOKENGitLab Personal Access Token or OAuth token-
GITLAB_AI_GATEWAY_URLAI Gateway URL for Anthropic proxyhttps://cloud.gitlab.com

Agentic Chat Options

interface GitLabAgenticOptions {
  providerModel?: string; // Override the backend model (e.g., 'claude-sonnet-4-5-20250929' or 'gpt-5.1-2025-11-13')
  maxTokens?: number; // Default: 8192
  featureFlags?: Record<string, boolean>; // GitLab feature flags
}

Note: The providerModel option allows you to override the automatically mapped model. The provider will validate that the override is compatible with the model ID's provider (e.g., you cannot use an OpenAI model with a duo-chat-opus-* model ID).

Error Handling

import { GitLabError } from '@gitlab/gitlab-ai-provider';

try {
  const result = await generateText({ model, prompt });
} catch (error) {
  if (error instanceof GitLabError) {
    if (error.isAuthError()) {
      console.error('Authentication failed');
    } else if (error.isRateLimitError()) {
      console.error('Rate limit exceeded');
    } else if (error.isServerError()) {
      console.error('Server error:', error.statusCode);
    }
  }
}

πŸ”§ Development

Build

npm run build          # Build once
npm run build:watch    # Build in watch mode

Testing

npm test              # Run all tests
npm run test:watch    # Run tests in watch mode

Code Quality

npm run lint          # Lint code
npm run lint:fix      # Lint and auto-fix
npm run format        # Format code
npm run format:check  # Check formatting
npm run type-check    # TypeScript type checking

Project Structure

gitlab-ai-provider/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ index.ts                            # Main exports
β”‚   β”œβ”€β”€ gitlab-provider.ts                  # Provider factory
β”‚   β”œβ”€β”€ gitlab-anthropic-language-model.ts  # Anthropic/Claude model
β”‚   β”œβ”€β”€ gitlab-openai-language-model.ts     # OpenAI/GPT model
β”‚   β”œβ”€β”€ model-mappings.ts                   # Model ID mappings
β”‚   β”œβ”€β”€ gitlab-direct-access.ts             # Direct access tokens
β”‚   β”œβ”€β”€ gitlab-oauth-manager.ts             # OAuth management
β”‚   β”œβ”€β”€ gitlab-oauth-types.ts               # OAuth types
β”‚   β”œβ”€β”€ gitlab-project-detector.ts          # Project detection
β”‚   β”œβ”€β”€ gitlab-project-cache.ts             # Project caching
β”‚   β”œβ”€β”€ gitlab-api-types.ts                 # API types
β”‚   β”œβ”€β”€ gitlab-error.ts                     # Error handling
β”‚   └── gitlab-workflow-debug.ts            # Debug logging
β”œβ”€β”€ tests/                                  # Test files
β”œβ”€β”€ dist/                                   # Build output
β”œβ”€β”€ package.json
β”œβ”€β”€ tsconfig.json
β”œβ”€β”€ tsup.config.ts
└── vitest.config.ts

πŸ“ Code Style

  • Imports: Named imports, organized by external β†’ internal β†’ types
  • Formatting: Single quotes, semicolons, 100 char line width, 2 space indent
  • Types: Interfaces for public APIs, Zod schemas for runtime validation
  • Naming: camelCase (variables/functions), PascalCase (classes/types), kebab-case (files)
  • Exports: Named exports only (no default exports)
  • Comments: JSDoc for public APIs with @param/@returns

Assistant

🀝 Contributing

Contributions are welcome! Please see our Contributing Guide for detailed guidelines on:

  • Code style and conventions
  • Development workflow
  • Testing requirements
  • Submitting merge requests
  • Developer Certificate of Origin and License

Quick Start for Contributors:

  • Commit Messages: Use conventional commits format

    feat(scope): add new feature
    fix(scope): fix bug
    docs(scope): update documentation
    
  • Code Quality: Ensure all checks pass

    npm run lint
    npm run type-check
    npm test
    
  • Testing: Add tests for new features

πŸ™ Acknowledgments

This project is built on top of:

Made with ❀️ for the OpenCode community

Keywords

ai

FAQs

Package last updated on 22 Jan 2026

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts