@wix/ai

Wix AI SDK for the Vercel AI SDK. Provides access to AI models through Wix infrastructure with automatic authentication.
⚠️ Note: This package depends on Vercel AI SDK v6 (currently in beta).
Supported Providers
| OpenAI | ✅ Available | Text generation, chat, embeddings |
| Runware | 🔄 Planned | Image generation |
Currently, OpenAI is the only supported provider. Additional providers will be added over time.
Installation
npm install @wix/ai ai@beta
Requirements:
ai@>=6.0.0-beta (Vercel AI SDK v6)
zod@^4.1.8
Quick Start
Using OpenAI:
import { generateText } from 'ai';
import { openai } from '@wix/ai';
const { text } = await generateText({
model: openai('gpt-4o'),
prompt: 'What is the capital of France?',
});
OpenAI Provider Features
- Responses API - Uses OpenAI's Responses API by default
- Chat Completions - Standard chat completions via
.chat()
- Embeddings - Text embeddings for semantic search via
.embeddingModel()
Requirements
This SDK is designed to run only within Wix environments (Vibe, etc.) on backend code.
- ✅ Wix backend code (API routes, server functions)
- ❌ Browser/client-side code
- ❌ External environments (outside Wix)
Technical requirements:
- Node.js 18+
ai package v6.0.0+
API Reference
OpenAI Provider
Responses API (Default)
import { generateText } from 'ai';
import { openai } from '@wix/ai';
const { text } = await generateText({
model: openai('gpt-4o'),
prompt: 'Explain quantum computing.',
});
Or explicitly:
const model = openai.responses('gpt-4o');
Chat Completions
import { generateText } from 'ai';
import { openai } from '@wix/ai';
const { text } = await generateText({
model: openai.chat('gpt-4o'),
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'What is TypeScript?' },
],
});
Embeddings
import { embed, cosineSimilarity } from 'ai';
import { openai } from '@wix/ai';
const { embedding } = await embed({
model: openai.embeddingModel('text-embedding-3-small'),
value: 'Hello world',
});
const similarity = cosineSimilarity(embedding1, embedding2);
Batch embeddings:
import { embedMany } from 'ai';
import { openai } from '@wix/ai';
const { embeddings } = await embedMany({
model: openai.embeddingModel('text-embedding-3-small'),
values: ['Hello', 'World', 'Foo', 'Bar'],
});
OpenAI Provider Methods
openai(modelId) | Responses API (default) |
openai.responses(modelId) | Responses API (explicit) |
openai.chat(modelId) | Chat Completions API |
openai.embeddingModel(modelId) | Text embeddings |
License
MIT