
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
@khaveeai/sdk
Advanced tools
KhaveeAI VRM Avatar SDK - A comprehensive toolkit for creating interactive AI-powered avatars
Transform VRM 3D avatars into interactive AI characters with expressions, animations, and voice.
Build immersive AI experiences with realistic 3D avatars that can talk, express emotions, and respond intelligently to users.
First, install the core 3D rendering libraries:
npm install three @react-three/fiber @react-three/drei
# or
pnpm add three @react-three/fiber @react-three/drei
# or
yarn add three @react-three/fiber @react-three/drei
npm install @khaveeai/react @khaveeai/core
# or
pnpm add @khaveeai/react @khaveeai/core
# or
yarn add @khaveeai/react @khaveeai/core
# For OpenAI
npm install @khaveeai/providers-openai
# For Azure
npm install @khaveeai/providers-azure
# For development/testing
npm install @khaveeai/providers-mock
The SDK requires these peer dependencies (most React projects already have them):
{
"react": "^18.0.0 || ^19.0.0",
"react-dom": "^18.0.0 || ^19.0.0",
"three": "^0.160.0"
}
import { Canvas } from '@react-three/fiber';
import { KhaveeProvider, VRMAvatar } from '@khaveeai/react';
export default function App() {
return (
<KhaveeProvider>
<Canvas>
<ambientLight intensity={0.5} />
<directionalLight position={[10, 10, 5]} />
<VRMAvatar
src="/models/character.vrm"
position={[0, -1, 0]}
/>
</Canvas>
</KhaveeProvider>
);
}
const animations = {
idle: '/animations/idle.fbx', // Auto-plays on load
walk: '/animations/walk.fbx',
dance: '/animations/dance.fbx',
};
function App() {
return (
<KhaveeProvider>
<Canvas>
<VRMAvatar
src="/models/character.vrm"
animations={animations}
/>
</Canvas>
</KhaveeProvider>
);
}
import { KhaveeProvider, VRMAvatar, useLLM, useVoice } from '@khaveeai/react';
import { OpenAIProvider } from '@khaveeai/providers-openai';
const config = {
llm: new OpenAIProvider({ apiKey: 'your-key' }),
voice: new OpenAIProvider({ apiKey: 'your-key' }),
};
function ChatInterface() {
const { streamChat } = useLLM();
const { speak } = useVoice();
const handleChat = async (userMessage: string) => {
let response = '';
// Stream LLM response
for await (const chunk of streamChat({
messages: [{ role: 'user', content: userMessage }]
})) {
if (chunk.type === 'text') {
response += chunk.delta;
}
}
// Speak the response
await speak({ text: response });
};
return (
<button onClick={() => handleChat('Hello!')}>
Say Hello
</button>
);
}
export default function App() {
return (
<KhaveeProvider config={config}>
<Canvas>
<VRMAvatar src="/models/character.vrm" />
</Canvas>
<ChatInterface />
</KhaveeProvider>
);
}
import { useVRMExpressions } from '@khaveeai/react';
function ExpressionControls() {
const { setExpression, resetExpressions, setMultipleExpressions } = useVRMExpressions();
return (
<div>
{/* Single expression */}
<button onClick={() => setExpression('happy', 1)}>
😊 Happy
</button>
{/* Partial intensity */}
<button onClick={() => setExpression('happy', 0.5)}>
🙂 Slightly Happy
</button>
{/* Multiple expressions */}
<button onClick={() => setMultipleExpressions({
happy: 0.8,
surprised: 0.4
})}>
😲 Excited
</button>
{/* Reset all */}
<button onClick={() => resetExpressions()}>
😐 Neutral
</button>
</div>
);
}
import { useVRMAnimations } from '@khaveeai/react';
function AnimationControls() {
const { animate, stopAnimation, currentAnimation } = useVRMAnimations();
return (
<div>
<button onClick={() => animate('walk')}>
🚶 Walk
</button>
<button onClick={() => animate('dance')}>
💃 Dance
</button>
<button onClick={() => animate('idle')}>
🧍 Idle
</button>
<button onClick={() => stopAnimation()}>
⏹️ Stop
</button>
<p>Current: {currentAnimation || 'none'}</p>
</div>
);
}
your-project/
├── public/
│ ├── models/
│ │ └── character.vrm # Your VRM model
│ └── animations/
│ ├── idle.fbx # Mixamo animations
│ ├── walk.fbx
│ └── dance.fbx
├── src/
│ ├── app/
│ │ └── page.tsx # Your main component
│ └── ...
└── package.json
<KhaveeProvider>Root provider that manages VRM state and optional LLM/TTS configuration.
<KhaveeProvider config={config}>
{children}
</KhaveeProvider>
Props:
config? - Optional LLM/TTS provider configurationchildren - React children<VRMAvatar>Renders a VRM 3D character with animations and expressions.
<VRMAvatar
src="/models/character.vrm"
animations={animations}
position={[0, -1, 0]}
rotation={[0, Math.PI, 0]}
scale={[1, 1, 1]}
/>
Props:
src - URL to VRM model file (required)animations? - Animation configuration (URLs to FBX files)position? - 3D position [x, y, z] (default: [0, 0, 0])rotation? - 3D rotation [x, y, z] (default: [0, Math.PI, 0])scale? - 3D scale [x, y, z] (default: [1, 1, 1])useVRMExpressions()Control facial expressions with smooth transitions.
const {
expressions, // Current expression values
setExpression, // Set single expression
resetExpressions, // Reset all to neutral
setMultipleExpressions // Set multiple at once
} = useVRMExpressions();
Example:
setExpression('happy', 1); // Full happiness
setExpression('happy', 0.5); // Partial
setMultipleExpressions({ // Multiple
happy: 0.8,
surprised: 0.3
});
resetExpressions(); // Reset all
useVRMAnimations()Play and control body animations.
const {
animate, // Play animation by name
stopAnimation, // Stop all animations
currentAnimation // Currently playing animation name
} = useVRMAnimations();
Example:
animate('walk'); // Play walk animation
animate('dance'); // Play dance animation
stopAnimation(); // Stop all
useLLM()Stream chat responses from LLM providers.
const { streamChat } = useLLM();
// Usage
for await (const chunk of streamChat({ messages })) {
if (chunk.type === 'text') {
console.log(chunk.delta);
}
}
useVoice()Text-to-speech with speaking state tracking.
const { speak, speaking } = useVoice();
// Usage
await speak({ text: 'Hello world!' });
await speak({ text: 'Hello!', voice: 'female' });
useVRM()Access the raw VRM instance for advanced use cases.
const vrm = useVRM();
if (vrm) {
console.log('VRM loaded:', vrm.meta.name);
}
useKhavee()Access all SDK functionality at once.
const {
vrm,
setExpression,
animate,
// ... all functions
} = useKhavee();
function TalkingAvatar() {
const { speak } = useVoice();
const { setExpression, resetExpressions } = useVRMExpressions();
const sayHello = async () => {
setExpression('happy', 1);
await speak({ text: 'Hello! How are you today?' });
resetExpressions();
};
return <button onClick={sayHello}>Say Hello</button>;
}
function DanceWithJoy() {
const { animate } = useVRMAnimations();
const { setExpression } = useVRMExpressions();
const danceHappily = () => {
animate('dance');
setExpression('happy', 1);
};
return <button onClick={danceHappily}>Dance!</button>;
}
function ChatBot() {
const { streamChat } = useLLM();
const { speak } = useVoice();
const { setExpression } = useVRMExpressions();
const chat = async (userMessage: string) => {
let response = '';
// Stream response
for await (const chunk of streamChat({
messages: [{ role: 'user', content: userMessage }]
})) {
if (chunk.type === 'text') {
response += chunk.delta;
}
}
// Speak with expression
setExpression('happy', 0.8);
await speak({ text: response });
};
return (
<input
onKeyPress={(e) => {
if (e.key === 'Enter') {
chat(e.currentTarget.value);
}
}}
/>
);
}
Perfect for testing without API keys:
import { MockProvider } from '@khaveeai/providers-mock';
const config = {
llm: new MockProvider(),
voice: new MockProvider(),
};
import { OpenAIProvider } from '@khaveeai/providers-openai';
const config = {
llm: new OpenAIProvider({
apiKey: process.env.OPENAI_API_KEY
}),
voice: new OpenAIProvider({
apiKey: process.env.OPENAI_API_KEY
}),
};
import { AzureProvider } from '@khaveeai/providers-azure';
const config = {
llm: new AzureProvider({
apiKey: process.env.AZURE_API_KEY,
endpoint: process.env.AZURE_ENDPOINT
}),
voice: new AzureProvider({
apiKey: process.env.AZURE_API_KEY
}),
};
Create your own provider:
import { LLMProvider, VoiceProvider } from '@khaveeai/core';
class CustomProvider implements LLMProvider, VoiceProvider {
async *streamChat({ messages }) {
// Your implementation
yield { type: 'text', delta: 'Hello!' };
}
async speak({ text, voice }) {
// Your implementation
return Promise.resolve();
}
}
animations configRecommended Animations:
Check these:
<Canvas> from @react-three/fiber<KhaveeProvider><ambientLight>, <directionalLight>)Check these:
Check these:
<KhaveeProvider>Check these:
<KhaveeProvider config={...}>useLLM() or useVoice() inside providerContributions are welcome! Please read our Contributing Guide for details.
MIT © Khavee AI
Check out our example app to see:
Built with ❤️ by the Khavee AI Team
FAQs
KhaveeAI VRM Avatar SDK - A comprehensive toolkit for creating interactive AI-powered avatars
We found that @khaveeai/sdk demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.