
Security News
Axios Supply Chain Attack Reaches OpenAI macOS Signing Pipeline, Forces Certificate Rotation
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.
@lanonasis/ai-sdk
Advanced tools
Drop-in AI SDK for browser and Node.js with persistent memory, chat completions, and vortexai-l0 orchestration
Drop-in AI SDK for browser and Node.js applications with persistent memory, chat completions, and vortexai-l0 orchestration.
Current Release: v0.3.0
lano_xxx... format keys@lanonasis/memory-sdk-standalonenpm install @lanonasis/ai-sdk
# or
bun add @lanonasis/ai-sdk
Add your API key to your environment:
# .env.local (Next.js) or .env
LANONASIS_API_KEY=lano_your_api_key_here
# For client-side usage in Next.js
NEXT_PUBLIC_LANONASIS_API_KEY=lano_your_api_key_here
Once configured, the SDK works seamlessly with no additional setup required.
import { LanonasisAI } from '@lanonasis/ai-sdk';
// Initialize with API key from environment
const ai = new LanonasisAI({
apiKey: process.env.LANONASIS_API_KEY!,
// baseUrl defaults to https://api.lanonasis.com
});
// Simple message
const response = await ai.send('Hello, who are you?');
console.log(response);
// Chat with full control
const result = await ai.chat({
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'What is 2+2?' },
],
model: 'gpt-4o-mini',
temperature: 0.7,
maxTokens: 1000,
});
console.log(result.response);
import { LanonasisAI } from '@lanonasis/ai-sdk';
const ai = new LanonasisAI({
apiKey: process.env.LANONASIS_API_KEY!,
});
// Single turn conversation
const response = await ai.chat({
messages: [
{ role: 'user', content: 'Explain quantum computing in simple terms' }
],
model: 'gpt-4o-mini',
});
console.log(response.response);
// Output: "Quantum computing is like having a super-powered calculator..."
const ai = new LanonasisAI({ apiKey: process.env.LANONASIS_API_KEY! });
// Build conversation history
const messages = [
{ role: 'system', content: 'You are a coding assistant specializing in TypeScript.' },
{ role: 'user', content: 'How do I create a generic function?' },
];
const response1 = await ai.chat({ messages });
console.log(response1.response);
// Continue the conversation
messages.push({ role: 'assistant', content: response1.response });
messages.push({ role: 'user', content: 'Can you show me an example with arrays?' });
const response2 = await ai.chat({ messages });
console.log(response2.response);
The SDK automatically integrates with @lanonasis/memory-sdk-standalone:
const ai = new LanonasisAI({
apiKey: process.env.LANONASIS_API_KEY!,
memory: {
enabled: true,
autoSave: true,
contextStrategy: 'relevance',
},
});
// Store important information
await ai.memory.createMemory({
title: 'User Preferences',
content: 'User prefers dark mode and concise responses',
status: 'active',
});
// Search memories
const memories = await ai.memory.searchMemories({
query: 'user preferences',
status: 'active',
threshold: 0.7,
});
// Chat with memory context
const response = await ai.chat({
messages: [{ role: 'user', content: 'Remember my preferences?' }],
conversationId: 'user-123-session',
});
// Get context from memories
const context = await ai.memory.searchWithContext('previous conversations');
// Async iterator style
for await (const chunk of ai.chatStream({
messages: [{ role: 'user', content: 'Tell me a story' }],
})) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
// Callback style
await ai.streamChat(
{ messages: [{ role: 'user', content: 'Count to 10' }] },
(chunk) => console.log(chunk.choices[0]?.delta?.content)
);
Use vortexai-l0 for complex workflows:
// Remote API orchestration (default)
const result = await ai.orchestrate('Create a viral TikTok campaign');
// Local orchestration (no API call)
const ai = new LanonasisAI({
apiKey: 'lnss_xxx',
useLocalOrchestration: true,
});
const result = await ai.orchestrate('analyze trending hashtags');
import { useLanonasis, useChat } from '@lanonasis/ai-sdk/react';
function ChatComponent() {
const { client, isReady } = useLanonasis({
apiKey: process.env.NEXT_PUBLIC_LANONASIS_KEY!,
});
const { messages, send, isLoading, sendWithStream } = useChat({
client,
systemPrompt: 'You are a helpful assistant.',
});
return (
<div>
{messages.map((msg, i) => (
<div key={i} className={msg.role}>{msg.content}</div>
))}
{isLoading && <div>Thinking...</div>}
</div>
);
}
new LanonasisAI({
apiKey: string; // Required: lano_xxx format
baseUrl?: string; // Default: https://api.lanonasis.com
memoryUrl?: string; // Default: same as baseUrl
timeout?: number; // Default: 30000 (30s)
maxRetries?: number; // Default: 3
debug?: boolean; // Default: false
organizationId?: string; // For multi-tenant setups
useLocalOrchestration?: bool // Use vortexai-l0 locally
memory?: {
enabled?: boolean; // Default: true
autoSave?: boolean; // Default: true
contextStrategy?: string; // 'relevance' | 'temporal' | 'hybrid'
maxContextTokens?: number; // Default: 4000
};
});
The SDK connects to these endpoints by default:
| Feature | Endpoint |
|---|---|
| Chat Completions | POST /v1/chat/completions |
| Memory API | GET/POST /api/v1/memory |
| Health Check | GET /health |
All requests are authenticated via the Authorization: Bearer lano_xxx header.
This SDK integrates with existing Lanonasis infrastructure:
// Old (v0.1.0)
import { AiSDK } from '@lanonasis/ai-sdk';
const sdk = new AiSDK();
await sdk.orchestrate('query');
// New (v0.2.x)
import { LanonasisAI } from '@lanonasis/ai-sdk';
const ai = new LanonasisAI({ apiKey: process.env.LANONASIS_API_KEY! });
await ai.orchestrate('query');
// Or simple:
const message = await ai.send('query');
MIT
For direct access to L0's memory-plugin APIs:
import { memoryAPI, configureMemoryPlugin } from '@lanonasis/ai-sdk';
// Configure the memory plugin (optional if using LanonasisAI)
configureMemoryPlugin({
apiUrl: 'https://api.lanonasis.com',
authToken: 'lano_xxx',
});
// Core operations
await memoryAPI.search('project requirements');
await memoryAPI.create({ title: 'Notes', content: '...', type: 'context' });
await memoryAPI.list({ type: 'project' });
// Intelligence features
await memoryAPI.suggestTags('memory-id');
await memoryAPI.findRelated('memory-id');
await memoryAPI.detectDuplicates(0.9);
// Behavioral features
await memoryAPI.recallBehavior({ task: 'deploy', directory: '/app' });
await memoryAPI.suggestNextAction({ task: 'fix bug', completed: ['found issue'] });
await memoryAPI.recordPattern({ trigger: 'user asked...', actions: [...] });
import { LanonasisAI, createPluginManager } from '@lanonasis/ai-sdk';
const plugins = createPluginManager();
plugins.register({
metadata: { name: 'demo', version: '1.0.0', description: 'Demo plugin' },
triggers: ['demo'],
handler: async (ctx) => ({ message: `Hello ${ctx.query}`, type: 'orchestration' })
});
const sdk = new AiSDK({ plugins });
await sdk.orchestrate('demo request');
Ensures no Node-only deps leak into browser builds.
cd packages/ai-sdk
bun run smoke:web # esbuild bundle → dist-smoke/index.js
cd packages/ai-sdk
bun run build
vortexai-l0 browser-safe entry; CLI lives in vortexai-l0/dist/node/cli.js.bun run build and bun run smoke:web.FAQs
Drop-in AI SDK for browser and Node.js with persistent memory, chat completions, and vortexai-l0 orchestration
We found that @lanonasis/ai-sdk demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.

Security News
Open source is under attack because of how much value it creates. It has been the foundation of every major software innovation for the last three decades. This is not the time to walk away from it.

Security News
Socket CEO Feross Aboukhadijeh breaks down how North Korea hijacked Axios and what it means for the future of software supply chain security.