
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
Node.js client for Azure OpenAI using credentials from Azure Key Vault with streaming support
A Node.js client that retrieves Azure OpenAI credentials from Azure Key Vault and provides both streaming and non-streaming chat completion capabilities.
npm install mars-llm
npm install
export AZURE_KEY_VAULT_URL=https://your-keyvault.vault.azure.net/
az login to authenticateYour Azure Key Vault must contain the following secrets:
| Secret Name | Description | Example Value |
|---|---|---|
MARS-API-KEY | Your Azure OpenAI API key | abc123... |
MARS-DEPLOYMENT | Your Azure OpenAI deployment name | gpt-4o |
MARS-ENDPOINT | Your Azure OpenAI endpoint URL | https://your-resource.openai.azure.com/ |
MARS-API-VERSION | API version | 2024-02-15-preview |
You must set the Key Vault URL via environment variable:
export AZURE_KEY_VAULT_URL=https://your-keyvault.vault.azure.net/
Or on Windows:
set AZURE_KEY_VAULT_URL=https://your-keyvault.vault.azure.net/
This environment variable is required for the client to work.
This client uses DefaultAzureCredential which automatically tries authentication methods in order:
Run az login before using the client - no additional configuration needed.
Automatically works when running on Azure services - no configuration needed.
Set AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID if required.
Uses Azure PowerShell context if available.
Falls back to interactive authentication if other methods fail.
// If installed from npm
const AzureOpenAIClient = require('mars-llm');
// If using locally
// const AzureOpenAIClient = require('./azure-openai-client');
async function example() {
const client = new AzureOpenAIClient();
// Simple chat
const response = await client.chat('Hello, how are you?');
console.log(response.content);
}
const AzureOpenAIClient = require('./azure-openai-client');
async function advancedExample() {
const client = new AzureOpenAIClient();
// Chat with custom options
const response = await client.chat('Explain quantum computing', {
maxTokens: 200,
temperature: 0.7,
model: 'gpt-4o'
});
console.log('Response:', response.content);
console.log('Usage:', response.usage);
}
const AzureOpenAIClient = require('./azure-openai-client');
async function streamingExample() {
const client = new AzureOpenAIClient();
// Streaming chat with callback
await client.chatStream(
'Tell me a story about AI',
(chunk, fullContent) => {
process.stdout.write(chunk); // Print each chunk as it arrives
},
{
maxTokens: 500,
temperature: 0.8
}
);
}
const AzureOpenAIClient = require('./azure-openai-client');
async function promptTextExample() {
const client = new AzureOpenAIClient();
const prompt = "Summarize the following text";
const text = "Your long text content here...";
// Non-streaming
const response = await client.chatCompletion(prompt, text, {
maxTokens: 150,
temperature: 0.5
});
// Streaming
await client.chatCompletionStream(
prompt,
text,
(chunk) => process.stdout.write(chunk),
{ maxTokens: 150 }
);
}
const AzureOpenAIClient = require('./azure-openai-client');
async function imageExample() {
const client = new AzureOpenAIClient();
// Analyze image from file
const response1 = await client.chatWithImages(
'What do you see in this image?',
'./my-image.jpg',
{
maxTokens: 300,
imageDetail: 'high' // 'low', 'high', or 'auto'
}
);
// Analyze multiple images
const response2 = await client.chatWithImages(
'Compare these images',
['./image1.jpg', './image2.jpg']
);
// Use base64 encoded image
const response3 = await client.chat('Describe this image', {
images: [
{
type: 'base64',
data: 'iVBORw0KGgoAAAANSUhEUgAAAAEAAAAB...',
mimeType: 'image/png'
}
]
});
// Use image URL
const response4 = await client.chat('What is shown here?', {
images: [
{
type: 'url',
data: 'https://example.com/image.jpg'
}
]
});
// Streaming with images
await client.chatStream(
'Create a story based on this image',
(chunk) => process.stdout.write(chunk),
{
images: [{ type: 'file', data: './story-image.jpg' }]
}
);
}
const client = new AzureOpenAIClient(keyVaultUrl);
keyVaultUrl (optional): Key Vault URL. If not provided, uses AZURE_KEY_VAULT_URL environment variable. One of these must be set.chat(message, options)Simple chat completion without streaming.
Parameters:
message (string): The message to sendoptions (object): Optional configuration
model (string): Model name (default: 'gpt-4o')maxTokens (number): Maximum tokens (default: 50)temperature (number): Temperature (default: 0.7)images (array): Array of image objects (NEW!)Returns: Promise with response object containing content, usage, model, finishReason
chatStream(message, onChunk, options)Streaming chat completion.
Parameters:
message (string): The message to sendonChunk (function): Callback for each chunk (chunk, fullContent) => {}options (object): Optional configuration (same as chat, including images)Returns: Promise with response object containing content, model
chatCompletion(prompt, text, options)Chat completion with separate prompt and text (Python-style).
chatCompletionStream(prompt, text, onChunk, options)Streaming chat completion with separate prompt and text.
chatWithImages(message, imagePaths, options) (NEW!)Convenience method for image analysis.
Parameters:
message (string): The message about the image(s)imagePaths (string|array): Single image path or array of image pathsoptions (object): Optional configuration
imageDetail (string): 'low', 'high', or 'auto' (default: 'auto'){
type: 'file' | 'base64' | 'url',
data: 'path/to/image.jpg' | 'base64string' | 'https://example.com/image.jpg',
mimeType: 'image/png', // required for base64
detail: 'low' | 'high' | 'auto' // optional, default: 'auto'
}
low: Faster processing, lower costhigh: More detailed analysis, higher costauto: Model decides based on imagenpm start
node example.js
# Image analysis demonstration
node samples/multimodal-example.js
# Simple image test
node samples/simple-multimodal-test.js
az login for local developmentaz login && az account showaz keyvault secret show --vault-name your-vault --name MARS-API-KEYnode samples/simple-multimodal-test.jsMIT License
FAQs
Node.js client for Azure OpenAI using credentials from Azure Key Vault with streaming support
We found that mars-llm demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.