
Security News
Axios Supply Chain Attack Reaches OpenAI macOS Signing Pipeline, Forces Certificate Rotation
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.
@helicone/helicone
Advanced tools
A Node.js wrapper for the OpenAI API that logs all requests to Helicone.
This package is a simple and convenient way to log all requests made through the OpenAI API with Helicone. You can easily track and manage your OpenAI API usage and monitor your GPT models' cost, latency, and performance on the Helicone platform.
To get started, install the helicone-openai package:
npm install @helicone/helicone
Set HELICONE_API_KEY as an environment variable:
Set HELICONE_API_KEY as an environment variable:
ℹ️ You can also set the Helicone API Key in your code (See below).
Replace:
const { ClientOptions, OpenAI } = require("openai");
with:
const { HeliconeProxyOpenAI as OpenAI,
IHeliconeProxyClientOptions as ClientOptions } = require("helicone");
Make a request Chat, Completion, Embedding, etc usage is equivalent to OpenAI package.
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
heliconeMeta: {
apiKey: process.env.HELICONE_API_KEY, // Can be set as env variable
// ... additional helicone meta fields
},
});
const chatCompletion = await openai.chat.completion.create({
model: "gpt-3.5-turbo",
messages: [{ role: "user", content: "Hello world" }],
});
console.log(chatCompletion.data.choices[0].message);
Ensure you store the helicone-id header returned in the original response.
const { data, response } = await openai.chat.completion
.create({
model: "gpt-3.5-turbo",
messages: [{ role: "user", content: "Hello world" }],
})
.withResponse();
const heliconeId = response.headers.get("helicone-id");
await openai.helicone.logFeedback(heliconeId, HeliconeFeedbackRating.Positive); // or Negative
interface IHeliconeMeta {
apiKey?: string;
properties?: { [key: string]: any };
cache?: boolean;
retry?: boolean | { [key: string]: any };
rateLimitPolicy?: string | { [key: string]: any };
user?: string;
baseUrl?: string;
onFeedback?: OnHeliconeFeedback; // Callback after feedback was processed
}
type OnHeliconeLog = (response: Response) => Promise<void>;
type OnHeliconeFeedback = (result: Response) => Promise<void>;
const options = new IHeliconeProxyClientOptions({
apiKey,
heliconeMeta: {
apiKey: process.env.HELICONE_API_KEY,
cache: true,
retry: true,
properties: {
Session: "24",
Conversation: "support_issue_2",
},
rateLimitPolicy: {
quota: 10,
time_window: 60,
segment: "Session",
},
},
});
To get started, install the helicone-openai package:
npm install @helicone/helicone
Set HELICONE_API_KEY as an environment variable:
Set HELICONE_API_KEY as an environment variable:
ℹ️ You can also set the Helicone API Key in your code (See below).
Replace:
const { ClientOptions, OpenAI } = require("openai");
with:
const { HeliconeAsyncOpenAI as OpenAI,
IHeliconeAsyncClientOptions as ClientOptions } = require("helicone");
Make a request Chat, Completion, Embedding, etc usage is equivalent to OpenAI package.
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
heliconeMeta: {
apiKey: process.env.HELICONE_API_KEY, // Can be set as env variable
// ... additional helicone meta fields
},
});
const chatCompletion = await openai.chat.completion.create({
model: "gpt-3.5-turbo",
messages: [{ role: "user", content: "Hello world" }],
});
console.log(chatCompletion.data.choices[0].message);
With Async logging, you must retrieve the helicone-id header from the log response (not LLM response).
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
heliconeMeta: {
apiKey: process.env.HELICONE_API_KEY,
onLog: async (response: Response) => {
const heliconeId = response.headers.get("helicone-id");
await openai.helicone.logFeedback(
heliconeId,
HeliconeFeedbackRating.Positive
);
},
},
});
Async logging loses some additional features such as cache, rate limits, and retries
interface IHeliconeMeta {
apiKey?: string;
properties?: { [key: string]: any };
user?: string;
baseUrl?: string;
onLog?: OnHeliconeLog;
onFeedback?: OnHeliconeFeedback;
}
type OnHeliconeLog = (response: Response) => Promise<void>;
type OnHeliconeFeedback = (result: Response) => Promise<void>;
For more information see our documentation.
FAQs
A Node.js wrapper for the OpenAI API that logs all requests to Helicone.
The npm package @helicone/helicone receives a total of 1,336 weekly downloads. As such, @helicone/helicone popularity was classified as popular.
We found that @helicone/helicone demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.

Security News
Open source is under attack because of how much value it creates. It has been the foundation of every major software innovation for the last three decades. This is not the time to walk away from it.

Security News
Socket CEO Feross Aboukhadijeh breaks down how North Korea hijacked Axios and what it means for the future of software supply chain security.