
Security News
Meet Socket at Black Hat and DEF CON 2025 in Las Vegas
Meet Socket at Black Hat & DEF CON 2025 for 1:1s, insider security talks at Allegiant Stadium, and a private dinner with top minds in software supply chain security.
@imaginary-dev/openai
Advanced tools
**This package has been deprecated, please migrate to <https://www.npmjs.com/package/@libretto/openai>.**
This package has been deprecated, please migrate to https://www.npmjs.com/package/@libretto/openai.
TypeScript wrapper around openai library to send events to templatest
npm install @imaginary-dev/openai
To use this library, you need to patch the openai library. This will time calls to OpenAI, and report them to Templatest.
You'll need an API key from Templatest. Set it in the environment variable PROMPT_API_KEY
or pass it directly to the patch()
call. You'll also probably want to name which template you are using.
import { patch, objectTemplate } from "@imaginary-dev/openai";
import OpenAI from "openai";
async function main() {
patch({
apiKey: "XXX", // defaults to process.env.PROMPT_API_KEY
// You can set this here or in the `create` call:
// promptTemplateName: "my-template-test"
OpenAI,
});
const openai = new OpenAI({
apiKey: "YYY", // defaults to process.env.OPENAI_API_KEY
});
const completion = await openai.chat.completions.create({
// Instead of a chat message array, you can pass objectTemplate instead.
messages: objectTemplate([
{ role: "user", content: "Give a hearty welcome to our new user {name}" },
]) as any,
model: "gpt-3.5-turbo",
// Uniquely identifies this prompt within your project. Equivalent to
// passing `promptTemplateName` to `patch()`.
ip_prompt_template_name: "ts-client-test-chat",
// The parameters to fill in the template.
ip_template_params: { name: "John" },
});
console.log(completion.choices);
}
main();
You can "unpatch" the library by calling unpatch()
. This will restore the original create
method on the chat.completions
object.
import { patch, objectTemplate } from "@imaginary-dev/openai";
import OpenAI from "openai";
const unpatch = patch({ OpenAI });
try {
const completion = await openai.chat.completions.create({...});
} finally {
unpatch();
}
The following options may be passed to patch
:
promptTemplateName
: A default name to associate with prompts. If provided,
this is the name that will be associated with any create
call that's made
without an ip_prompt_template_name
parameter.allowUnnamedPrompts
: When set to true
, every prompt will be sent to
Templatest even if no prompt template name as been provided (either via the
promptTemplateName
option on patch
or via the ip_prompt_template_name
parameter added to create
).redactPii
: When true
, certain personally identifying information (PII)
will be attempted to be redacted before being sent to the Templatest backend.
See the pii
package for details about the types of PII being detected/redacted.
false
by default.The following parameters are added to the create
call:
ip_template_params
: The parameters to use for template
strings. This is a dictionary of key-value pairs.ip_chat_id
: The id of a "chat session" - if the chat API is
being used in a conversational context, then the same chat id can be
provided so that the events are grouped together, in order. If not provided,
this will be left blank.ip_template_chat
: The chat template to record for chat
requests. This is a list of dictionaries with the following keys:
role
: The role of the speaker. Either "system"
, "user"
or "ai"
.content
: The content of the message. This can be a string or a template
string with {}
placeholders.ip_template_text
: The text template to record for non-chat
completion requests. This is a string or a template string with {}
placeholders,ip_parent_event_id
: The UUID of the parent event. All calls with the same
parent id are grouped as a "Run Group".ip_feedback_key
: The optional key used to send feedback on the prompt, for
use with sendFeedback()
later. This is normally auto-generated, and the
value is returned in the OpenAI response.Sometimes the answer provided by the LLM is not ideal, and your users may be able to help you find better responses. There are a few common cases:
You can send this feedback to Tepmlatest by calling sendFeedback()
. This will
send a feedback event to Templatest about a prompt that was previously called, and
let you review this feedback in the Templatest dashboard. You can use this
feedback to develop new tests and improve your prompts.
import { patch, sendFeedback } from "@imaginary-dev/openai";
import crypto from "crypto";
import OpenAI from "openai";
async function main() {
patch({ OpenAI });
// Must be unique for each call to OpenAI
const completion = await openai.chat.completions.create({
// ...
});
// Maybe the user didn't like the answer, so ask them for a better one.
const betterAnswer = await askUserForBetterResult(completion.choices[0].text);
// If the user provided a better answer, send feedback to Templatest
if (betterAnswer !== completion.choices[0].text) {
// feedback key is automatically injected into OpenAI response object.
const feedbackKey = completion.ip_feedback_key;
await sendFeedback({
apiKey,
feedbackKey,
// Better answer from the user
betterResponse: betterAnswer,
// Rating of existing answer, from 0 to 1
rating: 0.2,
});
}
}
Note that feedback can include either rating
, betterResponse
, or both.
Parameters:
rating
- a value from 0 (meaning the result was completely wrong) to 1 (meaning the result was correct)betterResponse
- the better response from the userFAQs
**This package has been deprecated, please migrate to <https://www.npmjs.com/package/@libretto/openai>.**
We found that @imaginary-dev/openai demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Meet Socket at Black Hat & DEF CON 2025 for 1:1s, insider security talks at Allegiant Stadium, and a private dinner with top minds in software supply chain security.
Security News
CAI is a new open source AI framework that automates penetration testing tasks like scanning and exploitation up to 3,600× faster than humans.
Security News
Deno 2.4 brings back bundling, improves dependency updates and telemetry, and makes the runtime more practical for real-world JavaScript projects.