
Security News
Another Round of TEA Protocol Spam Floods npm, But It’s Not a Worm
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.
@gpt-tag/openai
Advanced tools
@gpt-tag/openai is the OpenAI specific version of gpt-tag. It's designed to be easily composable within your existing application, and leaves code organization and control up to your code.
Install @gpt-tag/openai and make sure you have installed openai as well.
npm install @gpt-tag/openai openai
or
yarn add @gpt-tag/openai openai
gpt-tag libraries use a fluent interface to help provide maximum composability. Simply import openai from @gpt-tag/openai to start composing.
import { openai } from "@gpt-tag/openai";
const factual = openai.temperature(0);
const president = factual`Who was president of the United States in 1997`;
const result = await president.get();
// Bill Clinton
You can embed the result of one tag inside of another tag to create powerful compositions, and no LLM requests will be
made until you attempt to call .get() on a tag. At that time, it will resolve the necessary calls sequentially to
provide a final answer.
import { openai } from "@gpt-tag/openai";
const factual = openai.temperature(0);
const opinion = openai.temperature(1);
const president = factual`Who was president of the United States in 1997? Respond with only their name`;
const height = factual`What is ${president}'s height? Respond with only the height. Format: D'D"`;
const iceCream = opinion`Which flavor of ice cream would be preferred by ${president}? Choose only one. Guess if you don't know. Format: <flavor>`;
const [heightAnswer, iceCreamAnswer] = await Promise.all([
height.get(),
iceCream.get(),
]);
// [ 6'2" , mango ]
See examples for more samples of how you can compose tags together.
Most methods return GPTTag itself for fluently chaining.
Return a promise that will resolve any llm calls used to compose this tag
await openai.get()
Use temperature to control randomness
openai.temperature(0.5)
Override the model used
openai.model('gpt-4')
Use a streaming response
openai.stream(true);
Transform the results of a call before returning them
openai.transform((result) => result.trim());
FAQs
A library for building OpenAI powered applications using template strings
The npm package @gpt-tag/openai receives a total of 1 weekly downloads. As such, @gpt-tag/openai popularity was classified as not popular.
We found that @gpt-tag/openai demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.

Security News
PyPI adds Trusted Publishing support for GitLab Self-Managed as adoption reaches 25% of uploads

Research
/Security News
A malicious Chrome extension posing as an Ethereum wallet steals seed phrases by encoding them into Sui transactions, enabling full wallet takeover.