New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

ts-chatgpt

Package Overview
Dependencies
Maintainers
1
Versions
7
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ts-chatgpt

A library that is created to receive pure responses that are typed using the official ChatGPT API.

  • 0.5.0
  • Source
  • npm
  • Socket score

Version published
Maintainers
1
Created
Source

ts-chatgpt

npm version main workflow License: MIT

A library that is created to receive pure responses that are typed using the official ChatGPT API by OpenAI.

Install

npm install ts-chatgpt

It has been confirmed to work with Remix's loader function.

Usage

FunctionDescriptionParametersReturn
promptGet a response from ChatGPT APIprops - The props contains an model name you want to use for the ChatGPT API, an array of Message type and an options object.Promise<ChatGPTResponse>

When calling this function, be sure to set the OPENAI_API_KEY environment variable to the API key you received from OpenAI.

import { prompt } from "ts-chatgpt";

const response = await prompt({
  model: "gpt-4",
  messages: [
    {
      role: "user",
      content:
        "In the style of Nicholas Sparks, please summarize the following introductory You are limited to 140 characters. 'I love Android and I develop applications using Kotlin and Jetpack Compose.'",
    },
  ],
  options: {
    temperature: 0.1,
  },
});

Since dotenv.config() is automatically called internally, developers do not need to install dotenv to load OPENAI_API_KEY themselves.

Props

When calling prompt(), you must pass an object containing the following as an argument:

KeyDescriptionTypeRequired
modelThe model name you want to use for the ChatGPT API.string
messagesAn array of Message type.Message[]
optionsAn object containing options.PromptOptions

The following values are currently available for the model More will be added in the future.

ModelDescriptionAvailable
gpt-3.5-turbo-0301The default model.
gpt-3.5-turbo-
gpt-4GPT-4 is the latest and most powerful model.

The following values can be specified by the user as messages to be passed to the prompt function.

KeyDescriptionTypeRequired
roleThe role of the message."system", "assistant" or "user"
contentThe content of the message.string

The following values can be specified by the user as options to be passed to the prompt function.

KeyDescriptionType
apiKeyAPI key that can be obtained from the OpenAI configuration page. You can omit this value by setting the OPENAI_API_KEY environment variable.string
temperatureThe lower the temperature, the more accurate the results. API temperatures set to 0 or close to 0 (e.g. 0.1 or 0.2) tend to give better results in most cases; with GPT-3, the higher the temperature, the more creative and random the results, while with Codex, the higher the temperature, the more truly random and erratic the response can be.number

For detailed specifications of the ChatGPT API, please refer to this document.

Response Type

There are two types of return values for the prompt function: ChatGPT and ChatGPTError.

TypeDescription
ChatGPTThe response from the ChatGPT API.
ChatGPTErrorThe response from the ChatGPT API when an error occurs.

ChatGPT by type is as follows:

type ChatGPT = {
  choices?:
    | {
        message: {
          role: string;
          content: string;
        };
        finish_reason: string;
        index: number;
      }[]
    | undefined;
  object: string;
  id: string;
  created: number;
  model: string;
  usage: {
    prompt_tokens: number;
    completion_tokens: number;
    total_tokens: number;
  };
};

Next, ChatGPTError as a type is as follows:

type ChatGPTError = {
  error: {
    message: string;
    type: string;
    param: string | null;
    code: string | null;
  };
};

Team

Keisuke Takagi

License

This project is licensed under the terms of the MIT license.

MIT

Keywords

FAQs

Package last updated on 19 Mar 2023

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc