New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

llm-prompt-stream

Package Overview
Dependencies
Maintainers
0
Versions
9
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

llm-prompt-stream

A lightweight Node.js module for streaming OpenAI responses with Markdown formatting and file saving support.

1.0.26
latest
npm
Version published
Weekly downloads
563
Maintainers
0
Weekly downloads
 
Created
Source

📜 llm-prompt-stream

A lightweight Node.js module for streaming OpenAI responses with Markdown formatting and file saving support.

🚀 Features

  • Streaming OpenAI responses in real-time.
  • Markdown support for structured responses.
  • Write streamed responses to a file.
  • Mock data support for testing.

📦 Installation

Install via npm:

npm install llm-prompt-stream

🔧 Usage

Basic Example: Streaming an OpenAI Completion

import { createCompletionAndStream, readStream } from "llm-prompt-stream";

// Your OpenAI API key
const openAIKey = "your-api-key";

const messages = [
  { role: "system", content: "You are a helpful assistant." },
  { role: "user", content: "Tell me a fun fact!" },
];

async function run() {
  const stream = await createCompletionAndStream(openAIKey, messages);
  const response = await readStream(stream);
  console.log("Full Response:", response);
}

run();

Saving Streamed Response to a Markdown File

import { createCompletionAndStream, readStream } from "llm-prompt-stream";

const openAIKey = "your-api-key";
const messages = [{ role: "user", content: "Give me a summary of AI." }];

async function run() {
  const stream = await createCompletionAndStream(openAIKey, messages);
  await readStream(stream, true, "output.md");
  console.log("Response saved to output.md!");
}

run();

🛠 API Reference

`createCompletionAndStream(openAIKey: string, messages: any[])`

Creates an OpenAI streaming completion.

  • `openAIKey` - Your OpenAI API key.
  • `messages` - Array of message objects for the chat completion.

Returns: A `ReadableStream` containing the OpenAI response.

`readStream(stream: ReadableStream, createFile?: boolean, outputFilename?: string)`

Reads and processes a streamed response.

  • `stream` - The OpenAI response stream.
  • `createFile` (optional) - If `true`, writes response to a file.
  • `outputFilename` (optional) - Filename for the saved response (default: `response.md`).

Returns: A string containing the full response.

🧪 Running Tests

To run tests locally:

npm test

📜 License

This project is licensed under the MIT License.

🤝 Contributing

Contributions are welcome! Feel free to open an issue or submit a PR.

  • Fork the repo
  • Clone it: `git clone https://github.com/yourusername/llm-prompt-stream.git\`
  • Install dependencies: `npm install`
  • Make changes & commit
  • Submit a pull request!

⭐ Support

If you like this package, give it a ⭐ on GitHub!

FAQs

Package last updated on 06 Mar 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts