
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
node-ai-ragbot
Advanced tools
Node.js backend package for building AI chatbots and voicebots with Retrieval-Augmented Generation (RAG). It ingests website pages or local files (PDF, DOCX, TXT, MD), creates embeddings with LangChain + OpenAI, stores them in a fast in-memory vector data
node-ai-ragbot is a modular and framework-agnostic Node.js package for building intelligent chatbot and voicebot systems with Retrieval-Augmented Generation (RAG), powered by OpenAI and LangChain.
It allows you to quickly integrate both text-based chatbots and voice-enabled assistants into your application with minimal setup. The package provides ready-to-use request handlers and adapters for popular frameworks, so you can focus on building your application instead of reinventing backend logic.
.pdf, .docx, .txt, .md)/chat endpoint/voice endpoint (speech-to-text using Whisper, text-to-speech using OpenAI TTS)npm install node-ai-ragbot
The easiest way to get started is with Express. You can initialize the bot and then attach the provided handlers to your routes.
const express = require("express");
const cors = require("cors");
require("dotenv").config();
const { initRagVoiceBot, useInExpress } = require("node-ai-ragbot");
const app = express();
(async () => {
const { chatHandler, voiceHandler } = await initRagVoiceBot({
sources: {
files: ["files/knowledge.txt", "files/knowledge.pdf"],
// urls: ["https://docs.myproduct.com"],
},
openai: {
apiKey: process.env.OPENAI_API_KEY,
chat: { model: "gpt-4o", temperature: 0.3 },
whisper: { model: "whisper-1", language: "en" },
tts: { model: "tts-1-hd", voice: "nova" },
},
});
// Mounts /api/bot/chat and /api/bot/voice
useInExpress(app, chatHandler, voiceHandler, "/api/bot");
app.listen(3000, () =>
console.log("Server running on http://localhost:3000")
);
})();
This will expose the following endpoints:
POST /api/bot/chatPOST /api/bot/voiceIf you are not using Express, you can still initialize handlers with initRagVoiceBot and then connect them to your chosen framework using the adapters provided in this package.
const { initRagVoiceBot, useInFastify, useInKoa, useInNestJS, useInHttp } = require("node-ai-ragbot");
(async () => {
const { chatHandler, voiceHandler } = await initRagVoiceBot({
sources: { files: ["files/knowledge.txt"] },
openai: { apiKey: process.env.OPENAI_API_KEY },
});
// Fastify
const fastify = require("fastify")();
useInFastify(fastify, chatHandler, voiceHandler, "/api/bot");
// Koa
const Koa = require("koa");
const Router = require("koa-router");
const koaApp = new Koa();
const koaRouter = new Router();
useInKoa(koaRouter, chatHandler, voiceHandler, "/api/bot");
koaApp.use(koaRouter.routes());
// NestJS
const { chat, voice } = useInNestJS(chatHandler, voiceHandler);
// Example in controller:
// @Post("chat") chat(@Req() req, @Res() res) { return chat(req, res); }
// @Post("voice") voice(@Req() req, @Res() res) { return voice(req, res); }
// Raw Node.js
const http = require("http");
const server = http.createServer();
useInHttp(server, chatHandler, voiceHandler, "/api/bot");
server.listen(4000, () => console.log("Raw server on http://localhost:4000"));
})();
By using adapters, you can integrate seamlessly with any backend framework without being locked to a specific one.
/chat{
"question": "What is in the knowledge base?"
}
{
"success": true,
"answer": "Answer from the chatbot"
}
/voiceaudio/webm){
"success": true,
"transcription": "Your transcribed question",
"answer": "Generated answer",
"audio": "Base64 encoded audio response"
}
You can customize how the bot processes data by passing a configuration object.
interface RagConfig {
sources: {
files?: string[]; // Local file paths (.pdf, .docx, .txt, .md)
urls?: string[]; // Website URLs
};
rag?: {
maxPagesPerSite?: number; // default: 30
textSplit?: {
chunkSize?: number; // default: 1000
chunkOverlap?: number; // default: 200
};
topK?: number; // default: 3
};
openai: {
apiKey: string;
embeddings?: { model?: string }; // default: text-embedding-3-small
chat?: {
model?: string; // default: gpt-4o
temperature?: number; // default: 0.3
maxTokens?: number; // default: 200
promptTemplate?: string;
};
stt?: {
model?: string; // default: whisper-1
language?: string; // default: en
};
tts?: {
model?: string; // default: tts-1
voice?: string; // default: alloy
};
};
logger?: Console; // default: console
}
The package uses LangChain’s in-memory vector store to hold embeddings of indexed data extracted from files or websites.
my-app/
├── files/
│ ├── knowledge.txt
│ ├── knowledge.pdf
├── server.js
├── package.json
└── README.md
If you encounter issues or have feature requests, please open a GitHub issue.
For custom requirements, the package can be extended by wrapping handlers or building new adapters.
Happy building with node-ai-ragbot!
FAQs
Node.js backend package for building AI chatbots and voicebots with Retrieval-Augmented Generation (RAG). It ingests website pages or local files (PDF, DOCX, TXT, MD), creates embeddings with LangChain + OpenAI, stores them in a fast in-memory vector data
The npm package node-ai-ragbot receives a total of 0 weekly downloads. As such, node-ai-ragbot popularity was classified as not popular.
We found that node-ai-ragbot demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.