
Product
Rust Support in Socket Is Now Generally Available
Socket’s Rust and Cargo support is now generally available, providing dependency analysis and supply chain visibility for Rust projects.
gpt-research
Advanced tools
Autonomous AI research agent that conducts comprehensive research on any topic and generates detailed reports with citations
🔍 GPT Research is an autonomous AI research agent that conducts comprehensive research on any topic, searches the web for real-time information, and generates detailed reports with proper citations.
Built with TypeScript and optimized for both local development and serverless deployment (Vercel, AWS Lambda, etc.).
npm install gpt-research
# or
yarn add gpt-research
# or
pnpm add gpt-research
Create a .env file in the root directory:
# Required
OPENAI_API_KEY=your-openai-api-key
# Optional Search Providers (at least one recommended)
TAVILY_API_KEY=your-tavily-api-key # https://tavily.com (best for AI research)
SERPER_API_KEY=your-serper-api-key # https://serper.dev (Google search, 2,500 free/month)
GOOGLE_API_KEY=your-google-api-key # Google Custom Search
GOOGLE_CX=your-google-custom-search-engine-id
# Optional LLM Providers
ANTHROPIC_API_KEY=your-anthropic-api-key
GOOGLE_AI_API_KEY=your-google-ai-api-key
GROQ_API_KEY=your-groq-api-key
const { GPTResearch } = require('gpt-research');
// or for TypeScript/ES modules:
// import { GPTResearch } from 'gpt-research';
async function main() {
const researcher = new GPTResearch({
query: 'What are the latest developments in quantum computing?',
reportType: 'research_report',
llmProvider: 'openai',
apiKeys: {
openai: process.env.OPENAI_API_KEY,
tavily: process.env.TAVILY_API_KEY
}
});
// Conduct research
const result = await researcher.conductResearch();
console.log(result.report);
console.log(`Sources used: ${result.sources.length}`);
console.log(`Cost: $${result.costs.total.toFixed(4)}`);
}
main().catch(console.error);
const researcher = new GPTResearch(config);
// Stream research updates in real-time
for await (const update of researcher.streamResearch()) {
switch (update.type) {
case 'progress':
console.log(`[${update.progress}%] ${update.message}`);
break;
case 'data':
if (update.data?.reportChunk) {
process.stdout.write(update.data.reportChunk);
}
break;
case 'complete':
console.log('\nResearch complete!');
break;
}
}
interface ResearchConfig {
// Required
query: string; // Research query
// Report Configuration
reportType?: ReportType; // Type of report to generate
reportFormat?: ReportFormat; // Output format (markdown, pdf, docx)
tone?: Tone; // Writing tone
// LLM Configuration
llmProvider?: string; // LLM provider (openai, anthropic, etc.)
smartLLMModel?: string; // Model for complex tasks
fastLLMModel?: string; // Model for simple tasks
temperature?: number; // Generation temperature
maxTokens?: number; // Max tokens per generation
// Search Configuration
defaultRetriever?: string; // Default search provider
maxSearchResults?: number; // Max results per search
// Scraping Configuration
defaultScraper?: string; // Default scraper (cheerio, puppeteer)
scrapingConcurrency?: number; // Concurrent scraping operations
// API Keys
apiKeys?: {
openai?: string;
tavily?: string;
serper?: string;
google?: string;
anthropic?: string;
groq?: string;
};
}
| Provider | Best For | Free Tier | API Key Required |
|---|---|---|---|
| Tavily | AI-optimized research | 1,000/month | Yes - Get Key |
| Serper | Google search results | 2,500/month | Yes - Get Key |
| Custom search | 100/day | Yes - Setup | |
| DuckDuckGo | Privacy-focused | Unlimited | No |
// Configure multiple providers for redundancy
const researcher = new GPTResearch({
query: 'Your research topic',
retrievers: ['tavily', 'serper'], // Falls back if one fails
apiKeys: {
tavily: process.env.TAVILY_API_KEY,
serper: process.env.SERPER_API_KEY
}
});
GPT Research now supports MCP for connecting to external tools and services!
MCP (Model Context Protocol) is a standardized protocol for connecting AI systems to external tools and data sources. It enables seamless integration with various services through a unified interface.
const researcher = new GPTResearch({
query: "Latest AI developments",
mcpConfigs: [
{
name: "research-tools",
connectionType: "http",
connectionUrl: "https://mcp.example.com",
connectionToken: process.env.MCP_TOKEN
}
],
useMCP: true
});
const researcher = new GPTResearch({
query: "Analyze this codebase",
mcpConfigs: [
{
name: "filesystem",
connectionType: "stdio",
command: "npx",
args: ["@modelcontextprotocol/filesystem-server"],
env: { READ_ONLY: "false" }
},
{
name: "git",
connectionType: "stdio",
command: "git-mcp",
args: ["--repo", "."]
}
]
});
const researcher = new GPTResearch({
query: "Research topic",
mcpConfigs: [
// Local tools via stdio
{ name: "local-fs", connectionType: "stdio", command: "npx", args: ["fs-mcp"] },
// Remote API via HTTP
{ name: "api", connectionType: "http", connectionUrl: "https://api.example.com/mcp" },
// Real-time via WebSocket
{ name: "stream", connectionType: "websocket", connectionUrl: "wss://realtime.example.com" }
]
});
| MCP Type | Local/Node.js | Vercel | Docker | VPS/Cloud |
|---|---|---|---|---|
| HTTP Servers | ✅ Full | ✅ Full | ✅ Full | ✅ Full |
| WebSocket | ✅ Full | ✅ Full | ✅ Full | ✅ Full |
| Stdio | ✅ Full | ❌ Not Supported | ✅ Full | ✅ Full |
These MCP servers can be run locally via stdio:
# File System Access
npx @modelcontextprotocol/filesystem-server
# Git Repository Tools
npx @modelcontextprotocol/git-server
# Database Query Execution
npm install -g mcp-database
mcp-database
# Custom Python MCP Server
python -m mcp.server
# Shell Command Execution
cargo install mcp-shell
mcp-shell
examples/demo-mcp.js for HTTP/WebSocket demoexamples/demo-mcp-stdio.js for stdio demoMCP.md for implementation detailsCreate API routes in your Next.js/Vercel project:
// api/research/route.js
import { GPTResearch } from 'gpt-research';
export async function POST(request) {
const { query, reportType } = await request.json();
const researcher = new GPTResearch({
query,
reportType,
apiKeys: {
openai: process.env.OPENAI_API_KEY,
tavily: process.env.TAVILY_API_KEY
}
});
const result = await researcher.conductResearch();
return Response.json(result);
}
// api/research/stream/route.js
export async function POST(request) {
const { query } = await request.json();
const stream = new ReadableStream({
async start(controller) {
const researcher = new GPTResearch({ query });
for await (const update of researcher.streamResearch()) {
controller.enqueue(
`data: ${JSON.stringify(update)}\n\n`
);
}
controller.close();
}
});
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
}
});
}
Add to your Vercel project settings:
OPENAI_API_KEY=your-key
TAVILY_API_KEY=your-key
SERPER_API_KEY=your-key
# Basic example
npm run example
# OpenAI-only example (no web search)
npm run example:simple
# Full research with Tavily web search
npm run example:tavily
# Research using Serper (Google Search API)
npm run example:serper
Check the examples/ directory for more detailed usage examples.
Contributions are welcome! Please feel free to submit a Pull Request.
MIT License - see LICENSE file for details.
Build Errors: Make sure you have Node.js 18+ and run npm install
API Key Errors: Verify your API keys are correct in .env
Rate Limiting: Reduce scrapingConcurrency and maxSearchResults
Memory Issues: For large research, increase Node.js memory:
node --max-old-space-size=4096 your-script.js
If you find GPT Research helpful, please consider:
Built with ❤️ by Pablo Schaffner
Autonomous research for everyone
FAQs
Autonomous AI research agent that conducts comprehensive research on any topic and generates detailed reports with citations
The npm package gpt-research receives a total of 0 weekly downloads. As such, gpt-research popularity was classified as not popular.
We found that gpt-research demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Product
Socket’s Rust and Cargo support is now generally available, providing dependency analysis and supply chain visibility for Rust projects.

Security News
Chrome 144 introduces the Temporal API, a modern approach to date and time handling designed to fix long-standing issues with JavaScript’s Date object.

Research
Five coordinated Chrome extensions enable session hijacking and block security controls across enterprise HR and ERP platforms.