πŸš€ DAY 2 OF LAUNCH WEEK: Announcing Socket Certified Patches: One-Click Fixes for Vulnerable Dependencies.Learn more β†’
Socket
Book a DemoInstallSign in
Socket

naver-finance-crawl-mcp

Package Overview
Dependencies
Maintainers
1
Versions
2
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

naver-finance-crawl-mcp

Naver Finance web crawler with MCP server support - crawl Korean stock market data

latest
Source
npmnpm
Version
1.0.1
Version published
Maintainers
1
Created
Source

Naver Finance Crawl MCP

npm version License: MIT Node.js Version TypeScript

MCP (Model Context Protocol) server for crawling Korean stock market data from Naver Finance.

A Node.js-based web crawler built with TypeScript, Axios, and Cheerio, with MCP server support for AI assistants.

Features

  • crawl-top-stocks: Fetch the most searched stocks from Naver Finance with real-time data
  • crawl-stock-detail: Get comprehensive stock information by 6-digit stock code
  • MCP Server Support: Full integration with AI assistants via Model Context Protocol
  • HTTP & STDIO Transports: Flexible transport options for different use cases
  • Korean Encoding Support: Proper handling of Korean characters (EUC-KR)
  • TypeScript Support: Fully typed with strict mode enabled
  • Modular Architecture: Extensible base crawler class for custom implementations
  • HTML Parsing: Cheerio-based HTML parsing with helper utilities
  • HTTP Requests: Axios-based HTTP client with retry logic
  • Testing: Vitest with unit, integration, and E2E tests
  • Code Quality: ESLint and Prettier for code consistency

Project Structure

src/
β”œβ”€β”€ crawlers/           # Crawler implementations
β”‚   β”œβ”€β”€ baseCrawler.ts  # Base class for all crawlers
β”‚   β”œβ”€β”€ exampleCrawler.ts # Example crawler implementation
β”‚   └── index.ts
β”œβ”€β”€ utils/              # Utility functions
β”‚   β”œβ”€β”€ request.ts      # HTTP client with retry logic
β”‚   β”œβ”€β”€ parser.ts       # HTML parsing helper
β”‚   └── index.ts
β”œβ”€β”€ types/              # TypeScript type definitions
β”‚   └── index.ts
└── index.ts            # Main entry point

tests/
β”œβ”€β”€ unit/               # Unit tests
β”œβ”€β”€ integration/        # Integration tests
└── e2e/                # E2E tests

Installation

NPM

npm install -g naver-finance-crawl-mcp

Smithery

To install Naver Finance Crawl MCP Server for any client automatically via Smithery:

npx -y @smithery/cli@latest install naver-finance-crawl-mcp --client <CLIENT_NAME>

Available clients: cursor, claude, vscode, windsurf, cline, zed, etc.

Example for Cursor:

npx -y @smithery/cli@latest install naver-finance-crawl-mcp --client cursor

This will automatically configure the MCP server in your chosen client.

Development Setup

pnpm install

MCP Client Integration

Naver Finance Crawl MCP can be integrated with various AI coding assistants and IDEs that support the Model Context Protocol (MCP).

Requirements

  • Node.js >= v18.0.0
  • An MCP-compatible client (Cursor, Claude Code, VS Code, Windsurf, etc.)
Install in Cursor

Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server

Add the following configuration to your ~/.cursor/mcp.json file:

{
  "mcpServers": {
    "naver-finance": {
      "command": "npx",
      "args": ["-y", "naver-finance-crawl-mcp"]
    }
  }
}

With HTTP transport:

{
  "mcpServers": {
    "naver-finance": {
      "command": "npx",
      "args": ["-y", "naver-finance-crawl-mcp", "--transport", "http", "--port", "5000"]
    }
  }
}
Install in Claude Code

Run this command:

claude mcp add naver-finance -- npx -y naver-finance-crawl-mcp

Or with HTTP transport:

claude mcp add naver-finance -- npx -y naver-finance-crawl-mcp --transport http --port 5000
Install in VS Code

Add this to your VS Code MCP config file. See VS Code MCP docs for more info.

"mcp": {
  "servers": {
    "naver-finance": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "naver-finance-crawl-mcp"]
    }
  }
}
Install in Windsurf

Add this to your Windsurf MCP config file:

{
  "mcpServers": {
    "naver-finance": {
      "command": "npx",
      "args": ["-y", "naver-finance-crawl-mcp"]
    }
  }
}
Install in Cline
  • Open Cline
  • Click the hamburger menu icon (☰) to enter the MCP Servers section
  • Choose Remote Servers tab
  • Click the Edit Configuration button
  • Add naver-finance to mcpServers:
{
  "mcpServers": {
    "naver-finance": {
      "command": "npx",
      "args": ["-y", "naver-finance-crawl-mcp"]
    }
  }
}
Install in Claude Desktop

Open Claude Desktop developer settings and edit your claude_desktop_config.json file:

{
  "mcpServers": {
    "naver-finance": {
      "command": "npx",
      "args": ["-y", "naver-finance-crawl-mcp"]
    }
  }
}
Install in Zed

Add this to your Zed settings.json:

{
  "context_servers": {
    "naver-finance": {
      "source": "custom",
      "command": "npx",
      "args": ["-y", "naver-finance-crawl-mcp"]
    }
  }
}
Install in Roo Code

Add this to your Roo Code MCP configuration file:

{
  "mcpServers": {
    "naver-finance": {
      "command": "npx",
      "args": ["-y", "naver-finance-crawl-mcp"]
    }
  }
}
Using with Bun
{
  "mcpServers": {
    "naver-finance": {
      "command": "bunx",
      "args": ["-y", "naver-finance-crawl-mcp"]
    }
  }
}

Available Scripts

Development & Build

# Type check
pnpm typecheck

# Lint code
pnpm lint
pnpm lint:fix

# Run tests
pnpm test
pnpm test:ui        # UI mode
pnpm test:coverage  # With coverage report

# Build project
pnpm build

# Watch mode
pnpm dev

# Start MCP server (STDIO transport)
pnpm start

# Start MCP server (HTTP transport)
pnpm start --transport http --port 5000

# Start HTTP REST API server
pnpm start:http

Usage

Running the MCP Server

STDIO Transport (default):

naver-finance-crawl-mcp

HTTP Transport:

naver-finance-crawl-mcp --transport http --port 5000

The server provides two MCP tools that can be used by LLMs:

  • crawl_top_stocks: Fetches the most searched stocks from Naver Finance
  • crawl_stock_detail: Fetches detailed information for a specific stock by its 6-digit code

Available Tools

Naver Finance Crawl MCP provides the following tools that can be used by LLMs:

crawl_top_stocks

Crawl top searched stocks from Naver Finance. Returns a list of the most searched stocks with their codes, names, current prices, and change rates.

Parameters: None

Example Response:

{
  "success": true,
  "count": 10,
  "data": [
    {
      "code": "005930",
      "name": "μ‚Όμ„±μ „μž",
      "currentPrice": "71,000",
      "changeRate": "+2.50%"
    }
  ],
  "timestamp": "2025-01-29T12:00:00.000Z"
}

crawl_stock_detail

Crawl detailed information for a specific stock by its 6-digit code. Returns comprehensive data including company info, stock prices, trading volume, and financial metrics.

Parameters:

  • stockCode (string, required): 6-digit stock code (e.g., "005930" for Samsung Electronics)

Example Request:

{
  "stockCode": "005930"
}

Example Response:

{
  "success": true,
  "stockCode": "005930",
  "data": {
    "companyName": "μ‚Όμ„±μ „μž",
    "currentPrice": "71,000",
    "changeRate": "+2.50%",
    "tradingVolume": "1,234,567",
    "marketCap": "423쑰원"
  },
  "metadata": {
    "url": "https://finance.naver.com/item/main.naver?code=005930",
    "statusCode": 200,
    "timestamp": "2025-01-29T12:00:00.000Z"
  }
}

Usage Examples

Example 1: Get top searched stocks

In Cursor/Claude Code:

Get the top searched stocks from Naver Finance

The tool will return:

  • List of most searched stocks
  • Stock codes, names, current prices
  • Price change rates
  • Timestamp of the data

Example 2: Get detailed stock information

In Cursor/Claude Code:

Get detailed information for Samsung Electronics (stock code: 005930)

The tool will return:

  • Company name and stock code
  • Current price and change rate
  • Trading volume
  • Market capitalization
  • Additional financial metrics

In Cursor/Claude Code:

First, show me the top searched stocks.
Then, fetch detailed information for the top 3 stocks.
Analyze which stocks show the most significant price changes.

Using as a Library

You can also use the crawlers directly in your Node.js applications:

Basic Crawler Example

import { ExampleCrawler } from './src/crawlers/exampleCrawler.js';

const crawler = new ExampleCrawler({
  timeout: 10000,
  retries: 3,
});

const result = await crawler.crawl('https://example.com');
console.log(result);

Create Custom Crawler

import { BaseCrawler } from './src/crawlers/baseCrawler.js';
import { CrawlResult } from './src/types/index.js';

class MyCrawler extends BaseCrawler {
  async crawl(url: string): Promise<CrawlResult> {
    const html = await this.fetchHtml(url);
    const parser = this.parseHtml(html);

    const data = parser.parseStructure('div.item', {
      title: 'h2',
      price: 'span.price',
    });

    return {
      url,
      data,
      timestamp: new Date(),
      statusCode: 200,
    };
  }
}

HTML Parsing

import { HtmlParser } from './src/utils/parser.js';

const html = '<h1>Hello</h1><p>World</p>';
const parser = new HtmlParser(html);

// Get text from elements
const title = parser.getFirstText('h1');
console.log(title); // "Hello"

// Get attributes
const links = parser.getAttributes('a[href]', 'href');

// Parse structured data
const items = parser.parseStructure('div.item', {
  name: 'h2',
  description: 'p',
});

Configuration Files

  • tsconfig.json: TypeScript compiler options
  • vitest.config.ts: Vitest test runner configuration
  • .eslintrc.json: ESLint rules configuration
  • .prettierrc.json: Prettier formatting rules

Dependencies

Production

  • axios: HTTP client library
  • cheerio: jQuery-like HTML parsing

Development

  • typescript: TypeScript compiler
  • vitest: Unit testing framework
  • @typescript-eslint/*: TypeScript linting
  • eslint: Code linting
  • prettier: Code formatting
  • tsx: TypeScript executor

Testing

The project includes comprehensive tests covering:

  • Unit tests for utilities and base classes
  • Integration tests for crawler functionality
  • E2E tests for complete workflows

Run tests with:

pnpm test          # Run all tests
pnpm test:ui       # Interactive UI
pnpm test:coverage # With coverage report

Docker Usage

Build Docker Image

docker build -t naver-finance-crawl-mcp .

Run with Docker

With STDIO transport:

docker run -i --rm naver-finance-crawl-mcp

With HTTP transport:

docker run -d -p 5000:5000 \
  --name naver-finance \
  naver-finance-crawl-mcp \
  node dist/mcp-server.js --transport http --port 5000

Docker Compose Example

Create a docker-compose.yml:

version: '3.8'

services:
  naver-finance-crawl-mcp:
    build: .
    ports:
      - "5000:5000"
    environment:
      - PORT=5000
      - NODE_ENV=production
    command: ["node", "dist/mcp-server.js", "--transport", "http", "--port", "5000"]
    restart: unless-stopped
    healthcheck:
      test: ["CMD", "node", "-e", "require('http').get('http://localhost:5000/mcp', (r) => {process.exit(r.statusCode === 200 ? 0 : 1)})"]
      interval: 30s
      timeout: 3s
      retries: 3
      start_period: 5s

Run with Docker Compose:

docker-compose up -d

Use Docker Image in MCP Clients

Configure your MCP client to use the Docker container:

{
  "mcpServers": {
    "naver-finance": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "naver-finance-crawl-mcp"
      ]
    }
  }
}

Architecture

The project follows a modular architecture:

  • crawlers/: Crawler implementations
    • baseCrawler.ts: Base class for all crawlers
    • naverFinanceCrawler.ts: Naver Finance stock detail crawler
    • TopStocksCrawler.ts: Top searched stocks crawler
    • exampleCrawler.ts: Example crawler implementation
  • tools/: MCP tool implementations
    • crawl-top-stocks.ts: MCP tool for top stocks
    • crawl-stock-detail.ts: MCP tool for stock details
  • utils/: Utility functions
    • request.ts: HTTP client with retry logic
    • parser.ts: HTML parsing helpers
  • types/: TypeScript type definitions
  • mcp-server.ts: MCP server entry point (STDIO/HTTP)
  • http-server.ts: REST API server entry point

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Author

greatsumini

License

MIT

Keywords

crawler

FAQs

Package last updated on 29 Oct 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts