New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details
Socket
Book a DemoSign in
Socket

o2lma

Package Overview
Dependencies
Maintainers
1
Versions
6
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

o2lma

[English](README.md) | [中文](README.zh-CN.md)

latest
npmnpm
Version
0.0.8
Version published
Maintainers
1
Created
Source

English | 中文

o2lma

o2lma is a lightweight proxy server that transforms API requests compatible with DeepSeek or OpenAI formats into Ollama API requests. This enables tools like VSCode Copilot Chat to interact with third-party APIs through Ollama.

Features

  • API Compatibility: Converts DeepSeek/OpenAI-style requests to Ollama-compatible requests.
  • Flexible Configuration: Supports custom base URLs and API keys via command-line arguments or environment variables.
  • Lightweight: Built with Hono for minimal overhead.
  • Easy Setup: Simple installation and quick start with npm.

Installation

  • Clone the repository:

    git clone https://github.com/wrtx-dev/o2lma.git
    cd o2lma
    
  • Install dependencies:

    npm install
    
  • Build the project:

    npm run build
    

Configuration

Configure the server using either command-line arguments or environment variables:

Command-line options:

npx o2lma --url [API_BASE_URL] --apikey [API_KEY] --host [HOST] --port [PORT] --cap [CAPABILITIES]

Options:

  • --url: Base API URL (default: https://api.deepseek.com)
  • --apikey: API key for authentication
  • --host: Server host (default: localhost)
  • --port: Server port (default: 11434)
  • --cap: Additional capabilities (options: tools, thinking)

Environment variables:

export BASE_URL="https://api.deepseek.com"
export API_KEY="your-api-key"

Usage

  • Start the server:

    npm start
    

    Or for development:

    npm run dev
    
  • The server will run on http://localhost:11434 by default (configurable via --host and --port)

  • Configure your client (e.g., VSCode Copilot) to use this local endpoint

API Endpoints

The server provides the following endpoints compatible with Ollama API:

  • GET /api/version - Returns server version
  • POST /api/show - Returns model capabilities
  • GET /api/tags - Lists available models
  • POST /v1/chat/completions - Proxies chat completion requests

Development

Building

npm run build

Running in development mode

npm run dev

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

License

MIT

Keywords

DeepSeek

FAQs

Package last updated on 11 Aug 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts