Socket
Socket
Sign inDemoInstall

llm-interface

Package Overview
Dependencies
Maintainers
1
Versions
38
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

llm-interface - npm Package Compare versions

Comparing version 0.0.5 to 0.0.7

docs/APIKEYS.md

1

config.js

@@ -15,2 +15,3 @@ /**

rekaApiKey: process.env.REKA_API_KEY,
gooseApiKey: process.env.GOOSE_API_KEY,
};

2

docs/API.md

@@ -123,3 +123,3 @@ # llm-interface

### LlamaCPP
### LLaMA.cpp

@@ -126,0 +126,0 @@ #### `sendMessage(message, options)`

@@ -5,6 +5,6 @@ # llm-interface

First, require the handlers from the `llm-interface` package:
First, require the LLMInterface from the `llm-interface` package:
```javascript
const handlers = require("llm-interface");
const LLMInterface = require("llm-interface");
```

@@ -19,3 +19,3 @@

```javascript
const openai = new handlers.openai(process.env.OPENAI_API_KEY);
const openai = new LLMInterface.openai(process.env.OPENAI_API_KEY);

@@ -47,3 +47,3 @@ const message = {

```javascript
const anthropic = new handlers.anthropic(process.env.ANTHROPIC_API_KEY);
const anthropic = new LLMInterface.anthropic(process.env.ANTHROPIC_API_KEY);

@@ -80,3 +80,3 @@ const message = {

```javascript
const gemini = new handlers.gemini(process.env.GEMINI_API_KEY);
const gemini = new LLMInterface.gemini(process.env.GEMINI_API_KEY);

@@ -108,3 +108,3 @@ const message = {

```javascript
const groq = new handlers.groq(process.env.GROQ_API_KEY);
const groq = new LLMInterface.groq(process.env.GROQ_API_KEY);

@@ -136,3 +136,3 @@ const message = {

```javascript
const reka = new handlers.reka(process.env.REKA_API_KEY);
const reka = new LLMInterface.reka(process.env.REKA_API_KEY);

@@ -158,5 +158,5 @@ const message = {

### LlamaCPP Interface
### LLaMA.cpp Interface
The LlamaCPP interface allows you to send messages to the LlamaCPP API.
The LLaMA.cpp interface allows you to send messages to the LLaMA.cpp API; this is exposed by the [LLaMA.cpp HTTP Server](https://github.com/ggerganov/llama.cpp/tree/master/examples/server).

@@ -166,3 +166,3 @@ #### Example

```javascript
const llamacpp = new handlers.llamacpp(process.env.LLAMACPP_URL);
const llamacpp = new LLMInterface.llamacpp(process.env.LLAMACPP_URL);

@@ -169,0 +169,0 @@ const message = {

{
"name": "llm-interface",
"version": "0.0.5",
"version": "0.0.7",
"main": "src/index.js",

@@ -36,2 +36,3 @@ "description": "A simple, unified interface for integrating and interacting with multiple Large Language Model (LLM) APIs, including OpenAI, Anthropic, Google Gemini, Groq, and LlamaCPP.",

"axios": "^1.7.2",
"cohere": "^1.1.1",
"dotenv": "^16.4.5",

@@ -38,0 +39,0 @@ "groq-sdk": "^0.5.0",

@@ -5,12 +5,16 @@ # llm-interface

![Version 0.0.4](https://img.shields.io/badge/Version-0.0.4-blue) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Built with Node.js](https://img.shields.io/badge/Built%20with-Node.js-green)](https://nodejs.org/)
![Version 0.0.6](https://img.shields.io/badge/Version-0.0.6-blue) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Built with Node.js](https://img.shields.io/badge/Built%20with-Node.js-green)](https://nodejs.org/)
## Introduction
The LLM Interface project is a versatile and comprehensive wrapper designed to interact with multiple Large Language Model (LLM) APIs. It simplifies integrating various LLM providers, including OpenAI, Anthropic, Google Gemini, Groq, Reka AI, and LlamaCPP, into your applications. This project aims to provide a simplified and unified interface for sending messages and receiving responses from different LLM services, making it easier for developers to work with multiple LLMs without worrying about the specific intricacies of each API.
The LLM Interface project is a versatile and comprehensive wrapper designed to interact with multiple Large Language Model (LLM) APIs. It simplifies integrating various LLM providers, including **OpenAI, Anthropic, Google Gemini, Goose AI, Groq, Reka AI, and LLaMA.cpp**, into your applications. This project aims to provide a simplified and unified interface for sending messages and receiving responses from different LLM services, making it easier for developers to work with multiple LLMs without worrying about the specific intricacies of each API.
## New in 0.0.7
- **Goose AI**: Added support for Goose AI
## Features
- **Unified Interface**: A single, consistent interface to interact with multiple LLM APIs.
- **Dynamic Module Loading**: Automatically loads and manages different LLM handlers.
- **Dynamic Module Loading**: Automatically loads and manages different LLM LLMInterface.
- **Error Handling**: Robust error handling mechanisms to ensure reliable API interactions.

@@ -24,4 +28,3 @@ - **Extensible**: Easily extendable to support additional LLM providers as needed.

- `dotenv`: For managing environment variables.
- `axios`: For making HTTP requests (used in LlamaCPP).
- `axios`: For making HTTP requests (used for Goose AI, LLaMA.cpp and Reka).
- `@anthropic-ai/sdk`: SDK for interacting with the Anthropic API.

@@ -31,2 +34,3 @@ - `@google/generative-ai`: SDK for interacting with the Google Gemini API.

- `openai`: SDK for interacting with the OpenAI API.
- `dotenv`: For managing environment variables. Used by test cases.

@@ -48,3 +52,3 @@ ## Installation

```javascript
const handlers = require("llm-interface");
const LLMInterface = require("llm-interface");
```

@@ -55,3 +59,3 @@

```javascript
import handlers from "llm-interface";
import LLMInterface from "llm-interface";
```

@@ -62,3 +66,3 @@

```javascript
const openai = new handlers.openai(process.env.OPENAI_API_KEY);
const openai = new LLMInterface.openai(process.env.OPENAI_API_KEY);
const message = {

@@ -65,0 +69,0 @@ model: "gpt-3.5-turbo",

/**
* @file index.js
* @description Entry point for the LLM interface module, dynamically loading handlers for different LLM providers.
* @description Entry point for the LLM interface module, dynamically loading LLMInterface for different LLM providers.
*/

@@ -13,7 +13,8 @@

groq: "./groq",
goose: "./goose",
};
const handlers = {};
const LLMInterface = {};
Object.keys(modules).forEach((key) => {
Object.defineProperty(handlers, key, {
Object.defineProperty(LLMInterface, key, {
get: function () {

@@ -30,2 +31,3 @@ if (!this[`_${key}`]) {

module.exports = handlers;
const handlers = LLMInterface;
module.exports = { LLMInterface, handlers };

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc