New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

@lastmileai/lastmileai

Package Overview
Dependencies
Maintainers
3
Versions
2
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@lastmileai/lastmileai

Node.js library for LastMile AI API

  • 0.0.2
  • latest
  • Source
  • npm
  • Socket score

Version published
Maintainers
3
Created
Source

LastMile AI Node.js Library

This library provides access to the LastMile AI API from Node.js. The code should reflect the same API endpoints documented here: https://lastmileai.dev/docs/api

API Token

This library requires a LastMile AI API Token, which can be obtained from https://lastmileai.dev/settings?page=tokens.

Important note: this library should only be used from a server-side context, where the API key can be securely accessed. Using this library from client-side browser code will expose your private API key!

Installation

npm install lastmileai

Usage

Initialize Library with API Key

This library needs to be configured with your API Token (aka API key) obtained above. You can store the API key in an environment variable or alternative secure storage that can be accessed in your server-side code. For example, to initialize the library with the API key loaded from environment variable:

import { LastMile } from "lastmileai";

const lastmile = new LastMile({apiKey: process.env.LASTMILEAI_API_KEY ?? ""});

Completions -- Open AI Models

OpenAI completions are supported out-of the box for ChatGPT and GPT3 models:

const completion = await lastmile.createOpenAICompletion({
  completionParams: {
    model: "text-davinci-003",
    prompt: "Your prompt here",
  },
});
const responseText = completion.choices[0]?.text;
const completion = await lastmile.createOpenAIChatCompletion({
  completionParams: {
    model: "gpt-3.5-turbo",
    messages: [
      { role: "user", content: "Your prompt here" },
    ],
  },
});
const responseText = completion.choices[0].message?.content;

Completions -- Custom Models

If you've tuned any GPT3-based custom models in LastMile or by using this library, you can easily perform inference/completions against the context of their associated datasets (stored in the model's embeddings):

const model = await lastmile.readModel("Your model ID");

const completion = await lastmile.createOpenAICompletion({
  completionParams: {
    model: "text-davinci-003",
    prompt: "Your prompt here",
  },
  embeddingCollectionId: model.embeddingCollections[0]?.id,
});
const responseText = completion.choices[0]?.text;


Keywords

FAQs

Package last updated on 05 May 2023

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc