New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

forumaisdk

Package Overview
Dependencies
Maintainers
0
Versions
4
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

forumaisdk

ForumAI NodeJS SDK

  • 1.0.4
  • latest
  • npm
  • Socket score

Version published
Weekly downloads
0
decreased by-100%
Maintainers
0
Weekly downloads
 
Created
Source

ForumAI SDK

This is the Typescript implementation of ForumAI SDK

Setup and usage

Installation

This SDK is available on npm registry, so you can easily install it using npm, yarn or any other package manager tool

npm install forumaisdk

Usage

  • With cjs
const forumaisdk = require('forumaisdk');
const ModelMarket = forumaisdk.ModelMarket;
  • With esm
import { ModelMarket } from 'forumaisdk';

There is two modes of using the SDK to interact with the ForumAI LLM node:

  1. No response streaming

    With this mode you will get response of your chat at once. Sample code looks like bellow

    const privateKey = process.env.PRIVATE_KEY;
    if (!privateKey) {
      throw new Error('Private key is not provided via env PRIVATE_KEY');
    }
    const model = new ModelMarket.Mixtral8x7BModelMarketTestnet(privateKey);
    const chat = [{ role: 'system', content: 'You are a helpful assistant!' }];
    
    while (true) {
      const prompt = await askQuestion('[INFO] Enter your prompt: ');
      if (['q', 'quit', 'exit'].includes(prompt)) {
        console.log('[INFO] Chat finished');
        break;
      }
      chat.push({ role: 'user', content: prompt });
      console.log('[INFO] Wait for response ...');
      const resp = await model.generate(3000, chat);
      console.log('[INFO] Response:\n', resp);
      chat.push({ role: 'assistant', content: resp });
    }
    

    Full sample code you can found here simple-chat

  2. Response streaming

    With this mode you will get response of your chat in stream. Sample code looks like bellow

    const privateKey = process.env.PRIVATE_KEY;
    if (!privateKey) {
      throw new Error('Private key is not provided via env PRIVATE_KEY');
    }
    const model = new ModelMarket.Mixtral8x7BModelMarketTestnet(privateKey);
    const chat = [{ role: 'system', content: 'You are a helpful assistant!' }];
    
    while (true) {
      const prompt = await askQuestion('[INFO] Enter your prompt: ');
      if (['q', 'quit', 'exit'].includes(prompt)) {
        console.log('[INFO] Chat finished');
        break;
      }
      chat.push({ role: 'user', content: prompt });
      console.log('[INFO] Wait for response ...');
    
      const [node_url, result_code] = await model.generateSelfRequesting(3000, chat);
      let full_resp = '';
      let done = false;
      console.log('[INFO] Response:\n');
      while (!done) {
        const [resp, _done] = await model.getNextOutput(node_url, result_code, full_resp);
        full_resp += resp;
        done = _done;
        logWithoutEndLine(resp);
        await sleep(100);
      }
      logWithoutEndLine('\n');
      chat.push({ role: 'assistant', content: full_resp });
    }
    

    Full sample code you can found here streamed-chat

FAQs

Package last updated on 03 Jul 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc