Research
Security News
Threat Actor Exposes Playbook for Exploiting npm to Build Blockchain-Powered Botnets
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
@huggingface/agents
Advanced tools
A way to call Hugging Face models and inference APIs from natural language, using an LLM.
pnpm add @huggingface/agents
npm add @huggingface/agents
yarn add @huggingface/agents
// esm.sh
import { HfAgent } from "https://esm.sh/@huggingface/agent"
// or npm:
import { HfAgent } from "npm:@huggingface/agent"
Check out the full documentation.
Agents.js leverages LLMs hosted as Inference APIs on HF, so you need to create an account and generate an access token.
import { HfAgent } from "@huggingface/agents";
const agent = new HfAgent("hf_...");
const code = await agent.generateCode("Draw a picture of a cat, wearing a top hat.")
console.log(code) // always good to check the generated code before running it
const outputs = await agent.evaluateCode(code);
console.log(outputs)
You can also use your own LLM, by calling one of the LLMFrom*
functions.
You can specify any valid model on the hub as long as they have an API.
import { HfAgent, LLMFromHub } from "@huggingface/agents";
const agent = new HfAgent(
"hf_...",
LLMFromHub("hf_...", "OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5")
);
You can also specify your own endpoint, as long as it implements the same API, for exemple using text generation inference and Inference Endpoints.
import { HfAgent, LLMFromEndpoint } from "@huggingface/agents";
const agent = new HfAgent(
"hf_...",
LLMFromEndpoint("hf_...", "http://...")
);
A LLM in this context is defined as any async function that takes a string input and returns a string. For example if you wanted to use the OpenAI API you could do so like this:
import { HfAgent } from "@huggingface/agents";
import { Configuration, OpenAIApi } from "openai";
const api = new OpenAIApi(new Configuration({ apiKey: "sk-..." }));
const llmOpenAI = async (prompt: string): Promise<string> => {
return (
(
await api.createCompletion({
model: "text-davinci-003",
prompt: prompt,
max_tokens: 1000,
})
).data.choices[0].text ?? ""
);
};
const agent = new HfAgent(
"hf_...",
llmOpenAI
);
// do anything you want with the agent here
By default, agents ship with 4 tools. (textToImage, textToSpeech, imageToText, speechToText)
But you can expand the list of tools easily by creating new tools and passing them at initialization.
import { HfAgent, defaultTools, LLMFromHub } from "@huggingface/agents";
import type { Tool } from "@huggingface/agents/src/types";
// define the tool
const uppercaseTool: Tool = {
name: "uppercase",
description: "uppercase the input string and returns it ",
examples: [
{
prompt: "uppercase the string: hello world",
code: `const output = uppercase("hello world")`,
tools: ["uppercase"],
},
],
call: async (input) => {
const data = await input;
if (typeof data !== "string") {
throw new Error("Input must be a string");
}
return data.toUpperCase();
},
};
// pass it in the agent
const agent = new HfAgent(process.env.HF_ACCESS_TOKEN,
LLMFromHub("hf_...", "OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5"),
[uppercaseTool, ...defaultTools]);
@huggingface/inference
: Required to call the inference endpoints themselves.FAQs
Multi-modal agents using Hugging Face's models
The npm package @huggingface/agents receives a total of 58 weekly downloads. As such, @huggingface/agents popularity was classified as not popular.
We found that @huggingface/agents demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
Security News
NVD’s backlog surpasses 20,000 CVEs as analysis slows and NIST announces new system updates to address ongoing delays.
Security News
Research
A malicious npm package disguised as a WhatsApp client is exploiting authentication flows with a remote kill switch to exfiltrate data and destroy files.