Security News
Weekly Downloads Now Available in npm Package Search Results
Socket's package search now displays weekly downloads for npm packages, helping developers quickly assess popularity and make more informed decisions.
@isdk/ai-tool-llm
Advanced tools
LLMProvider serves as the base class for all Large Language Model (LLM) backends, enabling support for various model types from a single backend. This class is pivotal in obtaining outputs from any LLM.
Every LLM provider must define:
rule
: RegExp | string | function
: optional, Determines which model names this provider will service. For instance, lama.cpp might use the pattern /[.]gguf$/
.async function(input: LLMArguments)
: Accepts input and returns the LLM's output. The function's objective is to generate output based on the input, which can be either a streamed JSON object or a non-streamed response.Note: the registered provider name will be treat as model url protocol name part.
The LLM's output is structured as a JSON Object adhering to the following schema:
export type AITextGenerationFinishReason =
| 'stop' // Model generated a stop sequence
| 'length' // Maximum token limit reached
| 'content-filter' // Content violated filters
| 'tool-calls' // Model invoked tool calls
| 'abort', // aborted by user or timeout for stream
| 'error' // Model halted due to an error
| 'other' // Other termination reasons
| null; // No specified reason
export interface AIResult<TValue = any, TOptions = any> {
/**
* Generated content.
*/
content?: TValue;
/**
* Reason for generation termination.
*/
finishReason?: AITextGenerationFinishReason;
/**
* Optional parameters associated with the result.
*/
options?: TOptions;
}
In the case of streaming output, the returned JSON objects exclude finishReason
and only include content
. options
are optional.
To register diverse LLM backends and retrieve results, invoke the LLM method. If no specific LLM backend is designated, the default one set via current will process requests. It also facilitates querying the parameter size of the current
LLM model.
Static method getByModel(modelName: string)
is employed to determine the appropriate Provider based on the model name provided.
FAQs
## LLMProvider
The npm package @isdk/ai-tool-llm receives a total of 3 weekly downloads. As such, @isdk/ai-tool-llm popularity was classified as not popular.
We found that @isdk/ai-tool-llm demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Socket's package search now displays weekly downloads for npm packages, helping developers quickly assess popularity and make more informed decisions.
Security News
A Stanford study reveals 9.5% of engineers contribute almost nothing, costing tech $90B annually, with remote work fueling the rise of "ghost engineers."
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.