
Security News
Node.js TSC Votes to Stop Distributing Corepack
Corepack will be phased out from future Node.js releases following a TSC vote.
llamaindex
Advanced tools
Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in JS runtime environments with TypeScript support.
Documentation: https://ts.llamaindex.ai/
Try examples online:
LlamaIndex.TS aims to be a lightweight, easy to use set of libraries to help you integrate large language models into your applications with your own data.
LlamaIndex.TS supports multiple JS environments, including:
For now, browser support is limited due to the lack of support for AsyncLocalStorage-like APIs
npm install llamaindex
pnpm install llamaindex
yarn add llamaindex
See our official document: https://ts.llamaindex.ai/docs/llamaindex/getting_started/
In most cases, you'll also need to install provider packages to use LlamaIndexTS. These are for adding AI models, file readers for ingestion or storing documents, e.g. in vector databases.
For example, to use the OpenAI LLM, you would install the following package:
npm install @llamaindex/openai
pnpm install @llamaindex/openai
yarn add @llamaindex/openai
Check out our NextJS playground at https://llama-playground.vercel.app/. The source is available at https://github.com/run-llama/ts-playground
Document: A document represents a text file, PDF file or other contiguous piece of data.
Node: The basic data building block. Most commonly, these are parts of the document split into manageable pieces that are small enough to be fed into an embedding model and LLM.
Embedding: Embeddings are sets of floating point numbers which represent the data in a Node. By comparing the similarity of embeddings, we can derive an understanding of the similarity of two pieces of data. One use case is to compare the embedding of a question with the embeddings of our Nodes to see which Nodes may contain the data needed to answer that question. Because the default service context is OpenAI, the default embedding is OpenAIEmbedding
. If using different models, say through Ollama, use this Embedding (see all here).
Indices: Indices store the Nodes and the embeddings of those nodes. QueryEngines retrieve Nodes from these Indices using embedding similarity.
QueryEngine: Query engines are what generate the query you put in and give you back the result. Query engines generally combine a pre-built prompt with selected Nodes from your Index to give the LLM the context it needs to answer your query. To build a query engine from your Index (recommended), use the asQueryEngine
method on your Index. See all query engines here.
ChatEngine: A ChatEngine helps you build a chatbot that will interact with your Indices. See all chat engines here.
SimplePrompt: A simple standardized function call definition that takes in inputs and formats them in a template literal. SimplePrompts can be specialized using currying and combined using other SimplePrompt functions.
Please see our contributing guide for more information. You are highly encouraged to contribute to LlamaIndex.TS!
Please join our Discord! https://discord.com/invite/eN6D2HQ4aX
FAQs
LlamaIndex.TS Data framework for your LLM application.
The npm package llamaindex receives a total of 28,230 weekly downloads. As such, llamaindex popularity was classified as popular.
We found that llamaindex demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Corepack will be phased out from future Node.js releases following a TSC vote.
Research
Security News
Research uncovers Black Basta's plans to exploit package registries for ransomware delivery alongside evidence of similar attacks already targeting open source ecosystems.
Security News
Oxlint's beta release introduces 500+ built-in linting rules while delivering twice the speed of previous versions, with future support planned for custom plugins and improved IDE integration.