
Security News
ECMAScript 2025 Finalized with Iterator Helpers, Set Methods, RegExp.escape, and More
ECMAScript 2025 introduces Iterator Helpers, Set methods, JSON modules, and more in its latest spec update approved by Ecma in June 2025.
@llamaindex/chat-ui
Advanced tools
Chat UI components for LLM apps
@llamaindex/chat-ui is a React component library that provides ready-to-use UI elements for building chat interfaces in LLM (Large Language Model) applications. This package is designed to streamline the development of chat-based user interfaces for AI-powered applications.
You can quickly add a chatbot to your project by using Shadcn CLI command:
npx shadcn@latest add https://ui.llamaindex.ai/r/chat.json
To install the package, run the following command in your project directory:
npm install @llamaindex/chat-ui
npm install @llamaindex/chat-ui
For Tailwind CSS version 4.x, update globals.css
to include the chat-ui components (update the relative path to node_modules if necessary):
@source '../node_modules/@llamaindex/chat-ui/**/*.{ts,tsx}';
For Tailwind CSS version 3.x, you need to add the following to your tailwind.config.ts
file:
module.exports = {
content: [
'app/**/*.{ts,tsx}',
'node_modules/@llamaindex/chat-ui/**/*.{ts,tsx}',
],
// ...
}
The easiest way to get started is to connect the whole ChatSection
component with useChat
hook from vercel/ai:
import { ChatSection } from '@llamaindex/chat-ui'
import { useChat } from 'ai/react'
const ChatExample = () => {
const handler = useChat()
return <ChatSection handler={handler} />
}
This simple example looks like this:
Components are designed to be composable. You can use them as is:
import { ChatSection } from '@llamaindex/chat-ui'
import { useChat } from 'ai/react'
const ChatExample = () => {
const handler = useChat()
return <ChatSection handler={handler} />
}
Or you can extend them with your own children components:
import { ChatSection, ChatMessages, ChatInput } from '@llamaindex/chat-ui'
import LlamaCloudSelector from './components/LlamaCloudSelector' // your custom component
import { useChat } from 'ai/react'
const ChatExample = () => {
const handler = useChat()
return (
<ChatSection handler={handler}>
<ChatMessages />
<ChatInput>
<ChatInput.Form className="bg-lime-500">
<ChatInput.Field type="textarea" />
<ChatInput.Upload />
<LlamaCloudSelector /> {/* custom component */}
<ChatInput.Submit />
</ChatInput.Form>
</ChatInput>
</ChatSection>
)
}
Your custom component can use the useChatUI
hook to send additional data to the chat API endpoint:
import { useChatInput } from '@llamaindex/chat-ui'
const LlamaCloudSelector = () => {
const { requestData, setRequestData } = useChatUI()
return (
<div>
<select
value={requestData?.model}
onChange={e => setRequestData({ model: e.target.value })}
>
<option value="llama-3.1-70b-instruct">Pipeline 1</option>
<option value="llama-3.1-8b-instruct">Pipeline 2</option>
</select>
</div>
)
}
chat-ui
components are based on shadcn components using Tailwind CSS.
You can override the default styles by changing CSS variables in the globals.css
file of your Tailwind CSS configuration. For example, to change the background and foreground colors:
:root {
--background: 0 0% 100%;
--foreground: 222.2 84% 4.9%;
}
For a list of all available CSS variables, please refer to the Shadcn Theme Config.
Additionally, you can also override each component's styles by setting custom classes in the className
prop. For example, setting the background color of the ChatInput.Form
component:
import { ChatSection, ChatMessages, ChatInput } from '@llamaindex/chat-ui'
import { useChat } from 'ai/react'
const ChatExample = () => {
const handler = useChat()
return (
<ChatSection handler={handler}>
<ChatMessages />
<ChatInput>
<ChatInput.Preview />
<ChatInput.Form className="bg-lime-500">
<ChatInput.Field type="textarea" />
<ChatInput.Upload />
<ChatInput.Submit />
</ChatInput.Form>
</ChatInput>
</ChatSection>
)
}
Inside the markdown component, we use highlight.js for code blocks, katex for latex, and pdf-viewer for pdf files. If your app is using code, latex or pdf files, you'll need to import their CSS files:
import '@llamaindex/chat-ui/styles/markdown.css' // code, latex and custom markdown styling
import '@llamaindex/chat-ui/styles/pdf.css' // pdf styling
import '@llamaindex/chat-ui/styles/editor.css' // document editor styling
The code.css
file uses the atom-one-dark
theme from highlight.js by default. There are a lot of others to choose from: https://highlightjs.org/demo
You can use any of them by copying their CSS to your project and importing it.
For any language that the LLM generates, you can specify a custom renderer to render the output. We have an example for how to render mermaid code as SVG using a custom renderer.
See the example app for a complete example. The generate a full-featured project to get started with, use create-llama.
@llamaindex/chat-ui is released under the MIT License.
If you encounter any issues or have questions, please file an issue on our GitHub repository.
FAQs
Chat UI components for LLM apps
The npm package @llamaindex/chat-ui receives a total of 1,770 weekly downloads. As such, @llamaindex/chat-ui popularity was classified as popular.
We found that @llamaindex/chat-ui demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 5 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
ECMAScript 2025 introduces Iterator Helpers, Set methods, JSON modules, and more in its latest spec update approved by Ecma in June 2025.
Security News
A new Node.js homepage button linking to paid support for EOL versions has sparked a heated discussion among contributors and the wider community.
Research
North Korean threat actors linked to the Contagious Interview campaign return with 35 new malicious npm packages using a stealthy multi-stage malware loader.