Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
@llamaindex/chat-ui
Advanced tools
Chat UI components for LLM apps
@llamaindex/chat-ui is a React component library that provides ready-to-use UI elements for building chat interfaces in LLM (Large Language Model) applications. This package is designed to streamline the development of chat-based user interfaces for AI-powered applications.
To install the package, run the following command in your project directory:
npm install @llamaindex/chat-ui
npm install @llamaindex/chat-ui
tailwind.config.ts
to include the chat-ui componentsmodule.exports = {
content: [
'app/**/*.{ts,tsx}',
'node_modules/@llamaindex/chat-ui/**/*.{ts,tsx}',
],
// ...
}
The easiest way to get started is to connect the whole ChatSection
component with useChat
hook from vercel/ai:
import { ChatSection } from '@llamaindex/chat-ui'
import { useChat } from 'ai/react'
const ChatExample = () => {
const handler = useChat()
return <ChatSection handler={handler} />
}
This simple example looks like this:
Components are designed to be composable. You can use them as is:
import { ChatSection } from '@llamaindex/chat-ui'
import { useChat } from 'ai/react'
const ChatExample = () => {
const handler = useChat()
return <ChatSection handler={handler} />
}
Or you can extend them with your own children components:
import { ChatSection, ChatMessages, ChatInput } from '@llamaindex/chat-ui'
import LlamaCloudSelector from './components/LlamaCloudSelector' // your custom component
import { useChat } from 'ai/react'
const ChatExample = () => {
const handler = useChat()
return (
<ChatSection handler={handler}>
<ChatMessages />
<ChatInput>
<ChatInput.Preview />
<ChatInput.Form className="bg-lime-500">
<ChatInput.Field type="textarea" />
<ChatInput.Upload />
<LlamaCloudSelector /> {/* custom component */}
<ChatInput.Submit />
</ChatInput.Form>
</ChatInput>
</ChatSection>
)
}
Your custom component can use the useChatUI
hook to send additional data to the chat API endpoint:
import { useChatInput } from '@llamaindex/chat-ui'
const LlamaCloudSelector = () => {
const { requestData, setRequestData } = useChatUI()
return (
<div>
<select
value={requestData?.model}
onChange={e => setRequestData({ model: e.target.value })}
>
<option value="llama-3.1-70b-instruct">Pipeline 1</option>
<option value="llama-3.1-8b-instruct">Pipeline 2</option>
</select>
</div>
)
}
chat-ui
components are based on shadcn components using Tailwind CSS.
You can override the default styles by changing CSS variables in the globals.css
file of your Tailwind CSS configuration. For example, to change the background and foreground colors:
:root {
--background: 0 0% 100%;
--foreground: 222.2 84% 4.9%;
}
For a list of all available CSS variables, please refer to the Shadcn Theme Config.
Additionally, you can also override each component's styles by setting custom classes in the className
prop. For example, setting the background color of the ChatInput.Form
component:
import { ChatSection, ChatMessages, ChatInput } from '@llamaindex/chat-ui'
import { useChat } from 'ai/react'
const ChatExample = () => {
const handler = useChat()
return (
<ChatSection handler={handler}>
<ChatMessages />
<ChatInput>
<ChatInput.Preview />
<ChatInput.Form className="bg-lime-500">
<ChatInput.Field type="textarea" />
<ChatInput.Upload />
<ChatInput.Submit />
</ChatInput.Form>
</ChatInput>
</ChatSection>
)
}
Inside the markdown component, we use highlight.js for code blocks, katex for latex, and pdf-viewer for pdf files. If your app is using code, latex or pdf files, you'll need to import their CSS files:
import '@llamaindex/chat-ui/styles/code.css'
import '@llamaindex/chat-ui/styles/katex.css'
import '@llamaindex/chat-ui/styles/pdf.css'
The code.css
file uses the atom-one-dark
theme from highlight.js by default. There are a lot of others to choose from: https://highlightjs.org/demo
You can use any of them by copying their CSS to your project and importing it.
See the example app for a complete example. The generate a full-featured project to get started with, use create-llama.
@llamaindex/chat-ui is released under the MIT License.
If you encounter any issues or have questions, please file an issue on our GitHub repository.
FAQs
Chat UI components for LLM apps
The npm package @llamaindex/chat-ui receives a total of 2,685 weekly downloads. As such, @llamaindex/chat-ui popularity was classified as popular.
We found that @llamaindex/chat-ui demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.