Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
@magicul/react-chat-stream
Advanced tools
A React hook that lets you easily integrate your custom ChatGPT-like chat in React.
Introducing @magicul/react-chat-stream: A React hook designed to simplify integrating chat streams returned by your backend. Let messages appear word-by-word similar to ChatGPT.
Are you building a ChatGPT-like chat interface? Then most likely you'll want to integrate a chat that has the messages appear word-by-word, similar to ChatGPT. Vercel recently released the Vercel AI SDK which adds Streaming First UI Helper, but what if you want to integrate your own backend? This package solves exactly that pain point. We've abstracted the logic into a React Hook to take care of handling everything for you.
If you're backend returns text/event-stream
then you can use this package. This package does not "fake" this response by imitating the word-by-word appearance. It will literally take the responses from your backend as them come in through the stream. The hook provides a messages
object which will change so you can display it as the result gets delivered.
Install this package with npm
npm i @magicul/react-chat-stream
Or with yarn
yarn add @magicul/react-chat-stream
With the useChatStream
hook, you can easily integrate your own API
to stream chat responses (text/event-stream
). Responses from your backend will appear
word-by-word to give it a ChatGPT-like user experience. The following
example demonstrates how to use the hook to integrate your own API
that streams the results.
Please note: Your API has to return text/event-stream
.
import React from 'react';
import useChatStream from '@magicul/react-chat-stream';
function App() {
const { messages, input, handleInputChange, handleSubmit } = useChatStream({
options: {
url: 'https://your-api-url',
method: 'POST',
},
// This means that the user input will be sent as the body of the request with the key 'prompt'.
method: {
type: 'body',
key: 'prompt',
},
});
return (
<div>
{messages.map((message, index) => (
<div key={message.id}>
<p>
{message.role}: {message.content}
</p>
</div>
))}
<form onSubmit={handleSubmit}>
<input type="text" onChange={handleInputChange} value={input} />
<button type="submit">Send</button>
</form>
</div>
);
}
export default App;
The useChatStream
hook provides a variable named messages
. This
messages
variable comes from the internal state of the hook. It contains the chat message reply received from
your API. Messages are updated in real-time as the stream continues to receive messages. The messages
variable will change and will get
appended with new messages received from your backend.
Important: For this to work, your API must stream back the results of the AI model as parts of the string you want to display.
The API endpoint you provide to the hook must be able to handle the following:
event/text-stream
event stream which contains the
responses you would like to display.The input of the hook is a configuration object with the following properties:
string
- the URL of the API endpoint.'GET' | 'POST'
- the HTTP method to use.object (optional)
- the query parameters to send with the
request.object (optional)
- the headers to include in the
request.object (optional)
- the body of the request.number (optional)
- the number of
characters to display per second. If this is unused the hook will display the messages as they come in.'body' | 'query'
- where to include the user's input in the
request.string
- the key of the input in the request.The output of this hook is an object with the following properties:
Array<ChatMessage>
- an array of chat messages. Each
message is an object with an id
(can be used as a key in the
loop), role
('bot' or 'user') and content
(
the content of the message).string
- the current user input, you can use this value as
the form input value.function
- a function to handle the change
event of the input field. Pass it to the onChange prop of your input
field.function
- a function to handle the submit event of
the form. Pass it to the onSubmit prop of your form.boolean
- a boolean indicating whether the request is
in progress.If you want to see a working example, check out the example folder for an example on how to use this package.
For those utilizing Next.js version 13 or higher as the server-side rendering framework with React, it's crucial to incorporate the useChatStream hook within a client component. The need for this is driven by the hook's use of useState, which necessitates its operation within a client component.
Transforming a regular server component into a client component is a straightforward task. Simply add the following line at the top of your component file:
'use client';
FAQs
A React hook that lets you easily integrate your custom ChatGPT-like chat in React.
The npm package @magicul/react-chat-stream receives a total of 306 weekly downloads. As such, @magicul/react-chat-stream popularity was classified as not popular.
We found that @magicul/react-chat-stream demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.