
Security News
Feross on the 10 Minutes or Less Podcast: Nobody Reads the Code
Socket CEO Feross Aboukhadijeh joins 10 Minutes or Less, a podcast by Ali Rohde, to discuss the recent surge in open source supply chain attacks.
@loopstack/chat-example-workflow
Advanced tools
A module for the Loopstack AI automation framework.
This module provides an example workflow demonstrating how to build an interactive chat interface with an LLM.
The Chat Example Workflow shows how to create a conversational assistant that processes user messages and generates responses using an LLM. It demonstrates the core patterns for building chat-based applications in Loopstack.
By using this workflow as a reference, you'll learn how to:
wait: true to pause the workflow for user inputClaudeGenerateTextThis example is useful for developers building chatbots, virtual assistants, or any conversational AI interface.
See SETUP.md for installation and setup instructions.
The workflow begins with an @Initial method that saves a hidden system message. The message content is rendered from a Handlebars template file:
@Initial({ to: 'waiting_for_user' })
async setup() {
await this.repository.save(
ClaudeMessageDocument,
{ role: 'user', content: this.render(__dirname + '/templates/systemMessage.md') },
{ meta: { hidden: true } },
);
}
The { meta: { hidden: true } } option ensures the system message is included in the LLM context but not displayed in the chat UI.
The userMessage transition uses wait: true to pause the workflow and wait for external input. A Zod schema defines the expected payload type:
@Transition({ from: 'waiting_for_user', to: 'ready', wait: true, schema: z.string() })
async userMessage(payload: string) {
await this.repository.save(ClaudeMessageDocument, { role: 'user', content: payload });
}
When the user sends a message, the payload is saved as a ClaudeMessageDocument and the workflow transitions to the ready state.
When the workflow reaches the ready state, it calls the LLM to generate a response based on the full conversation history using messagesSearchTag:
@Transition({ from: 'ready', to: 'waiting_for_user' })
async llmTurn() {
const result = await this.claudeGenerateText.call({
claude: { model: 'claude-sonnet-4-6' },
messagesSearchTag: 'message',
});
await this.repository.save(ClaudeMessageDocument, result.data!, { id: result.data!.id });
}
The messagesSearchTag: 'message' parameter retrieves all saved ClaudeMessageDocument entries as conversation context. The LLM response is saved and the workflow loops back to waiting_for_user.
The workflow defines a prompt input widget that is enabled when waiting for user input:
ui:
widgets:
- widget: prompt-input
enabledWhen:
- waiting_for_user
options:
transition: userMessage
label: Send Message
The transition: userMessage connects the widget to the userMessage method, and enabledWhen controls when the input is active.
The complete workflow class uses @InjectTool() to access the ClaudeGenerateText tool and extends BaseWorkflow:
import { z } from 'zod';
import { ClaudeGenerateText, ClaudeMessageDocument } from '@loopstack/claude-module';
import { BaseWorkflow, Initial, InjectTool, Transition, Workflow } from '@loopstack/common';
@Workflow({
uiConfig: __dirname + '/chat.ui.yaml',
})
export class ChatWorkflow extends BaseWorkflow {
@InjectTool() claudeGenerateText: ClaudeGenerateText;
@Initial({ to: 'waiting_for_user' })
async setup() {
await this.repository.save(
ClaudeMessageDocument,
{ role: 'user', content: this.render(__dirname + '/templates/systemMessage.md') },
{ meta: { hidden: true } },
);
}
@Transition({ from: 'waiting_for_user', to: 'ready', wait: true, schema: z.string() })
async userMessage(payload: string) {
await this.repository.save(ClaudeMessageDocument, { role: 'user', content: payload });
}
@Transition({ from: 'ready', to: 'waiting_for_user' })
async llmTurn() {
const result = await this.claudeGenerateText.call({
claude: { model: 'claude-sonnet-4-6' },
messagesSearchTag: 'message',
});
await this.repository.save(ClaudeMessageDocument, result.data!, { id: result.data!.id });
}
}
This workflow uses the following Loopstack modules:
@loopstack/common - Core framework functionality, BaseWorkflow, decorators@loopstack/claude-module - Provides ClaudeGenerateText tool and ClaudeMessageDocumentAuthor: Jakob Klippel
License: MIT
FAQs
An example chat workflow to interact with an LLM.
The npm package @loopstack/chat-example-workflow receives a total of 486 weekly downloads. As such, @loopstack/chat-example-workflow popularity was classified as not popular.
We found that @loopstack/chat-example-workflow demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Socket CEO Feross Aboukhadijeh joins 10 Minutes or Less, a podcast by Ali Rohde, to discuss the recent surge in open source supply chain attacks.

Research
/Security News
Campaign of 108 extensions harvests identities, steals sessions, and adds backdoors to browsers, all tied to the same C2 infrastructure.

Security News
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.