
Security News
Feross on the 10 Minutes or Less Podcast: Nobody Reads the Code
Socket CEO Feross Aboukhadijeh joins 10 Minutes or Less, a podcast by Ali Rohde, to discuss the recent surge in open source supply chain attacks.
@loopstack/chat-example-workflow
Advanced tools
A module for the Loopstack AI automation framework.
This module provides an example workflow demonstrating how to build an interactive chat interface with an LLM.
The Chat Example Workflow shows how to create a conversational assistant that processes user messages and generates responses using an LLM. It demonstrates the core patterns for building chat-based applications in Loopstack.
By using this workflow as a reference, you'll learn how to:
runtime objectThis example is useful for developers building chatbots, virtual assistants, or any conversational AI interface.
See SETUP.md for installation and setup instructions.
The workflow begins by creating a hidden system message that defines the assistant's behavior:
- id: greeting
from: start
to: ready
call:
- tool: createDocument
args:
document: aiMessageDocument
update:
meta:
hidden: true
content:
role: system
parts:
- type: text
text: |
You are a helpful assistant named Bob.
Always tell the user your name.
Use available tools to help the user with their requests.
When the workflow reaches the ready state, it calls the LLM to generate a response based on the conversation history. The tool call is given an id so its result can be referenced later via the runtime object:
- id: prompt
from: ready
to: prompt_executed
call:
- id: llm_call
tool: aiGenerateText
args:
llm:
provider: openai
model: gpt-4o
messagesSearchTag: message
After the LLM generates a response, the result is stored as a document. The response data is accessed through runtime.tools.prompt.llm_call.data, which references the result of the llm_call tool in the prompt transition:
- id: add_response
from: prompt_executed
to: waiting_for_user
call:
- tool: createDocument
args:
document: aiMessageDocument
update:
content: ${{ runtime.tools.prompt.llm_call.data }}
The workflow defines a custom UI action that allows users to send messages:
ui:
widgets:
- widget: prompt-input
enabledWhen:
- ready
options:
transition: user_message
label: Send Message
User messages are handled through a manually triggered transition that captures the input payload:
- id: user_message
from: waiting_for_user
to: ready
trigger: manual
call:
- id: prompt_message
tool: createDocument
args:
document: aiMessageDocument
update:
content:
role: user
parts:
- type: text
text: ${{ runtime.transition.payload }}
The TypeScript workflow class declares the tools, documents, and runtime types used in the YAML definition:
import { ToolResult } from '@loopstack/common';
@Workflow({
uiConfig: __dirname + '/chat.ui.yaml',
})
export class ChatWorkflow {
@InjectTool() createDocument: CreateDocument;
@InjectTool() aiGenerateText: AiGenerateText;
@InjectDocument() aiMessageDocument: AiMessageDocument;
@Runtime()
runtime: {
tools: Record<'prompt', Record<'llm_call', ToolResult<AiMessageDocumentContentType>>>;
};
}
The @Runtime() decorator provides typed access to tool results. Here, runtime.tools.prompt.llm_call gives access to the result of the llm_call tool executed in the prompt transition.
This workflow uses the following Loopstack modules:
@loopstack/core - Core framework functionality@loopstack/core - Provides CreateDocument tool@loopstack/ai-module - Provides AiGenerateText tool and AiMessageDocumentAuthor: Jakob Klippel
License: Apache-2.0
FAQs
An example chat workflow to interact with an LLM.
The npm package @loopstack/chat-example-workflow receives a total of 474 weekly downloads. As such, @loopstack/chat-example-workflow popularity was classified as not popular.
We found that @loopstack/chat-example-workflow demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Socket CEO Feross Aboukhadijeh joins 10 Minutes or Less, a podcast by Ali Rohde, to discuss the recent surge in open source supply chain attacks.

Research
/Security News
Campaign of 108 extensions harvests identities, steals sessions, and adds backdoors to browsers, all tied to the same C2 infrastructure.

Security News
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.