New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details
Socket
Book a DemoSign in
Socket

@loopstack/chat-example-workflow

Package Overview
Dependencies
Maintainers
1
Versions
18
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@loopstack/chat-example-workflow

An example chat workflow to interact with an LLM.

latest
npmnpm
Version
0.21.1
Version published
Weekly downloads
242
1323.53%
Maintainers
1
Weekly downloads
 
Created
Source

@loopstack/chat-example-workflow

A module for the Loopstack AI automation framework.

This module provides an example workflow demonstrating how to build an interactive chat interface with an LLM.

Overview

The Chat Example Workflow shows how to create a conversational assistant that processes user messages and generates responses using an LLM. It demonstrates the core patterns for building chat-based applications in Loopstack.

By using this workflow as a reference, you'll learn how to:

  • Set up a system prompt using a Handlebars template file
  • Use wait: true to pause the workflow for user input
  • Process user messages through an LLM with ClaudeGenerateText
  • Create a message loop for continuous conversation
  • Configure custom UI widgets for user input
  • Save documents with the workflow repository

This example is useful for developers building chatbots, virtual assistants, or any conversational AI interface.

Installation

See SETUP.md for installation and setup instructions.

How It Works

Key Concepts

1. System Prompt Setup

The workflow begins with an @Initial method that saves a hidden system message. The message content is rendered from a Handlebars template file:

@Initial({ to: 'waiting_for_user' })
async setup() {
  await this.repository.save(
    ClaudeMessageDocument,
    { role: 'user', content: this.render(__dirname + '/templates/systemMessage.md') },
    { meta: { hidden: true } },
  );
}

The { meta: { hidden: true } } option ensures the system message is included in the LLM context but not displayed in the chat UI.

2. Waiting for User Input

The userMessage transition uses wait: true to pause the workflow and wait for external input. A Zod schema defines the expected payload type:

@Transition({ from: 'waiting_for_user', to: 'ready', wait: true, schema: z.string() })
async userMessage(payload: string) {
  await this.repository.save(ClaudeMessageDocument, { role: 'user', content: payload });
}

When the user sends a message, the payload is saved as a ClaudeMessageDocument and the workflow transitions to the ready state.

3. LLM Response Generation

When the workflow reaches the ready state, it calls the LLM to generate a response based on the full conversation history using messagesSearchTag:

@Transition({ from: 'ready', to: 'waiting_for_user' })
async llmTurn() {
  const result = await this.claudeGenerateText.call({
    claude: { model: 'claude-sonnet-4-6' },
    messagesSearchTag: 'message',
  });
  await this.repository.save(ClaudeMessageDocument, result.data!, { id: result.data!.id });
}

The messagesSearchTag: 'message' parameter retrieves all saved ClaudeMessageDocument entries as conversation context. The LLM response is saved and the workflow loops back to waiting_for_user.

4. Custom UI Widgets

The workflow defines a prompt input widget that is enabled when waiting for user input:

ui:
  widgets:
    - widget: prompt-input
      enabledWhen:
        - waiting_for_user
      options:
        transition: userMessage
        label: Send Message

The transition: userMessage connects the widget to the userMessage method, and enabledWhen controls when the input is active.

Workflow Class

The complete workflow class uses @InjectTool() to access the ClaudeGenerateText tool and extends BaseWorkflow:

import { z } from 'zod';
import { ClaudeGenerateText, ClaudeMessageDocument } from '@loopstack/claude-module';
import { BaseWorkflow, Initial, InjectTool, Transition, Workflow } from '@loopstack/common';

@Workflow({
  uiConfig: __dirname + '/chat.ui.yaml',
})
export class ChatWorkflow extends BaseWorkflow {
  @InjectTool() claudeGenerateText: ClaudeGenerateText;

  @Initial({ to: 'waiting_for_user' })
  async setup() {
    await this.repository.save(
      ClaudeMessageDocument,
      { role: 'user', content: this.render(__dirname + '/templates/systemMessage.md') },
      { meta: { hidden: true } },
    );
  }

  @Transition({ from: 'waiting_for_user', to: 'ready', wait: true, schema: z.string() })
  async userMessage(payload: string) {
    await this.repository.save(ClaudeMessageDocument, { role: 'user', content: payload });
  }

  @Transition({ from: 'ready', to: 'waiting_for_user' })
  async llmTurn() {
    const result = await this.claudeGenerateText.call({
      claude: { model: 'claude-sonnet-4-6' },
      messagesSearchTag: 'message',
    });
    await this.repository.save(ClaudeMessageDocument, result.data!, { id: result.data!.id });
  }
}

Dependencies

This workflow uses the following Loopstack modules:

  • @loopstack/common - Core framework functionality, BaseWorkflow, decorators
  • @loopstack/claude-module - Provides ClaudeGenerateText tool and ClaudeMessageDocument

About

Author: Jakob Klippel

License: Apache-2.0

Additional Resources

Keywords

assistant

FAQs

Package last updated on 09 Apr 2026

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts