Big News: Socket Selected for OpenAI's Cybersecurity Grant Program.Details
Socket
Book a DemoSign in
Socket

@aws/genai-plugin-langgraph-agent-for-backstage

Package Overview
Dependencies
Maintainers
1
Versions
10
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@aws/genai-plugin-langgraph-agent-for-backstage

LangGraph agent module package for the GenAI AWS plugins for Backstage

latest
Source
npmnpm
Version
0.7.1
Version published
Maintainers
1
Created
Source

Generative AI plugin for Backstage - LangGraph Agent Type

This package implements an agent for the Generative AI plugin for Backstage based on LangGraph.js.

Features:

  • ReAct pattern to use available tools to answer prompts
  • Choose between Amazon Bedrock or OpenAI as the model provider
  • Optionally use the Backstage sqlite/Postgres database as a checkpoint store
  • Integrate with LangFuse for observability

Configuration

This agent can be configured at two different levels, global and per-agent

Global

Global configuration values apply to all agents, all of this is optional:

genai:
  langgraph:
    memory: # (Optional) Memory store to use
    recursionLimit: # (Optional) Limit the number of graph supersteps (default 25)
    langfuse: # (Optional) Configuration for LangFuse observability
      baseUrl: http://localhost:3001 # (Required) LangFuse URL
      publicKey: pk-aaa # (Required) Public key
      secretKey: sk-bbb # (Required) Secret key

The available options for memory are:

  • in-memory: (Default) Store the agent state in memory
  • backstage: Uses the Backstage database to store agent state, either sqlite or PostgresQL depending on the configuration

Per-agent

Per-agent configuration only applies to the agent for which it corresponds. The available parameters are:

genai:
  agents:
    general:
      description: [...]
      prompt: [...]
      langgraph:
        messagesMaxTokens: 100000 # (Required) Prune message history to maximum of this number of tokens
        temperature: 0 # (Optional) Model temperature
        maxTokens: 4000 # (Optional) Maximum output tokens
        topP: 0.9 # (Optional) Model topP
        # Only include the subsequent section for your model provider
        # Bedrock only
        bedrock:
          modelId: 'anthropic.claude-3-5-sonnet-20241022-v2:0' # (Required) Bedrock model ID
          region: us-west-2 # (Required) Bedrock AWS region
        # OpenAI only
        openai:
          apiKey: ${OPENAI_API_KEY} # (Required) OpenAI model name
          modelName: 'gpt-3.5-turbo-instruct' # (Optional) OpenAI model name
          baseUrl: ${OPENAI_API_BASE_URL} # (Optional) URL for OpenAI API endpoint
        ollama:
          model: ${OLLAMA_MODEL} # (Required) Ollama model name
          baseUrl: ${OLLAMA_BASE_URL} # (Required) Base URL for Ollama

FAQs

Package last updated on 04 Feb 2026

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts