Socket
Book a DemoInstallSign in
Socket

ui-feedback-parser

Package Overview
Dependencies
Maintainers
1
Versions
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ui-feedback-parser

A new package is designed to analyze user-submitted discussions or problem descriptions about improving business interfaces, such as Chase Travel's UI. It processes the input text and outputs a struct

pipPyPI
Version
2025.12.21172537
Maintainers
1

ui-feedback-parser

PyPI version License: MIT Downloads LinkedIn

A Python package designed to analyze user-submitted discussions or problem descriptions about improving business interfaces. This tool processes input text and outputs structured summaries highlighting key issues, suggested improvements, or actionable insights, helping teams quickly understand and address user feedback.

Installation

pip install ui_feedback_parser

Usage

The main function ui_feedback_parser takes user input text and returns a list of extracted insights:

from ui_feedback_parser import ui_feedback_parser

# Basic usage with default LLM7
user_input = "The Chase Travel app's booking process is confusing. I couldn't find the filter options for flights."
result = ui_feedback_parser(user_input)
print(result)
# Output: ["Booking process is confusing", "Missing filter options for flights"]

Parameters

  • user_input (str): The user input text to process
  • llm (Optional[BaseChatModel]): A Langchain LLM instance to use. If not provided, defaults to ChatLLM7
  • api_key (Optional[str]): API key for LLM7. If not provided, uses the LLM7_API_KEY environment variable or defaults to free tier

Using Different LLM Providers

You can use other LLM providers by passing a Langchain Chat model instance:

# Using OpenAI
from langchain_openai import ChatOpenAI
from ui_feedback_parser import ui_feedback_parser

llm = ChatOpenAI()
result = ui_feedback_parser("The app's search function is too slow", llm=llm)

# Using Anthropic
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic()
result = ui_feedback_parser("The checkout process has too many steps", llm=llm)

# Using Google
from langchain_google_genai import ChatGoogleGenerativeAI
llm = ChatGoogleGenerativeAI()
result = ui_feedback_parser("The dashboard doesn't show my recent trips", llm=llm)

API Key Configuration

The default rate limits for LLM7's free tier are sufficient for most use cases. For higher rate limits:

  • Set the environment variable:
export LLM7_API_KEY="your_api_key_here"
  • Or pass the key directly:
result = ui_feedback_parser(user_input, api_key="your_api_key_here")

Get a free API key at https://token.llm7.io/

Contributing

Report issues or suggest improvements on our GitHub issues page.

Author

Eugene Evstafev
Email: hi@eugene.plus
GitHub: chigwell

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts