
Security News
AGENTS.md Gains Traction as an Open Format for AI Coding Agents
AGENTS.md is a fast-growing open format giving AI coding agents a shared, predictable way to understand project setup, style, and workflows.
ai-workflow-utils
Advanced tools
A comprehensive automation platform that streamlines software development workflows by integrating AI-powered content generation with popular development tools like Jira, Bitbucket, and email systems. Includes startup service management for automatic syst
The Ultimate AI-Powered Development Workflow Automation Platform
Streamline your development process with intelligent Jira ticket creation, AI-powered code reviews & pull request creation with custom template support, featuring a beautiful dark/light theme interface
Create professional Jira tickets (Tasks, Bugs, Stories) using AI with multiple provider support:
Revolutionary AI-powered pull request creation for Atlassian Bitbucket:
Revolutionary AI-powered pull request reviews for Atlassian Bitbucket:
Comprehensive logging and monitoring system for troubleshooting and system insights:
The new API Client module provides a flexible, general-purpose interface for making API requests to any service (Jira, Bitbucket, email, or custom endpoints).
/api/api-client
endpoint~/.ai-workflow-utils/environment.json
Coming Soon: AI-powered automation, script generation, and smart workflow integration will be added in future releases.
Beautiful, adaptive interface that automatically adjusts to your preferences:
Advanced Model Context Protocol (MCP) client management for seamless AI tool integration:
# Install Ollama (if you want local AI processing)
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
# Windows - Download from https://ollama.com/
# Download the LLaVA model for image analysis
ollama pull llava
# Start Ollama service
ollama serve
Then configure Ollama as your AI provider in the web interface.
npm install -g ai-workflow-utils
# Start the application directly
ai-workflow-utils
The application will start immediately and be available at
http://localhost:3000
For production use or to run automatically on system boot:
ai-workflow-utils startup install
The service will now start automatically on boot. Access at
http://localhost:3000
Startup Service Management:
ai-workflow-utils startup status # Check service status
ai-workflow-utils startup start # Start the service
ai-workflow-utils startup stop # Stop the service
ai-workflow-utils startup uninstall # Remove startup service
Supported Platforms:
For detailed startup service documentation, see STARTUP.md
All configuration is managed through the web-based settings page:
http://localhost:3000/settings/environment
All changes are saved to ~/.ai-workflow-utils/environment.json
and persist
across upgrades.
No manual .env setup required!
AI Workflow Utils is a fully-featured PWA! Install it as a native app for the best experience:
🖥️ Desktop Installation:
http://localhost:3000
in Chrome, Edge✨ PWA Benefits:
What makes it special:
Example Usage:
AI Providers Supported:
Revolutionary PR Generation:
How it works:
AI Features:
Revolutionary Code Review:
How it works:
Comprehensive System Monitoring:
How it works:
Monitoring Features:
AI Workflow Utils follows functional programming principles throughout the codebase:
Benefits:
Comprehensive Jira Mocking Service for development and testing:
# Enable mock mode (no real API calls)
JIRA_MOCK_MODE=true
# Use real Jira API
JIRA_MOCK_MODE=false
Mock Service Features:
Functional Mock Architecture:
// Pure state management
const getMockState = () => ({ ...mockState });
const updateMockState = updates => ({ ...mockState, ...updates });
// Functional API operations
export const createIssue = async issueData => {
/* pure function */
};
export const getIssue = async issueKey => {
/* pure function */
};
export const searchIssues = async jql => {
/* pure function */
};
server/
├── controllers/ # Feature-based controllers
│ ├── jira/ # Jira integration
│ │ ├── services/ # Business logic services
│ │ ├── models/ # Data models
│ │ ├── utils/ # Utility functions
│ │ └── README.md # Module documentation
│ ├── pull-request/ # PR creation & review
│ ├── email/ # Email generation
│ ├── chat/ # AI chat integration
│ └── mcp/ # Model Context Protocol client management
├── mocks/ # Mock services (excluded from npm package)
│ └── jira/ # Comprehensive Jira mocking
└── services/ # Shared services
Each module follows the same structure:
# Interactive setup wizard
ai-workflow-setup
# Check configuration
ai-workflow-utils --config
# Test connections
ai-workflow-utils --test
# Start in development mode
ai-workflow-utils --dev
# Enable debug logging
ai-workflow-utils --debug
# Specify custom port
ai-workflow-utils --port 8080
# View logs in real-time
ai-workflow-utils --logs
# Clear log files
ai-workflow-utils --clear-logs
# Check Ollama status
ai-workflow-utils --ollama-status
# Download recommended models
ai-workflow-utils --setup-ollama
# List available models
ollama list
// Automatic fallback order:
1. OpenAI Compatible API (Primary)
2. Ollama LLaVA (Local fallback)
3. Error handling with user notification
# For different OpenAI-compatible providers:
OPENAI_COMPATIBLE_MODEL=gpt-4-vision-preview # OpenAI
OPENAI_COMPATIBLE_MODEL=claude-3-sonnet-20240229 # Anthropic
OPENAI_COMPATIBLE_MODEL=llama-2-70b-chat # Custom API
# For Ollama local models:
OLLAMA_MODEL=llava:13b # Larger model for better quality
OLLAMA_MODEL=llava:7b # Faster, smaller model
OLLAMA_MODEL=codellama:7b # Code-focused model
# Streaming configuration
STREAM_CHUNK_SIZE=1024
STREAM_TIMEOUT=60000
# Rate limiting
API_RATE_LIMIT=100
API_RATE_WINDOW=900000
# File upload limits
MAX_FILE_SIZE=50MB
ALLOWED_FILE_TYPES=jpg,jpeg,png,gif,mp4,mov,pdf,doc,docx
# Logging configuration
LOG_LEVEL=info # error, warn, info, debug
LOG_MAX_SIZE=10MB # Maximum log file size
LOG_MAX_FILES=5 # Number of rotated log files
LOG_RETENTION_DAYS=30 # Days to keep log files
ENABLE_REQUEST_LOGGING=true # Log all HTTP requests
# Build the application
npm run build
# Create Docker image
docker build -t ai-workflow-utils .
# Run container
docker run -p 3000:3000 --env-file .env ai-workflow-utils
# Install PM2
npm install -g pm2
# Start application
pm2 start ecosystem.config.js
# Monitor
pm2 monit
# View logs
pm2 logs ai-workflow-utils
server {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_upgrade;
}
}
/health
endpoint for monitoring# Logging levels: error, warn, info, debug
LOG_LEVEL=info
# Log file rotation
LOG_MAX_SIZE=10MB
LOG_MAX_FILES=5
# Enable request logging
LOG_REQUESTS=true
We welcome contributions! Here's how to get started:
# Clone the repository
git clone https://github.com/anuragarwalkar/ai-workflow-utils.git
cd ai-workflow-utils
# Install dependencies
npm install
# Set up environment
cp .env.example .env
cp ui/.env.example ui/.env
# Start development server
npm run dev
ai-workflow-utils/
├── bin/ # CLI scripts
├── server/ # Backend (Node.js + Express)
├── ui/ # Frontend (React + Redux)
├── dist/ # Built files
└── docs/ # Documentation
Jira Ticket Creation:
POST /api/jira/preview
Content-Type: application/json
{
"prompt": "Login button not working",
"images": ["base64-encoded-image"],
"issueType": "Bug"
}
Create Pull Request Preview (Streaming):
POST /api/pr/stream-preview
Content-Type: application/json
{
"projectKey": "PROJ",
"repoSlug": "my-repo",
"ticketNumber": "PROJ-123",
"branchName": "feature/my-branch"
}
# Returns Server-Sent Events stream with:
# - status updates
# - title_chunk events (streaming title generation)
# - title_complete event (final title)
# - description_chunk events (streaming description generation)
# - description_complete event (final description)
# - complete event (final preview data)
Create Pull Request:
POST /api/pr/create
Content-Type: application/json
{
"projectKey": "PROJ",
"repoSlug": "my-repo",
"ticketNumber": "PROJ-123",
"branchName": "feature/my-branch",
"customTitle": "feat(PROJ-123): Add user authentication",
"customDescription": "## Summary\nAdded user authentication feature\n\n## Changes Made\n- Added login component\n- Implemented JWT tokens"
}
GitStash PR Review:
POST /api/pr/review
Content-Type: application/json
{
"repoUrl": "https://bitbucket.company.com/projects/PROJ/repos/repo",
"pullRequestId": "123",
"reviewType": "security"
}
File Upload:
POST /api/jira/upload
Content-Type: multipart/form-data
file: [binary-data]
issueKey: "PROJ-123"
MCP Client Management:
# Get all MCP clients
GET /api/mcp/clients
# Create new MCP client
POST /api/mcp/clients
Content-Type: application/json
{
"name": "My MCP Server",
"url": "http://localhost:8080/mcp",
"token": "optional-auth-token",
"description": "Local MCP server for custom tools",
"enabled": true
}
# Update MCP client
PUT /api/mcp/clients/:id
Content-Type: application/json
{
"name": "Updated MCP Server",
"enabled": false
}
# Delete MCP client
DELETE /api/mcp/clients/:id
# Test MCP client connection
POST /api/mcp/clients/:id/test
Ollama Connection Failed:
# Check if Ollama is running
ollama list
# Start Ollama service
ollama serve
# Pull required model
ollama pull llava
Jira Authentication Error:
# Test Jira connection
curl -H "Authorization: Bearer YOUR_TOKEN" \
https://your-company.atlassian.net/rest/api/2/myself
Port Already in Use:
# Use different port
ai-workflow-utils --port 8080
# Or kill existing process
lsof -ti:3000 | xargs kill -9
# Enable detailed logging
ai-workflow-utils --debug
# Check logs
tail -f logs/app.log
This project is licensed under the MIT License - see the LICENSE file for details.
Special thanks to the amazing open-source community and the following technologies that make this project possible:
npm install -g ai-workflow-utils
⭐ Star us on GitHub if this tool helps you!
📢 Share with your team and boost everyone's productivity!
Made with ❤️ by Anurag Arwalkar
Empowering developers worldwide with AI-powered workflow automation
FAQs
A comprehensive automation platform that streamlines software development workflows by integrating AI-powered content generation with popular development tools like Jira, Bitbucket, and email systems. Includes startup service management for automatic syst
The npm package ai-workflow-utils receives a total of 23 weekly downloads. As such, ai-workflow-utils popularity was classified as not popular.
We found that ai-workflow-utils demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
AGENTS.md is a fast-growing open format giving AI coding agents a shared, predictable way to understand project setup, style, and workflows.
Security News
/Research
Malicious npm package impersonates Nodemailer and drains wallets by hijacking crypto transactions across multiple blockchains.
Security News
This episode explores the hard problem of reachability analysis, from static analysis limits to handling dynamic languages and massive dependency trees.