
Security News
Axios Maintainer Confirms Social Engineering Attack Behind npm Compromise
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.
@lov3kaizen/aigency-cli
Advanced tools
Command-line interface for Aigency ADK - Build and orchestrate AI agents from your terminal.
# Global installation
npm install -g @aigency/cli
# Or use with npx
npx @aigency/cli init
aigency init
This will guide you through:
aigency chat
Interactive chat session with your default agent.
aigency agent run default "What is the capital of France?"
aigencyinitInitialize Aigency CLI configuration with interactive prompts.
aigencyinit
aigencychatStart an interactive chat session.
aigencychat # Use default agent
aigencychat --agent my-agent # Use specific agent
aigencychat --model llama3 # Override model
aigencyagentManage agents.
# Create a new agent
aigencyagent create
# List all agents
aigencyagent list
# Get agent details
aigencyagent get <name>
# Run an agent with a message
aigencyagent run <name> "Your message"
# Set default agent
aigencyagent default <name>
# Delete an agent
aigencyagent delete <name>
aigencyproviderManage providers.
# List all providers
aigencyprovider list
# Get provider details
aigencyprovider get <name>
# Add a new provider
aigencyprovider add
# Set default provider
aigencyprovider default <name>
# Delete a provider
aigencyprovider delete <name>
aigencymodelManage models (Ollama only).
# List available models
aigencymodel list
# Pull a model from Ollama
aigencymodel pull llama2
# Show popular models
aigencymodel popular
aigencyconfigShow current configuration.
aigencyconfig
# Initialize with Anthropic
aigencyinit
> Cloud Provider
> Anthropic
> [Enter API Key]
# Chat with Claude
aigencychat
# Initialize with Ollama
aigencyinit
> Local Provider
> Ollama
> http://localhost:11434
# Pull a model
aigencymodel pull llama2
# Chat with local model
aigencychat
# Create a coding assistant
aigencyagent create
> Name: coder
> Model: codellama
> System Prompt: You are a coding assistant...
# Create a writer assistant
aigencyagent create
> Name: writer
> Model: llama2
> System Prompt: You are a creative writer...
# Use specific agent
aigencychat --agent coder
Configuration is stored in:
~/.config/aigency-cli/config.json~/Library/Preferences/aigency-cli/config.json%APPDATA%\aigency-cli\config.json{
"defaultProvider": "anthropic",
"defaultAgent": "default",
"providers": {
"anthropic": {
"name": "anthropic",
"type": "anthropic",
"apiKey": "sk-ant-...",
"timeout": 60000
},
"ollama": {
"name": "ollama",
"type": "ollama",
"baseUrl": "http://localhost:11434",
"timeout": 60000
}
},
"agents": {
"default": {
"name": "default",
"description": "Default agent",
"model": "claude-sonnet-4-20250514",
"provider": "anthropic",
"systemPrompt": "You are a helpful assistant.",
"temperature": 0.7,
"maxTokens": 2048
}
}
}
API keys can also be set via environment variables:
export ANTHROPIC_API_KEY=your_key_here
export OPENAI_API_KEY=your_key_here
export GEMINI_API_KEY=your_key_here
The CLI has first-class support for Ollama:
# Make sure Ollama is running
ollama serve
# Pull popular models
aigencymodel pull llama2
aigencymodel pull mistral
aigencymodel pull codellama
# List available models
aigencymodel list
# Show popular models
aigencymodel popular
# Chat with local model
aigencychat
Create an alias for quick access:
alias bc="aigencychat"
Set up multiple providers for different use cases:
aigencyprovider add
> Name: anthropic-prod
> Type: Anthropic
aigencyprovider add
> Name: ollama-dev
> Type: Ollama
Create agents for specific tasks:
# Coding agent
aigencyagent create
> Name: code
> Model: codellama
> System Prompt: You are an expert programmer...
# Writing agent
aigencyagent create
> Name: write
> Model: llama2
> System Prompt: You are a creative writer...
# Use them
aigencychat --agent code
aigencychat --agent write
Get detailed output:
aigencyagent run default "Hello" --verbose
Run aigencyinit to set up your first provider.
List providers: aigencyprovider list
Add provider: aigencyprovider add
List agents: aigencyagent list
Create agent: aigencyagent create
Make sure Ollama is running:
ollama serve
Check the base URL in your provider configuration:
aigencyprovider get ollama
Pull the model first:
aigencymodel pull llama2
# Install dependencies
pnpm install
# Build the CLI
pnpm build
# Link for local testing
pnpm link --global
# Test commands
aigency--help
MIT License - see LICENSE for details
Built with ❤️ by lovekaizen
FAQs
CLI tool for Aigency ADK - Build and orchestrate AI agents
We found that @lov3kaizen/aigency-cli demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.

Security News
The Axios compromise shows how time-dependent dependency resolution makes exposure harder to detect and contain.