
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
@lov3kaizen/agentsea-cli
Advanced tools
Command-line interface for AgentSea ADK - Build and orchestrate AI agents from your terminal.
# Global installation
npm install -g @lov3kaizen/agentsea-cli
# Or use with npx
npx @lov3kaizen/agentsea-cli init
agentsea init
This will guide you through:
agentsea chat
Interactive chat session with your default agent.
agentsea agent run default "What is the capital of France?"
agentsea initInitialize AgentSea CLI configuration with interactive prompts.
agentsea init
agentsea chatStart an interactive chat session.
agentsea chat # Use default agent
agentsea chat --agent my-agent # Use specific agent
agentsea chat --model llama3 # Override model
agentsea agentManage agents.
# Create a new agent
agentsea agent create
# List all agents
agentsea agent list
# Get agent details
agentsea agent get <name>
# Run an agent with a message
agentsea agent run <name> "Your message"
# Set default agent
agentsea agent default <name>
# Delete an agent
agentsea agent delete <name>
agentsea providerManage providers.
# List all providers
agentsea provider list
# Get provider details
agentsea provider get <name>
# Add a new provider
agentsea provider add
# Set default provider
agentsea provider default <name>
# Delete a provider
agentsea provider delete <name>
agentsea modelManage models (Ollama only).
# List available models
agentsea model list
# Pull a model from Ollama
agentsea model pull llama2
# Show popular models
agentsea model popular
agentsea configShow current configuration.
agentsea config
# Initialize with Anthropic
agentsea init
> Cloud Provider
> Anthropic
> [Enter API Key]
# Chat with Claude
agentsea chat
# Initialize with Ollama
agentsea init
> Local Provider
> Ollama
> http://localhost:11434
# Pull a model
agentsea model pull llama2
# Chat with local model
agentsea chat
# Create a coding assistant
agentsea agent create
> Name: coder
> Model: codellama
> System Prompt: You are a coding assistant...
# Create a writer assistant
agentsea agent create
> Name: writer
> Model: llama2
> System Prompt: You are a creative writer...
# Use specific agent
agentsea chat --agent coder
Configuration is stored in:
~/.config/agentsea-cli/config.json~/Library/Preferences/agentsea-cli/config.json%APPDATA%\agentsea-cli\config.json{
"defaultProvider": "anthropic",
"defaultAgent": "default",
"providers": {
"anthropic": {
"name": "anthropic",
"type": "anthropic",
"apiKey": "sk-ant-...",
"timeout": 60000
},
"ollama": {
"name": "ollama",
"type": "ollama",
"baseUrl": "http://localhost:11434",
"timeout": 60000
}
},
"agents": {
"default": {
"name": "default",
"description": "Default agent",
"model": "claude-sonnet-4-20250514",
"provider": "anthropic",
"systemPrompt": "You are a helpful assistant.",
"temperature": 0.7,
"maxTokens": 2048
}
}
}
API keys can also be set via environment variables:
export ANTHROPIC_API_KEY=your_key_here
export OPENAI_API_KEY=your_key_here
export GEMINI_API_KEY=your_key_here
The CLI has first-class support for Ollama:
# Make sure Ollama is running
ollama serve
# Pull popular models
agentsea model pull llama2
agentsea model pull mistral
agentsea model pull codellama
# List available models
agentsea model list
# Show popular models
agentsea model popular
# Chat with local model
agentsea chat
Create an alias for quick access:
alias bc="agentsea chat"
Set up multiple providers for different use cases:
agentsea provider add
> Name: anthropic-prod
> Type: Anthropic
agentsea provider add
> Name: ollama-dev
> Type: Ollama
Create agents for specific tasks:
# Coding agent
agentsea agent create
> Name: code
> Model: codellama
> System Prompt: You are an expert programmer...
# Writing agent
agentsea agent create
> Name: write
> Model: llama2
> System Prompt: You are a creative writer...
# Use them
agentsea chat --agent code
agentsea chat --agent write
Get detailed output:
agentsea agent run default "Hello" --verbose
Run agentsea init to set up your first provider.
List providers: agentsea provider list
Add provider: agentsea provider add
List agents: agentsea agent list
Create agent: agentsea agent create
Make sure Ollama is running:
ollama serve
Check the base URL in your provider configuration:
agentsea provider get ollama
Pull the model first:
agentsea model pull llama2
# Install dependencies
pnpm install
# Build the CLI
pnpm build
# Link for local testing
pnpm link --global
# Test commands
agentsea--help
MIT License - see LICENSE for details
Built with ❤️ by lovekaizen
FAQs
CLI tool for AgentSea ADK - Build and orchestrate AI agents
We found that @lov3kaizen/agentsea-cli demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.