New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

@lov3kaizen/aigency-cli

Package Overview
Dependencies
Maintainers
1
Versions
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install
Package was removed
Sorry, it seems this package was removed from the registry

@lov3kaizen/aigency-cli

CLI tool for Aigency ADK - Build and orchestrate AI agents

latest
Source
npmnpm
Version
0.3.0
Version published
Maintainers
1
Created
Source

@aigency/cli

Command-line interface for Aigency ADK - Build and orchestrate AI agents from your terminal.

npm version License: MIT

Features

  • 🚀 Quick Setup - Initialize with interactive prompts
  • 💬 Interactive Chat - Chat with agents in your terminal
  • 🤖 Agent Management - Create, list, and manage agents
  • 🔌 Provider Support - Cloud and local providers
  • 📦 Model Management - Pull and manage Ollama models
  • ⚙️ Configuration - Persistent configuration management
  • 🎨 Beautiful Output - Colored, formatted terminal output

Installation

# Global installation
npm install -g @aigency/cli

# Or use with npx
npx @aigency/cli init

Quick Start

1. Initialize

aigency init

This will guide you through:

  • Choosing a provider (cloud or local)
  • Configuring API keys or endpoints
  • Creating a default agent

2. Start Chatting

aigency chat

Interactive chat session with your default agent.

3. Run One-Off Commands

aigency agent run default "What is the capital of France?"

Commands

aigencyinit

Initialize Aigency CLI configuration with interactive prompts.

aigencyinit

aigencychat

Start an interactive chat session.

aigencychat                        # Use default agent
aigencychat --agent my-agent       # Use specific agent
aigencychat --model llama3         # Override model

aigencyagent

Manage agents.

# Create a new agent
aigencyagent create

# List all agents
aigencyagent list

# Get agent details
aigencyagent get <name>

# Run an agent with a message
aigencyagent run <name> "Your message"

# Set default agent
aigencyagent default <name>

# Delete an agent
aigencyagent delete <name>

aigencyprovider

Manage providers.

# List all providers
aigencyprovider list

# Get provider details
aigencyprovider get <name>

# Add a new provider
aigencyprovider add

# Set default provider
aigencyprovider default <name>

# Delete a provider
aigencyprovider delete <name>

aigencymodel

Manage models (Ollama only).

# List available models
aigencymodel list

# Pull a model from Ollama
aigencymodel pull llama2

# Show popular models
aigencymodel popular

aigencyconfig

Show current configuration.

aigencyconfig

Examples

Cloud Provider (Anthropic)

# Initialize with Anthropic
aigencyinit
> Cloud Provider
> Anthropic
> [Enter API Key]

# Chat with Claude
aigencychat

Local Provider (Ollama)

# Initialize with Ollama
aigencyinit
> Local Provider
> Ollama
> http://localhost:11434

# Pull a model
aigencymodel pull llama2

# Chat with local model
aigencychat

Multiple Agents

# Create a coding assistant
aigencyagent create
> Name: coder
> Model: codellama
> System Prompt: You are a coding assistant...

# Create a writer assistant
aigencyagent create
> Name: writer
> Model: llama2
> System Prompt: You are a creative writer...

# Use specific agent
aigencychat --agent coder

Configuration

Configuration is stored in:

  • Linux: ~/.config/aigency-cli/config.json
  • macOS: ~/Library/Preferences/aigency-cli/config.json
  • Windows: %APPDATA%\aigency-cli\config.json

Configuration Structure

{
  "defaultProvider": "anthropic",
  "defaultAgent": "default",
  "providers": {
    "anthropic": {
      "name": "anthropic",
      "type": "anthropic",
      "apiKey": "sk-ant-...",
      "timeout": 60000
    },
    "ollama": {
      "name": "ollama",
      "type": "ollama",
      "baseUrl": "http://localhost:11434",
      "timeout": 60000
    }
  },
  "agents": {
    "default": {
      "name": "default",
      "description": "Default agent",
      "model": "claude-sonnet-4-20250514",
      "provider": "anthropic",
      "systemPrompt": "You are a helpful assistant.",
      "temperature": 0.7,
      "maxTokens": 2048
    }
  }
}

Environment Variables

API keys can also be set via environment variables:

export ANTHROPIC_API_KEY=your_key_here
export OPENAI_API_KEY=your_key_here
export GEMINI_API_KEY=your_key_here

Supported Providers

Cloud Providers

  • Anthropic - Claude models
  • OpenAI - GPT models
  • Google - Gemini models

Local Providers

  • Ollama - Local LLM runtime
  • LM Studio - GUI for local models
  • LocalAI - OpenAI-compatible local API

Ollama Integration

The CLI has first-class support for Ollama:

# Make sure Ollama is running
ollama serve

# Pull popular models
aigencymodel pull llama2
aigencymodel pull mistral
aigencymodel pull codellama

# List available models
aigencymodel list

# Show popular models
aigencymodel popular

# Chat with local model
aigencychat

Tips & Tricks

1. Quick Chat

Create an alias for quick access:

alias bc="aigencychat"

2. Multiple Providers

Set up multiple providers for different use cases:

aigencyprovider add
> Name: anthropic-prod
> Type: Anthropic

aigencyprovider add
> Name: ollama-dev
> Type: Ollama

3. Specialized Agents

Create agents for specific tasks:

# Coding agent
aigencyagent create
> Name: code
> Model: codellama
> System Prompt: You are an expert programmer...

# Writing agent
aigencyagent create
> Name: write
> Model: llama2
> System Prompt: You are a creative writer...

# Use them
aigencychat --agent code
aigencychat --agent write

4. Verbose Mode

Get detailed output:

aigencyagent run default "Hello" --verbose

Troubleshooting

"No providers configured"

Run aigencyinit to set up your first provider.

"Provider not found"

List providers: aigencyprovider list

Add provider: aigencyprovider add

"Agent not found"

List agents: aigencyagent list

Create agent: aigencyagent create

Ollama Connection Error

Make sure Ollama is running:

ollama serve

Check the base URL in your provider configuration:

aigencyprovider get ollama

Model Not Found (Ollama)

Pull the model first:

aigencymodel pull llama2

Development

# Install dependencies
pnpm install

# Build the CLI
pnpm build

# Link for local testing
pnpm link --global

# Test commands
aigency--help

License

MIT License - see LICENSE for details

See Also

Built with ❤️ by lovekaizen

Keywords

ai

FAQs

Package last updated on 02 Dec 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts