
Research
Malicious npm Packages Impersonate Flashbots SDKs, Targeting Ethereum Wallet Credentials
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
@celp/agent-fleet
Advanced tools
A powerful CLI tool for creating, managing, and deploying AI agents with MCP integration
A comprehensive platform for creating, managing, and deploying standalone AI agents with tool support and MCP server integration.
Create a .env
file with your OpenAI API key:
cp .env.example .env
# Edit .env and add your OpenAI API key
git clone <repository>
npm install
npm run build
Core Functionality:
CLI Features:
# Create a standalone agent repository (dependencies auto-installed)
npm run dev -- agent create-repo -n "my-assistant" -m "gpt-5" -i "You are a helpful assistant" --mcp-servers "google-workspace"
# Ready to use immediately! Chat with your agent:
npm run dev -- chat --agent "my-assistant"
# Or run directly from agent directory:
cd ~/.agent-fleet/agents/my-assistant/
npm run dev # Interactive development mode
# Or build and deploy
npm run build
npm start # HTTP server mode
# Create agent in platform database (legacy)
npm run dev agent add --interactive
# Chat through platform
npm run dev chat --agent my-assistant
Once you're chatting with an agent, try these examples to see tool execution in action:
"Calculate 25 * 4 + 10" # Uses calculator tool
"Read the package.json file" # Uses file reading tool
"List files in the current directory" # Uses directory listing
"What's the current date?" # Uses shell command
"Search for TypeScript best practices" # Uses web search tool
Note: Tools are automatically executed when the agent determines they're needed. Use --verbose
flag to see detailed tool inputs/outputs.
The platform uses a distributed architecture designed for scalability:
ββ AI Agent CLI Platform ββ ββ ai-agent-runtime ββ
β ββ Agent Builder ββββββΆβ ββ AgentRuntime β
β ββ Template System β β ββ Built-in Tools β
β ββ Database (SQLite) β β ββ MCP Integration β
β ββ Repository Manager β β ββ Tool-Aware AI β
βββββββββββββββββββββββββββ ββββββββββββββββββββββ
β β
βββββββββΌββββββββ β
β Configuration β β
βββββββββββββββββ β
β
ββββββββββββββββββββββββββββββββββββΌβββββββββββββββββββββββ
β
βββββββββΌββββββββ
β Standalone β
β Agents β
ββ Lambda β
ββ Docker β
ββ HTTP Server β
ββ Custom β
Create a .env
file in the project root:
OPENAI_API_KEY=your-openai-api-key
AI_MODEL=gpt-5
Each agent has its own .env
file:
# In ~/.agent-fleet/agents/my-agent/.env
OPENAI_API_KEY=your-openai-api-key
NODE_ENV=development
PORT=3000
To avoid setting up AWS credentials for each agent individually, create a base .env.lambda
file in the project root:
# Copy the template and fill in your AWS credentials
cp .env.lambda.example .env.lambda
# In agent-fleet/.env.lambda
AWS_ACCESS_KEY_ID=your-aws-access-key-id
AWS_SECRET_ACCESS_KEY=your-aws-secret-access-key
AWS_REGION=us-east-1
AWS_ACCOUNT_ID=123456789012
ECR_REGISTRY=123456789012.dkr.ecr.us-east-1.amazonaws.com
LAMBDA_ROLE_ARN=arn:aws:iam::123456789012:role/lambda-execution-role
When you generate new agents, they'll automatically inherit these AWS credentials. Existing agents can sync credentials using:
cd ~/.agent-fleet/agents/my-agent
npm run sync:aws-creds
Add agent-specific tools in src/custom-tools.ts
:
import { z } from 'zod';
import type { ToolDefinition } from 'ai-agent-runtime';
export async function getCustomTools(): Promise<ToolDefinition[]> {
return [
{
name: 'weather_check',
description: 'Check current weather for a location',
parameters: z.object({
location: z.string().describe('City name')
}),
execute: async ({ location }) => {
// Your weather API integration
return { temperature: 72, condition: 'sunny' };
}
}
];
}
# HTTP Server
npm run build && npm start
# Docker
npm run deploy:docker
# AWS Lambda
npm run deploy:lambda
# Development
npm run dev
Connect to external services using MCP servers:
# agent.yaml
mcpServers:
- name: filesystem
url: npx @modelcontextprotocol/server-filesystem /allowed/path
- name: database
url: node ./custom-mcp-server.js
env:
DB_CONNECTION: "postgresql://..."
# Build CLI platform
npm run build
npm run typecheck
# Build runtime package
cd packages/runtime/
npm run build
# In agent directory
cd ~/.agent-fleet/agents/my-agent/
# Development mode
npm run dev
# Build for production
npm run build
# Type checking
npm run typecheck
# Validate configuration
npm run validate
# Update runtime package
cd packages/runtime/
npm version minor
npm publish
# Update specific agents
cd ~/.agent-fleet/agents/my-agent/
npm update ai-agent-runtime
npm run build && npm run deploy
Available templates for quick agent creation:
# Create from built-in template
npm run dev -- agent create-repo --template "sample-developer-assistant" -n "my-agent"
# Interactive template selection
npm run dev -- agent create-repo --interactive
β οΈ Note: Template and migrate commands were removed from the CLI:
npm run dev -- template [subcommand]
β DELETED (files removed)npm run dev -- migrate [subcommand]
β DELETED (files removed)Use agent create-repo
with built-in templates instead.
Platform team updates runtime:
cd packages/runtime/
# Add new tools/features
npm version minor # 1.0.0 β 1.1.0
npm publish
Agent owners update selectively:
cd ~/.agent-fleet/agents/my-agent/
npm update ai-agent-runtime@1.1.0
npm run build && npm run deploy
Bulk updates (optional):
npm run dev agent bulk-update --runtime-version 1.1.0
// Automatically included in each agent
import { createAgentRuntime, loadManifestFromFile } from 'ai-agent-runtime';
import { startServer } from './server.js';
const manifest = await loadManifestFromFile('agent.yaml');
const runtime = await createAgentRuntime(manifest, customTools);
await startServer(runtime, 3000);
# Deploy with streaming support
npm run deploy:lambda:complete
# Test your Lambda function
curl -X POST https://your-function-url.lambda-url.region.on.aws/chat \
-H "Content-Type: application/json" \
-d '{"message": "Hello, Lambda!"}'
# Automatically included in each agent
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
RUN npm run build
EXPOSE 3000
CMD ["npm", "start"]
Each agent includes these built-in tools:
Customize tool behavior in your agent.yaml
:
# Enable specific tools
enabledTools:
- web_search
- calculator
- analyze_pdf
# Configure individual tools
toolConfigurations:
web_search:
model: gpt-5-mini # Override model for web searches
maxTokens: 3000 # Limit response length
reasoning_effort: medium # For reasoning models (minimal/low/medium/high)
analyze_pdf:
model: gpt-5 # Model for PDF analysis
maxTokens: 8000 # Token limit for analysis
Available configurations:
model
(defaults to agent model), maxTokens
(5000), reasoning_effort
model
(defaults to agent model), maxTokens
π Complete configuration guide: AGENT-CONFIGURATION.md
Contributions are welcome! Please read the contributing guidelines before submitting PRs.
MIT License - see LICENSE file for details.
FAQs
A powerful CLI tool for creating, managing, and deploying AI agents with MCP integration
The npm package @celp/agent-fleet receives a total of 1 weekly downloads. As such, @celp/agent-fleet popularity was classified as not popular.
We found that @celp/agent-fleet demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.Β It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
Security News
Ruby maintainers from Bundler and rbenv teams are building rv to bring Python uv's speed and unified tooling approach to Ruby development.
Security News
Following last weekβs supply chain attack, Nx published findings on the GitHub Actions exploit and moved npm publishing to Trusted Publishers.