
Security News
AI Agent Lands PRs in Major OSS Projects, Targets Maintainers via Cold Outreach
An AI agent is merging PRs into major OSS projects and cold-emailing maintainers to drum up more work.
@xyne/xyne-cli
Advanced tools
Xyne CLI - A powerful AI assistant in your terminal with file operations, bash mode, drag-and-drop support, and multi-provider AI integration
A powerful AI assistant in your terminal with file operations, bash mode, drag-and-drop support, and multi-provider AI integration

Xyne CLI brings the power of modern AI assistants directly to your terminal with an intuitive interface, rich file handling capabilities, and seamless integration with multiple AI providers.
! prefix@filename syntax/help, /clear, /export and more@filepath to reference files in conversations! to execute shell commands directly@ to browse and select files interactivelynpm install -g @xyne/xyne-cli
xyne --version
# Start interactive session
xyne
# Start with debug logging
xyne --debug
# Load a previous conversation
xyne --load=path/to/conversation.json
# Show help
xyne --help
# Perform update
xyne update
# Set up Vertex AI (requires Google Cloud SDK)
export VERTEX_PROJECT_ID="your-project-id"
export VERTEX_REGION="us-east5"
export VERTEX_MODEL="claude-sonnet-4@20250514"
export VERTEX=1
Note: Vertex AI is the default provider. If no environment variables are set, the system will use:
dev-ai-epsilon (default)us-east5 (default)claude-sonnet-4@20250514 (default)Supported Vertex AI Models:
claude-sonnet-4@20250514 - Claude Sonnet 4 with thinking capabilitiesgemini-2.5-pro - Google Gemini 2.5 Pro with enhanced reasoning# Set up LiteLLM for other providers
export LITE_LLM_API_KEY="your-api-key"
export LITE_LLM_URL="https://api.openai.com/v1"
export LITE_LLM_MODEL="gpt-4"
export LITE_LLM=1
Supported LiteLLM Models:
glm-45-fp8, glm-46-fp8claude-sonnet-4, claude-sonnet-4-20250514, claude-sonnet-4-5gemini-2.5-pro# Add command-based MCP servers
xyne mcp add context7 sh -c "npx -y @upstash/context7-mcp 2>/dev/null"
xyne mcp add filesystem npx @modelcontextprotocol/server-filesystem /path/to/directory
# Add HTTP transport MCP server
xyne mcp add deepwiki --transport=http --url=https://mcp.deepwiki.com/mcp
# Add with environment variables
xyne mcp add myserver npx my-mcp-server --env=API_KEY=secret --env=DEBUG=true
# Add from JSON configuration
xyne mcp add-json github '{"command":"docker","args":["run","-i","--rm","ghcr.io/github/github-mcp-server"]}'
# Add to global configuration
xyne mcp add context7 sh -c "npx -y @upstash/context7-mcp 2>/dev/null" --global
# List configured MCP servers
xyne mcp list
# Get details about a server
xyne mcp get filesystem
# Remove MCP server
xyne mcp remove filesystem
# Start a conversation
xyne
> Hello! How can I help you today?
# Ask questions
> What files are in my current directory?
# Get help with commands
> /help
# Basic prompt
xyne prompt "What is the capital of France?"
# Prompt with custom system prompt
xyne prompt "Help me code" --system-prompt="You are a senior software engineer"
xyne prompt "Analyze this" --system="You are concise and direct"
# Prompt with specific tools only
xyne prompt "Read and analyze files" --tools=read,grep,ls
xyne prompt "File operations only" --tools="read,write,edit"
# Combine system prompt and tools
xyne prompt "Help me debug" --system="You are helpful" --tools=read,grep,bash
# Prompt from piped input
echo "Analyze this code" | xyne prompt
cat file.txt | xyne prompt "Summarize this content"
# Flexible argument order (all equivalent)
xyne prompt "Hello world" --system="Be helpful"
xyne prompt --system="Be helpful" "Hello world"
xyne prompt --tools=ls "List files" --system="Be concise"
When using --tools, you can specify any combination of:
read - Read fileswrite - Write filesedit - Edit filesmultiedit - Multiple file editsgrep - Search patterns in filesglob - File pattern matchingls - Directory listingbash - Execute shell commandstodo-write - Task management# Execute shell commands with ! prefix
> !ls -la
> !git status
> !npm install express
# Or use the bash command
> /bash ls -la
# Reference files in conversation
> Can you read @README.md and summarize it?
# Drag and drop files (automatically detected)
> # Drop a file into terminal
> I just attached an image, what do you see?
# Create and edit files
> Create a new Python script that calculates fibonacci numbers
> Edit @script.py to add error handling
# Load previous conversation
xyne --load=conversation.json
# Export conversation
> /export conversation.md
> /export conversation.json
# Clear conversation
> /clear
# Enable debug mode
xyne --debug
| Type | Extensions | Max Size | Features |
|---|---|---|---|
| Images | .png, .jpg, .jpeg, .gif, .webp, .svg, .bmp, .ico | 10MB | Visual analysis, OCR |
| PDFs | .pdf | 25MB | Text extraction, analysis |
| Text | .txt, .md, .json, .yaml, .yml, .html, .css, .xml | 5MB | Full content analysis |
| Code | .js, .ts, .jsx, .tsx, .py, .go, .rs, .java, .cpp, .c, .php, .rb, .swift, .kt | 5MB | Syntax highlighting, analysis |
Additional Support:
.gitignore, Dockerfile, Makefile)@filename or @path/to/file@ to browse and select filesWhen reporting bugs, please include:
--debug flag)For feature requests, please:
| Variable | Description | Default |
|---|---|---|
VERTEX_PROJECT_ID | Google Cloud Project ID | dev-ai-epsilon |
VERTEX_REGION | Vertex AI region | us-east5 |
VERTEX_MODEL | Vertex AI model | claude-sonnet-4@20250514 |
LITE_LLM_API_KEY | LiteLLM API key | - |
LITE_LLM_URL | LiteLLM base URL | - |
LITE_LLM_MODEL | LiteLLM model name | - |
LOG_LEVEL | Logging level | off |
| Command | Description |
|---|---|
/help | Show available commands |
/clear | Clear conversation history |
/export | Export conversation |
/mcp | List MCP servers |
/exit | Exit the application |
| Command | Description |
|---|---|
xyne | Start interactive chat |
xyne prompt <text> | Execute one-shot prompt |
xyne mcp add <name> [options...] | Add MCP server |
xyne mcp remove <name> | Remove MCP server |
xyne mcp list | List all MCP servers |
xyne mcp get <name> | Get MCP server details |
xyne mcp add-json <name> <json> | Add MCP server from JSON |
xyne config | Show current configuration |
xyne update | Update to latest version |
xyne --help | Show help information |
xyne --version | Show version information |
This project is licensed under the MIT License - see the LICENSE file for details.
Made with ❤️ by the Xyne Team
FAQs
Xyne CLI - A powerful AI assistant in your terminal with file operations, bash mode, drag-and-drop support, and multi-provider AI integration
The npm package @xyne/xyne-cli receives a total of 111 weekly downloads. As such, @xyne/xyne-cli popularity was classified as not popular.
We found that @xyne/xyne-cli demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
An AI agent is merging PRs into major OSS projects and cold-emailing maintainers to drum up more work.

Research
/Security News
Chrome extension CL Suite by @CLMasters neutralizes 2FA for Facebook and Meta Business accounts while exfiltrating Business Manager contact and analytics data.

Security News
After Matplotlib rejected an AI-written PR, the agent fired back with a blog post, igniting debate over AI contributions and maintainer burden.