Xyne CLI
A powerful AI assistant in your terminal with file operations, bash mode, drag-and-drop support, and multi-provider AI integration


Xyne CLI brings the power of modern AI assistants directly to your terminal with an intuitive interface, rich file handling capabilities, and seamless integration with multiple AI providers.
✨ Features
🎯 Core Capabilities
- Interactive Chat Interface - Beautiful terminal UI powered by Ink
- Multi-Provider AI Support - Works with Vertex AI and LiteLLM
- File Operations - Read, write, edit, and search files with AI assistance
- Conversation Management - Save, load, and resume conversations
- Smart Context Management - Automatic conversation compacting when context limits are reached
🛠️ Advanced Features
- Bash Mode - Execute shell commands with
! prefix
- File Attachments - Drag & drop files or use
@filename syntax
- Long Paste Support - Intelligent handling of large text pastes
- Word Navigation - Option+Arrow keys for word jumping and deletion
- Command System - Built-in
/help, /clear, /export and more
- MCP Integration - Model Context Protocol for extensible tools
📁 File Handling
- Drag & Drop Support - Drop files directly into the terminal
- File Type Detection - Automatic detection of images, PDFs, code, and text files
- Multiple Format Support - Images (PNG, JPG, GIF, WebP), PDFs, and text files up to 25MB
- Smart File Paths - Use
@filepath to reference files in conversations
⌨️ Productivity Features
- Bash Mode - Type
! to execute shell commands directly
- Input History - Use ↑/↓ arrows to navigate through previous inputs
- File Browser - Type
@ to browse and select files interactively
- Message Queue - Queue messages while AI is processing
- Interruption Support - Double ESC to interrupt AI processing
📦 Installation
Global Installation (Recommended)
npm install -g @xyne/xyne-cli
Verify Installation
xyne --version
Quick Start
xyne
xyne --debug
xyne --load=path/to/conversation.json
xyne --help
xyne update
🔧 Configuration
AI Provider Setup
Vertex AI (Default)
export VERTEX_PROJECT_ID="your-project-id"
export VERTEX_REGION="us-east5"
export VERTEX_MODEL="claude-sonnet-4@20250514"
export VERTEX=1
Note: Vertex AI is the default provider. If no environment variables are set, the system will use:
- Project ID:
dev-ai-epsilon (default)
- Region:
us-east5 (default)
- Model:
claude-sonnet-4@20250514 (default)
Supported Vertex AI Models:
claude-sonnet-4@20250514 - Claude Sonnet 4 with thinking capabilities
gemini-2.5-pro - Google Gemini 2.5 Pro with enhanced reasoning
LiteLLM (Alternative)
export LITE_LLM_API_KEY="your-api-key"
export LITE_LLM_URL="https://api.openai.com/v1"
export LITE_LLM_MODEL="gpt-4"
export LITE_LLM=1
Supported LiteLLM Models:
- Hugging Face:
glm-45-fp8, glm-46-fp8
- Anthropic:
claude-sonnet-4, claude-sonnet-4-20250514, claude-sonnet-4-5
- Google:
gemini-2.5-pro
MCP Servers
xyne mcp add context7 sh -c "npx -y @upstash/context7-mcp 2>/dev/null"
xyne mcp add filesystem npx @modelcontextprotocol/server-filesystem /path/to/directory
xyne mcp add deepwiki --transport=http --url=https://mcp.deepwiki.com/mcp
xyne mcp add myserver npx my-mcp-server --env=API_KEY=secret --env=DEBUG=true
xyne mcp add-json github '{"command":"docker","args":["run","-i","--rm","ghcr.io/github/github-mcp-server"]}'
xyne mcp add context7 sh -c "npx -y @upstash/context7-mcp 2>/dev/null" --global
xyne mcp list
xyne mcp get filesystem
xyne mcp remove filesystem
🚀 Usage
Interactive Chat
xyne
> Hello! How can I help you today?
> What files are in my current directory?
> /help
One-Shot Prompts
xyne prompt "What is the capital of France?"
xyne prompt "Help me code" --system-prompt="You are a senior software engineer"
xyne prompt "Analyze this" --system="You are concise and direct"
xyne prompt "Read and analyze files" --tools=read,grep,ls
xyne prompt "File operations only" --tools="read,write,edit"
xyne prompt "Help me debug" --system="You are helpful" --tools=read,grep,bash
echo "Analyze this code" | xyne prompt
cat file.txt | xyne prompt "Summarize this content"
xyne prompt "Hello world" --system="Be helpful"
xyne prompt --system="Be helpful" "Hello world"
xyne prompt --tools=ls "List files" --system="Be concise"
Available Tools
When using --tools, you can specify any combination of:
read - Read files
write - Write files
edit - Edit files
multiedit - Multiple file edits
grep - Search patterns in files
glob - File pattern matching
ls - Directory listing
bash - Execute shell commands
todo-write - Task management
Bash Mode
> !ls -la
> !git status
> !npm install express
> /bash ls -la
File Operations
> Can you read @README.md and summarize it?
>
> I just attached an image, what do you see?
> Create a new Python script that calculates fibonacci numbers
> Edit @script.py to add error handling
Advanced Features
xyne --load=conversation.json
> /export conversation.md
> /export conversation.json
> /clear
xyne --debug
Keyboard Shortcuts
- ↑/↓ Arrows - Navigate input history
- Option + ←/→ - Jump between words
- Option + Backspace/Delete - Delete words
- Ctrl + C - Interrupt (press twice to exit)
- Double ESC - Interrupt AI processing or show message selection
- @ - Open file browser
- ! - Enter bash mode
🎨 File Support
Supported File Types
| Images | .png, .jpg, .jpeg, .gif, .webp, .svg, .bmp, .ico | 10MB | Visual analysis, OCR |
| PDFs | .pdf | 25MB | Text extraction, analysis |
| Text | .txt, .md, .json, .yaml, .yml, .html, .css, .xml | 5MB | Full content analysis |
| Code | .js, .ts, .jsx, .tsx, .py, .go, .rs, .java, .cpp, .c, .php, .rb, .swift, .kt | 5MB | Syntax highlighting, analysis |
Additional Support:
- Files without extensions (e.g.,
.gitignore, Dockerfile, Makefile)
- Common configuration files automatically detected as text
- Content-based detection for unknown file types
File Attachment Methods
- Drag & Drop - Drop files directly into the terminal
- File References - Use
@filename or @path/to/file
- File Browser - Type
@ to browse and select files
- Clipboard - Paste file paths automatically detected
Areas for Contribution
- New AI Providers - Add support for additional AI providers
- File Handlers - Support for new file types and formats
- UI Improvements - Enhance the terminal interface
- Performance - Optimize conversation handling and file processing
- Documentation - Improve guides and examples
- Testing - Add comprehensive test coverage
Bug Reports
When reporting bugs, please include:
- Your operating system and Node.js version
- Steps to reproduce the issue
- Expected vs actual behavior
- Any error messages or logs (use
--debug flag)
Feature Requests
For feature requests, please:
- Search existing issues first
- Provide a clear description of the feature
- Explain the use case and benefits
- Consider contributing the feature yourself!
📚 Documentation
Environment Variables
VERTEX_PROJECT_ID | Google Cloud Project ID | dev-ai-epsilon |
VERTEX_REGION | Vertex AI region | us-east5 |
VERTEX_MODEL | Vertex AI model | claude-sonnet-4@20250514 |
LITE_LLM_API_KEY | LiteLLM API key | - |
LITE_LLM_URL | LiteLLM base URL | - |
LITE_LLM_MODEL | LiteLLM model name | - |
LOG_LEVEL | Logging level | off |
Interactive Commands
/help | Show available commands |
/clear | Clear conversation history |
/export | Export conversation |
/mcp | List MCP servers |
/exit | Exit the application |
CLI Commands
xyne | Start interactive chat |
xyne prompt <text> | Execute one-shot prompt |
xyne mcp add <name> [options...] | Add MCP server |
xyne mcp remove <name> | Remove MCP server |
xyne mcp list | List all MCP servers |
xyne mcp get <name> | Get MCP server details |
xyne mcp add-json <name> <json> | Add MCP server from JSON |
xyne config | Show current configuration |
xyne update | Update to latest version |
xyne --help | Show help information |
xyne --version | Show version information |
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Built with Ink for beautiful terminal UIs
- Powered by React for component architecture
- Supports Vertex AI and LiteLLM
- Inspired by the needs of developers who live in the terminal
Made with ❤️ by the Xyne Team