
Security News
Axios Supply Chain Attack Reaches OpenAI macOS Signing Pipeline, Forces Certificate Rotation
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.
lmapp
Advanced tools
Local LLM CLI – AI everywhere. Easy, simple, and undeniable.
Online or offline. The future is yours to command.
v0.4.0 - Agent Mode. Production Ready. Fully Featured. Free.
See Demo & Features for examples and use cases.
Full installation and setup: see QUICKSTART.md. Customize your AI's behavior: see Roles & Workflows Guide.
Everyday commands:
lmapp chat # Start chatting locally
lmapp chat --agent # Start in auto-Agent Mode (Copilot-like)
lmapp server start # Start API server (Web App & VS Code)
lmapp status # Check backend/model status
Access lmapp from your browser without installing the desktop app.
lmapp server starthttp://localhost:8000 in your browser.Recommended: Check out our Feature Tour to see lmapp in action!
We offer two ways to explore lmapp:
Explore real-world use cases including:
Turn your terminal into an autonomous coding assistant.
$ lmapp chat --agent
> Create a python script to calculate fibonacci
[AGENT] Creating fib.py...
[AGENT] Running fib.py...
$ lmapp chat --model mistral
╔════════════════════════════════════════════╗
║ Chat with Mistral (Local) ║
╚════════════════════════════════════════════╝
You: Explain quantum computing in simple terms
AI: Quantum computers use quantum bits (qubits) instead of regular bits.
While regular bits are 0 or 1, qubits can be both at once (superposition).
This lets them solve certain problems exponentially faster...
You: What are the use cases?
AI: Key use cases include:
• Drug discovery (molecular simulation)
• Finance (portfolio optimization)
• Cryptography (breaking encryption)
• Machine learning (optimization)
Turn VS Code into an AI-powered IDE with our extension.
lmapp server start
http://localhost:8000 to see status and chat.lmapp-vscode (coming soon to marketplace).$ lmapp rag index ~/my_docs
📁 Indexing documents...
✓ Processed: README.md (1,234 tokens)
✓ Processed: GUIDE.pdf (5,678 tokens)
✓ Processed: NOTES.txt (892 tokens)
✓ Index created: 7,804 tokens in 12 documents
$ lmapp rag search "how to optimize python code"
📊 Search Results (3 matches):
1. GUIDE.pdf - Line 45 (score: 0.92)
"Optimization techniques include: list comprehensions,
caching, and using built-in functions instead of loops"
2. NOTES.txt - Line 12 (score: 0.88)
"Profile code with cProfile before optimizing"
3. README.md - Line 89 (score: 0.81)
"Performance tips for production code"
$ lmapp chat --with-context
You: Summarize the best Python optimization tips from my docs
AI: Based on your documents, here are the key optimization tips:
1. Use list comprehensions instead of loops
2. Profile with cProfile before optimizing
3. Leverage built-in functions (map, filter, etc.)
4. Implement caching for expensive operations
$ lmapp batch create inputs.json
Processing 5 queries in batch...
[████████████████████] 100% (5/5)
Job created: batch_20250211_143022
Estimated time: 45 seconds
$ lmapp batch results batch_20250211_143022 --json
{
"job_id": "batch_20250211_143022",
"status": "completed",
"results": [
{"input": "Explain AI", "output": "AI is..."},
{"input": "What is ML?", "output": "Machine learning..."},
...
],
"completed_at": "2025-02-11T14:30:47Z"
}
$ lmapp plugin list
Available Plugins:
✓ translator - Real-time translation (8 languages)
✓ summarizer - Extract key points from long text
✓ code-reviewer - Analyze code and suggest improvements
✓ sql-generator - Write SQL queries from descriptions
✓ regex-helper - Build and test regex patterns
✓ json-validator - Validate and format JSON
✓ git-helper - Explain git commands and operations
✓ api-tester - Test REST APIs interactively
$ lmapp plugin install translator
Installing translator plugin...
✓ Downloaded (245 KB)
✓ Installed successfully
Ready to use: lmapp translate --help
$ lmapp translate --text "Hello World" --to spanish
Translation (Spanish):
"¡Hola Mundo!"
$ lmapp config show
Current Configuration:
Model: mistral (7B)
Temperature: 0.7
Max Tokens: 2048
Context Size: 4096
System Prompt: You are a helpful AI assistant
$ lmapp config set temperature 0.3
✓ Configuration updated
$ lmapp config --set-prompt
Enter your custom system prompt:
> You are a Python expert. Help with code, explain concepts clearly.
✓ System prompt saved
$ lmapp status
Status Report:
✓ Backend: Ollama (running)
✓ Model: mistral (7.4B)
✓ Memory: 6.2 GB / 16 GB
✓ Performance: 45 tokens/sec
Scenario: Camping, road trip, flight, deployment - anywhere without internet
Question: "Why does the sky look blue?"
$ lmapp chat
> AI: Hello, how may I help you?
> You: Why does the sky look blue?
> AI: Short answer: the sky is blue because air molecules scatter
sunlight more strongly at short (blue) wavelengths than at long
(red) wavelengths — a process called Rayleigh scattering.
A bit more detail:
• Sunlight contains all visible colors
• Rayleigh scattering scales as 1/λ^4
• Blue light (~450 nm) scattered many times more than red (~650 nm)
• At sunrise/sunset, blue light scattered out → sky looks red/orange
• Clouds look white (Mie scattering of larger water droplets)
Extra neat fact: scattered sky light is partially polarized!
> AI: Is there anything else I can help you with?
Key Point: Full scientific answers WITHOUT internet
Scenario: Playing GTA San Andreas, need cheat codes and tips
$ lmapp chat
> You: I'm playing GTA San Andreas, show me cheat codes and top 20 tips
> AI: Nice! Which platform? (PC, PS2/3/4/5, Xbox)
> You: PC
> AI: Here are PC cheat codes (type during gameplay):
HESOYAM - Health, Armor, $250,000
AEZAKMI - Never Wanted
BAGUVIX - Infinite Health
FULLCLIP - Infinite Ammo
[20+ more codes...]
Top 20 Tips & Tricks:
1. Save often — use multiple save slots
2. Complete side jobs for cash/stat boosts (Ambulance, Firefighter)
3. Grind stats: stamina, muscle, driving/flying skill
4. Learn weapon ranges, aim for headshots
5. Keep Armor on you at all times
[15+ more detailed tips...]
> AI: Need anything else? Specific mission help?
Key Point: Skip endless website searches, get comprehensive game help instantly
# Start chat
lmapp chat
# Use specific model
lmapp chat --model mistral
# Check status
lmapp status
# View configuration
lmapp config show
Supported Backends: Ollama, llamafile (auto-detected). Extensible architecture supports custom backends.
See QUICKSTART.md for complete usage guide.
v0.3.0 (Current) - Production ready
v0.4.0+ - Mobile/desktop apps, team features, enterprise tier
Help wanted! See Contributing Guide for code contributions, bug reports, or feature ideas.
All contributions welcome: bug fixes, features, documentation, tests, and ideas.
| Issue | Solution |
|---|---|
command not found | Add ~/.local/bin to $PATH or use pipx install lmapp |
ModuleNotFoundError | Reinstall: pip install --upgrade lmapp |
| Debian/Ubuntu issues | Use pipx install lmapp instead of pip |
See Troubleshooting Guide for more.
Q: How do I install?
pip install lmapp
Q: How do I update?
pip install --upgrade lmapp
Q: Can I use commercially?
Yes! MIT License allows it. See LICENSE.
Q: Does it collect data?
No. 100% local, no telemetry.
More questions? See Troubleshooting Guide.
MIT License - See LICENSE file
This means:
If lmapp helps you, please:
pip install lmapp
lmapp chat
| Document | Purpose |
|---|---|
| QUICKSTART.md | 5-minute setup guide ⭐ Start here |
| docs/installation.md | Installation methods for all platforms |
| docs/CONFIGURATION.md | Configuration, environment, and settings |
| docs/development.md | Developer workflow and tips |
| TROUBLESHOOTING.md | Solutions for common issues |
| SECURITY.md | Security policy and vulnerability reporting |
| CHANGELOG.md | Release history |
| CONTRIBUTING.md | Contribution guidelines |
| CODE_OF_CONDUCT.md | Community standards |
| LICENSE | License terms |
| DEMO.md | Live examples and feature tour |
| API_REFERENCE.md | Lightweight CLI + HTTP API reference |
Additional references:
Welcome to the future of local AI. 🚀 This is the way...
FAQs
Local LLM Made Simple - Consumer-friendly AI assistant
We found that lmapp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.

Security News
Open source is under attack because of how much value it creates. It has been the foundation of every major software innovation for the last three decades. This is not the time to walk away from it.

Security News
Socket CEO Feross Aboukhadijeh breaks down how North Korea hijacked Axios and what it means for the future of software supply chain security.