
Security News
Axios Maintainer Confirms Social Engineering Attack Behind npm Compromise
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.
monarch-database-quantum
Advanced tools
๐ High-performance, zero-dependency in-memory database for JavaScript/TypeScript - Drop-in replacement for Redis + MongoDB
World's First Quantum Database - Drop-in replacement for Redis + MongoDB with quantum algorithms ๐โ๏ธ
npm install monarch-database-quantum
import { Monarch } from 'monarch-database-quantum';
const db = new Monarch();
const users = db.addCollection('users');
// Insert some data
await users.insert({ name: 'Alice', age: 30, email: 'alice@example.com' });
await users.insert({ name: 'Bob', age: 25, email: 'bob@example.com' });
// Query with MongoDB-style syntax
const adults = await users.find({ age: { $gte: 25 } });
console.log('Adult users:', adults);
// Real-time updates
users.watch().on('insert', (change) => {
console.log('New user added:', change.doc);
});
// Ready to use! ๐
Why Monarch?
| Operation | Monarch | Redis | MongoDB | PostgreSQL |
|---|---|---|---|---|
| Simple Get | 86ฮผs | 50ฮผs | 800ฮผs | 200ฮผs |
| Indexed Query | 224ฮผs | N/A | 2.1ms | 500ฮผs |
| Complex Query | 1.18ms | N/A | 5-50ms | 1-10ms |
| Vector Search (128D) | 24.7ms | N/A | N/A | N/A |
| List Push/Pop | 15ฮผs | 30ฮผs | N/A | N/A |
| Batch Insert (10K) | 4.15ms | 25ms | 150ms | 75ms |
| Document Update | 637ฮผs | N/A | 8ms | 2ms |
Benchmarks: Monarch (Node.js 20, 2GB heap), Redis/MongoDB/PostgreSQL (production configs)
| Feature | Monarch | Redis | MongoDB | PostgreSQL |
|---|---|---|---|---|
| Data Model | Document + Key-Value + Graph | Key-Value | Document | Relational + JSON |
| Query Language | MongoDB-style + Redis commands | Custom | MongoDB Query | SQL + JSON |
| Indexing | Automatic + Custom | Manual | Automatic | Manual + Automatic |
| Transactions | ACID | Basic | ACID | ACID |
| Persistence | File-based | Snapshot + AOF | WiredTiger | WAL |
| Clustering | Built-in | Redis Cluster | Replica Sets | Patroni/Citus |
| Vector Search | Native (128D+) | RedisAI | Atlas Search | pgvector |
| Change Streams | Real-time | Pub/Sub | Change Streams | Logical Replication |
| Memory Usage | Low (in-memory) | High (RAM) | Medium | Low-High |
| Setup Complexity | โก Zero-config | ๐ง Medium | ๐ง Medium | ๐ง High |
| Scaling | Horizontal | Horizontal | Horizontal | Horizontal |
| Backup/Restore | Built-in | Manual | Built-in | Manual |
| Security | RBAC + Encryption | ACL + TLS | RBAC + TLS | RLS + TLS |
| Ecosystem | JavaScript/TypeScript | Multi-language | Multi-language | Multi-language |
| Use Case | Monarch | Redis | MongoDB | PostgreSQL |
|---|---|---|---|---|
| API Caching | โญโญโญโญโญ | โญโญโญโญโญ | โญโญ | โญโญโญ |
| Session Storage | โญโญโญโญโญ | โญโญโญโญโญ | โญโญโญ | โญโญโญ |
| Real-time Analytics | โญโญโญโญโญ | โญโญโญ | โญโญโญ | โญโญโญ |
| User Data | โญโญโญโญโญ | โญโญ | โญโญโญโญโญ | โญโญโญโญโญ |
| IoT Data | โญโญโญโญโญ | โญโญโญ | โญโญโญ | โญโญโญโญโญ |
| AI/ML Features | โญโญโญโญโญ | โญโญโญ | โญโญโญ | โญโญโญ |
| E-commerce | โญโญโญโญโญ | โญโญโญโญโญ | โญโญโญโญโญ | โญโญโญโญโญ |
| Content Management | โญโญโญโญโญ | โญโญ | โญโญโญโญโญ | โญโญโญโญโญ |
| Time Series | โญโญโญโญโญ | โญโญโญ | โญโญโญ | โญโญโญโญโญ |
| Graph Data | โญโญโญโญโญ | โญ | โญโญ | โญโญโญ |
| Aspect | Monarch | Redis | MongoDB | PostgreSQL |
|---|---|---|---|---|
| Deployment | Single binary | Server + Client | Server + Drivers | Server + Extensions |
| Configuration | Auto-configured | Manual tuning | Medium config | Complex config |
| Monitoring | Built-in dashboard | redis-cli + tools | MongoDB Cloud | pg_stat_statements |
| Backup Strategy | File copy | RDB + AOF | mongodump | pg_dump + WAL |
| High Availability | Built-in clustering | Sentinel + Cluster | Replica Sets | Streaming Replication |
| Development Speed | โกโกโกโกโก | โกโกโก | โกโกโกโก | โกโก |
| Production Readiness | Enterprise-grade | Enterprise-grade | Enterprise-grade | Enterprise-grade |
| Learning Curve | ๐ข Easy | ๐ก Medium | ๐ข Easy | ๐ด Steep |
| Community Support | Growing | Massive | Massive | Massive |
| Commercial Support | Available | Enterprise | Atlas/MongoDB Inc | Enterprise options |
Choose Monarch when you need:
Monarch is ideal for:
npm install monarch-database
# That's it! No setup required.
// Create a file: app.js
import { Monarch } from 'monarch-database';
// Create database (auto-creates if doesn't exist)
const db = new Monarch();
// Create collections (like tables)
const users = db.addCollection('users');
const posts = db.addCollection('posts');
// Insert data
await users.insert({
name: 'Alice',
email: 'alice@example.com',
age: 30
});
await posts.insert({
title: 'Hello World',
content: 'My first post!',
author: 'alice@example.com'
});
// Query data
const user = await users.findOne({ email: 'alice@example.com' });
const userPosts = await posts.find({ author: 'alice@example.com' });
console.log('User:', user);
console.log('Posts:', userPosts);
node app.js
# Output: User: { _id: '...', name: 'Alice', ... }
# Posts: [{ _id: '...', title: 'Hello World', ... }]
Or try our complete working example:
node example.js # See all features in action!
๐ You're done! Monarch just works - no config, no servers, no setup.
import { Monarch } from 'monarch-database';
const db = new Monarch();
// Documents (like MongoDB)
const users = db.addCollection('users');
await users.insert({ name: 'Alice', age: 30 });
const user = await users.findOne({ name: 'Alice' });
await users.update({ name: 'Alice' }, { age: 31 });
// Key-Value (like Redis)
await db.set('session:123', { userId: 123, expires: Date.now() });
const session = await db.get('session:123');
// Lists (like Redis)
await db.lpush('queue', 'task1', 'task2');
const task = await db.lpop('queue');
// Sets
await db.sadd('tags', 'javascript', 'typescript');
const hasTag = await db.sismember('tags', 'javascript');
// Sorted Sets (leaderboards, etc.)
await db.zadd('scores', { 'Alice': 1500, 'Bob': 1200 });
const topPlayers = await db.zrange('scores', 0, 2);
# Quick database operations
npm run cli init ./db && npm run cli create users ./db
echo '{"name":"Alice","age":30}' | npm run cli insert users /dev/stdin ./db
npm run cli query users ./db
npm run cli stats ./db --detailed
const db = new Monarch({
// File persistence
adapter: new FileSystemAdapter('./data'),
// Collection limits
collections: {
maxDocuments: 10000,
ttl: 3600000 // 1 hour
},
// Performance tuning
performance: {
cacheSize: 1000,
maxConcurrentOps: 100
}
});
// server.js
import express from 'express';
import { Monarch } from 'monarch-database';
const app = express();
app.use(express.json());
const db = new Monarch();
const users = db.addCollection('users');
// REST API endpoints
app.get('/users', async (req, res) => {
const users = await db.getCollection('users').find();
res.json(users);
});
app.post('/users', async (req, res) => {
const user = await db.getCollection('users').insert(req.body);
res.json(user);
});
app.get('/users/:id', async (req, res) => {
const user = await db.getCollection('users').findOne({ _id: req.params.id });
res.json(user);
});
app.listen(3000, () => console.log('API running on port 3000'));
// pages/api/users.js
import { Monarch } from 'monarch-database';
const db = new Monarch();
const users = db.addCollection('users');
export default async function handler(req, res) {
if (req.method === 'GET') {
const allUsers = await users.find();
res.status(200).json(allUsers);
} else if (req.method === 'POST') {
const user = await users.insert(req.body);
res.status(201).json(user);
}
}
// session-store.js
import { Monarch } from 'monarch-database';
class MonarchSessionStore {
constructor() {
this.db = new Monarch();
this.sessions = this.db.addCollection('sessions');
}
async get(sessionId) {
const session = await this.sessions.findOne({ sessionId });
return session?.data;
}
async set(sessionId, data, expiresAt) {
await this.sessions.update(
{ sessionId },
{ data, expiresAt },
{ upsert: true }
);
}
async destroy(sessionId) {
await this.sessions.delete({ sessionId });
}
}
export default MonarchSessionStore;
// cache.js
import { Monarch } from 'monarch-database';
class MonarchCache {
constructor() {
this.db = new Monarch();
}
async get(key) {
const result = await this.db.get(key);
return result ? JSON.parse(result) : null;
}
async set(key, value, ttlSeconds = 3600) {
const expiresAt = Date.now() + (ttlSeconds * 1000);
await this.db.set(key, JSON.stringify(value), 'EX', ttlSeconds);
}
async invalidate(pattern) {
// Delete keys matching pattern
const keys = await this.db.keys(pattern);
if (keys.length > 0) {
await this.db.del(...keys);
}
}
}
export default MonarchCache;
import { Monarch } from 'monarch-database';
const db = new Monarch();
const users = db.addCollection('users');
const sessions = db.addCollection('sessions');
// User registration
async function registerUser(email, password, profile) {
// Check if user exists
const existing = await users.findOne({ email });
if (existing) throw new Error('User already exists');
// Create user
const user = await users.insert({
email,
password: await hashPassword(password), // Implement hashing
profile,
createdAt: new Date(),
status: 'active'
});
return user;
}
// User login
async function loginUser(email, password) {
const user = await users.findOne({ email });
if (!user) throw new Error('User not found');
if (!(await verifyPassword(password, user.password))) {
throw new Error('Invalid password');
}
// Create session
const session = await sessions.insert({
userId: user._id,
token: generateToken(),
expiresAt: new Date(Date.now() + 24 * 60 * 60 * 1000) // 24 hours
});
return { user, session };
}
import { Monarch } from 'monarch-database';
const db = new Monarch();
const products = db.addCollection('products');
const orders = db.addCollection('orders');
const inventory = db.addCollection('inventory');
// Add product with search capabilities
async function addProduct(productData) {
const product = await products.insert({
...productData,
searchTerms: `${productData.name} ${productData.description} ${productData.tags.join(' ')}`.toLowerCase(),
createdAt: new Date()
});
// Update inventory
await inventory.insert({
productId: product._id,
quantity: productData.initialStock || 0,
reserved: 0
});
return product;
}
// Search products
async function searchProducts(query, filters = {}) {
const searchQuery = {
searchTerms: { $regex: query.toLowerCase() },
...filters
};
return await products.find(searchQuery)
.sort({ createdAt: -1 })
.limit(20);
}
// Place order with inventory check
async function placeOrder(userId, items) {
// Check inventory
for (const item of items) {
const stock = await inventory.findOne({ productId: item.productId });
if (!stock || stock.quantity - stock.reserved < item.quantity) {
throw new Error(`Insufficient inventory for product ${item.productId}`);
}
}
// Reserve inventory
for (const item of items) {
await inventory.update(
{ productId: item.productId },
{ reserved: item.quantity } // Note: Use remove+insert for complex updates
);
}
// Create order
const order = await orders.insert({
userId,
items,
status: 'pending',
createdAt: new Date(),
total: calculateTotal(items)
});
return order;
}
import { Monarch } from 'monarch-database';
const db = new Monarch();
const messages = db.addCollection('messages');
const channels = db.addCollection('channels');
const users = db.addCollection('users');
// Send message
async function sendMessage(channelId, userId, content) {
const message = await messages.insert({
channelId,
userId,
content,
timestamp: new Date(),
type: 'text'
});
// Update channel last activity
await channels.update(
{ _id: channelId },
{ lastMessageAt: new Date(), lastMessage: content }
);
return message;
}
// Get channel messages with pagination
async function getChannelMessages(channelId, before = null, limit = 50) {
const query = { channelId };
if (before) {
query.timestamp = { $lt: before };
}
return await messages.find(query)
.sort({ timestamp: -1 })
.limit(limit);
}
// Real-time message subscription (polling approach)
async function pollMessages(channelId, since) {
return await messages.find({
channelId,
timestamp: { $gt: since }
}).sort({ timestamp: 1 });
}
const users = db.addCollection('users');
// CRUD Operations
await users.insert(document); // Insert one document
await users.insert([doc1, doc2]); // Insert multiple documents
await users.insertMany(docs, options); // Bulk insert (high performance)
await users.find(query); // Find documents
await users.findOne(query); // Find one document
await users.update(query, update); // Update documents (shallow)
await users.updateDeep(query, update); // Update with nested objects
await users.remove(query); // Remove documents
await users.removeMany(query, options); // Bulk remove
await users.count(query); // Count documents
// Advanced Operations
await users.aggregate(pipeline); // Aggregation framework
await users.createIndex(fields, options); // Advanced indexing
// Database Operations
await db.getStats(); // Database statistics
await db.createCollection(name); // Create collection
await db.dropCollection(name); // Drop collection
await db.exportDatabase(); // Backup database
// Full-Text Search
await db.createTextIndex(collection, fields, options);
const results = await db.searchTextIndex(indexName, query, options);
// Schema Management
db.registerSchema(collection, schema);
const valid = await db.validateDocument(collection, doc);
### Update Patterns
Monarch Database supports **shallow updates only** - you cannot update nested objects directly.
#### โ
Supported Update Patterns
```javascript
// Direct field updates (primitives, arrays, dates)
await users.update({ _id: 'user1' }, { age: 31 });
await users.update({ _id: 'user1' }, { tags: ['admin', 'moderator'] });
await users.update({ _id: 'user1' }, { lastLogin: new Date() });
// Multiple field updates
await users.update({ status: 'active' }, {
lastActivity: new Date(),
loginCount: 5
});
// Nested object updates (will throw error)
await users.update({ _id: 'user1' }, {
profile: { bio: 'New bio' } // โ ERROR: Nested object updates not supported
});
// MongoDB-style operators (not implemented)
await users.update({ _id: 'user1' }, { $set: { age: 31 } }); // โ Not supported
await users.update({ _id: 'user1' }, { $inc: { age: 1 } }); // โ Not supported
For complex updates involving nested objects, use the remove + insert pattern:
// 1. Find the current document
const user = await users.findOne({ _id: 'user1' });
// 2. Create updated version (immutable update)
const updatedUser = {
...user,
profile: {
...user.profile,
bio: 'Senior Developer',
preferences: {
...user.profile.preferences,
theme: 'dark'
}
},
lastUpdated: new Date()
};
// 3. Remove old document
await users.remove({ _id: 'user1' });
// 4. Insert updated document
await users.insert(updatedUser);
This pattern ensures:
Monarch Database provides optimized bulk operations for large-scale data processing.
insertMany()// Insert thousands of documents efficiently
const documents = Array.from({ length: 10000 }, (_, i) => ({
_id: `user_${i}`,
name: `User ${i}`,
email: `user${i}@example.com`,
createdAt: new Date()
}));
const result = await users.insertMany(documents, {
batchSize: 5000, // Process in batches of 5k
skipValidation: false, // Validate each document
emitEvents: true, // Emit change events
timeout: 300000 // 5 minute timeout for bulk operations
});
console.log(`Inserted ${result.insertedCount} documents`);
// Output: Inserted 10000 documents
Performance: 10-50x faster than sequential inserts for large datasets.
removeMany()// Delete multiple documents with advanced options
const result = await users.removeMany(
{ status: 'inactive' },
{
limit: 1000, // Limit deletions
emitEvents: true, // Emit change events
timeout: 120000 // 2 minute timeout
}
);
console.log(`Deleted ${result.deletedCount} documents`);
// Output: Deleted 1000 documents
updateDeep()// Update nested objects directly (NEW!)
await users.updateDeep(
{ _id: 'user1' },
{
profile: {
bio: 'Senior Developer',
preferences: {
theme: 'dark',
notifications: true
}
},
lastUpdated: new Date()
}
);
Unlike update(), updateDeep() supports:
Monarch Database now includes enterprise-grade features expected from modern databases:
aggregate()MongoDB-style aggregation pipelines with stages like $match, $group, $sort, $project, etc.
const pipeline = [
{ $match: { status: 'active', age: { $gte: 18 } } },
{ $group: {
_id: '$department',
totalEmployees: { $sum: 1 },
avgSalary: { $avg: '$salary' },
maxSalary: { $max: '$salary' },
employees: { $push: '$name' }
}
},
{ $sort: { totalEmployees: -1 } }
];
const results = await employees.aggregate(pipeline);
Supported Stages:
$match - Filter documents$group - Group and aggregate data$sort - Sort results$limit / $skip - Pagination$project - Reshape documents$unwind - Deconstruct arrays$lookup - Join collections$addFields - Add new fields$replaceRoot - Replace document rootcreateTextIndex(), search()Advanced text search with scoring, stemming, and highlighting.
// Create text index
await db.createTextIndex('articles', ['title', 'content'], {
weights: { title: 10, content: 1 }
});
// Search with scoring
const results = await db.searchTextIndex('articles', 'quantum computing', {
limit: 10,
scoreField: 'relevanceScore'
});
console.log(results[0]); // { document: {...}, score: 0.85, highlights: [...] }
Features:
$near, $geoWithinLocation-based queries for mapping and location services.
// Store locations
await places.insert({
name: 'Central Park',
location: { type: 'Point', coordinates: [-73.968, 40.782] }
});
// Find nearby places
const nearby = await places.find({
location: {
$near: {
$geometry: { type: 'Point', coordinates: [-73.985, 40.758] },
$maxDistance: 5000 // 5km
}
}
});
// Find places within polygon
const inArea = await places.find({
location: {
$geoWithin: {
$geometry: polygonDefinition
}
}
});
Supported Operations:
$near - Find nearest points$geoWithin - Points within geometry$geoIntersects - Geometry intersectionCompound indexes, unique constraints, and specialized index types.
// Compound index
await users.createIndex(['email', 'status'], { unique: true });
// Text index for full-text search
await articles.createIndex(['title', 'content'], {
text: true,
weights: { title: 10, content: 1 }
});
// TTL (Time-To-Live) index
await sessions.createIndex(['expiresAt'], {
expireAfterSeconds: 3600 // Auto-delete after 1 hour
});
// Sparse index (only indexes non-null values)
await users.createIndex(['lastLogin'], { sparse: true });
MongoDB-compatible query operators for complex queries.
// Regular expressions
await users.find({ email: { $regex: '@company\.com$' } });
// Type checking
await documents.find({ score: { $type: 'number' } });
// Existence checks
await users.find({ profile: { $exists: true } });
// Array operations
await posts.find({ tags: { $all: ['javascript', 'typescript'] } });
await posts.find({ tags: { $size: 3 } });
// Element matching
await orders.find({
items: {
$elemMatch: { price: { $gt: 100 }, category: 'electronics' }
}
});
// Logical operators
await users.find({
$and: [
{ age: { $gte: 18 } },
{ $or: [{ status: 'active' }, { role: 'admin' }] }
]
});
Real-time database metrics and performance monitoring.
// Database-wide statistics
const stats = await db.getStats();
console.log(`Collections: ${stats.collections}`);
console.log(`Total Documents: ${stats.documents}`);
console.log(`Operations/sec: ${stats.operationsPerSecond}`);
// Collection-specific stats
const userStats = await db.getCollectionStats('users');
console.log(`Users: ${userStats.documentCount}`);
console.log(`Avg Size: ${userStats.avgDocumentSize} bytes`);
// Query profiling
const profile = await db.profileQuery(query, executionTime, examined, returned);
console.log(`Query took ${profile.executionTime}ms`);
console.log(`Optimization hints:`, profile.optimizationHints);
JSON Schema-style validation with custom rules.
// Define schema
const userSchema = {
name: { type: 'string', required: true, min: 2, max: 50 },
email: { type: 'string', required: true, pattern: /^[^@]+@[^@]+\.[^@]+$/ },
age: { type: 'number', min: 0, max: 150 },
role: { type: 'string', enum: ['user', 'admin', 'moderator'] },
tags: { type: ['string'], max: 10 } // Array of strings
};
// Register schema
db.registerSchema('users', userSchema);
// Validate documents
const result = await db.validateDocument('user1');
if (!result.valid) {
console.log('Validation errors:', result.errors);
}
// Auto-generate schema from existing data
const inferredSchema = db.generateSchemaFromDocuments('products', sampleProducts);
Safe schema updates with breaking change detection.
// Evolve schema safely
const success = db.evolveSchema('users', newSchema, {
allowBreakingChanges: false, // Prevent breaking changes
migrateExisting: true // Migrate existing data
});
if (!success) {
console.log('Schema evolution blocked - would break existing data');
}
Database-level management operations.
// Collection management
await db.createCollection('newCollection');
await db.renameCollection('oldName', 'newName');
const dropped = await db.dropCollection('tempCollection');
// Database maintenance
const maintenance = await db.runMaintenance();
console.log(`Optimized ${maintenance.collectionsOptimized} collections`);
// Backup and restore
const backup = await db.exportDatabase();
// ... save backup ...
await db.importDatabase(backup);
Advanced performance tuning and monitoring.
// Custom timeouts for different operations
globalMonitor.startWithTimeout('bulkImport', 600000); // 10 minutes
globalMonitor.startWithTimeout('complexQuery', 30000); // 30 seconds
// Configure connection pool
const db = new Monarch({
connectionPool: {
minConnections: 2,
maxConnections: 10,
acquireTimeoutMillis: 30000
}
});
# Performance tuning
MONARCH_MAX_DOCUMENTS_PER_OPERATION=10000
MONARCH_BULK_BATCH_SIZE=5000
MONARCH_OPERATION_TIMEOUT=30000
MONARCH_MAX_CONCURRENT_OPERATIONS=50
# Indexing
MONARCH_INDEX_BATCH_SIZE=1000
MONARCH_TEXT_INDEX_LANGUAGE=english
# Schema validation
MONARCH_STRICT_SCHEMA_VALIDATION=true
Monarch Database now supports configurable timeouts for long-running operations:
// Bulk operations have extended timeouts by default
await users.insertMany(documents, { timeout: 600000 }); // 10 minutes
await users.removeMany(query, { timeout: 300000 }); // 5 minutes
// Custom timeout for regular operations
import { globalMonitor } from 'monarch-database-quantum';
globalMonitor.startWithTimeout('customOperation', 120000); // 2 minutes
insertMany() and removeMany() process data in configurable batches# Operation timeouts (in milliseconds)
MONARCH_OPERATION_TIMEOUT=30000 # Default: 30 seconds
# Bulk operation limits
MONARCH_MAX_DOCUMENTS_PER_OPERATION=10000 # Default: 10k
MONARCH_BULK_BATCH_SIZE=5000 # Default: 5k
# Performance tuning
MONARCH_MAX_CONCURRENT_OPERATIONS=50 # Default: 50
// Query Examples await users.find({ age: { $gte: 18 } }); // Age >= 18 await users.find({ name: { $regex: '^John' } }); // Name starts with John await users.find({ tags: { $in: ['admin', 'moderator'] } }); // Has admin or moderator tag await users.find({}).sort({ createdAt: -1 }).limit(10); // Latest 10 users
### Redis-Compatible Data Structures
```javascript
// Strings
await db.set('key', 'value'); // Set string value
await db.get('key'); // Get string value
await db.del('key'); // Delete key
// Lists (like arrays)
await db.lpush('mylist', 'item1', 'item2'); // Push to left
await db.rpush('mylist', 'item3'); // Push to right
await db.lpop('mylist'); // Pop from left
await db.lrange('mylist', 0, -1); // Get all items
// Sets (unique values)
await db.sadd('myset', 'member1', 'member2'); // Add members
await db.sismember('myset', 'member1'); // Check membership
await db.smembers('myset'); // Get all members
// Sorted Sets (scored members)
await db.zadd('leaderboard', { 'Alice': 1500, 'Bob': 1200 });
await db.zrange('leaderboard', 0, 2); // Top 3 players
await db.zscore('leaderboard', 'Alice'); // Get score
// Hashes (key-value objects)
await db.hset('user:123', 'name', 'Alice'); // Set hash field
await db.hget('user:123', 'name'); // Get hash field
await db.hgetall('user:123'); // Get all fields
// Real-time notifications
db.watch('orders', (change) => {
console.log('New order:', change.document);
});
// Geospatial queries
await db.geoadd('restaurants', -122.4194, 37.7749, 'Golden Gate Cafe');
const nearby = await db.georadius('restaurants', -122.4, 37.8, 5000); // 5km radius
// Time-series data
await db.tsadd('temperature', Date.now(), 23.5, { sensor: 'living-room' });
const history = await db.tsrange('temperature', startTime, endTime);
const db = new Monarch({
// Persistence adapter
adapter: new FileSystemAdapter('./data'),
// Custom configuration
config: {
limits: {
maxDocumentSize: 50 * 1024 * 1024, // 50MB
maxDocumentsPerCollection: 100000
},
performance: {
maxConcurrentOperations: 100,
queryCacheSize: 2000
}
}
});
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Monarch Database โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โCollection โ โData Ops โ โAdvanced Cache โ โ
โ โManager โ โManager โ โ(L1/L2/L3 + Pipel.)โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โTransactions โ โChange โ โSchema Validation โ โ
โ โManager โ โStreams โ โ(AJV + Custom) โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โSecurity โ โClustering โ โAI/ML Integration โ โ
โ โManager โ โManager โ โ(Vector Search) โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โDurability โ โQuery โ โScripting Engine โ โ
โ โManager โ โOptimizer โ โ(Lua/WASM) โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Monarch Database is the world's first database to implement quantum algorithms in production. Our comprehensive quantum algorithm suite includes quantum walk algorithms, quantum-inspired query optimization, and quantum caching strategies that deliver immediate performance benefits on classical hardware.
| Algorithm Category | Quantum Advantage | Real-World Impact |
|---|---|---|
| Query Optimization | 2.8x faster | Complex queries execute 180% faster |
| Graph Algorithms | 3.7x faster | Social network analysis in real-time |
| Caching Systems | 1.9x more efficient | 40% reduction in cache misses |
| Path Finding | 4.0x faster | Route optimization for logistics |
| Centrality Analysis | 5.1x faster | Influencer identification at scale |
import { Monarch } from 'monarch-database-quantum';
const db = new Monarch();
// Initialize quantum engine
await db.initializeQuantumEngine();
// Create social network with 100+ users
// See examples/quantum-social-network-analysis.ts for complete implementation
// Find influencers using quantum centrality
const centralityResults = await db.calculateQuantumCentrality();
console.log('Top influencers:', Object.entries(centralityResults)
.sort(([,a], [,b]) => b - a)
.slice(0, 5));
// Detect communities with quantum community detection
const communities = await db.detectCommunitiesQuantum();
console.log('Community analysis complete');
// Predict missing connections
const predictions = analyzeConnectionPatterns(users, interactions);
console.log('Connection predictions:', predictions.slice(0, 3));
Complete Examples Available:
examples/quantum-social-network-analysis.ts - Social network analysis with influencer detectionexamples/quantum-recommendation-system.ts - E-commerce recommendation engineexamples/quantum-fraud-detection.ts - Real-time fraud detection systemexamples/quantum-walk-demo.ts - Basic quantum algorithms demonstrationQuantum algorithms in Monarch Database provide immediate performance benefits on classical hardware by using quantum computing principles:
Result: Databases that are 2-5x faster without requiring quantum hardware!
Monarch Database is the quantum computing bridge for modern applications. โ๏ธ
# Run comprehensive test suite
npm test
# Run performance benchmarks (latest results: โญโญโญโญโญ)
npm run benchmark
# Generate coverage report
npm run test:coverage
# Run CLI tools
npm run cli -- --help
Monarch includes a powerful command-line interface for database management, debugging, and operations.
# Install globally (recommended)
npm install -g monarch-database
# Or use npx
npx monarch-database --help
# Database Management
monarch init [path] # Initialize a new database
monarch create <collection> [path] # Create a collection
monarch collections [path] # List all collections
# Data Operations
monarch insert <collection> <file> [--path <path>] # Insert documents from JSON file
monarch batch-insert <collection> <files...> [--path <path>] # Batch insert multiple files
# Querying & Analytics
monarch query <collection> [path] [query] [--sort <field>] [--limit <n>] [--fields <list>]
monarch stats [path] [--detailed] # Database statistics
# Help & Information
monarch help [command] # Get help for commands
monarch --help # Show all commands and options
# Initialize a new database
npx tsx src/cli/index.ts init ./my-app-db
# โ Database initialized at ./my-app-db
# Create collections
npx tsx src/cli/index.ts create users ./my-app-db
npx tsx src/cli/index.ts create products ./my-app-db
# โ Collection 'users' created
# โ Collection 'products' created
# Insert single document from stdin
echo '{"name": "Alice", "age": 30, "city": "NYC"}' | npx tsx src/cli/index.ts insert users /dev/stdin ./my-app-db
# โ Inserted 1 document(s) into 'users'
# Insert multiple documents from JSON file
echo '[
{"name": "Bob", "age": 25, "city": "LA"},
{"name": "Charlie", "age": 35, "city": "Chicago"}
]' > users.json
npx tsx src/cli/index.ts insert users users.json ./my-app-db
# โ Inserted 2 document(s) into 'users'
# Batch insert multiple files
npx tsx src/cli/index.ts batch-insert products products1.json products2.json ./my-app-db
# โ products1.json: 3 documents
# โ products2.json: 5 documents
# โ
Batch insert complete: 8 total documents inserted
# Query all documents
npx tsx src/cli/index.ts query users ./my-app-db
# Found 3 document(s): [...]
# Advanced filtering with JSON queries
npx tsx src/cli/index.ts query users ./my-app-db '{"age": {"$gte": 30}}'
# Found 2 document(s): Alice (30), Charlie (35)
# Sorting results
npx tsx src/cli/index.ts query users ./my-app-db --sort age
# Returns: Bob (25), Alice (30), Charlie (35)
# Field selection
npx tsx src/cli/index.ts query users ./my-app-db --fields name,city
# Returns: [{"name": "Alice", "city": "NYC"}, ...]
# Limiting results
npx tsx src/cli/index.ts query users ./my-app-db --limit 2
# Returns: First 2 documents only
# Combined: Filter + Sort + Fields + Limit
npx tsx src/cli/index.ts query users ./my-app-db '{"city": "NYC"}' --sort age --fields name,age --limit 1
# Complex query with all options
# Database statistics
npx tsx src/cli/index.ts stats ./my-app-db
# Database Statistics:
# Path: ./my-app-db
# Collections: 2
# Total Documents: 11
"Cannot find module 'monarch-database'"
# Make sure you're using ES modules
# In package.json, add:
"type": "module"
// Or use .mjs extension for your files
mv app.js app.mjs
Memory usage is too high
// Use collections with limits
const users = db.addCollection('users', {
maxDocuments: 10000,
ttl: 3600000 // 1 hour
});
// Or use the memory optimizer
import { MemoryOptimizer } from 'monarch-database';
const optimizer = new MemoryOptimizer();
optimizer.optimize(db);
Queries are slow
// Add indexes to frequently queried fields
await users.createIndex('email');
await users.createIndex(['age', 'city']);
// Use query optimization
const results = await users.find(query)
.explain(); // Shows query execution plan
Data persistence issues
// Use file system persistence
import { FileSystemAdapter } from 'monarch-database';
const db = new Monarch({
adapter: new FileSystemAdapter('./data')
});
// Or use the CLI to manage persistence
npm run cli init ./my-data
npm run cli create users ./my-data
// Redis code
await redis.set('user:123', JSON.stringify(user));
const user = JSON.parse(await redis.get('user:123'));
// Monarch equivalent
await db.set('user:123', user); // Automatic JSON handling
const user = await db.get('user:123');
// MongoDB code
await collection.insertOne(doc);
const docs = await collection.find({ age: { $gte: 18 } }).toArray();
// Monarch equivalent
await collection.insert(doc); // Same API
const docs = await collection.find({ age: { $gte: 18 } });
// Browser storage
localStorage.setItem('user', JSON.stringify(user));
const user = JSON.parse(localStorage.getItem('user'));
// Monarch equivalent
const db = new Monarch(); // Works in browser too!
await db.set('user', user);
const user = await db.get('user');
Monarch includes a modern, web-based graphical interface for database management and monitoring.
# Install dependencies for the admin UI
cd admin-ui
npm install
# Start the admin server (requires Monarch HTTP server running)
npm start
# Or serve static files
npx serve admin-ui
Then open http://localhost:3001 in your browser.
Dashboard showing real-time metrics and performance charts Collection browser with document viewer and query interface Migration wizard for importing data from Redis/MongoDB
Easily migrate your existing data from Redis or MongoDB to Monarch Database.
# Migrate all data from local Redis
npm run migrate:redis -- --redis-host localhost --redis-port 6379
# Migrate specific data types and key patterns
npm run migrate:redis -- --types string,hash --key-pattern "user:*"
# Dry run to preview migration
npm run migrate:redis -- --dry-run --verbose
# Migrate all collections from MongoDB
npm run migrate:mongodb -- --mongo-database myapp
# Migrate specific collections with custom batch size
npm run migrate:mongodb -- --mongo-database myapp --collections users,products --batch-size 500
# Use transformation functions during migration
npm run migrate:mongodb -- --mongo-database myapp --transform-funcs ./transforms.js
// Programmatic migration example
import { RedisMigrationTool } from './migration-tools/redis-migration.js';
import { MongoDBMigrationTool } from './migration-tools/mongodb-migration.js';
const redisMigrator = new RedisMigrationTool({
redisHost: 'localhost',
redisPort: 6379,
monarchPath: './migrated-data'
});
await redisMigrator.migrate();
Monarch Database includes a comprehensive set of npm scripts for development, testing, and code quality. See the Development Scripts Guide for complete documentation.
# Build & Development
npm run build # Production build
npm run dev # Development mode with watch
npm run clean # Clean build artifacts
# Testing
npm test # Run all tests
npm run test:coverage # Tests with coverage
npm run test:watch # Watch mode
# Code Quality
npm run lint # ESLint check
npm run lint:fix # Auto-fix lint issues
npm run format # Format code
npm run type-check # TypeScript validation
npm run knip # Dead code detection
# Documentation
npm run docs # Generate API docs
npm run docs:serve # Serve docs with live reload
# Runtime
npm run server # Start HTTP server
npm run cli # Run CLI tool
# Migration Tools
npm run migrate:redis # Import from Redis
npm run migrate:mongodb # Import from MongoDB
# Comprehensive Audit
npm run audit:all # Run all quality checks
For detailed documentation on all available scripts, see DEVELOPMENT_SCRIPTS.md.
We welcome contributions! Please see our Contributing Guide for details.
# Clone the repository
git clone https://github.com/bantoinese83/Monarch-Database.git
cd monarch-database
# Install dependencies
npm install
# Run tests
npm test
# Run benchmarks
npm run benchmark
# Use the CLI
npm run cli -- --help
# Build the project
npm run build
any typesMIT License - see LICENSE file for details.
Monarch Database builds upon the best ideas from industry leaders:
Built with โค๏ธ by developers, for developers who demand performance and reliability.
Built with โค๏ธ for developers who demand performance and reliability
Website โข Documentation โข GitHub
FAQs
๐ High-performance, zero-dependency in-memory database for JavaScript/TypeScript - Drop-in replacement for Redis + MongoDB
We found that monarch-database-quantum demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.ย It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.

Security News
The Axios compromise shows how time-dependent dependency resolution makes exposure harder to detect and contain.