
Security News
Another Round of TEA Protocol Spam Floods npm, But It’s Not a Worm
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.
sql-mongo-converter
Advanced tools
Production-ready converter for SQL and MongoDB queries with full CRUD operations, JOINs, and advanced SQL support
SQL-Mongo Converter is a lightweight Python library for converting SQL queries into MongoDB query dictionaries and converting MongoDB query dictionaries into SQL statements. It is designed for developers who need to quickly migrate or prototype between SQL-based and MongoDB-based data models without the overhead of a full ORM.
Currently live on PyPI: https://pypi.org/project/sql-mongo-converter/
Comprehensive SQL to MongoDB Conversion:
$lookup aggregationBidirectional MongoDB to SQL Conversion: Translate MongoDB operations back to SQL:
insertOne/insertMany → INSERT statementsupdateMany → UPDATE statements with $set, $inc operatorsdeleteMany → DELETE statements$match, $group, $lookup, $project, $sort, $limitProduction-Ready Features:
allow_mutations flag)Extensible & Robust: Built to handle a wide range of query patterns with 103 passing tests and 59%+ code coverage. Easily extended to support additional SQL functions, advanced operators, and more complex query structures.
pip install sql-mongo-converter
Clone the repository and install dependencies:
git clone https://github.com/yourusername/sql-mongo-converter.git
cd sql-mongo-converter
pip install -r requirements.txt
python setup.py install
The sql_to_mongo function converts various SQL statements into MongoDB query dictionaries. By default, write operations (INSERT, UPDATE, DELETE) are enabled via the allow_mutations parameter.
from sql_mongo_converter import sql_to_mongo
# Basic SELECT
sql_query = "SELECT name, age FROM users WHERE age > 30 AND name = 'Alice';"
mongo_query = sql_to_mongo(sql_query)
print(mongo_query)
# Output:
# {
# "collection": "users",
# "find": { "age": {"$gt": 30}, "name": "Alice" },
# "projection": { "name": 1, "age": 1 }
# }
# DISTINCT query
sql_query = "SELECT DISTINCT department FROM employees;"
mongo_query = sql_to_mongo(sql_query)
# Output: {"collection": "employees", "operation": "distinct", "field": "department"}
# Aggregation with GROUP BY and HAVING
sql_query = "SELECT department, COUNT(*) FROM employees GROUP BY department HAVING COUNT(*) > 5;"
mongo_query = sql_to_mongo(sql_query)
# Output: Aggregation pipeline with $group and $match stages
# BETWEEN operator
sql_query = "SELECT * FROM products WHERE price BETWEEN 10 AND 100;"
mongo_query = sql_to_mongo(sql_query)
# Output: {"find": {"price": {"$gte": 10, "$lte": 100}}}
# LIKE with wildcards (% and _)
sql_query = "SELECT * FROM users WHERE name LIKE 'John%';"
mongo_query = sql_to_mongo(sql_query)
# Output: {"find": {"name": {"$regex": "John.*", "$options": "i"}}}
# IN and NOT IN
sql_query = "SELECT * FROM users WHERE role IN ('admin', 'manager');"
mongo_query = sql_to_mongo(sql_query)
# Output: {"find": {"role": {"$in": ["admin", "manager"]}}}
# IS NULL and IS NOT NULL
sql_query = "SELECT * FROM users WHERE email IS NOT NULL;"
mongo_query = sql_to_mongo(sql_query)
# Output: {"find": {"email": {"$ne": None}}}
# OR and NOT operators
sql_query = "SELECT * FROM users WHERE age > 30 OR status = 'active';"
mongo_query = sql_to_mongo(sql_query)
# Output: {"find": {"$or": [{"age": {"$gt": 30}}, {"status": "active"}]}}
# Single row insert
sql_query = "INSERT INTO users (name, age, email) VALUES ('Alice', 30, 'alice@example.com');"
mongo_query = sql_to_mongo(sql_query)
# Output: {"collection": "users", "operation": "insertOne", "document": {...}}
# Multiple rows insert
sql_query = "INSERT INTO users (name, age) VALUES ('Bob', 25), ('Charlie', 35);"
mongo_query = sql_to_mongo(sql_query)
# Output: {"operation": "insertMany", "documents": [{...}, {...}]}
# UPDATE with WHERE clause
sql_query = "UPDATE users SET age = 31, status = 'active' WHERE name = 'Alice';"
mongo_query = sql_to_mongo(sql_query)
# Output: {"collection": "users", "operation": "updateMany", "filter": {...}, "update": {"$set": {...}}}
# DELETE with WHERE clause
sql_query = "DELETE FROM users WHERE age < 18;"
mongo_query = sql_to_mongo(sql_query)
# Output: {"collection": "users", "operation": "deleteMany", "filter": {"age": {"$lt": 18}}}
# INNER JOIN
sql_query = """
SELECT u.name, o.order_id
FROM users u
INNER JOIN orders o ON u.user_id = o.user_id;
"""
mongo_query = sql_to_mongo(sql_query)
# Output: Aggregation pipeline with $lookup stage
# LEFT JOIN
sql_query = """
SELECT u.name, o.order_id
FROM users u
LEFT JOIN orders o ON u.user_id = o.user_id;
"""
mongo_query = sql_to_mongo(sql_query)
# Output: Aggregation pipeline with $lookup preserving unmatched documents
# CREATE TABLE with schema
sql_query = """
CREATE TABLE users (
id INT,
name VARCHAR(100),
age INT,
email VARCHAR(255)
);
"""
mongo_query = sql_to_mongo(sql_query)
# Output: createCollection with schema validation
# CREATE INDEX
sql_query = "CREATE INDEX idx_age ON users (age DESC, name ASC);"
mongo_query = sql_to_mongo(sql_query)
# Output: {"operation": "createIndex", "keys": {"age": -1, "name": 1}}
# DROP TABLE
sql_query = "DROP TABLE users;"
mongo_query = sql_to_mongo(sql_query, allow_mutations=True)
# Output: {"collection": "users", "operation": "drop"}
# DROP INDEX
sql_query = "DROP INDEX idx_age ON users;"
mongo_query = sql_to_mongo(sql_query, allow_mutations=True)
# Output: {"collection": "users", "operation": "dropIndex", "index_name": "idx_age"}
The mongo_to_sql function translates MongoDB operations back into SQL statements.
from sql_mongo_converter import mongo_to_sql
# Basic find with operators
mongo_obj = {
"collection": "users",
"find": {
"$or": [
{"age": {"$gte": 25}},
{"status": "ACTIVE"}
],
"tags": {"$in": ["dev", "qa"]}
},
"projection": {"age": 1, "status": 1, "tags": 1},
"sort": [("age", 1), ("name", -1)],
"limit": 10,
"skip": 5
}
sql_query = mongo_to_sql(mongo_obj)
print(sql_query)
# Output:
# SELECT age, status, tags FROM users WHERE ((age >= 25) OR (status = 'ACTIVE')) AND (tags IN ('dev', 'qa'))
# ORDER BY age ASC, name DESC LIMIT 10 OFFSET 5;
# insertOne
mongo_obj = {
"collection": "users",
"operation": "insertOne",
"document": {"name": "Alice", "age": 30, "email": "alice@example.com"}
}
sql_query = mongo_to_sql(mongo_obj)
# Output: INSERT INTO users (name, age, email) VALUES ('Alice', 30, 'alice@example.com');
# insertMany
mongo_obj = {
"collection": "users",
"operation": "insertMany",
"documents": [
{"name": "Bob", "age": 25},
{"name": "Charlie", "age": 35}
]
}
sql_query = mongo_to_sql(mongo_obj)
# Output: INSERT INTO users (name, age) VALUES ('Bob', 25), ('Charlie', 35);
mongo_obj = {
"collection": "users",
"operation": "updateMany",
"filter": {"name": "Alice"},
"update": {"$set": {"age": 31, "status": "active"}}
}
sql_query = mongo_to_sql(mongo_obj)
# Output: UPDATE users SET age = 31, status = 'active' WHERE name = 'Alice';
mongo_obj = {
"collection": "users",
"operation": "deleteMany",
"filter": {"age": {"$lt": 18}}
}
sql_query = mongo_to_sql(mongo_obj)
# Output: DELETE FROM users WHERE age < 18;
sql_to_mongo(sql_query: str, allow_mutations: bool = True) -> dictDescription: Parses SQL statements (SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, JOIN) and converts them into MongoDB query dictionaries.
Parameters:
sql_query: A valid SQL query stringallow_mutations: (Optional) Enable/disable write operations (INSERT, UPDATE, DELETE, CREATE, DROP). Default: TrueReturns: A dictionary containing the MongoDB operation specification. Structure varies by operation type:
SELECT queries:
{
"collection": str, # Table name
"find": dict, # Filter from WHERE clause
"projection": dict, # Columns to return
"sort": list, # Optional: sort specification
"limit": int, # Optional: result limit
"skip": int # Optional: offset
}
Aggregation queries (with GROUP BY/HAVING/DISTINCT/JOIN):
{
"collection": str,
"operation": "aggregate",
"pipeline": [...] # Aggregation pipeline stages
}
INSERT operations:
{
"collection": str,
"operation": "insertOne" | "insertMany",
"document": dict, # For insertOne
"documents": [dict] # For insertMany
}
UPDATE operations:
{
"collection": str,
"operation": "updateMany",
"filter": dict, # WHERE clause
"update": {"$set": {...}} # SET clause
}
DELETE operations:
{
"collection": str,
"operation": "deleteMany",
"filter": dict # WHERE clause
}
CREATE/DROP operations:
{
"collection": str,
"operation": "createCollection" | "createIndex" | "drop" | "dropIndex",
# ... additional fields based on operation
}
mongo_to_sql(mongo_obj: dict) -> strmongo_obj: A dictionary representing a MongoDB operation with keys such as:
collection: Collection/table name (required)operation: Operation type (optional, defaults to "find")find: Filter conditions for SELECT/UPDATE/DELETEprojection: Columns to selectsort: Sort specificationlimit/skip: Result paginationdocument/documents: For INSERT operationsfilter/update: For UPDATE operationspipeline: For aggregation operations$gt, $gte, $lt, $lte, $eq, $ne$in, $nin$and, $or, $not$regex$set, $inc, $unset$match, $group, $lookup, $project, $sort, $limit, $skip$lookup)=, >, >=, <, <=, !=, <>BETWEEN ... AND ...LIKE with wildcards (%, _)IN (...), NOT IN (...)IS NULL, IS NOT NULLAND, OR, NOTCOUNT, SUM, AVG, MIN, MAXThe package includes a comprehensive pytest test suite with 103 passing tests and 59%+ code coverage.
Create a virtual environment (optional but recommended):
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
Install test dependencies:
pip install -r requirements.txt
pip install pytest pytest-cov
Run tests:
# Run all tests
pytest tests/ -v
# With coverage report
pytest tests/ --cov=sql_mongo_converter --cov-report=html
# Quick run
pytest tests/ -q --tb=line
The test suite includes:
All tests cover:
Example scripts are provided in the examples/ directory:
# Basic usage examples
python examples/basic_usage.py
# Advanced features (validation, logging, benchmarking)
python examples/advanced_usage.py
These scripts demonstrate various conversion scenarios and best practices.
Ensure you have setuptools and wheel installed:
pip install setuptools wheel
Build the package:
python setup.py sdist bdist_wheel
This creates a dist/ folder with the distribution files.
Install Twine:
pip install twine
Upload your package:
twine upload dist/*
Follow the prompts for your PyPI credentials.
Contributions are welcome! To contribute:
Fork the Repository
Create a Feature Branch:
git checkout -b feature/my-new-feature
Commit Your Changes:
git commit -am "Add new feature or fix bug"
Push Your Branch:
git push origin feature/my-new-feature
Submit a Pull Request on GitHub.
For major changes, please open an issue first to discuss your ideas.
This project is licensed under the MIT License.
SQL-Mongo Converter is a comprehensive, production-ready tool that bridges SQL and MongoDB query languages with full bidirectional conversion support.
This release massively expands query support beyond the basic SELECT-only functionality:
$lookup aggregationThe converter is ideal for developers migrating between SQL and MongoDB data models, building database abstraction layers, or learning NoSQL query patterns. Extend and customize as needed to support additional SQL constructs or MongoDB operators.
Happy converting! 🍃
FAQs
Production-ready converter for SQL and MongoDB queries with full CRUD operations, JOINs, and advanced SQL support
We found that sql-mongo-converter demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.

Security News
PyPI adds Trusted Publishing support for GitLab Self-Managed as adoption reaches 25% of uploads

Research
/Security News
A malicious Chrome extension posing as an Ethereum wallet steals seed phrases by encoding them into Sui transactions, enabling full wallet takeover.