SQL to JSON Converter
🔄 Powerful SQL to JSON converter with support for large files and multiple output formats. Converts SQL database dumps to structured JSON files.
✨ Key Features
- 🚀 Large file processing: Stream processing for SQL files up to GB size
- 📁 Multiple output modes:
- Separate files: Each table becomes a separate JSON file (default)
- Combined file: All tables in one JSON file
- 💾 Smart output: Automatically creates
json-output
directory with summary file
- ⚡ High performance: Batch processing and memory optimization
- 🛡️ Error resilient: Skip unparsable statements and continue processing
- 📊 Progress tracking: Real-time progress and memory usage
- 🎯 CLI & Library: Can be used as both CLI tool and JavaScript library
📦 Installation
Use with npx (recommended)
npx sql-to-json-converter database.sql
Global installation
npm install -g sql-to-json-converter
sql-to-json database.sql
Local installation
npm install sql-to-json-converter
🚀 CLI Usage
Separate files mode (default)
npx sql-to-json-converter database.sql
npx sql-to-json-converter database.sql --output-dir my-tables
npx sql-to-json-converter database.sql --memory --batch-size 1000
Combined file mode
npx sql-to-json-converter database.sql --combined --output result.json
npx sql-to-json-converter database.sql --combined
Advanced options
npx sql-to-json-converter large-db.sql --memory --limit 100000
npx sql-to-json-converter database.sql --skip-unparsable
npx sql-to-json-converter database.sql --batch-size 2000
📚 Library Usage
Basic usage
const { convertSQLToJSONFiles, convertSQLToJSON } = require('sql-to-json-converter');
const sqlContent = fs.readFileSync('database.sql', 'utf8');
const result = convertSQLToJSONFiles(sqlContent, 'output-folder');
console.log(`Converted ${result.metadata.totalTables} tables`);
const combined = convertSQLToJSON(sqlContent);
console.log(combined.tables);
Advanced usage
const { SQLToJSONConverter } = require('sql-to-json-converter');
const converter = new SQLToJSONConverter({
batchSize: 1000,
showMemory: true,
outputMode: 'separate',
outputDir: 'my-json-data'
});
converter.processLargeSQL('huge-database.sql').then(() => {
console.log('Conversion completed!');
});
API Reference
convertSQLToJSON(content, options)
convertSQLToJSONFiles(content, outputDir)
processLargeSQLFile(inputFile, outputFile)
createConverter(options)
📝 Output Examples
Input SQL
CREATE TABLE users (
id INT PRIMARY KEY AUTO_INCREMENT,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) UNIQUE
);
INSERT INTO users VALUES (1, 'John Doe', 'john@example.com');
INSERT INTO users VALUES (2, 'Jane Smith', 'jane@example.com');
CREATE TABLE products (
id INT PRIMARY KEY,
name VARCHAR(100),
price DECIMAL(10,2)
);
INSERT INTO products VALUES (1, 'Laptop', 999.99);
INSERT INTO products VALUES (2, 'Mouse', 25.50);
Separate Files Output (default)
json-output/
├── _summary.json # Overview of all tables
├── users.json # User table data
└── products.json # Product table data
users.json:
{
"tableName": "users",
"columns": [
{"name": "id", "type": "INT PRIMARY KEY AUTO_INCREMENT"},
{"name": "name", "type": "VARCHAR(255) NOT NULL"},
{"name": "email", "type": "VARCHAR(255) UNIQUE"}
],
"recordCount": 2,
"generatedAt": "2024-01-20T10:30:00.000Z",
"data": [
{"id": 1, "name": "John Doe", "email": "john@example.com"},
{"id": 2, "name": "Jane Smith", "email": "jane@example.com"}
]
}
_summary.json:
{
"generatedAt": "2024-01-20T10:30:00.000Z",
"totalTables": 2,
"totalRecords": 4,
"tables": [
{"name": "users", "recordCount": 2, "fileName": "users.json"},
{"name": "products", "recordCount": 2, "fileName": "products.json"}
]
}
🎯 CLI Options
--help, -h | Show help | |
--version, -v | Show version | |
--separate | Export separate files (default) | ✅ |
--combined | Export combined file | |
--output [file] | Output file for combined mode | |
--output-dir [dir] | Output directory for separate mode | json-output |
--memory, -m | Show memory usage | |
--batch-size [num] | Batch size for processing | 500 |
--limit [num] | Limit number of statements | |
--skip-unparsable | Skip unparsable statements | |
🚀 Performance
File Size Guidelines
- < 10MB: In-memory processing
- > 10MB: Automatic stream processing
- > 100MB: Recommended to use
--memory
flag
- > 1GB: Recommended to increase
--batch-size
to 2000+
Memory Optimization
npx sql-to-json-converter huge-db.sql \
--memory \
--batch-size 5000 \
--skip-unparsable \
--output-dir large-output
📊 Supported SQL Statements
CREATE TABLE | ✅ Full | Table structure, columns, constraints |
INSERT INTO | ✅ Full | Single and multiple value sets |
VALUES | ✅ Full | Quoted strings, numbers, NULL |
DROP TABLE | ✅ Skip | Ignored during processing |
Comments | ✅ Full | -- line comments |
Transactions | ✅ Basic | START TRANSACTION, COMMIT |
🛠 Development
Setup
git clone <repo-url>
cd sql-to-json-converter
npm install
Testing
echo "CREATE TABLE test (id INT); INSERT INTO test VALUES (1);" > test.sql
npm start test.sql
node -e "
const {convertSQLToJSONFiles} = require('./index');
const fs = require('fs');
const sql = fs.readFileSync('test.sql', 'utf8');
console.log(convertSQLToJSONFiles(sql));
"
Publishing
npm version patch|minor|major
npm publish
⚙️ Configuration Options
const options = {
batchSize: 1000,
showMemory: true,
limit: 50000,
skipUnparsable: true,
outputMode: 'separate',
outputDir: 'my-output'
};
🐛 Troubleshooting
Common Issues
1. Memory errors with large files
npx sql-to-json-converter large-file.sql --batch-size 200 --memory
2. Unparsable statements
npx sql-to-json-converter problematic.sql --skip-unparsable
3. Too slow with very large files
npx sql-to-json-converter huge.sql --batch-size 2000 --skip-unparsable
📄 License
MIT License
🤝 Contributing
- Fork the repository
- Create feature branch (
git checkout -b feature/amazing-feature
)
- Commit changes (
git commit -m 'Add amazing feature'
)
- Push to branch (
git push origin feature/amazing-feature
)
- Open Pull Request
📞 Support