
Security News
Open Source Maintainers Feeling the Weight of the EU’s Cyber Resilience Act
The EU Cyber Resilience Act is prompting compliance requests that open source maintainers may not be obligated or equipped to handle.
A CLI tool to export and import schema definitions and data from CockroachDB in SQL, JSON, YAML, or chunked CSV formats.
A feature-rich CLI for exporting and importing CockroachDB schemas and data. Includes support for parallel chunked exports, manifest checksums, BYTES/UUID/ARRAY types, permission introspection, secure resumable imports, S3-compatible storage (MinIO, Cohesity), region-aware filtering, and automatic retry logic.
sql
, json
, yaml
COPY
-based imports with chunk-level tracking--parallel-load
) and manifest verification--diff
)logs/crdb_dump.log
--resume-log
or --resume-log-dir
--region
--use-s3
) with MinIO, Cohesity, or AWS--validate-csv
)boto3
) for MinIOpip install crdb-dump
./test-local.sh
This script will:
boto3
) to create S3 bucketscrdb-dump --help
crdb-dump export --help
crdb-dump load --help
Example usage:
crdb-dump export --db=mydb --data --per-table
crdb-dump load --db=mydb --schema=... --data-dir=... --resume-log=resume.json
export CRDB_URL="cockroachdb://root@localhost:26257/defaultdb?sslmode=disable"
# or
export CRDB_URL="postgresql://root@localhost:26257/defaultdb?sslmode=disable"
Alternatively:
--db mydb --host localhost --certs-dir ~/certs
Use --print-connection
to verify resolved URL.
crdb-dump export \
--db=mydb \
--per-table \
--data \
--data-format=csv \
--chunk-size=1000 \
--data-order=id \
--data-compress \
--data-parallel \
--verify \
--include-permissions \
--archive
Option | Description |
---|---|
--per-table | One file per object (e.g., table_users.sql ) |
--format | Output format: sql , json , yaml |
--diff | Show schema diff vs previous .sql file |
--tables | Comma-separated FQ names to include |
--exclude-tables | Skip specific FQ table names |
--include-permissions | Export roles, grants, and memberships |
--region | Only export tables matching this region |
Option | Description |
---|---|
--data | Enable data export |
--data-format | Format: csv or sql |
--chunk-size | Number of rows per chunk |
--data-split | Output one file per table |
--data-compress | Output .csv.gz |
--data-order | Order rows by column(s) |
--data-order-desc | Use descending order |
--data-parallel | Parallel export across tables |
--verify | Verify chunk checksums |
--region | Filter tables by region in manifests |
--use-s3 | Upload exported chunks to S3 |
--s3-bucket | S3 bucket name |
--s3-prefix | Key prefix under which to store chunks |
--s3-endpoint | S3-compatible endpoint URL |
--s3-access-key | S3 access key (can use env) |
--s3-secret-key | S3 secret key (can use env) |
crdb-dump load \
--db=mydb \
--schema=crdb_dump_output/mydb/mydb_schema.sql \
--data-dir=crdb_dump_output/mydb \
--resume-log=resume.json \
--validate-csv \
--parallel-load \
--print-connection
Option | Description |
---|---|
--schema | .sql file to apply |
--data-dir | Folder containing chunked CSV + manifests |
--resume-log | Track loaded chunks in a single JSON file |
--resume-log-dir | Per-table resume logs (e.g. resume/users.json ) |
--validate-csv | Ensure chunk headers match DB schema |
--parallel-load | Load chunks in parallel |
--region | Only import chunks from matching region |
--dry-run | Print actions but don't execute |
--use-s3 | Download chunks from S3 |
--s3-bucket | S3 bucket name |
--s3-prefix | Path prefix inside the bucket |
--s3-endpoint | S3-compatible endpoint (MinIO, Cohesity) |
--s3-access-key | S3 access key |
--s3-secret-key | S3 secret key |
✅ Retries failed operations with exponential backoff
✅ Resumable imports:
--resume-log
(single file)--resume-log-dir
(per-table)--resume-strict
(abort on failure)Writes resume state after each successful chunk. Restarts are safe and idempotent.
crdb-dump export \
--db=mydb \
--per-table \
--data \
--chunk-size=1000 \
--data-format=csv \
--use-s3 \
--s3-bucket=crdb-test-bucket \
--s3-endpoint=http://localhost:9000 \
--s3-access-key=minioadmin \
--s3-secret-key=minioadmin \
--s3-prefix=test1/ \
--out-dir=crdb_dump_output
crdb-dump load \
--db=mydb \
--data-dir=crdb_dump_output/mydb \
--resume-log-dir=resume/ \
--parallel-load \
--validate-csv \
--use-s3 \
--s3-bucket=crdb-test-bucket \
--s3-endpoint=http://localhost:9000 \
--s3-access-key=minioadmin \
--s3-secret-key=minioadmin \
--s3-prefix=test1/
crdb-dump export --db=mydb --diff=old_schema.sql
Output:
crdb_dump_output/mydb/mydb_schema.diff
pytest -m unit
pytest -m integration
./test-local.sh
Pull requests welcome! Star ⭐ the repo, file issues, or request features at:
FAQs
A CLI tool to export and import schema definitions and data from CockroachDB in SQL, JSON, YAML, or chunked CSV formats.
We found that crdb-dump demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
The EU Cyber Resilience Act is prompting compliance requests that open source maintainers may not be obligated or equipped to handle.
Security News
Crates.io adds Trusted Publishing support, enabling secure GitHub Actions-based crate releases without long-lived API tokens.
Research
/Security News
Undocumented protestware found in 28 npm packages disrupts UI for Russian-language users visiting Russian and Belarusian domains.