
Security News
Meet Socket at Black Hat and DEF CON 2025 in Las Vegas
Meet Socket at Black Hat & DEF CON 2025 for 1:1s, insider security talks at Allegiant Stadium, and a private dinner with top minds in software supply chain security.
Management project which can run locally and on AWS Lambda function. Project aims to make database migrations from/to cloud easy.
Project used to migrate relatively small databases from local machine to cloud and vice versa.
Project that can migrate your local MySql database to an AWS cloud database and vice versa.
This project runs on AWS Lambda function and on a local machine.
Migration is executed in 3 steps: creating mysql dump file, uploading it to S3, restoring the database from a file.
Migration from local to cloud looks like this: A mysql dump file from a local database is created, the file is uploaded to S3, a lambda function (this project) is invoked to restore a cloud database from a recently uploaded S3 file.
Migration from cloud to local looks like this: A lambda function (this project) is invoked to create a mysql dump file from a cloud database and upload it to S3. The most recent dump file is downloaded and a local database is restored from it.
Why lambda function? Why can't a direct migration between databases be achieved? Well, the answer is simple - of course it can! However in this case your cloud database is most likely deployed incorrectly. The cloud database should NOT be accessible to the whole internet. It should be deployed to a private subnet with a strict security group attached to it. This way the only way a migration between local and cloud can happen is through an additional AWS resource instance e.g. lambda, ec2, ecs, etc. This project has chosen Lambda to keep it light-weight.
pip install aws-db-migration
or:
./install.sh
Note, this project must be deployed as a Lambda function and have access to your cloud database.
Note, that database credentials can be provided either with a DatabaseCredentials class or through environment variables (refer to .env.example file).
from aws_db_migration.run_local import RunLocal
from aws_db_migration.database_credentials import DatabaseCredentials
from aws_db_migration.aws_credentials import AwsCredentials
db_credentials = DatabaseCredentials(
username='username',
password='password',
database_name='database',
host='localhost',
port='3306'
)
aws_credentials = AwsCredentials()
RunLocal(aws_credentials, db_credentials).to_cloud()
from aws_db_migration.run_local import RunLocal
from aws_db_migration.database_credentials import DatabaseCredentials
from aws_db_migration.aws_credentials import AwsCredentials
db_credentials = DatabaseCredentials(
username='username',
password='password',
database_name='database',
host='localhost',
port='3306'
)
aws_credentials = AwsCredentials()
RunLocal(aws_credentials, db_credentials).from_cloud()
from aws_db_migration.run_local import RunLocal
from aws_db_migration.database_credentials import DatabaseCredentials
from aws_db_migration.aws_credentials import AwsCredentials
def f1():
print('Pre-download!!!')
def f2():
print('Post-download!!!')
runner = RunLocal(AwsCredentials(), DatabaseCredentials())
runner.pre_download = f1
runner.post_download = f2
Dump databases in single transactions to avoid data corruptions.
Add ability to select database revision.
Using absolute paths to call amazon-linux-mysql(dump) libraries.
Fixed environment bug.
Critical bug (when backing up a database) fix.
Refactor the way aws credentials and parameters are provided.
Add example on how to add triggers.
README fixes.
Add README and HISTORY files.
Add parameter explanations in .env.example file.
Major project refactor and file renames.
FAQs
Management project which can run locally and on AWS Lambda function. Project aims to make database migrations from/to cloud easy.
We found that aws-db-migration demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Meet Socket at Black Hat & DEF CON 2025 for 1:1s, insider security talks at Allegiant Stadium, and a private dinner with top minds in software supply chain security.
Security News
CAI is a new open source AI framework that automates penetration testing tasks like scanning and exploitation up to 3,600× faster than humans.
Security News
Deno 2.4 brings back bundling, improves dependency updates and telemetry, and makes the runtime more practical for real-world JavaScript projects.