AWS DB Migration
Short description
Project used to migrate relatively small databases from local machine to cloud and vice versa.
Long description
Project that can migrate your local MySql database to an AWS cloud database and vice versa.
This project runs on AWS Lambda function and on a local machine.
Migration is executed in 3 steps: creating mysql dump file, uploading it to S3, restoring the
database from a file.
Migration from local to cloud looks like this: A mysql dump file from
a local database is created, the file is uploaded to S3, a lambda function (this project) is
invoked to restore a cloud database from a recently uploaded S3 file.
Migration from cloud to local looks like this:
A lambda function (this project) is invoked to create a mysql dump file from
a cloud database and upload it to S3. The most recent dump file is downloaded and
a local database is restored from it.
Why lambda function? Why can't a direct migration between databases be achieved? Well,
the answer is simple - of course it can! However in this case your cloud database is
most likely deployed incorrectly. The cloud database should NOT be accessible to the
whole internet. It should be deployed to a private subnet with a strict security group
attached to it. This way the only way a migration between local and cloud can happen
is through an additional AWS resource instance e.g. lambda, ec2, ecs, etc. This project
has chosen Lambda to keep it light-weight.
Prerequisites
Local prerequisites
- Mysql server installed.
- Mysql client installed.
- Database set up.
- This project installed with:
pip install aws-db-migration
or:
./install.sh
Cloud prerequisites
- Mysql set up on aws cloud (e.g. RDS)
- This project deployed as a lambda function with a configured environment
refer to .env.example file).
Usage
Note, this project must be deployed as a Lambda function and have access to your cloud database.
Note, that database credentials can be provided either with a DatabaseCredentials class
or through environment variables (refer to .env.example file).
Migration to cloud
from aws_db_migration.run_local import RunLocal
from aws_db_migration.database_credentials import DatabaseCredentials
from aws_db_migration.aws_credentials import AwsCredentials
db_credentials = DatabaseCredentials(
username='username',
password='password',
database_name='database',
host='localhost',
port='3306'
)
aws_credentials = AwsCredentials()
RunLocal(aws_credentials, db_credentials).to_cloud()
Migration from cloud
from aws_db_migration.run_local import RunLocal
from aws_db_migration.database_credentials import DatabaseCredentials
from aws_db_migration.aws_credentials import AwsCredentials
db_credentials = DatabaseCredentials(
username='username',
password='password',
database_name='database',
host='localhost',
port='3306'
)
aws_credentials = AwsCredentials()
RunLocal(aws_credentials, db_credentials).from_cloud()
Adding post/pre triggers
from aws_db_migration.run_local import RunLocal
from aws_db_migration.database_credentials import DatabaseCredentials
from aws_db_migration.aws_credentials import AwsCredentials
def f1():
print('Pre-download!!!')
def f2():
print('Post-download!!!')
runner = RunLocal(AwsCredentials(), DatabaseCredentials())
runner.pre_download = f1
runner.post_download = f2
Release history
3.1.1
Dump databases in single transactions to avoid data corruptions.
3.1.0
Add ability to select database revision.
3.0.3
Using absolute paths to call amazon-linux-mysql(dump) libraries.
3.0.2
Fixed environment bug.
3.0.1
Critical bug (when backing up a database) fix.
3.0.0
Refactor the way aws credentials and parameters are provided.
2.0.4
Add example on how to add triggers.
2.0.3
README fixes.
2.0.2
Add README and HISTORY files.
2.0.1
Add parameter explanations in .env.example file.
2.0.0
Major project refactor and file renames.