
Security News
vlt Launches "reproduce": A New Tool Challenging the Limits of Package Provenance
vlt's new "reproduce" tool verifies npm packages against their source code, outperforming traditional provenance adoption in the JavaScript ecosystem.
google-datacatalog-mysql-connector
Advanced tools
Library for ingesting MySQL metadata into Google Cloud Data Catalog
Library for ingesting MySQL metadata into Google Cloud Data Catalog.
Disclaimer: This is not an officially supported Google product.
Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.
With virtualenv, it's possible to install this library without needing system install permissions, and without clashing with the installed system dependencies. Make sure you use Python 3.6+.
pip3 install virtualenv
virtualenv --python python3.6 <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install google-datacatalog-mysql-connector
pip3 install virtualenv
virtualenv --python python3.6 <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install google-datacatalog-mysql-connector
git clone https://github.com/GoogleCloudPlatform/datacatalog-connectors-rdbms/
cd datacatalog-connectors-rdbms/google-datacatalog-mysql-connector
pip3 install virtualenv
virtualenv --python python3.6 <your-env>
source <your-env>/bin/activate
pip install .
<YOUR-CREDENTIALS_FILES_FOLDER>/mysql2dc-credentials.json
Please notice this folder and file will be required in next steps.
Replace below values according to your environment:
export GOOGLE_APPLICATION_CREDENTIALS=data_catalog_credentials_file
export MYSQL2DC_DATACATALOG_PROJECT_ID=google_cloud_project_id
export MYSQL2DC_DATACATALOG_LOCATION_ID=google_cloud_location_id
export MYSQL2DC_MYSQL_SERVER=mysql_server
export MYSQL2DC_MYSQL_USERNAME=mysql_username
export MYSQL2DC_MYSQL_PASSWORD=mysql_password
export MYSQL2DC_MYSQL_DATABASE=mysql_database
export MYSQL2DC_RAW_METADATA_CSV=mysql_raw_csv (If supplied ignores the MYSQL server credentials)
Along with default metadata, the connector can ingest optional metadata as well, such as number of rows in each table. The table below shows what metadata is scraped by default, and what is configurable.
Metadata | Description | Scraped by default | Config option |
---|---|---|---|
database_name | Name of a database | Y | --- |
table_name | Name of a table | Y | --- |
table_type | Type of a table (BASE, VIEW, etc) | Y | --- |
create_time | When the table was created | Y | --- |
update_time | When the table was updated | Y | --- |
table_size_mb | Size of a table, in MB | Y | --- |
column_name | Name of a column | Y | --- |
column_type | Column data type | Y | --- |
column_default_value | Default value of a column | Y | --- |
column_nullable | Whether a column is nullable | Y | --- |
column_char_length | Char length of values in a column | Y | --- |
column_numeric_precision | Numeric precision of values in a column | Y | --- |
ANALYZE TABLE statement | Statement to refresh metadata information | N | refresh_metadata_tables |
table_rows | Number of rows in a table | N | sync_row_counts |
Sample configuration file ingest_cfg.yaml in the repository root shows what kind of configuration is expected.
If you want to run optional queries, please add ingest_cfg.yaml to the directory from which you execute the connector and adapt it to your needs.
When running the ANALYZE TABLE
statement, the connector credentials need INSERT privilege
in the database system tables, otherwise you will receive the following error:
mysql.connector.errors.ProgrammingError: 1142 (42000): INSERT command denied to user
'read-only'@'{HOST}' for table '{TABLE_NAME}'
If it is desired to have only READ privilege make sure the flag refresh_metadata_tables
is disabled
google-datacatalog-mysql-connector \
--datacatalog-project-id=$MYSQL2DC_DATACATALOG_PROJECT_ID \
--datacatalog-location-id=$MYSQL2DC_DATACATALOG_LOCATION_ID \
--mysql-host=$MYSQL2DC_MYSQL_SERVER \
--mysql-user=$MYSQL2DC_MYSQL_USERNAME \
--mysql-pass=$MYSQL2DC_MYSQL_PASSWORD \
--mysql-database=$MYSQL2DC_MYSQL_DATABASE \
--raw-metadata-csv=$MYSQL2DC_RAW_METADATA_CSV
This option is useful when the connector cannot accurately determine the database hostname. For example when running under proxies, load balancers or database read replicas, you can specify the prefix of your master instance so the resource URL will point to the exact database where the data is stored.
google-datacatalog-mysql-connector \
--datacatalog-project-id=$MYSQL2DC_DATACATALOG_PROJECT_ID \
--datacatalog-location-id=$MYSQL2DC_DATACATALOG_LOCATION_ID \
--datacatalog-entry-resource-url-prefix project/database-instance \
--mysql-host=$MYSQL2DC_MYSQL_SERVER \
--mysql-user=$MYSQL2DC_MYSQL_USERNAME \
--mysql-pass=$MYSQL2DC_MYSQL_PASSWORD \
--mysql-database=$MYSQL2DC_MYSQL_DATABASE \
--raw-metadata-csv=$MYSQL2DC_RAW_METADATA_CSV
docker build -t mysql2datacatalog .
docker run --rm --tty -v YOUR-CREDENTIALS_FILES_FOLDER:/data mysql2datacatalog \
--datacatalog-project-id=$MYSQL2DC_DATACATALOG_PROJECT_ID \
--datacatalog-location-id=$MYSQL2DC_DATACATALOG_LOCATION_ID \
--mysql-host=$MYSQL2DC_MYSQL_SERVER \
--mysql-user=$MYSQL2DC_MYSQL_USERNAME \
--mysql-pass=$MYSQL2DC_MYSQL_PASSWORD \
--mysql-database=$MYSQL2DC_MYSQL_DATABASE \
--raw-metadata-csv=$MYSQL2DC_RAW_METADATA_CSV
# List of projects split by comma. Can be a single value without comma
export MYSQL2DC_DATACATALOG_PROJECT_IDS=my-project-1,my-project-2
# Run the clean up
python tools/cleanup_datacatalog.py --datacatalog-project-ids=$MYSQL2DC_DATACATALOG_PROJECT_IDS
pip install --upgrade yapf
# Auto update files
yapf --in-place --recursive src tests
# Show diff
yapf --diff --recursive src tests
# Set up pre-commit hook
# From the root of your git project.
curl -o pre-commit.sh https://raw.githubusercontent.com/google/yapf/master/plugins/pre-commit.sh
chmod a+x pre-commit.sh
mv pre-commit.sh .git/hooks/pre-commit
pip install --upgrade flake8
flake8 src tests
python setup.py test
In the case a connector execution hits Data Catalog quota limit, an error will be raised and logged with the following detailement, depending on the performed operation READ/WRITE/SEARCH:
status = StatusCode.RESOURCE_EXHAUSTED
details = "Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'."
debug_error_string =
"{"created":"@1587396969.506556000", "description":"Error received from peer ipv4:172.217.29.42:443","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'.","grpc_status":8}"
For more info about Data Catalog quota, go to: Data Catalog quota docs.
FAQs
Library for ingesting MySQL metadata into Google Cloud Data Catalog
We found that google-datacatalog-mysql-connector demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
vlt's new "reproduce" tool verifies npm packages against their source code, outperforming traditional provenance adoption in the JavaScript ecosystem.
Research
Security News
Socket researchers uncovered a malicious PyPI package exploiting Deezer’s API to enable coordinated music piracy through API abuse and C2 server control.
Research
The Socket Research Team discovered a malicious npm package, '@ton-wallet/create', stealing cryptocurrency wallet keys from developers and users in the TON ecosystem.