![Oracle Drags Its Feet in the JavaScript Trademark Dispute](https://cdn.sanity.io/images/cgdhsj6q/production/919c3b22c24f93884c548d60cbb338e819ff2435-1024x1024.webp?w=400&fit=max&auto=format)
Security News
Oracle Drags Its Feet in the JavaScript Trademark Dispute
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Load data from on-prem SQL or PostgreSQL databases to Azure Storage as JSON or CSV.
DB2Azure is a Python package designed to streamline the process of loading data from SQL Server (MSSQL), PostgreSQL, and MySQL databases to Azure Blob Storage in both JSON and CSV formats. This package simplifies the data extraction and upload processes with separate modules for SQL Server (MSSQLLoader
), PostgreSQL (PostgreLoader
), and MySQL (MySQLLoader
), enabling efficient and seamless integration with Azure Blob Storage.
DB2Azure helps automate the process of extracting data from SQL Server, PostgreSQL, and MySQL databases and uploading it directly to Azure Blob Storage in either JSON or CSV format. The package includes three main modules for SQL Server (MSSQLLoader
), PostgreSQL (PostgreLoader
), and MySQL (MySQLLoader
), each providing methods for executing SQL queries and transferring the data to Azure Blob Storage.
pyodbc
.psycopg
.pymysql
.azure-storage-blob
.To install the DB2Azure
package, use the following pip
command:
pip install DB2Azure
Alternatively, clone the repository and install it manually:
git clone https://github.com/mr-speedster/DB2Azure.git
cd DB2Azure
pip install .
To use the SQL Server loader, you can use the MSSQLLoader
class in the db2azure
module. The MSSQLLoader
class allows you to execute SQL queries and upload the resulting data to Azure Blob Storage in either JSON or CSV format.
from db2azure import MSSQLLoader
# SQL Query
query = "SELECT [UserID],[FirstName],[LastName],[Email],[Age] FROM [SampleDB].[dbo].[Users]"
# SQL Server connection string
sql_conn = r"Driver=<driver>;Server=<server_name>;Database=<database>;Trusted_Connection=yes;"
# Azure Blob Storage configurations
azure_config_json = {
'container_name': "your_container",
'folder_path': "your_folder",
'file_name': "your_file.json",
'azure_blob_url': "https://your_account_name.blob.core.windows.net",
'sas_token': "your_sas_token"
}
azure_config_csv = {
'container_name': "your_container",
'folder_path': "your_folder",
'file_name': "your_file.csv",
'azure_blob_url': "https://your_account_name.blob.core.windows.net",
'sas_token': "your_sas_token"
}
# Load to JSON
json_status = MSSQLLoader.load_to_json(query, sql_conn, azure_config_json)
print("JSON Upload Status:", json_status)
# Load to CSV
csv_status = MSSQLLoader.load_to_csv(query, sql_conn, azure_config_csv)
print("CSV Upload Status:", csv_status)
To use the PostgreSQL loader, you can use the PostgreLoader
class in the db2azure
module. The PostgreLoader
class operates similarly to MSSQLLoader
, but it works with PostgreSQL databases.
from db2azure import PostgreLoader
# PostgreSQL Query
# PostgreSQL Query
query = "SELECT user_id, first_name, last_name, email, age FROM public.users;"
# PostgreSQL connection parameters
connection_params = {
"host": "localhost", # e.g., "localhost" or an IP address
"port": "5432", # default PostgreSQL port
"dbname": "SampleDB", # name of the database
"user": "postgres", # PostgreSQL username
"password": "<your_password>" # PostgreSQL password
}
# Azure Blob Storage configurations
azure_config_json = {
'container_name': "your_container",
'folder_path': "your_folder",
'file_name': "your_file.json",
'azure_blob_url': "https://your_account_name.blob.core.windows.net",
'sas_token': "your_sas_token"
}
azure_config_csv = {
'container_name': "your_container",
'folder_path': "your_folder",
'file_name': "your_file.csv",
'azure_blob_url': "https://your_account_name.blob.core.windows.net",
'sas_token': "your_sas_token"
}
# Load to JSON
json_status = PostgreLoader.load_to_json(query, connection_params, azure_config_json)
print("JSON Upload Status:", json_status)
# Load to CSV
csv_status = PostgreLoader.load_to_csv(query, connection_params, azure_config_csv)
print("CSV Upload Status:", csv_status)
To use the MySQL loader, you can use the MySQLLoader
class in the db2azure
module. The MySQLLoader
class works similarly to MSSQLLoader
and PostgreLoader
, but it is designed to work with MySQL databases.
from db2azure import MySQLLoader
# SQL Query
query = "SELECT * FROM SampleDB.Users"
# MySQL connection parameters
mysql_conn = {
"host": "localhost", # e.g., "localhost" or an IP address
"port": "3306", # default MySQL port
"database": "SampleDB", # name of the database
"user": "*****", # MySQL username
"password": "*****" # MySQL password
}
# Azure Blob Storage configurations
azure_config_json = {
'container_name': "your_container",
'folder_path': "your_folder",
'file_name': "your_file.json",
'azure_blob_url': "https://your_account_name.blob.core.windows.net",
'sas_token': "your_sas_token"
}
azure_config_csv = {
'container_name': "your_container",
'folder_path': "your_folder",
'file_name': "your_file.csv",
'azure_blob_url': "https://your_account_name.blob.core.windows.net",
'sas_token': "your_sas_token"
}
# Load to JSON
json_status = MySQLLoader.load_to_json(query, mysql_conn, azure_config_json)
print("JSON Upload Status:", json_status)
# Load to CSV
csv_status = MySQLLoader.load_to_csv(query, mysql_conn, azure_config_csv)
print("CSV Upload Status:", csv_status)
MSSQLLoader
load_to_json
: Loads data from SQL Server to a JSON file in Azure Blob Storage.
sql_query
, connection_string
, azure_config
load_to_csv
: Loads data from SQL Server to a CSV file in Azure Blob Storage.
sql_query
, connection_string
, azure_config
PostgreLoader
load_to_json
: Loads data from PostgreSQL to a JSON file in Azure Blob Storage.
sql_query
, connection_params
, azure_config
load_to_csv
: Loads data from PostgreSQL to a CSV file in Azure Blob Storage.
sql_query
, connection_params
, azure_config
MySQLLoader
load_to_json
: Loads data from MySQL to a JSON file in Azure Blob Storage.
sql_query
, connection_params
, azure_config
load_to_csv
: Loads data from MySQL to a CSV file in Azure Blob Storage.
sql_query
, connection_params
, azure_config
For each loader (SQL Server, PostgreSQL, MySQL), you will need to provide the following configuration:
connection_string
parameter to configure the connection to your SQL Server.connection_params
dictionary to configure the connection to your PostgreSQL database.connection_params
dictionary to configure the connection to your MySQL database.azure_config
dictionary, containing container_name
, folder_path
, file_name
, azure_blob_url
, and sas_token
, to specify where and how the data should be uploaded to Azure Blob Storage.If any error occurs during the data extraction or upload process, the methods will return an error response containing:
status
: Always error
.message
: The error message describing the issue.Example error response:
{
"status": "error",
"message": "Connection failed: 'Your error message here'"
}
DB2Azure is licensed under the MIT License. See the LICENSE file for more details.
We welcome contributions! Feel free to open an issue or submit a pull request for any improvements or bug fixes.
FAQs
Load data from on-prem SQL or PostgreSQL databases to Azure Storage as JSON or CSV.
We found that DB2Azure demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.