Security News
PyPI Now Supports iOS and Android Wheels for Mobile Python Development
PyPI now supports iOS and Android wheels, making it easier for Python developers to distribute mobile packages.
SQL2Azure is a Python package designed to simplify the process of loading data from an on-premises SQL Server database into Azure Blob Storage. With this package, users can execute SQL queries to retrieve data and upload it in two formats: JSON or CSV. Whether you’re looking to move data for backup, reporting, or integration purposes, SQL2Azure makes it easy to seamlessly transfer your database information to Azure.
SQL2Azure helps to directly transfer data from an on-premises SQL Server database to an Azure Blob Storage container. This package provides two main functions for uploading data:
The tool connects to your SQL Server using a connection string, executes a SQL query, fetches the data, and then uploads it to a specified Azure Blob Storage container in the format of your choice.
Before you start using SQL2Azure, make sure you have the following prerequisites:
pyodbc
: For connecting to SQL Server and executing queries.azure-storage-blob
: For interacting with Azure Blob Storage.These dependencies are listed in the requirements.txt
file.
To install them, use the following command:
pip install -r requirements.txt
You can install SQL2Azure by using pip or by cloning this repository.
pip install SQL2Azure
Clone the repository and install it using the following commands:
git clone https://github.com/yourusername/SQL2Azure.git
cd SQL2Azure
pip install .
Once the package is installed, you can start using it to load data from your SQL Server to Azure Blob Storage. The package has two main functions, one for loading data as JSON and another for CSV. Here’s how you can use them:
To load data from SQL Server to Azure Blob Storage in JSON format, use the load_to_json
function. It accepts the following parameters:
sql_query
: The SQL query you want to run on the database.connection_string
: The connection string to the SQL Server.container_name
: The name of the Azure Blob Storage container where the file will be uploaded.folder_path
: The folder path within the container where the file will be saved.file_name
: The name of the blob (file) in the Azure container.azure_blob_url
: The base URL for your Azure Blob Storage account.sas_token
: The SAS token for accessing Azure Blob Storage.from sql2azure.json_loader import load_to_json
# SQL Query
query = "SELECT [UserID],[FirstName],[LastName],[Email],[Age] FROM [SampleDB].[dbo].[Users]"
# SQL Server connection string
sql_conn = r"Driver={SQL Server};Server=<SQL_SERVER_HOST>;Database=<SQL_SERVER_DB>;Trusted_Connection=yes;"
# Azure Blob Storage parameters
container = "<AZURE_CONTAINER_NAME>"
folder_json = "/data/json/"
json_blob = "user.json"
azure_blob_url = "<AZURE_BLOB_URL>"
sas_token = "<SAS_TOKEN>"
# Load to JSON
json_status = load_to_json(query, sql_conn, container, folder_json, json_blob, azure_blob_url, sas_token)
print("JSON Upload Status:", json_status)
To load data to CSV format, use the load_to_csv
function. It has the same parameters as load_to_json
, but the data will be saved in CSV format instead of JSON.
from sql2azure.csv_loader import load_to_csv
# SQL Query
query = "SELECT [UserID],[FirstName],[LastName],[Email],[Age] FROM [SampleDB].[dbo].[Users]"
# SQL Server connection string
sql_conn = r"Driver={SQL Server};Server=<SQL_SERVER_HOST>;Database=<SQL_SERVER_DB>;Trusted_Connection=yes;"
# Azure Blob Storage parameters
container = "<AZURE_CONTAINER_NAME>"
folder_csv = "/data/csv/"
csv_blob = "user.csv"
azure_blob_url = "<AZURE_BLOB_URL>"
sas_token = "<SAS_TOKEN>"
# Load to CSV
csv_status = load_to_csv(query, sql_conn, container, folder_csv, csv_blob, azure_blob_url, sas_token)
print("CSV Upload Status:", csv_status)
Here’s a simple demo of how to use both functions:
from sql2azure import json_loader
from sql2azure import csv_loader
# SQL Query
query = "SELECT [UserID],[FirstName],[LastName],[Email],[Age] FROM [SampleDB].[dbo].[Users]"
# SQL Server connection string
sql_conn = r"Driver={SQL Server};Server=<SQL_SERVER_HOST>;Database=<SQL_SERVER_DB>;Trusted_Connection=yes;"
# Azure Blob Storage parameters
container = "<AZURE_CONTAINER_NAME>"
folder_json = "/data/json/"
folder_csv = "/data/csv/"
json_blob = "user.json"
csv_blob = "user.csv"
azure_blob_url = "<AZURE_BLOB_URL>"
sas_token = "<SAS_TOKEN>"
# Load to JSON
json_status = json_loader.load_to_json(query, sql_conn, container, folder_json, json_blob, azure_blob_url, sas_token)
print("JSON Upload Status:", json_status)
# Load to CSV
csv_status = csv_loader.load_to_csv(query, sql_conn, container, folder_csv, csv_blob, azure_blob_url, sas_token)
print("CSV Upload Status:", csv_status)
The status returned from both functions will provide the result of the operation. For a successful operation, it will contain:
status
: "success"
rows_uploaded
: The number of rows uploaded to the Azure container.If an error occurs, the status will be "error"
, and an error message will be included.
© 2024 Ajith D. All rights reserved.
This project is licensed under the MIT License - see the LICENSE file for details.
FAQs
Load data from an on-prem SQL database to Azure Storage as JSON or CSV.
We found that SQL2Azure demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
PyPI now supports iOS and Android wheels, making it easier for Python developers to distribute mobile packages.
Security News
Create React App is officially deprecated due to React 19 issues and lack of maintenance—developers should switch to Vite or other modern alternatives.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.