Security News
pnpm 10.0.0 Blocks Lifecycle Scripts by Default
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.
A package for database session management using sqlalchemy and snowpark libraries
This pip module simplifies the management of database sessions using SQLAlchemy and Snowpark. It allows users to create a session with Snowflake, PostgreSQL, and other supported databases using a connection string or environment variables.
This README.md
file provides comprehensive instructions for installing, setting up, and using the Sqlalchemy-Snowpark
module, ensuring users can easily establish connections to their data warehouses and use inbuilt methods to query the datawarehouse and manage results.
You can install the module using pip:
pip install sqlalchemy-snowpark
This Python module helps establish a database session to Snowflake using SQLAlchemy or using Snowpark python connector. It supports creating connections via a provided connection string or by using environment variables for credentials.
Requires DB_SOURCE
, USERNAME
, HOST
, PASSWORD
, ROLE
, WAREHOUSE
, and DATABASE
.
If you want to create a session using SQLAlchemy then set the following environment variables
export DB_ENGINE=sqlalchemy
and if you want to create a Snowpark session the set the following environment variables
export DB_ENGINE=snowpark
If you have a connection string, you can create a session like this:
from sqlalchemy_snowpark.connection import get_db_session
connection_string = "snowflake://user:password@account/database/schema?warehouse=warehouse&role=role"
session = get_db_session(snowflake_creds)
session.close()
Environment Variables The following environment variables are required if no connection string is provided:
Note : In case of snowpark session ( DB_ENGINE=snowpark ), only this option will work.
export SNOWFLAKE_USER={snowflake_username}
export SNOWFLAKE_PASSWORD={snowflake_password}
export SNOWFLAKE_ACCOUNT={snowflake_account}
export SNOWFLAKE_DATABASE={snowflake_database}
export SNOWFLAKE_SCHEMA={snowflake_schema}
export SNOWFLAKE_WAREHOUSE={snowflake_warehouse}
export SNOWFLAKE_ROLE={snowflake_role}
export SNOWFLAKE_HOST={snowflake_host}
from sqlalchemy_snowpark.connection import get_db_session
session = get_db_session()
If network policy is activated in the snowflake account and incoming ips are not allowed or restricted then need to whitelist our StepFunction IP :
Please follow the below steps for the same :
Navigate to the Admin->Security section by clicking on "Admin" in the left navigation panel
Switch to Network Rules. Create a new rule by clicking on + Network Rule button a. Name: SFN_RULE b. Choose Type: IPv4 and Mode: Ingress c. Under Identifiers -> Add IP 18.210.244.167
Switch to Network Policy. Create a new policy by clicking on + Network Policy button a. Name: SFN_POLICY b. Under Allowed Section & Under Select Rule Dropdown select SFN_RULE then click on Create button to create the policy. c. Click on the dotted icon(...) at the end of the policy name and click Activate to start the policy.
Navigate back to the worksheet and replace placeholder with the StepFunctions public IP address.
ALTER NETWORK POLICY SFN_POLICY SET ALLOWED_IP_LIST=('18.210.244.167')
Requires USERNAME
, HOST
, PASSWORD
, and DATABASE
.
### Direct Connection (Redshift in Public Subnet)
from sqlalchemy_snowpark.connector import get_db_session
from sqlalchemy.engine.url import URL
# Define the connection parameters
redshift_connection_string = URL.create(
drivername="redshift+redshift_connector", # The driver to use
username="your_username", # Your Redshift username
password="your_password", # Your Redshift password
host="your_redshift_cluster_host", # Redshift cluster endpoint
port=5439, # Default port for Redshift
database="your_database_name", # The name of your Redshift database
query={"sslmode": "verify-ca"} # Optional: to ensure the connection is encrypted
)
session = get_db_session(redshift_connection_string)
session.close()
Environment Variables The following environment variables are required if no connection string is provided:
export REDSHIFT_USERNAME={redshift_username}
export REDSHIFT_PASSWORD={redshift_password}
export REDSHIFT_HOST={redshift_host}
export REDSHIFT_DATABASE={redshift_database}
from sqlalchemy_snowpark.connection import get_db_session
session = get_db_session()
Requires USERNAME
, HOST
, PASSWORD
, and DATABASE
.
from sqlalchemy_snowpark.connection import get_db_session
postgresql_connection_string = f"postgresql+psycopg2://{username}:{password}@{host}:5432/{database}"
session = get_session(postgresql_connection_string)
session.close()
Environment Variables The following environment variables are required if no connection string is provided:
export POSTGRESQL_USERNAME={postgresql_username}
export POSTGRESQL_PASSWORD={postgresql_password}
export POSTGRESQL_HOST={postgresql_host}
export POSTGRESQL_DATABASE={postgresql_database}
from sqlalchemy_snowpark.connection import get_db_session
session = get_db_session()
Once the session is established, you can interact with your data warehouse using most of the SQLAlchemy's ORM capabilities.
Usage
result = db_session.execute(query).fetchone()
# OR
result db_session.execute(query).fetchone(get_obj=True)
Usage
result = db_session.execute(query).fetchmany(count=10, get_obj=False)
# OR
result db_session.execute(query).fetchone(count=10, get_obj=True)
Usage
result = db_session.execute(query).fetchall()
# OR
result db_session.execute(query).fetchall(get_obj=True)
Usage
result = db_session.execute(query).mappings_one()
Usage
result = db_session.execute(query).mappings_all()
db_session.close()
Usage
db_session.execute({query}).fetchone()
8.commit() Commits the transaction
Usage
db_session.commit()
Usage
db_session.rollback()
Note: Model class represents the sqlachemy model class Usage
db_session.add(model_class={model_class}, data={data_in_dict_format})
Usage
db_session.create_table(cls=model_class, checkfirst=True)
Parameters:
i) when only one logical operator is defined, value should be a dictionary
ii) when multiple operators are defined, value should be a list of a single dictionary, where each key denotes another logical operator.
Usage
filter = {
"filter_1": f"='{filter_1_value}'",
"filter_2": f"='{filter_2_value}'",
"filter_3": f"!='{filter_3_value}'",
"filter_4": f"in ({filter_4_value}, {filter_5_value})",
}
result = db_session.query(
model_class=ModelClassName,
fields=["field_name"],
filter=filter,
limit=10,
offset=10
).fetchone()
Usage
update_filter = {
"filter_1": f"='{filter_1_value}'",
"filter_2": f"='{filter_2_value}'",
}
update_record = dict()
update_record["column_1"] = "value_1"
self.db_session.update(
model_class=ModelClassName,
data=update_record,
filter=update_filter,
)
Usage
db_session.delete(
model_class=ModelClassName,
filter={"column_1": f"='{value_1}'"},
)
The get_db_session
method prints exceptions to help identify issues during the connection process. Ensure that the provided connection details are accurate and the data warehouse is accessible.
This module simplifies the process of connecting to various data warehouses. Follow the setup instructions carefully, and refer to the examples for guidance on using the get_session
function. For further assistance, check the documentation or raise an issue on the project's GitHub repository.
FAQs
A package for database session management using sqlalchemy and snowpark libraries
We found that sqlalchemy-snowpark demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.
Product
Socket now supports uv.lock files to ensure consistent, secure dependency resolution for Python projects and enhance supply chain security.
Research
Security News
Socket researchers have discovered multiple malicious npm packages targeting Solana private keys, abusing Gmail to exfiltrate the data and drain Solana wallets.