Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
PostgreSQL testing utility. Both Python 2.7 and 3.3+ are supported.
To install testgres
, run:
pip install testgres
We encourage you to use virtualenv
for your testing environment.
Note: by default testgres runs
initdb
,pg_ctl
,psql
provided byPATH
.
There are several ways to specify a custom postgres installation:
PG_CONFIG
environment variable pointing to the pg_config
executable;PG_BIN
environment variable pointing to the directory with executable files.Example:
export PG_BIN=$HOME/pg_10/bin
python my_tests.py
Here is an example of what you can do with testgres
:
# create a node with random name, port, etc
with testgres.get_new_node() as node:
# run inidb
node.init()
# start PostgreSQL
node.start()
# execute a query in a default DB
print(node.execute('select 1'))
# ... node stops and its files are about to be removed
There are four API methods for running queries:
Command | Description |
---|---|
node.psql(query, ...) | Runs query via psql command and returns tuple (error code, stdout, stderr) . |
node.safe_psql(query, ...) | Same as psql() except that it returns only stdout . If an error occurs during the execution, an exception will be thrown. |
node.execute(query, ...) | Connects to PostgreSQL using psycopg2 or pg8000 (depends on which one is installed in your system) and returns two-dimensional array with data. |
node.connect(dbname, ...) | Returns connection wrapper (NodeConnection ) capable of running several queries within a single transaction. |
The last one is the most powerful: you can use begin(isolation_level)
, commit()
and rollback()
:
with node.connect() as con:
con.begin('serializable')
print(con.execute('select %s', 1))
con.rollback()
By default, cleanup()
removes all temporary files (DB files, logs etc) that were created by testgres' API methods.
If you'd like to keep logs, execute configure_testgres(node_cleanup_full=False)
before running any tests.
Note: context managers (aka
with
) callstop()
andcleanup()
automatically.
testgres
supports python logging,
which means that you can aggregate logs from several nodes into one file:
import logging
# write everything to /tmp/testgres.log
logging.basicConfig(filename='/tmp/testgres.log')
# enable logging, and create two different nodes
testgres.configure_testgres(use_python_logging=True)
node1 = testgres.get_new_node().init().start()
node2 = testgres.get_new_node().init().start()
# execute a few queries
node1.execute('select 1')
node2.execute('select 2')
# disable logging
testgres.configure_testgres(use_python_logging=False)
Look at tests/test_simple.py
file for a complete example of the logging
configuration.
It's quite easy to create a backup and start a new replica:
with testgres.get_new_node('master') as master:
master.init().start()
# create a backup
with master.backup() as backup:
# create and start a new replica
replica = backup.spawn_replica('replica').start()
# catch up with master node
replica.catchup()
# execute a dummy query
print(replica.execute('postgres', 'select 1'))
testgres
is also capable of running benchmarks using pgbench
:
with testgres.get_new_node('master') as master:
# start a new node
master.init().start()
# initialize default DB and run bench for 10 seconds
res = master.pgbench_init(scale=2).pgbench_run(time=10)
print(res)
It's often useful to extend default configuration provided by testgres
.
testgres
has default_conf()
function that helps control some basic
options. The append_conf()
function can be used to add custom
lines to configuration lines:
ext_conf = "shared_preload_libraries = 'postgres_fdw'"
# initialize a new node
with testgres.get_new_node().init() as master:
# ... do something ...
# reset main config file
master.default_conf(fsync=True,
allow_streaming=True)
# add a new config line
master.append_conf('postgresql.conf', ext_conf)
Note that default_conf()
is called by init()
function; both of them overwrite
the configuration file, which means that they should be called before append_conf()
.
Testgres supports the creation of PostgreSQL nodes on a remote host. This is useful when you want to run distributed tests involving multiple nodes spread across different machines.
To use this feature, you need to use the RemoteOperations class. This feature is only supported with Linux. Here is an example of how you might set this up:
from testgres import ConnectionParams, RemoteOperations, TestgresConfig, get_remote_node
# Set up connection params
conn_params = ConnectionParams(
host='your_host', # replace with your host
username='user_name', # replace with your username
ssh_key='path_to_ssh_key' # replace with your SSH key path
)
os_ops = RemoteOperations(conn_params)
# Add remote testgres config before test
TestgresConfig.set_os_ops(os_ops=os_ops)
# Proceed with your test
def test_basic_query(self):
with get_remote_node(conn_params=conn_params) as node:
node.init().start()
res = node.execute('SELECT 1')
self.assertEqual(res, [(1,)])
FAQs
Testing utility for PostgreSQL and its extensions
We found that testgres demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 6 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.