DelphaDBManagement
Management API for Delpha to work around Data inside the Cassandra Cluster
This repository is python based yet, to be open to other languages (Java, JavaScript, ...)
Module to handle DB Managmenet on our Cassandra Cluster.
Security Objectives
Need a really secure way to handle data :
Private Keys will be generated by Delpha only and will be stored inside an authentication table inside Cassandra. Users will need to match a given private key to access the DBMS. This key will be bound to the dedicated organisation, and only one Private Key will be tagged as Admin and unlock all the DBMS access.
Installation
Requirements
Python : >3.8
cassandra-driver
simple-salesforce
requests
boto3
Install Dependencies
Common
pip install -r requirements.txt
Salesforce Manager
Salesforce Data Manager using Salesforce API.
- Parameter : instance_name: Salesforce Instance name
- Parameter : client_id: Consumer ID for authorized app
- Parameter : client_secret: Consumer Secret for authorized app
- Parameter : username: usernam to login to Instance of Salesforce
- Parameter : param password: password to login to Instance of Salesforce
- Parameter : param security_token: User's security token
sf_manager = SalesforceManager(instance_name, consumer_key, consumer_secret, salesforce_username, salesforce_pwd, personnal_token)
sf_manager.help()
res, size = sf_manager.query("SELECT Name from Contact")
Cassandra Manager
Cassandra Database manager : CassandraManager.
- Parameter : pem_file_path: String path of the .pem file to use for 2FA connection
- Parameter : config_file_path: String path of the .yalk file to use to get credentials
- Parameter : key_space: String session keyspace to set for the Cluster
cass_cluster = CassandraManager(key_file_path, pem_file_path, "delpha_actions")
cass_cluster.execute("SELECT * FROM actions_conv_by_user").all()
Dictionnary part of Cassandra Database manager : DictionaryManager.
manager = DictionaryManager(cass_cluster)
manager.list_tables
['keyspace1', 'keyspace2', ...]
AppFlow Manager
Delpha AWS data Handler (AppFlows) : AppFlowManager.
- Parameter : key: String AWS key
- Parameter : secret: String AWS secret
- Parameter : flow_type: String Flow type to handle ['contact', 'account']
- Parameter : bucket_name: String to use specific bucket
- Parameter : region: String AWS region
handler = AppFlowManager(aws_key_id, aws_secret, bucket_name,'contact')
df = handler.get_flow_parquet_data()
flow_status, flow_last_execution_record = handler.start_flow(flow_name)
handler.ensure_spark_format(flow_name)
handler.get_last_flow_id(flow_name)
S3 Manager
Delpha AWS data Handler (AppFlows) : S3Manager.
- Parameter : key: String AWS key
- Parameter : secret: String AWS secret
- Parameter : region: String AWS region
S3 = S3Manager(aws_key_id, aws_secret, region)
df = S3.buckets
flow_status, flow_last_execution_record = handler.start_flow(flow_name)
handler.ensure_spark_format(flow_name)
handler.get_last_flow_id(flow_name)
How to - Pip publish package
python setup.py sdist bdist_wheel
twine upload dist/*
License