Security News
tea.xyz Spam Plagues npm and RubyGems Package Registries
Tea.xyz, a crypto project aimed at rewarding open source contributions, is once again facing backlash due to an influx of spam packages flooding public package registries.
apache-airflow-providers-kettle
A simple Apache Airflow Kettle Operator that can invoke jobs and transformations for Linux based systems.
Readme
KettleOperator which consists of KettleRunJobOperator and KettleRunTransformationOperator responsible for running Hitachi Vantara's PDI (Pentaho Data Integration) jobs and transformations from .kjb and .ktr files.
Currently, there's no support for CarteServer and the only way to make it work is to make sure that PDI is deployed within the same container as your Airflow installation and the operator is deployed locally (no repository support either, yet).
Installation can be done via pip, and the package itself will be installed in your site-packages directory (just like any other pip installation).
python -m site
to find out where are your site-packages installed. This package will be located in the kettle_provider
directory.
pip install apache-airflow-providers-kettle
To use the Operators you first have to import them within your Airflow .py files (presumably your DAG).
from kettle_provider.operators.kettle_operator import KettleRunJobOperator, KettleRunTransformationOperator
The Operators can then be used just like any other.
run_job = KettleRunJobOperator(
task_id='kettle-run-job',
file='test-job.kjb',
params={
'test-parameter-1': 'test-value-1',
'test-parameter'2': 'test-value-2',
},
)
run_transformation = KettleRunTransformationOperator(
task_id='kettle-run-transformation',
file='test-job.ktr',
params={
'test-parameter-1': 'test-value-1',
'test-parameter'2': 'test-value-2',
},
)
Below are the parameters you can use when defining the tasks with their default values.
Below list excludes base parameters inherited from BaseOperator class (such as task_id
, etc.)
KettleRunJobOperator(
pdipath: str = '/opt/pentaho/data-integration/' # PDI installation
filepath: str = '/opt/pentaho/data-integration/jobs/' # PDI jobs directory
file: str | None = None # .kjb file to run
logfile: str = '/opt/pentaho/data-integration/logs/pdi.kitchen.log', # logfile for kitchen runs
maxloglines: int = 0, # max log lines for kitchen logfile (0 = no limit)
maxlogtimeout: int = 0, # max log age in seconds for kitchen logfile (0 = no limit)
loglevel: str = 'Basic', # log level (Basic, Detailed, Debug, Rowlevel, Error, Nothing)
params: dict[str, str] | None = None, # dictionary of parameters
output_encoding: str = 'utf-8', # output encoding for exit commands
**kwargs
)
KettleRunTransformationOperator(
pdipath: str = '/opt/pentaho/data-integration/' # PDI installation
filepath: str = '/opt/pentaho/data-integration/transformations/' # PDI jobs directory
file: str | None = None # .ktr file to run
logfile: str = '/opt/pentaho/data-integration/logs/pdi.pan.log', # logfile for pan runs
maxloglines: int = 0, # max log lines for kitchen logfile (0 = no limit)
maxlogtimeout: int = 0, # max log age in seconds for kitchen logfile (0 = no limit)
loglevel: str = 'Basic', # log level (Basic, Detailed, Debug, Rowlevel, Error, Nothing)
params: dict[str, str] | None = None, # dictionary of parameters
output_encoding: str = 'utf-8', # output encoding for exit commands
**kwargs
)
FAQs
A simple Apache Airflow Kettle Operator that can invoke jobs and transformations for Linux based systems.
We found that apache-airflow-providers-kettle demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Tea.xyz, a crypto project aimed at rewarding open source contributions, is once again facing backlash due to an influx of spam packages flooding public package registries.
Security News
As cyber threats become more autonomous, AI-powered defenses are crucial for businesses to stay ahead of attackers who can exploit software vulnerabilities at scale.
Security News
UnitedHealth Group disclosed that the ransomware attack on Change Healthcare compromised protected health information for millions in the U.S., with estimated costs to the company expected to reach $1 billion.