Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

syntherela

Package Overview
Dependencies
Maintainers
2
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

syntherela

SyntheRela - Synthetic Relational Data Generation Benchmark

  • 0.0.3
  • PyPI
  • Socket score

Maintainers
2

SyntheRela - Synthetic Relational Data Generation Benchmark

PyPI MIT License Paper URL PyPI pyversions

Our paper Benchmarking the Fidelity and Utility of Synthetic Relational Data is available on arxiv.

Installation

To install only the benchmark package, run the following command:

pip install syntherela

Replicating the paper's results

We divide the reproducibility of the experiments into two parts: the generation of synthetic data and the evaluation of the generated data. The following sections describe how to reproduce the experiments for each part.

To reproduce some of the figures the synthetic data needs to be downloaded first. The tables can be reproduced with the results provided in the repository or by re-running the benchmark.

First, create a .env file in the root of the project with the path to the root of the project. Copy .env.example, rename it to .env and update the path.

Download synthetic data and results

The data and results can be downloaded and extracted with the below script, or are available on google drive here.

conda activate reproduce_benchmark
./experiments/reproducibility/download_data_and_results.sh

Evaluation of synthetic data

To run the benchmark and get the results of the metrics, run:

conda activate reproduce_benchmark
./experiments/reproducibility/evaluate_relational.sh

./experiments/reproducibility/evaluate_tabular.sh

./experiments/reproducibility/evaluate_utility.sh

Generation of synthetic data

Depending on the synthetic data generation method a separate pythone environment is needed. The instruction for installing the required environment for each method is provided in docs/INSTALLATION.md.

After installing the required environment, the synthetic data can be generated by running the following commands:

conda activate reproduce_benchmark
./experiments/reproducibility/generation/generate_sdv.sh

conda activate rctgan
./experiments/reproducibility/generation/generate_rctgan.sh

conda activate realtabformer
./experiments/reproducibility/generation/generate_realtabformer.sh

conda activate tabular
./experiments/reproducibility/generation/generate_tabular.sh

conda activate gretel
python experiments/generation/gretel/generate_gretel.py --connection-uid  <connection-uid> --model lstm
python experiments/generation/gretel/generate_gretel.py --connection-uid  <connection-uid> --model actgan

cd experiments/generation/clavaddpm
./generate_clavaddpm.sh <dataset-name> <real-data-path> <synthetic-data-path>  

To generate data with MOSTLYAI, insructions are provided in experiments/generation/mostlyai/README.md
Further instructions for GRETELAI are provided in experiments/generation/gretel/README.md.

Visualising Results

To visualize results, after running the benchmark you can run the below script. The figures will be saved to results/figures/:

conda activate reproduce_benchmark
./experiments/reproducibility/generate_figures.sh

Reproducing Tables

To reproduce the tables you can run the below script. The tables will be saved as .tex files in results/tables/:

conda activate reproduce_benchmark
./experiments/reproducibility/generate_tables.sh

Adding a new metric

The documentation for adding a new metric can be found in docs/ADDING_A_METRIC.md.

Synthetic Data Methods

Open Source Methods

* Denotes the method does not have a public implementation available.

Commercial Providers

A list of commercial synthetic relational data providers is available in docs/SYNTHETIC_DATA_TOOLS.md.

Conflicts of Interest

The authors declare no conflict of interest and are not associated with any of the evaluated commercial synthetic data providers.

Citation

If you use SyntheRela in your work, please cite our paper:

@misc{hudovernik2024benchmarkingsyntheticrelationaldata,
      title={Benchmarking the Fidelity and Utility of Synthetic Relational Data}, 
      author={Valter Hudovernik and Martin Jurkovič and Erik Štrumbelj},
      year={2024},
      eprint={2410.03411},
      archivePrefix={arXiv},
      primaryClass={cs.DB},
      url={https://arxiv.org/abs/2410.03411}, 
}

Keywords

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc