Discover and explore top open-source AI tools and projects—updated daily.
Privacy auditing library for assessing data privacy risks in ML models
Top 50.0% on SourcePulse
Privacy Meter is an open-source library designed to audit data privacy in statistical and machine learning algorithms, targeting researchers and practitioners in sensitive domains like healthcare and finance. It provides quantitative assessments of privacy risks using state-of-the-art membership inference attacks, helping organizations comply with data protection regulations like GDPR.
How It Works
Privacy Meter employs a configuration-driven approach using YAML files to specify models, datasets, and privacy games. It supports multiple auditing methodologies, including membership inference, range membership inference, and dataset usage cardinality inference, to detect information leakage through training points, vicinity of training points, and dataset usage percentages. The library also allows auditing differential privacy (DP) lower bounds.
Quick Start & Requirements
pip install -r requirements.txt
or conda env create -f env.yaml
.models_metadata.json
file are required.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
4 months ago
Inactive