DiCE  by interpretml

Counterfactual explanation SDK for ML models

Created 6 years ago
1,446 stars

Top 28.3% on SourcePulse

GitHubView on GitHub
Project Summary

DiCE provides diverse counterfactual explanations for any machine learning model, enabling users to understand "what-if" scenarios and actionable recourse. It is designed for ML model developers, decision subjects, and decision-makers seeking interpretable and truthful explanations.

How It Works

DiCE frames counterfactual explanation generation as an optimization problem, similar to finding adversarial examples. It aims to find feature-perturbed versions of an input that alter the model's prediction while ensuring diversity and feasibility of the changes. The library supports tunable parameters for diversity and proximity, as well as constraints on feature modifications to ensure practical relevance.

Quick Start & Requirements

  • Install via pip: pip install dice-ml or conda install -c conda-forge dice-ml.
  • For the latest version, clone the repo and run pip install -e ..
  • Requires Python 3+.
  • Supports TensorFlow 2, scikit-learn, and PyTorch models.
  • Example notebooks and documentation are available: Docs, Example Notebooks.

Highlighted Details

  • Generates diverse sets of counterfactual explanations.
  • Supports feature constraints (e.g., features_to_vary, permitted_range).
  • Allows tuning of proximity and diversity weights.
  • Can work with pre-trained models without requiring the original training data.
  • Offers model-agnostic and gradient-based explanation methods.

Maintenance & Community

  • Active development with contributions from Microsoft Research.
  • Community engagement via GitHub issues and pull requests.
  • Follows the Microsoft Open Source Code of Conduct.

Licensing & Compatibility

  • MIT License.
  • Permissive for commercial use and integration with closed-source applications.

Limitations & Caveats

  • Currently lacks support for generating explanations in natural language.
  • Planned features include evaluating feature attribution methods and better feasibility constraints.
Health Check
Last Commit

2 months ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
0
Star History
11 stars in the last 30 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), Travis Addair Travis Addair(Cofounder of Predibase), and
4 more.

alibi by SeldonIO

0.1%
3k
Python library for ML model inspection and interpretation
Created 6 years ago
Updated 14 hours ago
Starred by Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), Gabriel Almeida Gabriel Almeida(Cofounder of Langflow), and
5 more.

lit by PAIR-code

0.1%
4k
Interactive ML model analysis tool for understanding model behavior
Created 5 years ago
Updated 3 weeks ago
Feedback? Help us improve.