Python package for concise, transparent, and accurate predictive modeling. All sklearn-compatible and easy to use.
docs β’ imodels overview β’ demo notebooks
imodels overview
Modern machine-learning models are increasingly complex, often making them difficult to interpret. This package provides a simple interface for fitting and using state-of-the-art interpretable models, all compatible with scikit-learn. These models can often replace black-box models (e.g. random forests) with simpler models (e.g. rule lists) while improving interpretability and computational efficiency, all without sacrificing predictive accuracy! Simply import a classifier or regressor and use the fit
and predict
methods, same as standard scikit-learn models.
from imodels import BoostedRulesClassifier, BayesianRuleListClassifier, GreedyRuleListClassifier, SkopeRulesClassifier # see more models below
from imodels import SLIMRegressor, RuleFitRegressor
model = BoostedRulesClassifier() # initialize a model
model.fit(X_train, y_train) # fit model
preds = model.predict(X_test) # discrete predictions: shape is (n_test, 1)
preds_proba = model.predict_proba(X_test) # predicted probabilities: shape is (n_test, n_classes)
print(model) # print the rule-based model
-----------------------------
# the model consists of the following 3 rules
# if X1 > 5: then 80.5% risk
# else if X2 > 5: then 40% risk
# else: 10% risk
Installation
Install with pip install imodels
(see here for help).
Supported models
Model | Reference | Description |
---|---|---|
Rulefit rule set | ποΈ, π, π | Extracts rules from decision trees then fits a sparse linear model with them |
Skope rule set | ποΈ, π | Extracts rules from gradient-boosted trees, deduplicates them, then forms a linear combination of them based on their OOB precision |
Boosted rule set | ποΈ, π, π | Sequentially learns a set of rules with Adaboost |
Slipper rule set | ποΈ, π | Sequentially learns a set of rules with SLIPPER |
Bayesian rule set | ποΈ, π, π | Finds concise rule set with Bayesian sampling (slow) |
Optimal rule list | ποΈ, π, π | Learns succinct rule list using global optimization for sparsity (CORELS) |
Bayesian rule list | ποΈ, π, π | Learns compact rule list distribution with Bayesian sampling (slow) |
Greedy rule list | ποΈ, π | Uses CART to learn a list (only a single path), rather than a decision tree |
OneR rule list | ποΈ, π | Learns rule list restricted to only one feature |
Optimal rule tree | ποΈ, π, π | Learns succinct tree using global optimization for sparsity (GOSDT) |
Greedy rule tree | ποΈ, π, π | Greedily learns tree using CART |
C4.5 rule tree | ποΈ, π, π | Greedily learns tree using C4.5 |
Iterative random forest | ποΈ, π, π | (In progress) Repeatedly fit random forest, giving features with high importance a higher chance of being selected |
Sparse integer linear model | ποΈ, π | Sparse linear model with integer coefficients |
Sapling Sums | ποΈ, π | Sum of small trees with very few total rules (SAPS) |
Shrunk trees | ποΈ, π | Sum of small trees with very few total rules (SAPS) |
More models | β | (Coming soon!) Popular rule sets including Lightweight Rule Induction, MLRules |
Docs ποΈ, Reference code implementation π, Research paper π
Also see our simple function for explaining classification errors.
Fit an interpretable model to explain a previous model's errors (ex. in this notebookπ).Also see our fast and effective discretizers for data preprocessing.
Discretizer | Reference | Description |
---|---|---|
MDLP | ποΈ, π, π | Discretize using entropy minimization heuristic |
Simple | ποΈ, π | Simple KBins discretization |
Random Forest | ποΈ | Discretize into bins based on random forest split popularity |
The final form of the above models takes one of the following forms, which aim to be simultaneously simple to understand and highly predictive:
Rule set | Rule list | Rule tree | Algebraic models |
---|---|---|---|
![]() |
![]() |
![]() |
![]() |
Different models and algorithms vary not only in their final form but also in different choices made during modeling. In particular, many models differ in the 3 steps given by the table below.
ex. RuleFit and SkopeRules
RuleFit and SkopeRules differ only in the way they prune rules: RuleFit uses a linear model whereas SkopeRules heuristically deduplicates rules sharing overlap.ex. Bayesian rule lists and greedy rule lists
Bayesian rule lists and greedy rule lists differ in how they select rules; bayesian rule lists perform a global optimization over possible rule lists while Greedy rule lists pick splits sequentially to maximize a given criterion.ex. FPSkope and SkopeRules
FPSkope and SkopeRules differ only in the way they generate candidate rules: FPSkope uses FPgrowth whereas SkopeRules extracts rules from decision trees.See the docs for individual models for futher descriptions.
Rule candidate generation | Rule selection | Rule postprocessing |
---|---|---|
![]() |
![]() |
![]() |
The code here contains many useful and customizable functions for rule-based learning in the util folder. This includes functions / classes for rule deduplication, rule screening, and converting between trees, rulesets, and neural networks.
Demo notebooks
Demos are contained in the notebooks folder.
imodels demo
Shows how to fit, predict, and visualize with different interpretable modelsimodels colab demo 
Shows how to fit, predict, and visualize with different interpretable models
clinical decision rule notebook
Shows an example of usingimodels
for deriving a clinical decision rule
posthoc analysis
We also include some demos of posthoc analysis, which occurs after fitting models: posthoc.ipynb shows different simple analyses to interpret a trained model and uncertainty.ipynb contains basic code to get uncertainty estimates for a modelSupport for different tasks
Different models support different machine-learning tasks. Current support for different models is given below (each of these models can be imported directly from imodels (e.g. from imodels import RuleFitClassifier
):
Model | Binary classification | Regression |
---|---|---|
Rulefit rule set | RuleFitClassifier | RuleFitRegressor |
Skope rule set | SkopeRulesClassifier | |
Boosted rule set | BoostedRulesClassifier | |
SLIPPER rule set | SlipperClassifier | |
Bayesian rule set | BayesianRuleSetClassifier | |
Optimal rule list (CORELS) | OptimalRuleListClassifier | |
Bayesian rule list | BayesianRuleListClassifier | |
Greedy rule list | GreedyRuleListClassifier | |
OneR rule list | OneRClassifier | |
Optimal rule tree (GOSDT) | OptimalTreeClassifier | |
Greedy rule tree (CART) | GreedyTreeClassifier | GreedyTreeRegressor |
C4.5 rule tree | C45TreeClassifier | |
Iterative random forest | ||
Sparse integer linear model | SLIMClassifier | SLIMRegressor |
Sapling Sums (SAPS) | SaplingSumClassifier | SaplingSumRegressor |
References
Readings
Reference implementations (also linked above)
The code here heavily derives from the wonderful work of previous projects. We seek to to extract out, unify, and maintain key parts of these projects.- pycorels - by @fingoldin and the original CORELS team
- sklearn-expertsys - by @tmadl and @kenben based on original code by Ben Letham
- rulefit - by @christophM
- skope-rules - by the skope-rules team (including @ngoix, @floriangardin, @datajms, Bibi Ndiaye, Ronan Gautier)
- boa - by @wangtongada
Related packages
- gplearn: symbolic regression/classification
- pysr: fast symbolic regression
- pygam: generative additive models
- interpretml: boosting-based gam
- h20 ai: gams + glms (and more)
- optbinning: data discretization / scoring models
Updates
- For updates, star the repo, see this related repo, or follow @csinva_
- Please make sure to give authors of original methods / base implementations appropriate credit!
- Contributing: pull requests very welcome!
If it's useful for you, please star/cite the package, and make sure to give authors of original methods / base implementations credit:
@software{
imodels2021,
title = {{imodels: a python package for fitting interpretable models}},
journal = {Journal of Open Source Software}
publisher = {The Open Journal},
year = {2021},
author = {Singh, Chandan and Nasseri, Keyan and Tan, Yan Shuo and Tang, Tiffany and Yu, Bin},
volume = {6},
number = {61},
pages = {3192},
doi = {10.21105/joss.03192},
url = {https://doi.org/10.21105/joss.03192},
}
Expand source code
"""
.. include:: ../readme.md
"""
# Python `imodels` package for interpretable models compatible with scikit-learn.
# Github repo available [here](https://github.com/csinva/imodels)
# from .tree.iterative_random_forest.iterative_random_forest import IRFClassifier
# from .tree.optimal_classification_tree import OptimalTreeModel
from .tree.cart_wrapper import GreedyTreeClassifier, GreedyTreeRegressor
from .tree.gosdt.pygosdt import OptimalTreeClassifier
from .tree.c45_tree.c45_tree import C45TreeClassifier
from .tree.saps import SaplingSumRegressor, SaplingSumClassifier
from .tree.shrunk_tree import ShrunkTreeRegressor, ShrunkTreeClassifier, ShrunkTreeRegressorCV, ShrunkTreeClassifierCV
from .algebraic.slim import SLIMRegressor, SLIMClassifier
from .discretization.discretizer import RFDiscretizer, BasicDiscretizer
from .discretization.mdlp import MDLPDiscretizer, BRLDiscretizer
from .rule_list.bayesian_rule_list.bayesian_rule_list import BayesianRuleListClassifier
from .rule_list.corels_wrapper import OptimalRuleListClassifier
from .rule_list.greedy_rule_list import GreedyRuleListClassifier
from .rule_list.one_r import OneRClassifier
from .rule_set import boosted_rules
from .rule_set.boosted_rules import *
from .rule_set.boosted_rules import BoostedRulesClassifier
from .rule_set.brs import BayesianRuleSetClassifier
from .rule_set.fplasso import FPLassoRegressor, FPLassoClassifier
from .rule_set.fpskope import FPSkopeClassifier
from .rule_set.rule_fit import RuleFitRegressor, RuleFitClassifier
from .rule_set.skope_rules import SkopeRulesClassifier
from .rule_set.slipper import SlipperClassifier
from .util.explain_errors import explain_classification_errors
CLASSIFIERS = [BayesianRuleListClassifier, GreedyRuleListClassifier, SkopeRulesClassifier,
BoostedRulesClassifier, SLIMClassifier, SlipperClassifier, BayesianRuleSetClassifier,
C45TreeClassifier, OptimalTreeClassifier, OptimalRuleListClassifier, OneRClassifier,
SlipperClassifier,
SaplingSumClassifier] # , IRFClassifier
REGRESSORS = [RuleFitRegressor, SLIMRegressor, GreedyTreeClassifier, SaplingSumRegressor]
DISCRETIZERS = [RFDiscretizer, BasicDiscretizer, MDLPDiscretizer, BRLDiscretizer]
Sub-modules
algebraic
-
Generic class for models that take the form of algebraic equations (e.g. linear models).
discretization
experimental
rule_list
-
Generic class for models that take the form of a list of rules.
rule_set
-
Generic class for models that take the form of a set of (potentially overlapping) rules.
tree
-
Generic class for models that take the form of a tree of rules.
util
-
Shared utilities for implementing different interpretable models.