Optimal Feature Selection for Decision Robustness in Bayesian Networks

YooJung Choi, Adnan Darwiche, and Guy Van den Broeck.
In Proceedings of the 26th International Joint Conference on Artificial Intelligence (IJCAI), 2017

PDF  BibTex 

Abstract

In many applications, one can define a large set of features to support the classification task at hand. At test time, however, these become prohibitively expensive to evaluate, and only a small subset of features is used, often selected for their information-theoretic value. For threshold-based, Naive Bayes classifiers, recent work has suggested selecting features that maximize the expected robustness of the classifier, that is, the expected probability it maintains its decision after seeing more features. We propose the first algorithm to compute this expected same-decision probability for general Bayesian network classifiers, based on compiling the network into a tractable circuit representation. Moreover, we develop a search algorithm for optimal feature selection that utilizes efficient incremental circuit modifications. Experiments on Naive Bayes, as well as more general networks, show the efficacy and distinct behavior of this decision-making approach.

Citation

@inproceedings{ChoiIJCAI17,
  author = {Choi, YooJung and Darwiche, Adnan and Van den Broeck, Guy},
  title={Optimal Feature Selection for Decision Robustness in Bayesian Networks},
  booktitle = {Proceedings of the 26th International Joint Conference on Artificial Intelligence (IJCAI)},
  month = Aug,
  year={2017},
}

Preliminary version appeared in the IJCAI 2017 Workshop on Logical Foundations for Uncertainty and Machine Learning.