Go to JKU Homepage
institute-for-application-oriented-knowledge-processing
What's that?

Institutes, schools, other departments, and programs create their own web content and menus.

To help you better navigate the site, see here where you are at the moment.

Local Rule-Based Explanations

Supervisor: Johannes Fürnkranz

 

Motivation:

LIME [1] is one of the best-known approaches for interpreting a learned black-box model such as a deep neural network. Its key idea is to learn a simple local model such as a linear regression model in the local neighborhood of a point that needs to be explained. Single conjunctive rules could also be tried in such a setting.

Objective:

The task of this thesis is to implement and explore this approach using inductively learned rules as a base model, starting from simple strategies such as flipping each binary attribute, to more elaborate strategies for forming a neighborhood, such as the one proposed in [2].

Literature

[1] Marco Túlio Ribeiro, Sameer Singh, Carlos Guestrin: "Why Should I Trust You?": Explaining the Predictions of Any Classifier, opens an external URL in a new window. KDD 2016: 1135-1144.
[2] Riccardo Guidotti, Anna Monreale, Salvatore Ruggieri, Dino Pedreschi, Franco Turini, Fosca Giannotti: Local Rule-Based Explanations of Black Box Decision Systems, opens an external URL in a new window. CoRR abs/1805.10820 (2018)