Interpreting Predictions from Complex Models
Interpreting Predictions from Complex Models


Understanding why a model made a certain prediction is crucial in many applications. However, with large modern datasets the best accuracy is often achieved by complex models even experts struggle to interpret, such as ensemble or deep learning models. This creates a tension between accuracy and interpretability. In response, a variety of methods have recently been proposed to help users interpret the predictions of complex models. Here, we present a unified framework for interpreting predictions, namely SHAP (SHapley Additive exPlanations), which assigns each feature an importance for a particular prediction. SHAP comes with strong theoretical guarantees and is applicable to any model.

Using SHAP we present strict improvements to both LIME (a popular model agnostic method), and to feature attribution in tree ensemble methods (such as gradient boosting trees or random forests). Current attribution methods can be inconsistent, which means changing the model to rely more on a given feature can actually decrease the importance assigned to that feature. In contrast, SHAP values are guaranteed to always be consistent and locally accurate. Since SHAP strictly improves on the current state-of-the-art, it impacts any current user of tree ensemble methods, or model agnostic explanation methods.


Scott Lundberg is a Ph.D. candidate at the University of Washington's Paul Allen School of Computer Science and Engineering, working with Professor Su-In Lee at the intersection of machine learning and health/biology. Before coming to UW he received his B.S and M.S from Colorado State University in 2008, and then worked as a research scientist for five years with Numerica Corporation. Scott is currently supported by a NSF Graduate Research Fellowship, and is seeking to improve health and medicine through AI.

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from Youtube
Consent to display content from Vimeo
Google Maps
Consent to display content from Google