ShapShift: Explaining Model Prediction Shifts with Subgroup Conditional Shapley Values

📰 ArXiv cs.AI

arXiv:2604.11200v1 Announce Type: cross Abstract: Changes in input distribution can induce shifts in the average predictions of machine learning models. Such prediction shifts may impact downstream business outcomes (e.g. a bank's loan approval rate), so understanding their causes can be crucial. We propose \ours{}: a Shapley value method for attributing prediction shifts to changes in the conditional probabilities of interpretable subgroups of data, where these subgroups are defined by the stru

Published 14 Apr 2026
Read full paper → ← Back to Reads