Abstract
The application of machine learning in decision-making has become widespread. However, investigating the impact of interventions in decision-making tasks poses significant challenges, particularly in scenarios with limited observable data. Specifically, when interventions are introduced into the decision-making system, their effects extend beyond specific variables, creating a cascade of causal relationships that influence the entire decision-making process. Consequently, traditional decision-making methods based on static data prove insufficient and fail to consider the comprehensive data post-intervention. In response to these challenges, this paper proposes a framework for simulating interventions in decision-making tasks. By uncovering causal relationships within the dataset, the framework infers post-intervention data. Two inference methods are also designed within the framework: direct computation of weights and model-fitted weights. We employ our proposed framework and algorithms to simulate the data changes under two scenarios: PCOS Prediction and The Law School Admissions. By integrating validated knowledge, the experimental results demonstrate that our framework more realistically simulates the intervention process, providing more reliable outcomes for machine learning tasks, compared with the static decision-making with interventions.