Abstract
Intervention evaluation is critical for determining the value of health interventions; however, real-world implementation frequently falls short of achieving anticipated large-scale impacts. This evidence-to-practice gap often arises from challenges in capturing the complexity inherent in intervention implementation. This complexity may stem from the intervention itself, the dynamic and interrelated processes of dissemination, implementation, and sustainment, or the constraints of real-world settings characterized by interconnected systems. Integrating implementation science, which employs theories, models, and frameworks to understand the adoption and integration of evidence-based interventions, with systems science, which provides tools to model and analyze complex systems, offers a promising pathway for addressing these challenges. However, practical guidance on combining these approaches to evaluate dynamic interactions between interventions and implementation contexts, while simultaneously capturing system-level learnings, remains limited. In this methodological musing, we reflect on our experience integrating systems and implementation science to develop a conceptual and quantitative model for scenario evaluation of a maternal health service delivery redesign initiative in Kakamega, Kenya. We use four research objectives as a touchstone for organizing our reflections, explicated by three steps of an evaluation process: (i) developing a qualitative systems model using implementation frameworks and causal loop diagrams; (ii) constructing and parameterizing a quantitative computational model; and (iii) conducting scenario analyses to explore "what-if" strategies and inform adaptive planning. These reflections highlight the potential strengths of an integrated approach and offer practical considerations for researchers and practitioners evaluating complex health interventions through quantitative modeling and scenario development.