This chapter examines developments in systematic reviewing and what they mean for the search process. It introduces recent methods of synthesis, including mixed methods reviews and realist syntheses. It goes on to consider how innovative review types affect how we plan and execute search strategies. The chapter explores the value of other question formats and why you should choose a method appropriate to your review question. It starts by examining acknowledged limitations of the conventional systematic review model, both within health and multiple disciplines in which systematic reviews are gaining traction.
Challenging the conventional systematic review model
Since the mid-1990s, the conventional systematic review model, which assumes a closely focused question, a comprehensive search and a focus on synthesising ‘stronger’ rather than ‘weaker’ evidence, has proved remarkably flexible and resilient (Petticrew, 2015). The model has survived transfer to other emerging domains for systematic reviews, such as education, management and environmental science. However, this model faces renewed challenges as decision-makers seek to address ever more complex questions and to explore complex socially-mediated interventions, requiring diverse data and approaches to synthesis, and consequent changes to how studies are identified and selected.
Petticrew characterises the current era as moving from ‘What works’ to ‘What happens’ (Petticrew, 2015). Specifically, new forms of knowledge synthesis are required to explore people's perceptions of their situation, to identify underlying theories to explain how an intervention, policy or programme works, or to understand what makes an intervention, policy or programme more or less likely to be implemented (Tricco et al., 2016a). This, in turn, requires ‘other types of search approaches (e.g. snowballing of articles, focusing on identification of key theories)’ (Tricco et al., 2016b, 20) as well as others covered in this chapter.
The time and cost associated with producing systematic reviews offer a further challenge to their use in supporting decision-making (Tricco et al., 2017). Technical aspects of the search process are challenged by the rapid review movement which has reinterpreted ‘How far should you go?’ (Ogilvie et al., 2005) as ‘How little searching is enough?’ (Booth, 2010). One inevitable result is that three assumptions of the ‘systematic review catechism’ – the question, the comprehensive search and the privileging of rigour – are all receiving critical scrutiny.