Skip to main content Accessibility help

A Framework to Assess the Quality of Non-traditional Articles in the Field of Disaster Response and Management

  • Mary Hall (a1), Chris Cartwright (a1) and Andrew C. K. Lee (a1)



While carrying out a scoping review of earthquake response, we found that there is no universal standardized approach for assessing the quality of disaster evidence, much of which is variable or not peer reviewed. With the lack of a framework to ascertain the value and validity of this literature, there is a danger that valuable insights may be lost. We propose a theoretical framework that may, with further validation, address this gap.


Existing frameworks – quality of reporting of meta-analyses (QUORUM), meta-analysis of observational studies in epidemiology (MOOSE), the Cochrane assessment of bias, Critical Appraisal Skills Programme (CASP) checklists, strengthening the reporting of observation studies in epidemiology (STROBE), and consensus guidelines on reports of field interventions in disasters and emergencies (CONFIDE)–were analyzed to identify key domains of quality. Supporting statements, based on these existing frameworks were developed for each domain to form an overall theoretical framework of quality. This was piloted on a data set of publications from a separate scoping review.


Four domains of quality were identified: robustness, generalizability, added value, and ethics with 11 scored, supporting statements. Although 73 out of 111 papers (66%) scored below 70%, a sizeable portion (34%) scored higher.


Our theoretical framework presents, for debate and further validation, a method of assessing the quality of non-traditional studies and thus supporting the best available evidence approach to disaster response. (Disaster Med Public Health Preparedness. 2019;13:147–151)


Corresponding author

Correspondence and reprint requests to Ms Mary Hall, School of Health and Related Research, University of Sheffield, 30 Regent St, Sheffield, S1 4DA, UK (e-mail:


Hide All
1. Knox Clarke, P, Darcy, J. Insufficient Evidence? The Quality of Use of Evidence in Humanitarian Action. ALNAP Study. London: ALNAP/ODI; 2014.
2. Challen, K, Lee, ACK, Booth, A, et al. Where is the evidence for emergency planning: a scoping review. BMC Public Health. 2012;12:542.
3. Bradt, DA. Evidence-based decision making (part 1): origins and evolution in the health sciences. Prehosp Disaster Med. 2009;24(4):298-304.
4. Bradt, DA, Aitken, P. Disaster medicine reporting: the need for new guidelines and the CONFIDE statement. Emerg Med Australas. 2010;22:483-487.
5. Cartwright, C, Hall, M, Lee, ACK. The changing health priorities of earthquake response and implications for preparedness: a scoping review. Public Health. 2017;150:60-70.
6. Moher, D, Cook, DJ, Eastwood, S, et al. Improving the quality of reports of meta-analyses of randomised controlled trials: the QUORUM statement. Br J Surg. 2000;87:1448-1454.
7. Stroup, DF, Berlin, JA, Morton, SC, et al. Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis of Observational Studies in Epidemiology (MOOSE) Group. JAMA. 2000;283:2008-2012.
9. CASP checklists. Published 2018. Accessed February 27, 2017.
10. von Elm, E, Altman, DG, Egger, M, et al. Strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ. 2007;335:806-808.
11. De Brún, C. Finding the evidence: a key step in the information production process. The Information Standard; 2013. Accessed October 11, 2017.
12. Evidence Aid. Published 2018. Accessed May 10, 2018.
13. Disaster Information Management Resource Center. Accessed May 10, 2018.



Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed