Skip to main content Accessibility help
×
Home

Peer-Reviewed Validation of a Comprehensive Framework for Disaster Evaluation Typologies

  • Diana F. Wong (a1) (a2), Caroline Spencer (a1), Leanne Boyd (a3) (a4), Frederick M. Burkle (a1) (a5) (a6) and Frank Archer (a1)...

Abstract

Introduction:

The Comprehensive Framework for Disaster Evaluation Typologies, developed in 2017 (CFDET 2017), aims to unify and facilitate agreement regarding the identification, structure, and relationships between various evaluation typologies found in the disaster setting. A peer-reviewed validation process sought input from international experts in the fields of disaster medicine, disaster/emergency management, humanitarian/development, and evaluation. This paper discusses the validation process, its results, and outcomes.

Research Problem:

Previous frameworks, identified in the literature, lack validation and consistent terminology. To gain credibility and utility, this unique framework needed to be validated by international experts in the disaster setting.

Methods:

A mixed methods approach was designed to validate the framework. An initial iterative process informed an online survey which used a combination of a five-point Likert scale and open-ended questions. Pre-determined consensus thresholds, informed by a targeted literature review, provided the validation criteria.

Results:

A sample of 33 experts from 11 countries responded to the validation process. Quantitative measures largely supported the elements and relationships of the framework, and strongly supported its value and usefulness for supporting, promoting, and undertaking evaluations, as well as its usefulness for teaching evaluation in the disaster setting. Qualitative input suggested opportunities to strengthen and enhance the framework. There were limited responses to better understand the barriers and enablers of undertaking disaster evaluations. A potential for self-selection bias of respondents may be a limitation of this study. The attainment of high consensus thresholds, however, provides confidence in the validity of the results.

Conclusion:

For the first time, a framework of this nature has undergone a rigorous validation process by experts in three related disciplines at an international level. The modified framework, CFDET 2018, provides a unifying framework within which existing evaluation typologies can be structured. It gives evaluators confidence to choose an appropriate strategy for their particular evaluation in the disaster setting and facilitates consistency in reporting across the different phases of a disaster to better understand the process, outcomes, and impacts of the efficacy and efficiency of interventions. Future research could create a series of toolkits to support improved disaster evaluation processes and to evaluate the utility of the framework in the real-world setting.

Copyright

Corresponding author

Correspondence: Diana Wong, PhD, MCP Nsg Monash University Disaster Resilience Initiative (MUDRI), Monash University, Melbourne, Australia E-mail: Diana.Wong2@health.nsw.gov.au

References

Hide All
1. Wong, DF, Spencer, C, Boyd, L, Burkle, FM Jr, Archer, F. Disaster metrics: a comprehensive framework for disaster evaluation typologies. Prehosp Disaster Med. 2017;32(5):501514.
2. Task Force on Quality Control of Disaster Management (TFQCDM); World Association for Disaster and Emergency Medicine (WADEM); Nordic Society for Disaster Medicine. Health disaster management guidelines for evaluation and research in the Utstein style. Prehosp Disaster Med. 2003;17(Supplement 3):1177.
3. Kulling, P, Birnbaum, M, Murray, V, Rockenschaub, G. Guidelines for reports on health crises and critical health events. Prehosp Disaster Med. 2010;25(4):377382.
4. Powers, R, Daily, E (eds). International Disaster Nursing. Cambridge UK: Cambridge University Press and World Association for Disaster and Emergency Medicine; 2010.
5. Stephenson, C. Impacts Framework for Natural Disasters and Fire Emergencies. Melbourne, Australia: RMIT University and Bushfire CRC; 2010.
6. Debacker, M, Hubloue, I, Dhondt, E, et al. Utstein-style template for uniform data reporting of acute medical response in disasters. PLoS Curr. 2012;4.
7. Sundnes, KO. Health disaster management: guidelines for evaluation and research in the “Utstein Style.” Structural framework and preparedness. Scand J Public Health. 2014;42(14):3195.
8. Glassey, S. Preventing “lessons lost:” is evidence-based dynamic doctrine the answer? Australian J Emerg Manag. 2015;30(3):1114.
9. Birnbaum, ML, Daily, EK, O’Rourke, AP, Loretti, A. Disaster research/evaluation frameworks, Part 1: an overview. Prehosp Disaster Med. 2014;29(2):112.
10. Birnbaum, ML, Daily, EK, O’Rourke, AP, Loretti, A. Research and evaluations of the health aspects of disasters, Part I: an overview. Prehosp Disaster Med. 2015;30(5): 512522.
11. Birnbaum, ML, Daily, EK, O’Rourke, AP, Loretti, A. Research and evaluations of the health aspects of disasters, Part II: the disaster health conceptual framework revisited. Prehosp Disaster Med. 2015;30(5):523538.
12. Birnbaum, ML, Daily, EK, O’Rourke, AP. Research and evaluations of the health aspects of disasters, Part III: framework for the temporal phases of disasters. Prehosp Disaster Med. 2015;30(6):628632.
13. Birnbaum, ML, Daily, EK, O’Rourke, AP. Research and evaluations of the health aspects of disasters, Part IV: framework for societal structures: the social systems. Prehosp Disaster Med. 2015;30(6):633647.
14. Birnbaum, ML, Daily, EK, O’Rourke, AP. Research and evaluations of the health aspects of disasters, Part V: epidemiological disaster research. Prehosp Disaster Med. 2015;30(6):648656.
15. Birnbaum, ML, Daily, EK, O’Rourke, AP, Kushner, J. Research and evaluations of the health aspects of disasters, Part VI: interventional research and the disaster logic model. Prehosp Disaster Med. 2016;31(2):181194.
16. Birnbaum, ML, Daily, EK, O’Rourke, AP. Research and evaluations of the health aspects of disasters, Part VII: the relief/recovery framework. Prehosp Disaster Med. 2016;31(2):195210.
17. Birnbaum, ML, Loretti, A, Daily, EK, O’Rourke, AP. Research and evaluations of the health aspects of disasters, Part VIII: risk, risk reduction, risk management, and capacity building. Prehosp Disaster Med. 2016;31(3):195210.
18. Birnbaum, ML, Daily, EK, O’Rourke, AP, Loretti, A. Research and evaluations of the health aspects of disasters, Part IX: risk-reduction framework. Prehosp Disaster Med. 2016;31(3):309325.
19. Stratton, SJ. Disaster research and evaluation frameworks (editorial). Prehosp Disaster Med. 2014;29(2):12.
20. Stratton, S. Is there a scientific basis for disaster health and medicine? (editorial). Prehosp Disaster Med. 2014;29(3):221222.
21. Cardoni, P. Validating internal models within the Solvency II Directive: Solvency 2 Experts Group; 2017. http://solvency2experts.net/blog/?page_id=105. Accessed October 1, 2018.
22. Government, Q. Disaster Management Phases Brisbane: Queensland Government; 2016. http://www.disaster.qld.gov.au/About_disaster_management/Pages/Disaster-management-phases.aspx. Accessed October 1, 2018.
23. Twigg, J. Disaster Risk Reduction: Good Practice Review 9. London UK: Humanitarian Policy Group, Overseas Development Institute (ODI); 2015.
24. Coetzee, C, Van Niekerk, D. Tracking the evolution of the disaster management cycle: a general system theory approach. JAMBA. 2012;4(1):a54.
25. Lindell, MK. Disaster studies. SociopediaISA. 2011.
26. Baird, ME. The “Phases” of Emergency Management; 2010:50. http://www.memphis.edu/ifti/pdfs/cait_phases_of_emergency_mngt.pdf. Accessed October 1, 2018.
27. Neal, DM. Reconsidering the phases of disaster. IJMED. 1997;15(2):239264.
28. CHS Alliance. Humanitarian Accountability Report; On the Road to Istanbul, How Can the World Humanitarian Summit Make Humanitarian Response More Effective? CHS Alliance; 2015.
29. Sphere Project. The Core Humanitarian Standard and the Sphere Core Standards: Analysis and Comparison. Geneva, Switzerland: Sphere Project; 2015.
30. Creswell, JW, Miller, DL. Determining validity in qualitative inquiry. Theory Pract. 2000;39(3):124–30.
31. DeVon, HA, Block, ME, Moyle-Wright, P, et al. Clinical scholarship: a psychometric toolbox for testing validity and reliability. J Nurs Scholarsh. 2007;39(2):155164.
32. Keeney, S, Hasson, F, McKenna, H. The Delphi Technique in Nursing and Health Research. Oxford, UK: Blackwell Publishing; 2011.
33. Fattah, S, Rehn, M, Reierth, E, Wisborg, T. Systematic literature review of templates for reporting prehospital major incident medical management. BMJ Open. 2013;3:18.
34. Argyrous, G. A Monitoring and evaluation framework for disaster recovery programs. Version 2 - May 2018. Sydney, Australia: Australian and New Zealand School of Government (ANZOG); 2018.
35. Davis-Stober, CP. When is a crowd wise? American Psychological Association. 2014;1(2):79101.
36. Jorm, AF. Using the Delphi expert consensus method in mental health research. Aust N Z J Psychiatry. 2015;49(10):111.
37. Surowiecki, J. The Wisdom of Crowds: Why the Many Are Smarter Than the Few. London, UK: Abacus; 2004:296.
38. Jeon, Y-H, Conway, J, Chenoweth, L, Weise, J, Thomas, THT, Williams, A. Validation of a clinical leadership qualities framework for managers in aged care: a Delphi study. J Clin Nurs. 2014;24(7–8):9991010.
39. Ivankova, N, Wingo, N. Applying mixed methods in action research: Methodological potentials and advantages. Am Behav Sci. 2018;62(7):978997.
40. Qualtrics. Qualtrics Insight Platform (intermediate surveys). 2017. https://www.monash.edu/esolutions/software/qualtrics-insight-platform. Accessed October 1, 2018.
41. Bamberger, M, Rugh, J, Mabry, L. Real-World Evaluation: Working Under Budget, Time, Data, and Political Constraints. Second edition. Thousand Oaks, CA, USA: Sage; 2012.
42. Markiewicz, A, Patrick, I. Developing Monitoring and Evaluation Frameworks. Los Angeles, CA, USA: Sage Publications, Inc.; 2016:293.
43. WHO. WHO Regional Offices. Geneva: World Health Organization (WHO); 2018. http://www.who.int/about/regions/en/. Accessed October 1, 2018.
44. Statistics Canada. Measures of central tendency: Statistics Canada; 2017. https://www150.statcan.gc.ca/n1/edu/power-pouvoir/ch11/5214867-eng.htm. Accessed October 1, 2018.
45. United Nations General Assembly. Sendai Framework for Disaster Risk Reduction 2015–2030. http://www.wcdrr.org/uploads/Sendai_Framework_for_Disaster_Risk_Reduction_2015-2030.pdf. Accessed October 1, 2018.
46. GFDRR. Resilient Recovery: An Imperative for Sustainable Development. Washington, DC, USA: International Bank for Reconstruction and Development; International Development Association for The World Bank; 2015.
47. Coppola, DP. Introduction to International Disaster Management. Third Edition. Amsterdam, The Netherlands: Butterworth-Heinemann; 2015.
48. WHO. International Health Regulations (2005): Second Edition. Geneva, Switzerland: World Health Organization; 2008. Report No.: ISBN: 9789241580410.
49. CDC. Spotlight: International Heath Regulations (2005). Atlanta, Georgia USA: Centers for Disease Control and Prevention (CDC); 2015. https://www.cdc.gov/globalhealth/ihr/index.html. Accessed October 1, 2018.
50. Agenda for Humanity. World Humanitarian Summit 2016. New York USA: United Nations Office for the Coordination of Humanitarian Affairs (UNOCHA); 2016. https://agendaforhumanity.org/summit. Accessed October 1, 2018.
51. Habitat III; New York USA: The United Nations Conference on Housing and Sustainable Urban Development; 2016. http://habitat3.org. Accessed October 1, 2018.
52. O’Connell, D, Wise, R, Doerr, V, et al. Approach, methods, and results for co-producing a systems understanding of disaster. Technical report supporting the development of the Australian Vulnerability Profile. CSIRO; 2018:218.
53. Attorney-General’s Department. “Understanding the drivers of disaster. The case for developing an Australian Vulnerability Profile.” In: Emergency Management Australia. Canberra, Australia: Attorney-General’s Department; 2017:59.
54. UN Women. What is Baseline Assessment? New York USA: United Nations Entity for Gender Equality and the Empowerment of Women; 2012. http://www.endvawnow.org/en/articles/1323-what-is-a-baseline-assessment.html?next=1324. Accessed October 1, 2018.
55. IASC. IASC Guidelines: Common Operational Datasets (CODs) in Disaster Preparedness and Response. Geneva, Switzerland: Inter-Agency Standing Committee; 2010.
56. Handbook, UE. Common operational datasets (CODs) and fundamental operational datasets (FODs); New York USA: The UN Refugee Agency (UNHCR); 2018. https://emergency.unhcr.org/entry/50307. Accessed October 1, 2018.
57. Humanitarian Response. Common Operational Datasets; New York USA: UNOCHA; 2018. https://www.humanitarianresponse.info/en/applications/tools/category/operational-datasets. Accessed October 1, 2018.
58. UNISDR. Sendai Framework Indicators; Geneva Switzerland: UNISDR; 2018. https://www.preventionweb.net/sendai-framework/sendai-framework-monitor/indicators. Accessed October 1, 2018.
59. World Health Organization; Public Health England; UNISDR. Health Emergency and Disaster Risk Management fact sheets. Health Emergency and Disaster Risk Management, Overview. Geneva, Switzerland: WHO; 2017:8.
60. Inter-Agency Standing Committee (IASC). Multi-Sector Initial Rapid Assessment Guidance: Revision July 2015; Geneva, Switzerland: Inter-Agency Standing Committee (IASC); 2015. https://interagencystandingcommittee.org/. Accessed October 1, 2018.
61. Global Cluster for Early Recovery. Early recovery: global cluster for early recovery (GCER); Geneva, Switzerland: Global Cluster for Early Recovery (GCER); 2018. http://www.earlyrecovery.global/about. Accessed October 1, 2018.
62. Australian Government - Department of Home Affairs. Community Recovery, Handbook 2. Canberra, Australia: Australian Government, Department of Home Affairs; 2018:1148.
63. Veenema, TG, (editor). Disaster Nursing and Emergency Preparedness for Chemical, Biological, and Radiological Terrorism and Other Hazards (Second Edition). New York USA: Springer Publishing Company; 2007.
64. Bamberger, M, Independent Evaluation Group (IEG). Institutionalizing Impact Evaluation Within the Framework of a Monitoring and Evaluation System. Washington, DC USA: World Bank; 2009.
65. Puri, J, Aladysheva, A, Iversen, V, Ghorpade, Y, Bruck, T. What Methods May Be Used in Impact Evaluations of Humanitarian Assistance? 2015. http://ftp.iza.org/dp8755.pdf. Accessed October 1, 2018.
66. White House. Some Reflections on Current Debates in Impact Evaluation. New Delhi, India: International Initiative for Impact Evaluation (3ie); 2009.
67. Puri, J, Aladysheva, A, Iverson, V, Ghorpade, Y, Bruck, T. Can rigorous impact evaluations improve humanitarian assistance? J Dev Effectiveness. 2017;9(4):519542.
68. 3ie. Impact Evaluation Glossary. New Delhi, India: International Initiative for Impact Evaluation (3ie); 2012. Contract No.: Version No. 7.
69. Tan, YSA, von Schreeb, J. Humanitarian assistance and accountability: what are we really talking about? Prehosp Disaster Med. 2015;30(3):264270.
70. Cosgrave, J, Buchanan-Smith, M. Evaluation of Humanitarian Action Guide. London, UK: Active Learning Network for Accountability and Performance in Humanitarian Action (ALNAP); 2016.
71. Core Humanitarian Standard on Quality and Accountability. HAP International People in Aid Sphere Project; 2014.
72. Hilhorst, D. Being Good at Doing Good? Quality and Accountability of Humanitarian NGOs. Disasters. 2002;26(3):193212.
73. Blagescu, M, de Las Casas, L, Lloyd, R. Pathways to Accountability: The GAP Framework. London, UK: One World Trust; 2005.
74. Turoff, M. Past and present emergency response information systems. Commun ACM. 2002;45(4):2932.
75. Raman, M, Kuppusamy, MV, Dorasamy, M, Nair, S. Knowledge management systems and disaster management in Malaysia: an action research approach. J Inf Knowl Manag. 2014;13(01):115.
76. Moore, GF, Audrey, S, Barker, M, et al. Process evaluation of complex interventions: Medical Research Council guidance. Br Med J. 2015;350:17.
77. Christoplos, I, Knox Clarke, P, Cosgrave, J, Bonino, F, Alexander, J. Strengthening the Quality of Evidence in Humanitarian Evaluations. London, UK: ALNAP ODI; 2017.
78. Christoplos, I, Dillon, N, Bonino, F. Evaluation of Protection in Humanitarian Action. London, UK: ALNAP; 2018.
79. Blanchet, K, Allen, C, Breckon, J, et al. Using Research Evidence in the Humanitarian Sector: A Practice Guide. London, UK: Evidence Aid London School of Hygiene and Tropical Medicine Nesta; 2018.
80. Taylor-Butts, A. Emergency Preparedness in Canada, 2014. Ottowa, Canada: Statistics Canada; 2016.
81. Security USDoH. 2017 National Preparedness Report. Washington, DC USA: Department of Homeland Security; 2017.
82. Government Statistician Queensland Treasury and Trade. Queensland Community Preparedness Survey November 2013– Household Preparedness for Natural Disasters. Brisbane, Australia: State of Queensland (Queensland Treasury and Trade); 2014.
83. Arup. City Resilience Index. London, UK: The Rockefeller Foundation Arup; 2015.
84. Parsons, M. The Australian Natural Disaster Resilience Index: a system for assessing the resilience of Australian communities to natural hazards: Bushfire and Natural Hazards CRC; 2018. https://www.bnhcrc.com.au/research/hazard-resilience/251. Accessed October 1, 2018.
85. Parsons, M, Morley, P. The Australian Natural Disaster Resilience Index. Australian J Emerg Manag. 2017;32(2):2022.
86. Parsons, M, Morley, P, McGregor, J, et al. Overview of Indicators: The Australian Natural Disaster Resilience Index. Melbourne, Australia: Bushfire & Natural Hazards CRC; 2016.
87. Horney, J, Dwyer, C, Chirra, B, McCarthy, K, Shafer, J, Smith, G. Measuring successful disaster recovery. IJMED. 2018;36(1):122.

Keywords

Type Description Title
PDF
Supplementary materials

Wong et al. supplementary material
Wong et al. supplementary material 1

 PDF (2.8 MB)
2.8 MB
PDF
Supplementary materials

Wong et al. supplementary material
Wong et al. supplementary material 2

 PDF (952 KB)
952 KB

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed