Skip to main content Accessibility help
×
Home

Evaluation of animal and public health surveillance systems: a systematic review

  • J. A. DREWE (a1), L. J. HOINVILLE (a2), A. J. C. COOK (a3), T. FLOYD (a2) and K. D. C. STÄRK (a1)...

Summary

Disease surveillance programmes ought to be evaluated regularly to ensure they provide valuable information in an efficient manner. Evaluation of human and animal health surveillance programmes around the world is currently not standardized and therefore inconsistent. The aim of this systematic review was to review surveillance system attributes and the methods used for their assessment, together with the strengths and weaknesses of existing frameworks for evaluating surveillance in animal health, public health and allied disciplines. Information from 99 articles describing the evaluation of 101 surveillance systems was examined. A wide range of approaches for assessing 23 different system attributes was identified although most evaluations addressed only one or two attributes and comprehensive evaluations were uncommon. Surveillance objectives were often not stated in the articles reviewed and so the reasons for choosing certain attributes for assessment were not always apparent. This has the potential to introduce misleading results in surveillance evaluation. Due to the wide range of system attributes that may be assessed, methods should be explored which collapse these down into a small number of grouped characteristics by focusing on the relationships between attributes and their links to the objectives of the surveillance system and the evaluation. A generic and comprehensive evaluation framework could then be developed consisting of a limited number of common attributes together with several sets of secondary attributes which could be selected depending on the disease or range of diseases under surveillance and the purpose of the surveillance. Economic evaluation should be an integral part of the surveillance evaluation process. This would provide a significant benefit to decision-makers who often need to make choices based on limited or diminishing resources.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Evaluation of animal and public health surveillance systems: a systematic review
      Available formats
      ×

      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Evaluation of animal and public health surveillance systems: a systematic review
      Available formats
      ×

      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Evaluation of animal and public health surveillance systems: a systematic review
      Available formats
      ×

Copyright

Corresponding author

*Author for correspondence: Dr J. A. Drewe, Centre for Emerging, Endemic and Exotic Diseases, Royal Veterinary College, Hawkshead Lane, North Mymms, Herts, AL9 7TA, UK. (Email: jdrewe@rvc.ac.uk)

References

Hide All
1.Thacker, SB, Berkelman, RL. Public health surveillance in the United States. Epidemiologic Reviews 1988; 10: 164190.
2.Stärk, KDC, et al. Concepts for risk-based surveillance in the field of veterinary medicine and veterinary public health: Review of current approaches. BMC Health Services Research 2006; 6: 18.
3.Defra. Partnership, priorities and professionalism: a strategy for enhancing veterinary surveillance in the UK. London: Department for Environment, Food and Rural Affairs; 2003.
4.OIE. Terrestrial Animal Health Code. Paris: Office international des épizooties, 2010.
5.Bravata, DM, et al. Systematic review: surveillance systems for early detection of bioterrorism-related diseases. Annals of Internal Medicine 2004; 140: 910922.
6.Hu, PJH, et al. A Web-based system for infectious disease data integration and sharing: Evaluating outcome, task performance efficiency, user information satisfaction, and usability. Intelligence and Security Informatics: Biosurveillance, Proceedings 2007; 4506: 134146.
7.Meynard, JB, et al. Proposal of a framework for evaluating military surveillance systems for early detection of outbreaks on duty areas. BMC Public Health 2008; 8: 146.
8.Knight-Jones, TJD, et al. Evaluation of effectiveness and efficiency of wild bird surveillance for avian influenza. Veterinary Research 2010; 41: 50.
9.Hesterberg, U, Cook, A, Stack, JM, PAJ. Evaluation of the sensitivity of the British brucellosis surveillance system using stochastic scenario tree modelling. In: Proceedings of the 12th Meeting of the International Society of Veterinary Epidemiology and Economics, 1014 August 2009, Durban, South Africa, 2009.
10.Nabarro, D. Global disease surveillance and response: incentives and disincentives to timely disease reporting and response – lessons from the influenza campaign. In: Relman, DA, Choffnes, ER, Mack, A, eds. Infectious Disease Movement in a Borderless World. Washington: National Academies Press, 2010, pp. 256–61.
11.World Health Organisation. Health systems strengthening – glossary (https://www.healthintelligenceportal.org/landing/hssglossary.html). Accessed 9 December 2010.
12.Birch, CPD, et al. Spatial distribution of the active surveillance of sheep scrapie in Great Britain: an exploratory analysis. BMC Veterinary Research 2009; 5: 23.
13.Carpenter, TE. Evaluation and extension of the cusum technique with an application to Salmonella surveillance. Journal of Veterinary Diagnostic Investigation 2002; 14: 211218.
14.Hadorn, D, et al. Using scenario tree modelling for the evaluation of surveillance strategies for zoonoses in disease-free and endemic situations. In: Proceedings of the Society for Veterinary Epidemiology and Preventive Medicine Annual Meeting, 2628 March 2008, Liverpool, UK, pp. 253261, 2008.
15.Moran, D, Fofana, A. An economic evaluation of the control of three notifiable fish diseases in the United Kingdom. Preventive Veterinary Medicine 2007; 80: 193208.
16.Vidal Diez, A, Arnold, ME, Del Rio Vilas, VJ. Evaluation and optimisation of scrapie surveillance in Great Britain: A Bayesian framework. In: Proceedings of the 12th Meeting of the International Society of Veterinary Epidemiology and Economics, 1014 August 2009, Durban, South Africa, 2009.
17.CDC.Updated guidelines for evaluating public health surveillance systems: recommendations from the Guidelines Working Group. Morbidity and Mortality Weekly Report (Recommendations & Reports) 2001; 50: 135.
18.WHO. Commnicable disease surveillance and response systems: guide to monitoring and evaluating. Geneva: World Health Organisation, 2006.
19.Health Surveillance Coordinating Committee.Framework and tools for evaluating health surveillance systems. Vancouver: Health Canada, 2004.
20.WHO.Protocol for the evaluation of epidemiological surveillance systems. Geneva: World Health Organisation, 1997.
21.Aavitsland, P, Nilsen, O, Lystad, A. Anonymous reporting of HIV infection: an evaluation of the HIV/AIDS surveillance system in Norway 1983–2000. European Journal of Epidemiology 2001; 17: 479489.
22.Abernethy, D, et al. Evaluation of surveillance measures to detect Brucella abortus in a high density cattle population. In: Proceedings of the 12th Meeting of the International Society of Veterinary Epidemiology and Economics, 1014 August 2009, Durban, South Africa, 2009.
23.Andersen, HJ, et al. Evaluation of the surveillance program of Streptococcus agalactiae in Danish dairy herds. Journal of Dairy Science 2003; 86: 12331239.
24.CDC.Evaluation of congenital syphilis surveillance system – New Jersey, 1993. Morbidity and Mortality Weekly Report (Recommendations & Reports) 1995; 44: 225227.
25.CDC.Evaluation of HIV case surveillance through the use of non-name unique identifiers – Maryland and Texas, 1994–1996. Morbidity and Mortality Weekly Report 1998; 46: 12541258.
26.WHO.Evaluation of the dracunculiasis surveillance system in 4 districts in Ghana. Weekly Epidemiological Record 2005; 80: 270276.
27.Ansari, JA, et al. Evaluation of the existing bacterial meningitis surveillance system in Islamabad, Pakistan. International Journal of Infectious Diseases 2008; 12: E192.
28.Arscott-Mills, S, Holder, Y, Gordon, G. Comparative evaluation of different modes of a national accident and emergency department-based injury surveillance system: Jamaican experience. International Journal of Injury Control and Safety Promotion 2002; 9: 235239.
29.Atchison, CJ, et al. Clinical laboratory practices for the detection of rotavirus in England and Wales: can surveillance based on routine laboratory testing data be used to evaluate the impact of vaccination? Eurosurveillance 2009; 14: 16.
30.Barat, LM, et al. Evaluation of malaria surveillance using retrospective, laboratory-based active case detection in four southwestern states, 1995. American Journal of Tropical Medicine and Hygiene 1999; 60: 910914.
31.Betancourt, JA, et al. Evaluation of ICD-9 codes for syndromic surveillance in the electronic surveillance system for the early notification of community-based epidemics. Military Medicine 2007; 172: 346352.
32.Betanzos-Reyes, AF, et al. Comparative analysis of two alternative models for epidemiological surveillance in the Mexican Malaria Control Program. Health Policy 2007; 80: 465482.
33.Bingle, CL, et al. An evaluation of the Ontario rapid risk factor surveillance system. Canadian Journal of Public Health – Revue Canadienne De Sante Publique 2005; 96: 145150.
34.Bowen, HJ, et al. Community exposures to chemical incidents: development and evaluation of the first environmental public health surveillance system in Europe. Journal of Epidemiology and Community Health 2000; 54: 870873.
35.Brooker, S, et al. The use of schools for malaria surveillance and programme evaluation in Africa. Malaria Journal 2009; 8: 231.
36.Brownstein, JS, Freifeld, CC. Evaluation of internet-based informal surveillance for global infectious disease intelligence. International Journal of Infectious Diseases 2008; 12: E193E194.
37.Buckeridge, DL, Verma, A, Tamblyn, R. Ambulatory e-prescribing: evaluating a novel surveillance data source. Intelligence and Security Informatics: Biosurveillance, Proceedings 2007; 4506: 190195.
38.Carpenter, TE, Chriel, M, Greiner, M. An analysis of an early-warning system to reduce abortions in dairy cattle in Denmark incorporating both financial and epidemiologic aspects. Preventive Veterinary Medicine 2007; 78: 111.
39.Carrieri, MP, et al. Evaluation of the SIMI system, an experimental computerised network for the surveillance of communicable diseases in Italy. European Journal of Epidemiology 2000; 16: 941947.
40.Chadee, DD. Evaluation of malaria surveillance in Trinidad (1988–1998). Annals of Tropical Medicine and Parasitology 2000; 94: 403406.
41.Chriel, M, Salman, MD, Wagner, BA. Evaluation of surveillance and sample collection methods to document freedom from infectious bovine rhinotracheitis in cattle populations. American Journal of Veterinary Research 2005; 66: 21492153.
42.Clothier, HJ, Fielding, JE, Kelly, HA. An evaluation of the Australian Sentinel Practice Research Network (ASPREN) surveillance for influenza-like illness. Communicable Diseases Intelligence 2005; 29: 231–47.
43.Cretikos, M, Telfer, B, McAnulty, J. Evaluation of the system of surveillance for enteric disease outbreaks, New South Wales, Australia, 2000 to 2005. NSW Public Health Bulletin 2008; 19: 8–14.
44.David, ST, et al. A bird's eye view: using geographic analysis to evaluate the representativeness of corvid indicators for West Nile virus surveillance. International Journal of Health Geographics 2007; 6: 3.
45.de Chabalier, F, Hassane, A, Chippaux, JP. Evaluation of surveillance thresholds for prediction of meningitis epidemics using ongoing surveillance data at the district level, in Niger. Transactions of the Royal Society of Tropical Medicine and Hygiene 2000; 94: 251252.
46.Doroshenko, A, et al. Evaluation of syndromic surveillance based on National Health Service Direct derived data – England and Wales. Morbidity and Mortality Weekly Report 2005; 54 (Suppl.): 117122.
47.Dufour, B. Technical and economic evaluation method for use in improving infectious animal disease surveillance networks. Veterinary Research 1999; 30: 2737.
48.Fujii, H, et al. Evaluation of the school health surveillance system for influenza, Tokyo, 1999–2000. Japanese Journal of Infectious Diseases 2002; 55: 9799.
49.Fujii, H, et al. Evaluation of a sentinel surveillance system for influenza, 1995–2000, Kyoto City, Japan. Japanese Journal of Infectious Diseases 2002; 55: 2326.
50.Gazarian, M, et al. Evaluation of a national surveillance unit. Archives of Disease in Childhood 1999; 80: 2127.
51.Grenier, D, et al. Canadian Paediatric Surveillance Program evaluation: an excellent report card. Paediatrics & Child Health 2004; 9: 379384.
52.Guasticchi, G, et al. Syndromic surveillance: sensitivity and positive predictive value of the case definitions. Epidemiology and Infection 2009; 137: 662671.
53.Hadorn, DC, Stark, KDC. Evaluation and optimization of surveillance systems for rare and emerging infectious diseases. Veterinary Research 2008; 39: 57.
54.Hall, HI, Mokotoff, ED, Advisory, G. Setting standards and an evaluation framework for human immunodeficiency virus/acquired immunodeficiency syndrome surveillance. Journal of Public Health Management and Practice 2007; 13: 519523.
55.Harpaz, R, Papania, MJ. Can a minimum rate of investigation of measleslike illnesses serve as a standard for evaluating measles surveillance? Journal of Infectious Diseases 2004; 189: S204S209.
56.Harpaz, R, et al. Lessons learned from establishing and evaluating indicators of the quality of measles surveillance in the United States, 1996–1998. Journal of Infectious Diseases 2004; 189: S196S203.
57.He, SW, Zurynski, YA, Elliott, EJ. Evaluation of a national resource to identify and study rare diseases: The Australian Paediatric Surveillance Unit. Journal of Paediatrics and Child Health 2009; 45: 498504.
58.Hendrikx, P, et al. Development of performance indicators for the bovine clinical salmonellosis surveillance network in France. Journal of Veterinary Medicine, Series B: Infectious Diseases and Veterinary Public Health 2005; 52: 465475.
59.Herndandez-Jover, M, et al. Evaluating post-farm-gate disease surveillance for the pig industry in Australia: livestock markets and abattoirs. In: Proceedings of the 12th Meeting of the International Society of Veterinary Epidemiology and Economics, 1014 August 2009, Durban, South Africa, 2009.
60.Huaman, MA, et al. Impact of two interventions on timeliness and data quality of an electronic disease surveillance system in a resource limited setting (Peru): a prospective evaluation. BMC Medical Informatics and Decision Making 2009; 9: 16.
61.Hutchison, J, Martin, T. Evaluation of surveillance for enzootic bovine leucosis in the Australian dairy industry. In: Proceedings of the 11th Meeting of the International Society of Veterinary Epidemiology and Economics, 6–11 August 2006, Cairns, Australia, 2006.
62.Ihekweazu, C, et al. Is STI surveillance in England meeting the requirements of the 21st century? An evaluation of data from the South West Region. Eurosurveillance 2007; 12: 111116.
63.Iqbal, S, et al. Carbon monoxide-related hospitalizations in the US: evaluation of a web-based query system for public health surveillance. Public Health Reports 2010; 125: 423432.
64.Irvin, CB, Nouhan, PP, Rice, K. Syndromic analysis of computerized emergency department patients' chief complaints: an opportunity for bioterrorism and influenza surveillance. Annals of Emergency Medicine 2003; 41: 447452.
65.Izadi, M, et al. A Bayesian network model for analysis of detection performance in surveillance systems. AMIA Annual Symposium Proceedings 2009; 2009: 276280.
66.Jefferson, H, et al. Evaluation of a syndromic surveillance for the early detection of outbreaks among military personnel in a tropical country. Journal of Public Health 2008; 30: 375–83.
67.Johnson, HA, et al. Analysis of web access logs for surveillance of influenza. Medinfo 2004: Proceedings of the 11th World Congress on Medical Informatics, Pt 1 and 2 2004; 107: 12021206.
68.Jones, NF, Marshall, R. Evaluation of an electronic general-practitioner-based syndromic surveillance system – Auckland, New Zealand, 2000–2001. Morbidity and Mortality Weekly Report 2004; 53 (Suppl.): 173178.
69.Kaufman, Z, et al. Evaluation of a syndromic surveillance system using the WSARE algorithm for early detection of an unusual, localized summer outbreak of influenza B: Implications for bioterrorism surveillance. Israel Medical Association Journal 2007; 9: 37.
70.Laberge, K, Galanis, E. Evaluation of the surveillance of hemolytic uremic syndrome in British Columbia – should it remain reportable? Canadian Journal of Public Health – Revue Canadienne De Sante Publique 2008; 99: 286289.
71.Lenaway, DD, Ambler, A. Evaluation of a school-based influenza surveillance system. Public Health Reports 1995; 110: 333337.
72.Lesher, L, et al. Evaluation of surveillance methods for staphylococcal toxic shock syndrome. Emerging Infectious Diseases 2009; 15: 770773.
73.Lynn, T, et al. An evaluation of scrapie surveillance in the United States. Preventive Veterinary Medicine 2007; 81: 7079.
74.Macarthur, C, Pless, IB. Evaluation of the quality of an injury surveillance system. American Journal of Epidemiology 1999; 149: 586592.
75.Mariner, JC, et al. Rinderpest surveillance performance monitoring using quantifiable indicators. Revue Scientifique et Technique de l'Office International des Epizooties 2003; 22: 837847.
76.Matsui, T, et al. Evaluation of national Tsutsugamushi disease surveillance – Japan, 2000. Japanese Journal of Infectious Diseases 2002; 55: 197203.
77.Mazurek, J, et al. Evaluation of hepatitis C surveillance in Poland in 1998. Epidemiology and Infection 2002; 129: 119125.
78.McIntyre, L, et al. The relative value of farmer, veterinary practitioner and diagnostic laboratory records in providing epidemiologically sound endemic disease surveillance data. In: Proceedings of the Society for Veterinary Epidemiology and Preventive Medicine Annual meeting, 35 April 2002, Cambridge, UK, pp. 128136.
79.McNabb, SJN, et al. Applying a new conceptual framework to evaluate tuberculosis a surveillance and action performance and measure the costs, Hillsborough county, Florida 2002. Annals of Epidemiology 2004; 14: 640645.
80.Miller, M, et al. Evaluation of Australia's National Notifiable Disease Surveillance System. Communicable Diseases Intelligence 2004; 28: 311323.
81.Morris, S, et al. The costs and effectiveness of surveillance of communicable disease: a case study of HIV and AIDS in England and Wales. Journal of Public Health Medicine 1996; 18: 415422.
82.Nardone, A, et al. Repeat capture-recapture studies as part of the evaluation of the surveillance of Legionnaires' disease in France. Epidemiology and Infection 2003; 131: 647654.
83.Odoi, A, et al. Application of an automated surveillance-data-analysis system in a laboratory-based early-warning system for detection of an abortion outbreak in mares. American Journal of Veterinary Research 2009; 70: 247256.
84.Paisley, L, Tharaldsen, J, Harp, J. Evaluation of the Norwegian BHV-1 surveillance program with Monte Carlo simulation models. In: Proceedings of the 9th Meeting of the International Society of Veterinary Epidemiology and Economics, 6–12 August 2000, Breckenridge, Colorado, USA, 2000.
85.Perron, L, De Wals, P, Milord, F. Evaluation of the rubella surveillance system in Quebec. Canadian Journal of Infectious Diseases 2000; 11: 313316.
86.Phillips, VL, et al. Evaluation of program performance and expenditures in a report of performance measures (RPM) via a case study of two Florida county tuberculosis programs. Evaluation and Program Planning 2010; 33: 373378.
87.Rahi, JS, Dezateux, C, Brit Congenital Cataract Interest G. Capture-recapture analysis of ascertainment by active surveillance in the British congenital cataract study. Investigative Ophthalmology & Visual Science 1999; 40: 236239.
88.Richard, JL, Vidondo, B, Mausezahl, M. A 5-year comparison of performance of sentinel and mandatory notification surveillance systems for measles in Switzerland. European Journal of Epidemiology 2008; 23: 5565.
89.Roberts, M, et al. Implementing and evaluating a practice-based surveillance program for equine infectious disease in North Carolina. In: Proceedings of the 11th Meeting of the International Society of Veterinary Epidemiology and Economics, 6–11 August 2006, Cairns, Australia, 2006.
90.Robotin, M. Evaluation of the Australian CJD surveillance system. Communicable Diseases Intelligence 2002; 26: 265–72.
91.Romaguera, P, German, R, Klaucke, D. Evaluating public health surveillance. In: Teutsch, S, Churchill, R, eds. Principles and Practice of Public Health Surveillance, 2nd edn. New York: Oxford University Press, 2000, pp. 176–93.
92.Rumisha, SF, et al. Monitoring and evaluation of integrated disease surveillance and response in selected districts in Tanzania. Tanzania Health Research Bulletin 2007; 9: 111.
93.Safdar, RM, Khan, SA, Asghar, RJ. Evaluation of hepatitis surveillance systems in Pakistan. International Journal of Infectious Diseases 2008; 12: E192.
94.Sandberg, M, et al. An evaluation of the Norwegian Salmonella surveillance and control program in live pig and pork. International Journal of Food Microbiology 2002; 72: 111.
95.Sekhobo, JP, Druschel, CM. An evaluation of congenital malformations surveillance in New York State: An application of Centers for Disease Control and Prevention (CDC) guidelines for evaluating surveillance systems. Public Health Reports 2001; 116: 296305.
96.Takahashi, T, et al. Evaluation of the Japanese school health surveillance system for influenza. Japanese Journal of Infectious Diseases 2001; 54: 2730.
97.Tan, HF, et al. Evaluation of the national notifiable disease surveillance system in Taiwan: An example of varicella reporting. Vaccine 2007; 25: 26302633.
98.van Benthem, BHB, van Vliet, JA. Reflections on an evaluation of the Dutch Infectious diseases Surveillance Information System. Eurosurveillance 2008; 13(11).
99.Walker, N, et al. Epidemiological analysis of the quality of HIV sero-surveillance in the world: how well do we track the epidemic? AIDS 2001; 15: 15451554.
100.Watkins, RE, et al. An evaluation of the sensitivity of acute flaccid paralysis surveillance for poliovirus infection in Australia. BMC Infectious Diseases 2009; 9: 162.
101.Weber, IB, Evaluation of the notifiable disease surveillance system in Gauteng province, South Africa (M.Sc. thesis). University of Pretoria, 2007.
102.Wilson, K, Brownstein, JS. Early detection of disease outbreaks using the Internet. Canadian Medical Association Journal 2009; 180: 829831.
103.Yih, WK, et al. Evaluating real-time syndromic surveillance signals from ambulatory care data in four states. Public Health Reports 2010; 125: 111–20.
104.Jajosky, RA, Groseclose, SL. Evaluation of reporting timeliness of public health surveillance systems for infectious diseases. BMC Public Health 2004; 4: 29.
105.Buehler, JW, et al. Framework for evaluating public health surveillance systems for early detection of outbreaks: recommendations from the CDC Working Group. Morbidity and Mortality Weekly Report (Recommendations & Reports) 2004; 53: 111.
106.IAEA. Performance indicators for rinderpest surveillance. IAEA-TECDOC-1261. Vienna, Austria: International Atomic Energy Agency, 2001.
107.Malecki, KC, Resnick, B, Burke, TA. Effective environmental public health surveillance programs: a framework for identifying and evaluating data resources and indicators. Journal of Public Health Management and Practice 2008; 14: 543551.
108.Mitchell, RJ, Williamson, AM, O'Connor, R. The development of an evaluation framework for injury surveillance systems. BMC Public Health 2009; 9: 260.
109.Sosin, DM. Draft framework for evaluating syndromic surveillance systems. Journal of Urban Health – Bulletin of the New York Academy of Medicine 2003; 80: i8–i13.
110.Del Rio Vilas, VJ, Pfeiffer, DU. The evaluation of bias in scrapie surveillance: a review. Veterinary Journal 2010; 185: 259264.
111.Kleinman, KP, Abrams, AM. Assessing surveillance using sensitivity, specificity and timeliness. Statistical Methods in Medical Research 2006; 15: 445464.
112.Kleinman, KP, Abrams, AM. Assessing the utility of public health surveillance using specificity, sensitivity, and lives saved. Statistics in Medicine 2008; 27: 40574068.
113.Magnani, R, et al. Review of sampling hard-to-reach and hidden populations for HIV surveillance. AIDS 2005; 19 (Suppl. 2): S67S72.
114.Jolly, GM. Explicit estimates from capture-recapture data with both death and immigration-stochastic model. Biometrika 1965; 52: 225247.
115.Del Rio Vilas, VJ, et al. A case study of capture-recapture methodology using scrapie surveillance data in Great Britain. Preventive Veterinary Medicine 2005; 67: 303317.
116.Food and Agriculture Organisation. Campaign against deadly cattle plague ending: End of field activities sets stage for rinderpest eradication (http://www.fao.org/news/story/en/item/46383/icode/). Accessed 4 November 2010.
117.Hoinville, L, et al. Appendix 2b: Summary of surveillance definitions and criteria for describing and evaluating surveillance activities developed after the workshop. In: Discussing the Development and Application of Methods for Effective Surveillance in Livestock Populations. Report of a workshop held prior to the ISVEE conference, Durban, South Africa, August 2009 (http://www.defra.gov.uk/vla/reports/docs/rep_pres_isvee.pdf). Accessed 7 March 2011.

Keywords

Related content

Powered by UNSILO

Evaluation of animal and public health surveillance systems: a systematic review

  • J. A. DREWE (a1), L. J. HOINVILLE (a2), A. J. C. COOK (a3), T. FLOYD (a2) and K. D. C. STÄRK (a1)...

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed.