Skip to main content Accessibility help
×
Home
Hostname: page-component-568f69f84b-ftpnm Total loading time: 0.208 Render date: 2021-09-23T06:46:39.597Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": true, "newCiteModal": false, "newCitedByModal": true, "newEcommerce": true, "newUsageEvents": true }

Reporting and presenting information retrieval processes: the need for optimizing common practice in health technology assessment

Published online by Cambridge University Press:  13 October 2010

Christina Niederstadt
Affiliation:
Medical Review Board of the German Statutory Health Insurances Lower Saxony (MDK Niedersachsen)
Sigrid Droste
Affiliation:
Institute for Quality and Efficiency in Health Care (IQWiG)

Abstract

Background: Information retrieval (IR) in health technology assessment (HTA) calls for transparency and reproducibility, but common practice in the documentation and presentation of this process is inadequate in fulfilling this demand.

Objectives: Our objective is to promote good IR practice by presenting the conceptualization of retrieval and transcription readable to non-information specialists, and reporting of effectively processed search strategies.

Methods: We performed a comprehensive database search (04/2010) to synthesize the current state-of-the-art. We then developed graphical and tabular presentation methods and tested their feasibility on existing research questions and defined recommendations.

Results: No generally accepted standard of reporting of IR in HTA exists. We, therefore, developed templates for presenting the retrieval conceptualization, database selection, and additional hand-searching as well as for presenting search histories of complex and lengthy search strategies. No single template fits all conceptualizations, but some can be applied to most processes. Database interface providers report queries as entered, not as they are actually processed. In PubMed®, the huge difference between entered and processed query is shown in “Details.” Quality control and evaluation of search strategies using a validated tool such as the PRESS checklist is suboptimal when only entry-query based search histories are applied.

Conclusions: Moving toward an internationally accepted IR reporting standard calls for advances in common reporting practices. Comprehensive, process-based reporting and presentation would make IR more understandable to others than information specialists and facilitate quality control.

Type
THEME SECTION: INFORMATION RETRIEVAL FOR HTA
Copyright
Copyright © Cambridge University Press 2010

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Booth, A. “Brimful of STARLITE”: Toward standards for reporting literature searches. J Med Libr Assoc. 2006;94:421429, e205.Google ScholarPubMed
2. Busse, R, Orvain, J, Velasco, M, et al. Best practice in undertaking and reporting health technology assessments: Working group 4 report. Int J Technol Assess Health Care. 2002;18:361422.CrossRefGoogle Scholar
3. Canadian Agency for Drugs and Technologies in Health. PRESS: Peer review of electronic search strategies. Ottawa: CADTH; 2008.Google ScholarPubMed
4. Centre for Evidence Based Medicine (CEBM), University of Oxford. Asking focused questions. http://www.cebm.net/index.aspx?o=1036 (accessed May 11, 2010).Google Scholar
5. Centre for Reviews and Dissemination. Systematic reviews: CRD's guidance for undertaking reviews in health care. York: CRD, University of York; 2009.Google Scholar
6. Danish Centre for Health Technology Assessment. Health technology assessment handbook. Copenhagen: DACEHTA; 2008.Google ScholarPubMed
7. Etutorials.org. Learning UML. http://etutorials.org/programming/learning+uml/ (accessed May 11, 2010).Google Scholar
8. Golder, S, Loke, Y, McIntosh, HM. Poor reporting and inadequate searches were apparent in systematic reviews of adverse effects. J Clin Epidemiol. 2008;61:440448.CrossRefGoogle ScholarPubMed
9. International Network of Agencies for Health Technology Assessment. A checklist for health technology assessment reports. Stockholm: INAHTA; 2007.Google ScholarPubMed
10. Kitchenham, B. Procedures for performing systematic reviews. Eversleigh: National Information and Communications Technology Centre of Excellence Australia (NICTA); 2004.Google Scholar
11. Liberati, A, Altman, DG, Tetzlaff, J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: Explanation and elaboration. PLoS Med. 2009;6:e1000100.CrossRefGoogle ScholarPubMed
12. Patrick, TB, Demiris, G, Folk, LC, et al. Evidence-based retrieval in evidence-based medicine. J Med Libr Assoc. 2004;92:196199.Google ScholarPubMed
13. Roundtree, AK, Kallen, MA, Lopez-Olivo, MA, et al. Poor reporting of search strategy and conflict of interest in over 250 narrative and systematic reviews of two biologic agents in arthritis: A systematic review. J Clin Epidemiol. 2009;62:128137.CrossRefGoogle ScholarPubMed
14. Sampson, M, McGowan, J, Tetzlaff, J, Cogo, E, Moher, D. No consensus exists on search reporting methods for systematic reviews. J Clin Epidemiol. 2008;61:748754.CrossRefGoogle ScholarPubMed
15. Sandelowski, M, Barroso, J. Handbook for synthesizing qualitative research. New York: Springer; 2007.Google Scholar
16. Shea, BJ, Grimshaw, JM, Wells, GA, et al. Development of AMSTAR: A measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007;7:10.CrossRefGoogle ScholarPubMed
17. Spoerri, A. InfoCrystal: A visual tool for information retrieval. In: Card, SK, MacKinlay, JD, Shneiderman, B eds. Readings in information visualization: Using vision to think. San Diego: Academic Press; 1999.Google Scholar
18. Stroup, DF, Berlin, JA, Morton, SC, et al. Meta-analysis of observational studies in epidemiology: A proposal for reporting. JAMA. 2000;283:20082012.CrossRefGoogle Scholar
19. The AGREE Collaboration. Appraisal of Guidelines for Research and Evaluation: AGREE instrument. London: St George's Hospital Medical School; 2001.Google Scholar
20. The Campbell Collaboration Steering Committee, ed. The Campbell Collaboration information retrieval policy brief. Oslo: The Campbell Collaboration; 2004.Google Scholar
21. The Campbell Collaboration. Systematic review information retrieval checklist: Revised 13/02/2009. Oslo: The Campbell Collaboration; 2009.Google Scholar
22. The Cochrane Collaboration. Cochrane handbook for systematic reviews of interventions: Version 5.0.2. Oxford: The Cochrane Collaboration; 2009.Google Scholar
23. Yoshii, A, Plaut, DA, McGraw, KA, Anderson, MJ, Wellik, KE. Analysis of the reporting of search strategies in Cochrane systematic reviews. J Med Libr Assoc. 2009;97:2129.CrossRefGoogle ScholarPubMed
Supplementary material: File

Niederstadt and Droste supplementary material

Tables and figures

Download Niederstadt and Droste supplementary material(File)
File 200 KB
20
Cited by

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Reporting and presenting information retrieval processes: the need for optimizing common practice in health technology assessment
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

Reporting and presenting information retrieval processes: the need for optimizing common practice in health technology assessment
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

Reporting and presenting information retrieval processes: the need for optimizing common practice in health technology assessment
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *