Hostname: page-component-8448b6f56d-cfpbc Total loading time: 0 Render date: 2024-04-20T04:28:50.462Z Has data issue: false hasContentIssue false

A Technical Review of the ISPOR Presentations Database Identified Issues in the Search Interface and Areas for Future Development

Published online by Cambridge University Press:  08 March 2022

Chris Cooper*
Affiliation:
Department of Clinical, Educational and Health Psychology, University College London, London, United Kingdom
Anna Brown
Affiliation:
Warwick Medical School, University of Warwick, Coventry, United Kingdom
Rachel Court
Affiliation:
Warwick Medical School, University of Warwick, Coventry, United Kingdom
Ute Schauberger
Affiliation:
Independent Researcher, Glasgow, Scotland
*
*Author for correspondence: Chris Cooper, E-mail: Ucjucc4@ucl.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

Objective

To undertake a technical review of the search interface of the ISPOR Presentations Database. By technical review, we mean an evaluation of the technical aspects of the search interface and functionality, which a user must navigate to complete a search.

Methods

A validated checklist (Bethel and Rogers, 2014, Health Info Libr J, 31, 43-53) was used to identify where the interface performed well, where the interface was adequate, where the interface performed poorly, where functionality available in core biomedical bibliographic databases does not exist in the ISPOR database, and to establish a list of any issues arising during the review. Two researchers independently undertook the technical review in October 2021.

Results

The ISPOR database scored 35 of a possible 165 (27/111 essential criteria and 8/54 desirable criteria). Two issues arising were identified, both of which will cause searchers to miss potentially eligible abstracts: (i) that search terms, which include * or ? as truncation or wildcard symbols should not be capitalized (e.g., cost* not Cost*; organi?ation not Organi?ation) and (ii) that quotation marks should be straight sided in phrase searching (e.g., “cost analyses” not “cost analyses”).

Conclusions

The ISPOR database is a promising and free database to identify abstracts/posters presented at ISPOR. We summarize two key issues arising, and we set out proposed changes to the search interface, including: adding the ability to export abstracts to a bibliographic tool, exporting search strategies, adding a researcher account, and updating the help guide. All suggestions will further improve this helpful database.

Type
Assessment
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press

The Professional Society for Health Economics and Outcomes Research, also known as ISPOR (formerly, the International Society for Pharmacoeconomics and Outcomes Research) is a not-for-profit organization focused on advancing the science and practice of health economics and outcomes research worldwide (1). As part of their work, ISPOR run a number of conferences and summits where delegates and ISPOR members can present their research. Although the presentations focus on health economics and outcomes research, the topics presented by researchers can be broad in scope. This makes the ISPOR presentations database a recommended resource for researchers interested in health technology assessment and a valuable resource for researchers looking for economic, cost, or outcomes data across a range of health topics (Reference Kaunelis and Glanville2;Reference Gebregergish, Hashim, Heeg, Wilke, Rauland and Hostalek3).

Abstracts of podium and poster presentations from ISPOR meetings are reported in the journal, Value in Health, but ISPOR also offer a free, online, searchable database, the ISPOR Presentations Database (4). In this paper, we report a technical review of the search interface of the ISPOR Presentations Database. The reason for this technical review is an observation that some of the search functionality in the presentations database does not appear to work, and that some traditional search functionality, which is not mentioned in the help guide, does appear to work. Furthermore, we see areas for development of the database, which we believe will improve the database. This technical review aims to empirically identify issues arising and areas for development to provide searchers with clear guidance and to make recommendations for improvements.

Research Aim

To undertake a technical review of the search interface of the ISPOR Presentations Database. By technical review, we mean an evaluation of the technical aspects of the search interface and functionality, which a user must navigate to complete a search (Reference Cooper, Court, Kotas and Schauberger5). We subdivide this aim into five objectives:

  1. 1. to identify areas where the interface performed well;

  2. 2. to identify areas where the interface was adequate;

  3. 3. to identify areas where the interface performed poorly;

  4. 4. to identify areas where functionality available in core biomedical bibliographic databases does not exist; and

  5. 5. to establish a list of any issues arising in review.

Methods

We apply the same checklist and methods as in our recent technical review of trials registry resources reported elsewhere (Reference Cooper, Court, Kotas and Schauberger5).

The Checklist

To ensure a transparent and replicable review, we used a checklist developed by Bethel and Rogers to assess the ability of bibliographic database platforms to process literature searches for systematic reviews. Bethel and Rogers identified ten domains, which they considered necessary in the search interface of a bibliographic database to process a literature search for a systematic review (see Table 1). Within these domains, fifty-five individual criteria were identified as being either essential (n = 37) or desirable (n = 18). Bethel and Rogers indicate that a score of 1–3 be assigned by a researcher to grade each criterion (Reference Cooper, Court, Kotas and Schauberger5;Reference Bethel and Rogers6).

Table 1. Summary of the Bethel and Rogers Checklist

We used the checklist exactly as reported by Bethel and Rogers, but we amended the scoring criteria by adding a score of zero (0). The original checklist gave a score of 1 where an interface did not perform the function, or it was so difficult to find that it was deemed “ineffective” (Reference Bethel and Rogers6). We felt that there was a perceptible difference between an interface not performing a function well, or being difficult to find, and the function not existing. It could help interpretation of findings to specifically indicate which functionality does not exist. Our revised scoring criteria is set out in Table 2.

Table 2. Scoring and Research Aims

Undertaking the Technical Review

The search interfaces were independently reviewed by C.C. and A.B. using the search interface provided on the ISPOR website: https://www.ispor.org/heor-resources/presentations-database/search.

C.C. used an Apple Macintosh (OS Catalina, Version 10.15.6) and the Firefox web browser (92.0.1 64-bit) with a Virtual Private Network (SurfShark version 3.7.0). All cookies were cleared prior to testing and the author was logged out of all linked accounts (e.g., Google; Reference Cooper, Lorenc and Schauberger7;Reference Rethlefsen, Kirtley and Waffenschmidt8).

A.B. used a Dell laptop running Windows 10 Enterprise and Microsoft Edge (version 94.0.992.50) and Google Chrome (version 94.0.4606.81) web browsers. The laptop is managed by University of Warwick IT department and the author used both browsers while logged in to her university Microsoft live account.

The authors then met to reconcile scores awarded (as is common in resolving study selection decisions in a systematic review) to produce a final, unified score for each criterion. R.C. acted as a third reviewer, helping to test functionality where C.C. and A.B. were producing different results.

We used the criterion level scoring to structure our results, as set out in Table 2. We correlated a high total over-all score for each criterion (3/3) as representing the best possible search experience. Lower scores indicated where the user experience might be suboptimal or could be improved (1–2), and a score of zero (0) indicated a feature, identified by Bethel and Rogers, was absent in the ISPOR database. Issues arising were identified through discussion of the results. Recommendations for improving the database were made by review of criteria scoring 0 or 1 and our knowledge as expert searchers.

Results

The checklist was applied independently by the authors in October 2021. No issues were reported with the database, and we believed the platform to be working correctly at the time of the review.

Figure 1 reports the unified scoring table for each of the ten criteria. The combined score was 35 of a possible 165, with 27 essential criteria of a possible 111, and 8 desirable criteria of a possible 54. The unified reviewer scores are reported in Figure 1 alongside detail on the tests performed and notes made in testing. Below, we summarize scoring by objective.

Figure 1. Combined results in full.

Objective 1: to identify areas where the interfaces performed well (unified reviewer score of 3). Nine criteria achieved a score of 3, seven of which were essential criteria. These were in the searching (syntax), performance, and other domains. The two desirable criteria with a score of 3 were in the searching (syntax) domain.

Objective 2: areas where the interfaces were adequate (unified reviewer score of 2). Two criteria achieved a score of 2, one in essential criteria (phrase searching) and the other desirable criteria (single character truncation). Both criteria were in the searching (syntax) domain.

Objective 3: areas where the interfaces performed poorly (unified reviewer score of 1). Four criteria achieved a score of 1, all of which were essential criteria. Three of these related to the controlled vocabulary domain and one in the domain “other.”

Objective 4: to identify areas where functionality available in core biomedical bibliographic databases does not exist (unified reviewer scope of 0). Forty criteria achieved a score of 0. Twenty-five of these were essential criteria and fifteen were desirable, relating to the following domains: searching (functions/syntax), field codes, controlled vocabulary, display (search/records), downloading, search history, performance, and “other.”

Discussion

Previous technical reviews using the Bethel and Rogers checklist have focused on evaluating the search interfaces of biomedical bibliographic databases and trials registers resources (Reference Cooper, Court, Kotas and Schauberger5;Reference Bethel and Rogers6). These reviews found similar overall scores despite differences in the resources and their interfaces. We continue to find the Bethel and Rogers checklist a useful tool to assess general search functionality, regardless of database content or purpose. We focus now on the issues arising in this evaluation (things, which a searcher using the ISPOR database should be aware of when searching) and changes, which might be made to improve the use of the search interface in future updates.

Issues Arising

We found differences in testing the truncation feature. In further testing, we conclude that the search interface cannot process capitalization where a search term is truncated or includes a wildcard symbol, for instance: Cost* n = 0 or cost* n = 22,556 (see Supplementary Figure 1). The capitalization of C in cost appears to affect retrieval. This also affects left truncation, for instance: *Diabetes n = 0 or *diabetes n = 4,192 (see Supplementary Figure 2).

In common with bibliographic databases, the formatting of quotation marks (speech marks) impacts retrieval. For instance, we found differences between “cost analyses” and “cost analyses” (See Supplementary Figure 3), where “curly” quotation marks appear to affect retrieval. This does not affect users writing their search enquires directly into the interface, but it is important for searchers who write and then copy and paste their search terms into the search interface, which is a recommended practice when translating search strategies developed in other resources to avoid misspellings.

Changes that Might Improve the Interface

The changes are proposed from the point of view of expert searchers and researchers undertaking searches for abstracts/posters or data to inform systematic reviews or evidence synthesis. This includes both full systematic searches and ad hoc iterative searches.

We propose that the following additions would improve the search interface and searcher experience:

  1. 1. Search history: a search history function would allow a searcher to view their searches across a session of searching. An additional use of this feature would be to include functionality to combine searches, for example, to build up a search. The ability to edit search lines to account for adjustments to the search as it develops would also be desirable.

  2. 2. Ability to export search strategies: that is, the ability to export the search strategies used to identify abstracts/posters when a search is completed. Search data should include the number of records returned by each search line. This will improve the transparency of searching and rigor of the interface.

  3. 3. Ability to download abstracts to a bibliographic tool and Excel: we see this as a key change, because it would allow searchers to export data for review and citation. Ability to export data to research information systems (RIS) format or to Excel would be favored. We consider RIS format the priority, because the format works with bibliographic tools, such as EndNote and Zotero, de-duplication tools, and resources to manage the process of evidence synthesis, such as Covidence and EPPI-Reviewer.

  4. 4. A researcher account function: where searchers can save searches and set up automatic update searches.

  5. 5. Updating the help guide: we would propose adding to the help guide to make the following points clear:

    1. 1.1. the search interface supports searching using parentheses—expert searchers who wish to construct complicated search strategies may find this useful to know—especially where searches can be saved for alerts on particular topics or methods;

    2. 1.2. the interface supports phrase searching (defined as searching for phrases (e.g., cost analysis) as opposed to individual search terms). It should be made clear that the formatting of the quotation marks affects retrieval (e.g., “this” NOT “this”);

    3. 1.3. search terms should not be capitalized other than for Boolean connectors since this appears to impact retrieval where truncation is used;

    4. 1.4. both * and ? can be used as wildcard or truncation symbols, however * represents any number of characters (including 0), whereas ? always represents a single (1) character only. For example, economic* will retrieve economic, economics, economical, economically, and so on, whereas economic? will only retrieve economics.

We do not consider adding the ability to use adjacency/proximity searching to be a priority, since searches of the ISPOR database is relatively small and does not commonly yield a large number of abstracts, unlike searches of a bibliographic database, such as Embase.

We note that there are other ways to search ISPOR content, for example, through handsearching of Value in Health, or via Embase. Further research to compare commonly used interfaces to these resources for retrieval of published/citable ISPOR abstracts would be useful.

Limitations

The criteria in the Bethel and Rogers checklist could put smaller bibliographic databases, such as ISPOR’s database, at a disadvantage when scoring the search interfaces. We acknowledge (above) that the size of the ISPOR database makes some functionality less important than it might be in, say, MEDLINE (e.g., field codes and command line searching). In future, especially where comparing smaller databases, it might be desirable to weight the scoring according to the focus or priorities of the resource.

Conclusions

A technical review of the search interface of the ISPOR database using a validated checklist generated a score of 35 of a possible 165, with 27 for essential criteria of a possible 111, and 8 for desirable criteria of a possible 54.

Following this review, we highlight two issues to researchers using the database, both of which will cause searchers to miss potentially eligible abstracts: (i) that search terms, which are truncated should not be capitalized (e.g., cost* not Cost* or *diabetes not *Diabetes) and (ii) that quotation marks should be straight sided in phrase searching (e.g., “cost analyses” not “cost analyses”).

We set out proposed changes to the search interface, including adding the ability to export abstracts/search strategies to a bibliographic tool, adding a researcher account, and updating the help guide. All suggestions would further improve this helpful database.

Conflicts of Interest

The authors declare no potential conflicts of interest.

Supplementary Materials

To view supplementary material for this article, please visit http://doi.org/10.1017/S0266462322000137.

Funding

C.C. and U.S. received no funding to write this paper. A.B. and R.C. were supported by the NIHR Evidence Synthesis Program (14/25/05). The views expressed in this report are those of the authors and not necessarily those of the NIHR Evidence Synthesis Program, the Department of Health and Social Care. Any errors are the responsibility of the authors.

References

The Professional Society for Health Economics and Outcomes Research ISPOR. About ISPOR—the professional society for health economics and outcomes research. Available at: https://www.ispor.org/about. Published 2021. Accessed 2021.Google Scholar
Kaunelis, D, Glanville, J. Costs and economic evaluation. SuReInfo. Available at: https://sites.google.com/york.ac.uk/sureinfo/home/costs-and-economic-evaluation. Published 2021. Accessed 2022.Google Scholar
Gebregergish, SB, Hashim, M, Heeg, B, Wilke, T, Rauland, M, Hostalek, U (2020) The cost-effectiveness of metformin in pre-diabetics: A systematic literature review of health economic evaluations. Expert Rev Pharmacoecon Outcomes Res. 20, 207–19.10.1080/14737167.2020.1688146Google ScholarPubMed
The Professional Society for Health Economics and Outcomes Research ISPOR. ISPOR Presentations Database. Available at: https://www.ispor.org/heor-resources/presentations-database/search. Published 2021. Accessed 2021.Google Scholar
Cooper, C, Court, R, Kotas, E, Schauberger, U (2021) A technical review of three clinical trials register resources indicates where improvements to the search interfaces are needed. Res Synth Methods. 12, 384–93.10.1002/jrsm.1477Google Scholar
Bethel, A, Rogers, M (2014) A checklist to assess database-hosting platforms for designing and running searches for systematic reviews. Health Info Libr J. 31, 4353.10.1111/hir.12054Google ScholarPubMed
Cooper, C, Lorenc, T, Schauberger, U (2021) What you see depends on where you sit: The effect of geographical location on web-searching for systematic reviews: A case study. Res Synth Methods. 12, 557–70.10.1002/jrsm.1485Google ScholarPubMed
Rethlefsen, ML, Kirtley, S, Waffenschmidt, S, et al. (2021) PRISMA-S: An extension to the PRISMA Statement for reporting literature searches in systematic reviews. Syst Rev. 10, 39.10.1186/s13643-020-01542-zGoogle Scholar
Figure 0

Table 1. Summary of the Bethel and Rogers Checklist

Figure 1

Table 2. Scoring and Research Aims

Figure 2

Figure 1. Combined results in full.

Supplementary material: File

Cooper et al. supplementary material

Figures S1-S3

Download Cooper et al. supplementary material(File)
File 626.5 KB