Hostname: page-component-8448b6f56d-t5pn6 Total loading time: 0 Render date: 2024-04-18T20:25:32.669Z Has data issue: false hasContentIssue false

Quality Indicators for Older Persons’ Transitions in Care: A Systematic Review and Delphi Process

Published online by Cambridge University Press:  03 June 2021

Kaitlyn Tate
Affiliation:
Faculty of Nursing, University of Alberta, Edmonton, Alberta
Sarah Lee
Affiliation:
Faculty of Nursing, University of Alberta, Edmonton, Alberta
Brian H Rowe
Affiliation:
Department of Emergency Medicine, University of Alberta, Edmonton, Alberta
Garnet E Cummings*
Affiliation:
Department of Emergency Medicine, University of Alberta, Edmonton, Alberta
Jayna Holroyd-Leduc
Affiliation:
Foothills Medical Centre, University of Calgary, Calgary, Alberta
R Colin Reid
Affiliation:
School of Health and Exercise Sciences, University of British Columbia, Okanagan, Kelowna, British Columbia
Rowan El-Bialy
Affiliation:
Schulich School of Business, York University, Toronto, Ontario
Jeffrey Bakal
Affiliation:
Department of Medicine, University of Alberta, Edmonton, Alberta
Carole A Estabrooks
Affiliation:
Faculty of Nursing, University of Alberta, Edmonton, Alberta
Carol Anderson
Affiliation:
Alberta Health Services, Edmonton, Alberta
Greta G Cummings
Affiliation:
Faculty of Nursing, University of Alberta, Edmonton, Alberta
*
Corresponding Author: La correspondance et les demandes de tirés-à-part doivent être adressées à : / Correspondence and requests for offprints should be sent to: Greta G. Cummings, Ph.D., R.N., F.A.A.N., F.C.A.H.S. Faculty of Nursing University of Alberta, 5-110 Edmonton Clinical Health Academy 11405 87 Avenue Edmonton, Alberta Canada T6G 1C9. (gretac@ualberta.ca)
Rights & Permissions [Opens in a new window]

Abstract

We identified quality indicators (QIs) for care during transitions of older persons (≥ 65 years of age). Through systematic literature review, we catalogued QIs related to older persons’ transitions in care among continuing care settings and between continuing care and acute care settings and back. Through two Delphi survey rounds, experts ranked relevance, feasibility, and scientific soundness of QIs. A steering committee reviewed QIs for their feasible capture in Canadian administrative databases. Our search yielded 326 QIs from 53 sources. A final set of 38 feasible indicators to measure in current practice was included. The highest proportions of indicators were for the emergency department (47%) and the Institute of Medicine (IOM) quality domain of effectiveness (39.5%). Most feasible indicators were outcome indicators. Our work highlights a lack of standardized transition QI development in practice, and the limitations of current free-text documentation systems in capturing relevant and consistent data.

Résumé

Résumé

Nous avons identifié des indicateurs de qualité (IQ) liés aux soins offerts lors des transitions de personnes âgées (> 65 ans). Par une revue systématique, nous avons catalogué les IQ associés aux transitions de soins de personnes âgées qui étaient transférées entre des établissements de soins continus, ainsi qu’entre ceux-ci et des établissements de soins actifs, et inversement. Deux cycles d’enquêtes Delphi ont été effectués. Des experts ont classé la pertinence, la faisabilité et la solidité scientifique des IQ. Notre comité directeur a examiné les IQ concernant la faisabilité de leur capture dans les bases de données administratives canadiennes. Notre recherche a mené à 326 IQ provenant de 53 sources. Un total de 38 indicateurs ont été sélectionnés en considération de la faisabilité de ces mesures dans la pratique actuelle. La plus grande proportion des indicateurs visait les services d’urgence (47 %) et l’efficacité selon les domaines de qualité de l’Institute of Medicine (IOM) (39,5 %). Les indicateurs présentant la meilleure faisabilité étaient ceux liés aux résultats. Notre étude met en évidence un développement insuffisant d’IQ standardisés pour les transitions en pratique, ainsi que des limites dans les systèmes de documentation actuellement en accès libre pour l’obtention de données pertinentes et cohérentes.

Type
Article
Copyright
© Canadian Association on Gerontology 2021

Health care service delivery for Canada’s vulnerable older adult population occurs in a number of settings and involves diverse groups of health providers, professions, and services. When the health status and care needs of older persons’ (≥ 65 years of age) change, they can be transferred from one health care setting to another (e.g., from their residential facility to acute care settings). Care during transitions of older persons can be fragmented, delayed, not evidence informed, and unsafe (Anderson, Allan, & Finucane, Reference Anderson, Allan and Finucane2000; Coleman, Reference Coleman2003; Crilly, Chaboyer, & Wallis, Reference Crilly, Chaboyer and Wallis2006; Reid et al., Reference Reid, Cummings, Cooper, Abel, Bissell and Estabrooks2013; Riaz & Brown, Reference Riaz and Brown2019; Trahan, Spiers, & Cummings, Reference Trahan, Spiers and Cummings2016). Poor quality of care transitions between residential long-term care (LTC) facilities or community care settings and acute care settings is linked to increased length of stay in hospital, increased dissatisfaction among providers and patients, increased risk of adverse patient events, and decreased quality of health care (Callahan et al., Reference Callahan, Arling, Tu, Rosenman, Counsell and Stump2012; Coleman & Berenson, Reference Coleman and Berenson2004; Crilly et al., Reference Crilly, Chaboyer and Wallis2006; McCloskey, Reference McCloskey2011; Riaz & Brown, Reference Riaz and Brown2019; Scott, Reference Scott2010; Tisminetzky et al., Reference Tisminetzky, Gurwitz, Miozzo, Gore, Lessard and Yarzebski2019). Additionally, although there are established quality indicators for care delivery within facility-based care settings (e.g., Resident Assessment Instrument [RAI] indicators), whether these indicators are applicable and used for transitions remains unclear (Hutchinson et al., Reference Hutchinson, Milke, Maisey, Johnson, Squires and Teare2010). A particular concern is that persons who rely on others during transitions, such as older persons with moderate to severe dementia, receive optimal patient-centered care (Banerjee, Reference Banerjee2007).

Health systems require valid and reliable measures of quality to monitor, improve, and maintain high standards of care delivery for frail older persons during care transitions. Clinicians, health care managers, and policy makers are responsible for ensuring that care delivery for older persons across health care settings is monitored and evaluated based on the best available standards. When quality indicators (QIs) are identified and reported in areas of care delivery with high potential for improvement, they can provide measures for quality of care and improved patient outcomes (Hibbard, Stockard, & Tusler, Reference Hibbard, Stockard and Tusler2005; Kraska, Krummenauer, & Geraedts, Reference Kraska, Krummenauer and Geraedts2016).

This study examined the state of established QIs for vulnerable older adults experiencing transition(s) among multiple care settings, which could be between: (1) continuing care and community settings (LTC/nursing homes; assisted or supportive living facilities that provide accommodation, meals, and personal care for those who are medically and physically stable; and independent living with or without home care support); (2) emergency or non-emergency transport via ambulance, hereafter referred to as emergency medical services (EMS); (3) emergency departments (EDs); and (4) hospital in-patient settings (see Figure 1 for settings included). Our aim was to develop and validate a ranked set of evidence-based QIs for evaluating quality of care provided during care transition, and our objectives were to:

  1. 1. Systematically review the current state of QI literature for care transitions experienced by older persons

  2. 2. Validate QIs for older persons’ care transitions through a Delphi process

  3. 3. Evaluate the feasibility of implementing the full set of QIs across care transitions

  4. 4. Translate findings into practice through an integrated knowledge translation approach

Figure 1. Care transition process

Methods

During Phase 1 we conducted a systematic scoping review, informed by Arksey and O’Malley’s framework, in which researchers select the research question, search related studies, select eligible studies, and synthesize and tabulate key information to derive a report of findings (Arksey & O’Malley, Reference Arksey and O’Malley2005). We used the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines to guide reporting of the review (Moher, Liberati, Tetzlaff, Altman, & PRISMA Group, Reference Moher, Liberati, Tetzlaff and Altman2009). A University Health Research Ethics Board (PRO00069167) provided ethics approval for Phase 2: The Delphi Process and steering committee feasibility review.

Inclusion and Exclusion Criteria

We defined QIs as indicators developed through a predetermined systematic process in which primary data collection and/or stakeholder involvement (Delphi process or expert panel) occurred in the identification or review of indicators (De Koning, Reference De Koning2007). We included all literature examining QIs applied in care settings where older persons receive care during transitions: residential care facilities (LTC/nursing homes, assisted living facilities, independent living with home care support), EMS, EDs, and hospital in-patient settings. We included all types of QIs (structure, process, and outcome). We placed no limitation on year of publication. We excluded literature examining QIs focused on (1) provision of care not within or directly leading to care transitions, (2) care delivery of a specific disease or condition not directly related to the transition process, and/or (3) individuals under the age of 65 (e.g., studies on maternal or child health). We included studies published in English only, as that was the only language shared among team members.

Search Strategy

An academic health sciences librarian assisted in developing the search strategy. Search terms included “quality indicator/standard of care/benchmarking/outcome measures”, “quality of health care/process assessment”, and “quality improvement/quality assurance”. Electronic databases searched included Cochrane Database of Systematic Reviews, Elton B. Stephens Company (EBSCO)host Cumulative Index to the Nursing and Allied Health (CINAHL) Plus, Institute for Scientific Information (ISI) Web of Science, Ovid Embase, Ovid MEDLINE®, and Scopus. Records were downloaded into Endnote™ and duplicates were removed. We actively sought grey literature in academic, government, and institutional Web sites that generated reports of QIs, but did not include theoretical articles, commentaries, or practice guidelines that did not include QIs. We used a previously pilot-tested, Microsoft Access electronic form for data screening and extraction (Tate et al., Reference Tate, Hewko, McLane, Baxter, Perry and Armijo-Olivo2019). See Appendix 1 for detailed search strategy.

Screening Procedures

Six research team members (K.T., S.L., R.L., F.C., G.G.C., B.H.R.) met to affirm inclusion and exclusion criteria. Following removal of duplicates, one of four partnered reviewers (K.T., S.L., R.L., F.C.) independently screened every abstract. Partnered reviewers met after review of an initial 200 abstracts to ensure consistent interpretation of the inclusion and exclusion criteria. Discrepancy meetings occurred throughout screening to compare results and ensure clarity of inclusion criteria. When reviewers could not reach consensus through discussion, the senior author (G.G.C.) made the final decision. One of four partnered reviewers (K.T., S.L., R.L., F.C.) independently screened each full text manuscript using similar procedures.

Data Extraction

The following seven data elements were extracted from each study: (1) study characteristics (e.g., year of publication and year[s] of data collection, health care setting, theoretical framework and objectives); (2) study design; (3) identified quality indicators; (4) methods for developing QIs and data source; (5) results; (6) study limitations; and (7) study conclusions. One of four reviewers (K.T., S.L., R.L., F.C.) independently extracted data from each included article, and then each extraction was verified by a second reviewer. We did not appraise study quality, as expert panelists would appraise all possible QIs during the Delphi process (which would be different from the level of quality of the study if the study itself was not on the entirety of the QI development, or was about more than QI development).

Delphi Process for Evaluation

Before the Delphi process in Phase 2, team members reviewed and categorized indicators to avoid duplicate entries and clarify indicator parameters. To map indicators to the most relevant quality domain (Institute of Medicine [U.S.], 2001), six reviewers were paired, and then independently coded extracted indicators from each included study according to: care setting (sending continuing care or community setting [residential care facility, home living setting], transport 1, ED, hospital/in-patient, or other continuing care setting, and, if applicable, transport 2, receiving seniors’ facilities/home living setting) as seen in Figure 1; Donabedian framework domain (structure, process, outcome); and Institute of Medicine (IOM) Domains of Quality (safe, effective, patient-centred, timely, efficient, equitable). Discrepancy meetings between partnered reviewers were held after coding was completed to ensured agreement among reviewers.

Integrated Knowledge Approach

We invited experts via e-mail to join our expert panel to review coded QIs across care transitions through a Delphi process using online surveys. We searched for and approached potential expert panelists based on their roles as authors and practice experts from relevant literature, and through suggestions from research team members. The e-mail invitation letter included a link to a Google Form survey to record their willingness to participate. To keep track of both affirmed and declined responses, only the names and e-mail addresses were recorded. No other identifying information was collected. The expert panelist participation record was kept in a password-protected document accessible only by the local research team. We aimed to recruit at least 20 expert panelist members to ensure a diverse panel (Boulkedid, Abdoul, Loustau, Sibony, & Alberti, Reference Boulkedid, Abdoul, Loustau, Sibony and Alberti2011).

The Delphi process methods were adapted from Boulkedid et al. (Reference Boulkedid, Abdoul, Loustau, Sibony and Alberti2011). The adapted method had previously been used by a member of our research team (Schull et al., Reference Schull, Hatcher, Guttmann, Leaver, Vermeulen and Rowe2010). Study data were collected and managed using REDCap* electronic data capture tools (Harris et al., Reference Harris, Taylor, Thielke, Payne, Gonzalez and Conde2009). We provided each expert panelist with a unique survey link and participant identifier. An invitation and three subsequent e-mail reminders were sent, approximately a week apart from each other, based on a schedule adapted from Dillman, Smyth, and Christian’s (Dillman, Reference Dillman, Christian and Smyth2014) method.

Round 1

Expert panelists were asked to rate each QI on five domains using five-point Likert scales: scientific soundness, validity, feasibility, relevance, and importance (Boulkedid et al., Reference Boulkedid, Abdoul, Loustau, Sibony and Alberti2011; Schull et al., Reference Schull, Guttman, Leaver, Vermeulen, Hatcher and Rowe2011). We provided information to panelists about candidate indicators from original sources including numerator (number of cases that met the QI criteria) and denominator (total number of cases subject to meeting QI criteria), source(s), applicable care setting, and method of QI development. Identified indicators were organized into five transition care settings (sending continuing care setting [residential care facility or home living setting], transport 1, ED, hospital, transport 2, receiving continuing care setting). We strategically assigned expert panelists across these five settings so that a variety of experts from different specialties (i.e., researchers, clinicians, decision makers, older adults), but with the most expertise in care delivery in that particular setting rated each indicator (i.e., researchers focusing on ED care and geriatricians with experience in ED were assigned to evaluate ED QIs). We used all responses (fully and partially completed) to classify each indicator as retained, borderline, or discarded. Participants added comments and rationales for each indicator rating to allow for qualitative feedback between rounds (Boulkedid et al., Reference Boulkedid, Abdoul, Loustau, Sibony and Alberti2011; Schull et al., Reference Schull, Hatcher, Guttmann, Leaver, Vermeulen and Rowe2010). Four to seven experts rated each indicator, and all responses were weighted equally and combined. Indicators with a median score ≥ 4 on soundness and at least one of the importance or relevance measures were retained. Indicators with scores between 3.0 and 3.9 on soundness and at least one of the importance or relevance measures were borderline and kept for repeat assessment (Boulkedid et al., Reference Boulkedid, Abdoul, Loustau, Sibony and Alberti2011; Schull et al., Reference Schull, Hatcher, Guttmann, Leaver, Vermeulen and Rowe2010). Any indicator with a score < 3.0 on soundness was discarded.

Round 2

To maintain panelists’ continued engagement, we provided feedback between Rounds 1 and 2. Experts were given median scores and their initial individual score for each borderline indicator from Round 1. Lists of retained and discarded indicators from Round 1, including ID number and QI name, were sent to each panelist. Qualitative feedback from participants in Round 1 was used to clarify parameters of QIs. In Round 2, expert panelists were asked either to keep or discard each borderline indicator using the same information provided in Round 1. Experts were divided into two groups, each comprising a variety of different specialties (i.e., we aimed to have researchers, clinicians, and older adults with lived experiences, as well as representatives from various care settings distributed evenly between groups). Participants reviewed many of the same indicators from Round 1 in Round 2. Borderline indicators that received a vote of keep from at least half of the panelists were retained; remaining indicators were reclassified as discarded. Retained indicators were further assessed for feasibility and accessibility.

Feasibility Review

A steering committee completed a feasibility review of the final indicators from the Delphi rounds to determine whether the current Canadian administrative databases captured each indicator, and how easily such data could be retrieved. The Older Persons’ Transitions in Care (OPTIC) steering committee consisted of research team members (academics, data specialists, and health system decision makers) with substantive research, clinical, and administrative data expertise who were representative of various care settings. Prior to the in-person feasibility review, steering committee members searched for available national health systems databases (as well as databases in one Western Canadian province) and extracted data elements that could be used to measure indicators under review. Databases identified and reviewed were Canadian Institute for Health Information’s (CIHI) National Ambulatory Care Reporting System (NACRS) and Discharge Abstract Databases (DAD), Alberta Continuing Care Information System (ACCIS), Canadian Patient Experiences Reporting System (CPERS), Continuing Care Reporting System (CCRS), Pharmaceutical Information Network (PIN), and regional databases in Edmonton and Calgary. From these databases, we identified relevant individual data elements (e.g., reported 30-day readmission rates, new medication “flags” that could be used to identify if persons left hospital with new prescriptions) for each indicator, unit of analysis captured, and whether collection of elements was mandatory, optional, or conditionally mandatory.

The OPTIC research team categorized each indicator as either an (1) established QI currently measured with a data set, (2) indicator for which data elements are collected but not used, (3) indicator for which some applicable databases/elements exist but may or may not be collected, or (4) indicator for which no applicable database/elements are currently captured. These categorizations were independently completed by one team member, verified by another, and sent to the data expert for review of indicator data availability prior to the feasibility review. During the in-person feasibility review, the OPTIC steering committee reviewed and discussed individual indicators when it was unclear if and how current data in Canadian administrative databases could be used to measure them. The steering committee determined whether capture of retained indicators was feasible with existing data, required enhanced data collection, and/or was clinically valuable for improving care during transitions of older persons.

Results

Search Results

Our electronic database search yielded 10,487 unique records. Following abstract/title screening, 1,615 articles were retrieved for full text screening, of which a final 41 articles met inclusion criteria. Twelve other sources from grey literature searches met inclusion criteria, for a total of 53 articles. See Figure 2 for PRISMA Flow diagram of search and screening results.

Figure 2. PRISMA diagram

From the 53 articles, 326 candidate QIs were identified for review through the Delphi process. After coding into applicable domains, the 326 QIs (n = 266 established and n = 60 developing) included 35 (10.7%) structure, 212 (65.0%) process, and 79 (24.2%) outcome indicators. QIs were categorized into timeliness (25%), effectiveness (n = 24%), safety (21%), patient-centredness (19%), efficiency (10%), and equity (<1%). See Figure 3 for a visual display of review results by Donabedian framework, IOM quality domain, and care setting.

Figure 3. Indicators extracted from systematic review

Delphi Process Results

Round 1

Thirty-three of 39 invited experts initially agreed to participate. Participants included researchers on transitional or geriatric care or gerontology, clinicians and decision makers with experience in quality management and/or related research, and older adults with experience as an informal caregiver or recipient of care during a care transition. Twenty-two experts completed the survey for Round 1, three partially completed the survey, five did not complete the survey, and three had to withdraw prior to Round 1 completion because of time constraints.

Of the 326 indicators included in Round 1, 80 were classified as “retained”, 92 were classified as “discarded” and 154 were classified as “borderline”. The 154 borderline indicators were included in the Round 2 survey, while the 80 retained indicators were moved forward for feasibility review by the steering committee. Although no clear patterns of response emerged based on expert specialty, the majority of indicators from Round 1 were discarded based on lack of “clinical importance or relevance”. Specifically, participants felt that some quality indicators were not relevant in Canadian contexts, supported by a Delphi panelist stating, “This is only relevant to UK or Australia ED contexts” in reference to the QI, “Proportion of patients re-attending the ED seen by a more senior member of the ED medical staff (Middle Grade or Consultant)”, and another Delphi participant stating (in reference to the QI) “Availability of ED observation beds”.

[Availability of ED observation beds] is very different in different health care systems – in the United States observation beds are often a means to address billing for ED services

Other experts felt that some indicators were not clinically meaningful (e.g., length of stay [LOS] in acute care services).

LOS – hard to determine what is appropriate since this is determined by complexity of the patient. To get a better understanding of transitions and quality of care and patient flow, it is critical to look at unnecessary LOS in acute care (aka Alternate level of care – ALC: patients who are in an acute care bed who no longer need the intensity of care provided by that unit).

Or they may have felt that some indicators were no longer important based on more current best practices (e.g., proportion of LTC residents who experienced an unintentional discontinuation of their statins upon returning to their LTC residence after an acute-care admission).

[Statins] are often not indicated or no longer effective. Not sure why we would pick Statins to gauge “unintentional discontinuation”.

Round 2

Of 22 experts who completed Round 1 surveys, 19 participated in Round 2. A total of 154 borderline indicators were split into two different surveys of 77 indicators each to ensure survey completion. Of the borderline group of indicators, 100 additional quality indicators were retained.

After both rounds, a total of 180 indicators was retained, while 146 were discarded. Retained indicators generally covered a similar range of transition settings, Donabedian framework types, IOM domains of quality, and care settings compared with the initial identified indicators. However, notable changes among retained indicators included fewer indicators that spanned multiple settings, and fewer indicators specific to transitions and palliative care. Qualitative feedback was not solicited for this round, as the intent was to provide feedback and clarify QI parameters (if possible) for Delphi panelists between rounds (Boulkedid et al., Reference Boulkedid, Abdoul, Loustau, Sibony and Alberti2011; Schull et al., Reference Schull, Hatcher, Guttmann, Leaver, Vermeulen and Rowe2010). See Figure 4 for Delphi process classification results.

Figure 4. Indicators retained after Delphi process

Feasibility Review

Following the OPTIC steering committee’s review of the 180 retained QIs for feasibility, 7 indicators were feasible based on current use by the CIHI, 31 additional indicators were reconsidered feasible and retained, and 142 indicators were deemed not feasible. Indicators were not feasible if (1) individual chart review was required to ensure data availability (n = 46), (2) procedures described in the indicator were not currently being performed (n = 6), (3) the indicator was not known to be documented (n = 17), (4) further indicator clarification was required in order to reasonably capture the indicator within current data platforms (n = 8), and/or (5) the indicator lacked clinical value or relevance based on current Canadian information systems (e.g., for the indicator “time from first contact with emergency and urgent care systems [EUCS] service to definitive care”, more targeted measures for specific conditions could be tracked and used more meaningfully than a general indicator) (n = 7).

The final set of 38 feasible indicators in 21 articles (see Appendix 2) included the following transition care settings: ED (n = 18), seniors’ facilities (n = 4), transport (n = 1), hospital (n = 5), palliative care (n = 7), and multiple settings (n = 3). See Figure 5 for results of the feasibility review by Donabedian framework, IOM quality domain, and care setting, and Table 1 for characteristics of included and feasible QIs.

Figure 5. Indicators retained after feasibility review

Table 1. Final set of retained indicators

Note. Sources for included Quality Indicators can be seen in Appendix 2. IOM = Institute of Medicine; LTC = long-term care; A&E = accident & emergency department; CTAS = Canadian Triage Acuity Scale.

The steering committee identified knowledge gaps during their deliberations for the feasibility review. These include lack of standardized QI development applied in practice, no feasible indicators related to equity (e.g., age, sex/gender, race), a paucity of appropriate assessments (or documentation of assessments) of older persons across settings, and little to no screening done for baseline function, delirium, dementia, or cognitive impairment. Many proposed indicators require individual chart review.

Discussion

Using a robust mixed-method design and an integrated knowledge translation approach, this study identified 326 QIs cited in the literature and explored the feasibility of their reporting using standard administrative health databases. After an expert panel review, only 38 QIs were feasible to capture with existing databases and documentation practices within the Canadian context. The majority of feasible indicators relate to acute care settings, outcomes, and process indicators, and aligned with the IOM quality domain of effectiveness. Few available and feasible indicators were identified from EMS transport and seniors’ residential care settings, structure indicators, or IOM domains of patient-centredness and equity.

Of the QIs identified in this review, many can be used to monitor and improve transitions to and from EDs and in-patient settings, particularly pertaining to timeliness and safety in the process of care delivery. Target wait times from ED arrival to disposition for older adults are often not met and when older adults are hospitalized, they are at high risk of experiencing adverse events such as medication-related errors and in-hospital death (Cummings et al., Reference Cummings, McLane, Reid, Tate, Cooper and Rowe2020; Riaz & Brown, Reference Riaz and Brown2019; Tisminetzky et al., Reference Tisminetzky, Gurwitz, Miozzo, Gore, Lessard and Yarzebski2019). Although many older patients are discharged back to the community, they experience high rates of repeat ED visits and unplanned hospitalizations largely attributed to unresolved problems and limited discharge planning (Ahn, Hussein, Mahmood, & Smith, Reference Ahn, Hussein, Mahmood and Smith2020; Brennan, Chan, Killeen, & Castillo, Reference Brennan, Chan, Killeen and Castillo2015; Doupe et al., Reference Doupe, Palatnick, Day, Chateau, Soodeen and Burchill2012). Identified QIs, although not comprehensive, offer an initial framework to build a suite of QIs for various transitions for older adults. QIs discarded during feasibility review could be re-evaluated as electronic health records evolve, to determine if their capture could be feasible by adding or mandating data elements. Further, QIs discarded based on relevance to Canadian contexts could be reviewed to determine whether they could be clinically important if modified and tested here.

The lack of feasible indicators outside of acute care settings is concerning. Issues that occur during the onset of transfer, such as incomplete or missing data on resident condition and goals of care, can negatively influence care throughout the transition process (Griffiths, Morphet, Innes, Crawford, & Williams, Reference Griffiths, Morphet, Innes, Crawford and Williams2014). Although data are available for care delivery within continuing care settings (such as RAI-Minimum Data Set [MDS] 2.0 nursing home data), (Estabrooks, Knopp-Sihota, & Norton, Reference Estabrooks, Knopp-Sihota and Norton2013) we found a lack of rigorously developed indicators for processes leading up to a decision to transfer and for the initial patient transfer process from continuing care settings. Despite existing research regarding trigger events leading to transfer to acute care services for older persons, only one feasible QI related to a trigger event (falls) was identified and it was only captured as an element of LTC admission, not of transfer from continuing care to acute care services (Cummings et al., Reference Cummings, McLane, Reid, Tate, Cooper and Rowe2020; Dwyer, Stoelwinder, Gabbe, & Lowthian, Reference Dwyer, Stoelwinder, Gabbe and Lowthian2015). Other QI reviews on care delivery for older adult populations report that most indicators focus on examinations and treatment for a specific disease, although limited measures are available to monitor safety and quality concerns where care services intersect (Joling et al., Reference Joling, van Eenoo, Vetrano, Smaardijk, Declercq and Onder2018; Laugaland, Aase, & Barach, Reference Laugaland, Aase and Barach2011). Our results confirm the scarcity of available, feasible indicators related to transition onset. These types of indicators are integral in elucidating early concerns in transitions, determining a reference point of patient condition and context influencing perceived quality of the transition, and identifying and evaluating potentially avoidable transitions.

Our review highlights that despite guidelines being available for standardized QI development, validation and prioritization of many QIs do not meet standards of rigor (Kötter, Blozik, & Scherer, Reference Kötter, Blozik and Scherer2012). Many QIs were validated through consensus and lacked reported empirical testing; therefore, they still require better reporting on their development methods, pilot testing, operationalization with properly developed numerators and denominators (where applicable), and evaluation through more robust quantitative and mixed-methods designs (Kötter et al., Reference Kötter, Blozik and Scherer2012; Terrell et al., Reference Terrell, Hustey, Hwang, Gerson, Wenger and Miller2009; Wakai et al., Reference Wakai, O’Sullivan, Staunton, Walsh, Hickey and Plunkett2013). Unfortunately, QIs have been used in applied research or practice without the preceding research necessary to ensure validity and utility of these measures after their initial identification (Mansoor & Al-Kindi, Reference Mansoor and Al-Kindi2017; Saver et al., Reference Saver, Martin, Adler, Candib, Deligiannidis and Golding2015). Moreover, some QIs (e.g., thresholds for certain types of screening related to cancer, diabetes, and dementia, as well as QIs for prescribing practices for diabetes) that are currently being used in hospital settings and are tied to financial incentives, are selected because of their measurement ease and availability rather than because of their evidence base or representation as true markers of care quality (Saver et al., Reference Saver, Martin, Adler, Candib, Deligiannidis and Golding2015). Even among QI sets considered to be of high quality (interRAI-Home Care QIs, Agency for Healthcare Research and Quality prevention QI sets, and Assessing Care of Vulnerable Elders [ACOVE]-3 indicator sets), only ACOVE-3 indicators have scored high enough for methodological quality based on “scientific evidence” (Burkett, Martin-Khan, & Gray, Reference Burkett, Martin-Khan and Gray2017; De Koning, Reference De Koning2007; Joling et al., Reference Joling, van Eenoo, Vetrano, Smaardijk, Declercq and Onder2018; Wenger et al., Reference Wenger, Roth, Shekelle, Amin, Bedsine and Blazer2007). Further study will ensure that QIs for older persons’ care transitions meet established standards of development, and will determine resources required to capture data to measure QIs (van Teijlingen & Hundley, Reference van Teijlingen and Hundley2002).

Our findings suggest that little to no systematic screening for baseline function, delirium, dementia, or cognitive impairment is occurring and feasibly captured as older persons transition through acute care settings (Cummings et al., Reference Cummings, McLane, Reid, Tate, Cooper and Rowe2020). Some care activities may be performed, but are not documented, some are documented but are not easy to capture, and some may not be performed at all in current care settings.

Tracking of current available indicators relies primarily on chart review, potentially from multiple care settings. Having standardized documentation that prompts certain assessments or activities to be completed (vs. solely free-text charting) offers a robust opportunity to improve both care provided and continuity in care (Hustey & Palmer, Reference Hustey and Palmer2010; Terrell et al., Reference Terrell, Brizendine, Bean, Giles, Davidson and Evers2005; Zafirau, Snyder, Hazelett, Bansal, & McMahon, Reference Zafirau, Snyder, Hazelett, Bansal and McMahon2012). Antiquated and fragmented electronic tracking systems need to be consolidated and advanced to allow health care decision makers to better evaluate and improve older persons’ care during transitions, in recognition of their distinct care needs (Allen, Hutchinson, Brown, & Livingston, Reference Allen, Hutchinson, Brown and Livingston2014). Standardized electronic documentation (e.g., drop-down menus, checklists) (McLane et al., Reference McLane, Tate, Reid, Rowe, Estabrooks and Cummings2022) also needs to be completed across care settings to maximize benefits of using large clinical and administrative databases efficiently. Standardized electronic documentation allows for reliable, feasible tracking, and enhances the quality and completeness of the data tracked (Vuokko, Mäkelä-Bengs, Hyppönen, Lindqvist, & Doupi, Reference Vuokko, Mäkelä-Bengs, Hyppönen, Lindqvist and Doupi2017). Provincial policies, clinical guidelines, and practice standards should provide direction and governance related to data specifications and documentation practices that will allow for effective data integration across care settings and regions.

The electronic capture of valid and reliable data can be used for secondary purposes, such as creation of QI dashboards for audit-feedback targeted at improving care for older persons (Lloyd, Reference Lloyd2017; Vuokko et al., Reference Vuokko, Mäkelä-Bengs, Hyppönen, Lindqvist and Doupi2017). With a standardized electronic data platform, related QIs can be captured together and thereby support display of QI information with statistical interpretations for knowledge users. (Schall et al., Reference Schall, Cullen, Matthews, Pennathur, Chen and Burrell2017). This is a necessary step to incorporate concepts of statistical process control (using statistics to monitor and improve quality), health informatics, and meaningful use of indicators in health care systems to consider context and missing data to drive change (Lloyd, Reference Lloyd2017; Office of the National Coordinator for Health Information Technology, 2015; Spath, Reference Spath2013; Tashobya et al., Reference Tashobya, Dubourg, Ssengooba, Speybroeck, Macq and Criel2016). Ensuring data completeness has the potential to reduce the amount of superfluous data being captured and thereby reduce resources needed to retrieve such data (Arthofer & Girardi, Reference Arthofer and Girardi2017).

No feasible equity indicators were identified that clearly compared care received by older persons to care received by the general population or by older persons living in their homes. However, risk-adjusted QIs can statistically account for the influence of variables such as age, sex, and chronic conditions on the values and subsequent interpretation of QIs (Joling et al., Reference Joling, van Eenoo, Vetrano, Smaardijk, Declercq and Onder2018). Unfortunately, many QIs identified in this study, and in another review of QIs in older persons’ community care, are neither risk adjusted nor accompanied by strategies for risk adjustment in published reports (Joling et al., Reference Joling, van Eenoo, Vetrano, Smaardijk, Declercq and Onder2018). Having almost no information on how care is provided for older persons compared with other populations is alarming, as older persons are identified as one of the most disadvantaged and vulnerable patient groups (Johnstone & Kanitsaki, Reference Johnstone and Kanitsaki2008). It is imperative that future research related to care transitions focus on development and validation of feasible equity indicators with parameters that include comparators by age (Williams & Mohammed, Reference Williams and Mohammed2009). A minimum set of essential, cross-setting transition QIs are needed, and should be rigorously developed, validated, and evaluated using available guidelines.

Limitations and Strengths

The systematic review component of this study may be limited by publication and selection bias. Key weaknesses in QIs for transitions were related to validation, empirical testing, and reporting of their development. Difficulties emerged when seeking knowledgeable experts in both older persons’ transitions in care and QIs. Many potential panelists were acknowledged as experts in older persons’ care but were unfamiliar with what constituted rigor in QI development, despite criteria being described and available on the online surveys. This study only examined feasibility related to data capture of QIs in Canadian contexts, and our findings may not be transferable to other regions in which health policy, health care delivery systems, and health informatics systems differ.

Strengths of our study included systematic selection of indicators through trained and independent research staff, and the diversity and number of experts included in our Delphi process and steering committee for feasibility review. A comprehensive search strategy was used to mitigate publication bias and to avoid selection bias. Efforts to maintain rigor were evident through individual coding, extraction, and consensus methods used in the Delphi process and feasibility review. Diversity in both the expert panel and steering committee reduced risk of monopolization of one discipline or setting, allowing for representation from stakeholders across the continuum of care.

Conclusion

Although numerous QIs have been developed and reported, the number of feasible QIs for older persons’ transitions in care is distressingly small. QIs that do exist for older persons’ transitions in care are primarily for acute care settings, and almost none exist for tracking transitions across settings. A set of cross-setting transition QIs is needed, and should be developed, validated, and properly operationalized using available guidelines. Measurement and documentation practices need to be improved, to increase the feasibility of capturing QIs rather than having a system complacent about adapting and implementing QIs that conform to current poor reporting practices. Future QI development should focus on standardized electronic reporting systems to better track data across settings. Each setting involved in care transitions should be held accountable for improving the quality of care experienced by older persons during transitions.

Acknowledgements

We acknowledge the contributions of research assistants Rory Lepage and Francisca Claveria, who participated in abstract and full-text screening and initial indicator coding, and Stephanie Couperthwaite, who assisted in the survey development process and REDCap survey administration. Finally, we thank all the Delphi expert panelists who gave permission to be named: Tammy Hopper, James L. Silvius, Navjot Virk, Ingrid Crowther, Karen Fruetel, Deniz Cetin-Sahin, Isabelle Vedel, Tammy Damberger, Machelle Wilchesky, Michael J Bullard, Jenny Basran, Barbara Liu, John Muscedere, Erika Dempsey, Angela Gulay, Douglas Faulder, Cliff Mitchell, Alison Hutchinson, and Denise S. Cloutier.

Author Contributions

All authors participated in the feasibility review process and interpretation of phase 2 study findings, participated on the OPTIC Steering Committee, and reviewed and contributed to draft manuscript versions. Authors K.T., S.L., J.H.L., R.C.R., G.G.C., and G.E..C. were involved in categorizing quality indicators (QIs) and interpretation of study findings for phase 1. K.T., S.L., B.H.R., R.C.R., and G.C.C. contributed to clarification of inclusion and exclusion criteria for phase 1. Author R.E.B. drafted preliminary results of the Delphi process. Author J.B. provided specific content expertise on administrative databases in the feasibility review and contributed to the interpretation of phase 2 findings. Authors B.H.R., J.H.L., G.E.C., C.A.E., and G.G.C. provided expertise regarding clinical practice and clinical importance of QIs, and contributed to the interpretation of phase 2 findings. Author K.T. drafted the initial manuscript with S.L., and G.G.C., as senior author, reviewed and edited all versions of the manuscript.

Funding Statement

This project titled “Development of Quality Indicators for Older Persons’ Transitions across Care Settings: A Systematic Review and Delphi Process” (G.G. Cummings as nominated principal investigator) was funded by the Canadian Institutes of Health Research. The first author (K.T.) was funded by the Canadian Frailty Network for Phase 1 of this study. The senior author on this paper (G.G.C.) was supported by the University of Alberta Centennial Professorship during the time of this study. C.A.E. is supported through a Tier 1 Canada Research Chair in Knowledge Translation. B.H.R. held a Tier I Canada Research Chair in Evidence-Based Emergency Medicine during the time of this study.

Appendix 1: MULTIFILE Search Strategy

  1. 1. quality indicators, health care/ or benchmarking/

  2. 2. (benchmark* or trigger tool*).ti,ab,kf.

  3. 3. ((quality adj3 (indicator* or measure* or metric*)) or (quality adj3 criteri*) or performance indicator* or performance measure* or clinical indicator* or clinical measure* or outcome indicator* or ((performance or clinical or outcome) adj3 metric*)).ti,ab,kf.

  4. 4. ((quality and (standard* or measure* or indicator* or metric*)) or (performance and (indicator* or measure* or metric*))).ti,kf.

  5. 5. (practice guidelines as topic/ or practice guideline.pt. or ((clinical or practice) adj guideline*).ti,ab,kf.) and (((safe* or efficien* or effective* or timel* or equit* or patient cent*) adj3 (care or service*)) or quality or indicator*).ti,ab,kf.

  6. 6. (“quality of health care”/ or “outcome assessment (health care)”/ or “Process Assessment (Health Care)”/ or quality assurance, health care/) and (((safe* or efficien* or effective* or timel* or equit* or patient cent*) adj3 (care or service*)) or indicator*).ti,ab,kf.

  7. 7. audit.ti,ab,kf,hw. and (((safe* or efficien* or effective* or timel* or equit* or patient cent*) adj3 (care or service*)) or quality or indicator*).ti,ab,kf.

  8. 8. or/1-7

  9. 9. nursing homes/ or Intermediate Care Facilities/ or skilled nursing facilities/ or homes for the aged/

  10. 10. (((extended care or long term care or intermediate or skilled or residential) adj2 (facilit* or facilities)) or residential care).ti,ab,kf.

  11. 11. (assisted living or lodge or lodges).ti,ab,kf.

  12. 12. emergency medical services/ or advanced trauma life support care/ or emergency medical service communication systems/ or exp emergency service, hospital/ or emergency services, psychiatric/

  13. 13. (emergency adj2 (room* or center* or centre* or facilit* or department* or ward* or service*)).ti,ab,kf.

  14. 14. or/9-13

  15. 15. 8 and 14

  16. 16. home care services/ or home health nursing/

  17. 17. (((home or community) adj2 care) or ((home or community) and (supportive living or supportive care))).ti,ab,kf.

  18. 18. 16 or 17

Appendix 2: Sources for included Quality Indicators

Australian Commission on Safety and Quality in Health Care and NSW Therapeutic Advisory Group Inc. (2014). National quality use of medicines indicators for Australian hospitals (ACSQHC), Sydney. Retrieved 26 January 2021 from https://www.safetyandquality.gov.au/sites/default/files/migrated/SAQ127_National_QUM_Indicators_V14-FINAL-D14-39602.pdf

Berenholtz, S. M., Dorman, T., Ngo, K., & Pronovost, P. J. (2002). Qualitative review of intensive care unit quality indicators. Journal of Critical Care, 17(1), 1–12.

Coleman, P., & Nicholl, J. (2010). Consensus methods to identify a set of potential performance indicators for systems of emergency and urgent care. Journal of Health Services Research & Policy, 15(Suppl. 2), 12–18.

College of Emergency Medicine UK. (2011). Emergency department clinical quality indicators: A CEM guide to implementation. Retrieved 26 January 2021 from http://www.dickyricky.com/Medicine/Guidelines/RCEM%20-%20Royal%20College%20of%20Emergency%20Medicine/2011_03%20CEM5832%20Quality%20Indicators.pdf

Earle, C. C., Neville, B. A., Weeks, J. C., Landrum, M. B., Souza, J. M., Ayanian, J. Z., et al. (2005). Evaluating claims-based indicators of the intensity of end-of-life cancer care. International Journal for Quality in Health Care, 17(6), 505–509. https://doi.org/10.1093/intqhc/mzi061

Earle, C. C., Park, E. R., Lai, B., Weeks, J. C., Ayanian, J. Z., & Block, S. (2003). Identifying potential indicators of the quality of end-of-life cancer care from administrative data. Journal of Clinical Oncology, 21(6), 1133–1138.

Gagnon, B., Mayo, N. E., Hanley, J., & MacDonald, N. (2004). Pattern of care at the end of life: Does age make a difference in what happens to women with breast cancer? Journal of Clinical Oncology, 22(17), 3458–3465.

Grunfled, E., Urquhart, R., Mykhalovskiy, E., Folkes, A., Johnston, G., Burge, F. I., et al. (2008). Toward population-based indicators of quality end-of-life care: Testing stakeholder agreement. Cancer, 112(10), 2301–2308.

Health Quality Ontario. (2021). System performance: Indicator library. Retrieved 26 January 2021 from https://www.hqontario.ca/System-Performance.

Joint Commission. (2015). Specifications manual for Joint Commission National Quality Core. Retrieved 14 October 2018 from https://manual.jointcommission.org/releases/TJC2015B1/TableOfContentsTJC.html

Maritz, D., Hodkinson, P., & Wallis, L. (2010). Identification of performance indicators for emergency centres in South Africa: Results of a Delphi study. International Journal of Emergency Medicine, 3(4), 341–349.

Research ANd Development (RAND) Health Corporation. (2007). Assessing care of vulnerable elders-3 quality indicators. Journal of the American Geriatric Society, 55(S2), 465–487.

Research Triangle Institute. (2012). Nursing home MDS 3.0 quality measures: Final analytic report. Retrieved 26 January 2021 from https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/NursingHomeQualityInits/Quality-Measures-Archive

Saliba, D., Solomon, D., Rubenstein, L., Young, R., Schnelle, J., Roth, C., et al. (2005). Feasibility of quality indicators for the management of geriatric syndromes in nursing home residents. Journal of the American Medical Directors Association, 6(3), S50–S59.

Santana, M. J., & Stelfox, H. T. (2013). Trauma quality indicator consensus: Development and evaluation of evidence-informed quality indicators for adult injury care. Annals of Surgery, 259(1), 186–192.

Schull, M. J., Guttmann, A., Leaver, C. A., Vermeulen, M., Hatcher, C. M., Rowe, B. H., et al. (2011). Prioritizing performance measurement for emergency department care: consensus on evidence-based quality of care indicators. Canadian Journal of Emergency Medicine, 13(5), 300–309.

Shrank, W. H., Polinski, J. M., & Avor, J. (2007). Quality indicators for medication use in vulnerable elders. Journal of the American Geriatrics Society, 55(Suppl. 2), S373–S382.

Tregunno, D., Baker, R. G., Barnsley, J., & Murray, M. (2004). Competing values of emergency department performance: Balancing multiple stakeholder perspectives. Health Services Research, 39(41), 771–792. https://doi.org/10.1111/j.1475-6773.2004.00257.x

United Kingdom Department of Health. (2010). Accident and emergency clinical quality indicators: Data definitions. Retrieved 14 October 2018 from https://webarchive.nationalarchives.gov.uk/20130105030902/ http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_122868

Wakai, A., O’Sullivan, R., Staunton, P., Walsh, C., Hickey, F., & Plunkett, P. K. (2013). Development of key performance indicators for emergency departments in Ireland using an electronic modified-Delphi consensus approach. European Journal of Emergency Medicine, 20(2), 109–114. https://doi.org/10.1097/MEJ.0b013e328351e5d8

Welch, S. J., Asplin, B. R., Stone-Griffith, S., Davidson, S. J., Augustine, J., & Schuur, J. (2011). Emergency department operational metrics, measures and definitions: Results of the second performance measures and benchmarking summit. Annals of Emergency Medicine, 58(1), 33–40.

References

Ahn, S.-N., Hussein, M., Mahmood, A., & Smith, M. L. (2020). Emergency department and inpatient utilization among U.S. older adults with multiple chronic conditions: A post-reform update. BMC Health Services Research, 20(1), 110. https://doi.org/10.1186/s12913-020-4902-7.CrossRefGoogle ScholarPubMed
Allen, J., Hutchinson, A. M., Brown, R., & Livingston, P. M. (2014). Quality care outcomes following transitional care interventions for older people from hospital to home: A systematic review. BMC Health Services Research, 14, 346.CrossRefGoogle ScholarPubMed
Anderson, K., Allan, D., & Finucane, P. (2000). Complaints concerning the hospital care of elderly patients: A 12-month study of one hospital’s experience. Age & Ageing, 29, 409412.CrossRefGoogle ScholarPubMed
Arksey, H., & O’Malley, L. (2005). Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology, 8(1), 1932.CrossRefGoogle Scholar
Arthofer, K., & Girardi, D. (2017). Data quality- and master data management – A hospital case. Studies in Health Technology and Informatics, 236, 259266.Google ScholarPubMed
Banerjee, A. (2007). An overview of long-term care in Canada and selected provinces and territories (Women and Health Care Reform). Retrieved 8 November 2019 from http://www.femmesreformesante.ca/publications/banerjee_overviewLTC.pdf.Google Scholar
Boulkedid, R., Abdoul, H., Loustau, M., Sibony, O., & Alberti, C. (2011). Using and reporting the Delphi method for selecting healthcare quality indicators: A systematic review. PLoS ONE, 6(6), e20476. https://doi.org/10.1371/journal.pone.0020476.CrossRefGoogle ScholarPubMed
Brennan, J. J., Chan, T. C., Killeen, J. P., & Castillo, E. M. (2015). Inpatient readmissions and emergency department visits within 30 days of a hospital admission. Western Journal of Emergency Medicine: Integrating Emergency Care with Population Health, 16(7), 10251029. https://doi.org/10.5811/westjem.2015.8.26157.CrossRefGoogle ScholarPubMed
Burkett, E., Martin-Khan, M. G., & Gray, L. C. (2017). Quality indicators in the care of older persons in the emergency department: A systematic review of the literature. Australasian Journal on Ageing, 36(4), 286. https://doi.org/10.1111/ajag.12451.CrossRefGoogle ScholarPubMed
Callahan, C. M., Arling, G., Tu, W., Rosenman, M. B., Counsell, S. R., Stump, T. E., et al. (2012). Transitions in care for older adults with and without dementia. Journal of the American Geriatric Society, 60(5), 813820. https://doi.org/10.1111/j.1532-5415.2012.03905.x CrossRefGoogle ScholarPubMed
Coleman, E. A. (2003). Falling through the cracks: Challenges and opportunities for improving transitional care for persons with continuous complex care needs. Journal of the American Geriatrics Society, 51(4), 549555. https://doi.org/10.1046/j.1532-5415.2003.51185.x.CrossRefGoogle ScholarPubMed
Coleman, E. A., & Berenson, R. A. (2004). Lost in transition: Challenges and opportunities for improving the quality of transitional care. Annals of Internal Medicine, 141(7), 533536. https://doi.org/10.7326/0003-4819-141-7-200410050-00009.CrossRefGoogle ScholarPubMed
Crilly, J., Chaboyer, W., & Wallis, M. (2006). Continuity of care for acutely unwell older adults from nursing homes. Scandinavian Journal of Caring Sciences, 20(2), 122134. https://doi.org/10.1111/j.1471-6712.2006.00388.x.CrossRefGoogle ScholarPubMed
Cummings, G. G., McLane, P., Reid, R. C., Tate, K., Cooper, S. L., Rowe, B. H., et al. (2020). Fractured care: A window into emergency transitions in care for LTC residents with complex health needs. Journal of Aging and Health 34(3–4), 119133. https://doi.org/10.1177/0898264318808908 CrossRefGoogle Scholar
De Koning, J. (2007). Development and validation of a measurement instrument for appraising indicator quality: Appraisal of indicators through research and evaluation (AIRE) instrument. Düsseldorf: German Medical Science GMS Publishing House.Google Scholar
Dillman, D. A. (2014). Reducing people’s reluctance to respond to surveys. In Christian, L. M. & Smyth, J. D. (Eds.), Internet, phone, mail, and mixed-mode surveys: The tailored design method (4th ed., pp. 1955). Indianapolis: Wiley.Google Scholar
Doupe, M. B., Palatnick, W., Day, S., Chateau, D., Soodeen, R.-A., Burchill, C., et al. (2012). Frequent users of emergency departments: Developing standard definitions and defining prominent risk factors Annals of Emergency Medicine, 60(1), 2432. https://doi.org/10.1016/j.annemergmed.2011.11.036 CrossRefGoogle ScholarPubMed
Dwyer, R., Stoelwinder, J., Gabbe, B., & Lowthian, J. (2015). Unplanned transfer to emergency departments for frail elderly residents of aged care facilities: A review of patient and organizational factors. Journal of the American Medical Directors Association, 16(7), 551562. https://doi.org/10.1016/j.jamda.2015.03.007.CrossRefGoogle Scholar
Estabrooks, C. A., Knopp-Sihota, J., & Norton, P. G. (2013). Practice sensitive quality indicators in MDS-RAI 2.0 nursing home data. BMC Research Notes, 6, 16.CrossRefGoogle Scholar
Griffiths, D., Morphet, J., Innes, K., Crawford, K., & Williams, A. (2014). Communication between residential aged care facilities and the emergency department: A review of the literature. International Journal of Nursing Studies, 51(11), 15171523. https://doi.org/10.1016/j.ijnurstu.2014.06.002.CrossRefGoogle Scholar
Harris, P. A., Taylor, R., Thielke, R., Payne, J., Gonzalez, N., & Conde, J. G. (2009). Research Electronic Data Capture (REDCap) — A metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics, 42(2), 377381. https://doi.org/10.1016/j.jbi.2008.08.010.CrossRefGoogle ScholarPubMed
Hibbard, J. H., Stockard, J., & Tusler, M. (2005). Hospital performance reports: Impact on quality, market share, and reputation. Health Affairs, 24(4), 11501160. https://doi.org/10.1377/hlthaff.24.4.1150.CrossRefGoogle ScholarPubMed
Hustey, F. M., & Palmer, R. M. (2010). An internet-based communication network for information transfer during patient transitions from skilled nursing facility to the emergency department. Journal of the American Geriatrics Society, 58(6), 11481152. https://doi.org/10.1111/j.1532-5415.2010.02864.x.CrossRefGoogle Scholar
Hutchinson, A. M., Milke, D. L., Maisey, S., Johnson, C., Squires, J. E., Teare, G., et al. (2010). The resident assessment instrument-minimum data set 2.0 quality indicators: A systematic review. BMC Health Services Research, 10, 166. https://doi.org/10.1186/1472-6963-10-166.CrossRefGoogle ScholarPubMed
Institute of Medicine (U.S.), Committee on Quality of Health Care in America. (2001). Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academy Press.Google Scholar
Johnstone, M., & Kanitsaki, O. (2008). Cultural racism, language prejudice and discrimination in hospital contexts: An Australian study. Diversity in Health and Social Care, 5, 12.Google Scholar
Joling, K. J., van Eenoo, L., Vetrano, D. L., Smaardijk, V. R., Declercq, A., Onder, G., et al. (2018). Quality indicators for community care for older people: A systematic review. PloS One, 13(1), e0190298. https://doi.org/10.1371/journal.pone.0190298 CrossRefGoogle ScholarPubMed
Kötter, T., Blozik, E., & Scherer, M. (2012). Methods for the guideline-based development of quality indicators——A systematic review. Implementation Science, 7(1), 21. https://doi.org/10.1186/1748-5908-7-21.CrossRefGoogle ScholarPubMed
Kraska, R. A., Krummenauer, F., & Geraedts, M. (2016). Impact of public reporting on the quality of hospital care in Germany: A controlled before-after analysis based on secondary data. Health Policy (Amsterdam, Netherlands), 120(7), 770779. https://doi.org/10.1016/j.healthpol.2016.04.020 Google Scholar
Laugaland, K., Aase, K., & Barach, P. (2011). Addressing risk factors for transitional care of the elderly – Literature review. Medicine. Retrieved 10 November 2019 from https://www.researchgate.net/publication/267834203_Addressing_Risk_Factors_for_Transitional_Care_of_the_Elderly_-_Literature_review/link/549a20d00cf2b803713590fb/download.Google Scholar
Lloyd, R. (2017). Quality health care. Burlington, VT: Jones & Bartlett Learning, LLC.Google Scholar
Mansoor, E., & Al-Kindi, S. G. (2017). The premise and promise of big data for tracking population health: Big deal or big disappointment? Digestive Diseases and Sciences, 62(3), 562563. https://doi.org/10.1007/s10620-017-4458-5.CrossRefGoogle ScholarPubMed
McCloskey, R. (2011). A qualitative study on the transfer of residents between a nursing home and an emergency department. Journal of the American Geriatric Society, 59(4), 717724. https://doi.org/10.1111/j.1532-5415.2011.03337.x.CrossRefGoogle Scholar
McLane, P., Tate, K., Reid, R. C., Rowe, B. H., Estabrooks, C. A., Cummings, G. G. (2022). Addressing communication breakdowns during aged care transitions: Evaluation of a quality improvement project. Canadian Journal on Aging, 41(1).CrossRefGoogle Scholar
Moher, D., Liberati, A., Tetzlaff, J., Altman, D., & PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PloS Medicine, 6(7), e1000097. https://doi.org/10.1371/journal.pmed.1000097.CrossRefGoogle ScholarPubMed
Office of the National Coordinator for Health Information Technology. (2015). What is meaningful use? Retrieved 26 January 2021 from: https://www.healthit.gov/faq/what-meaningful-use.Google Scholar
Reid, R., Cummings, G., Cooper, S., Abel, L., Bissell, L., Estabrooks, C., et al. (2013). The Older Persons’ Transitions in Care (OPTIC) study: Pilot testing of the transition tracking tool. BMC Health Services Research, 13, 515.CrossRefGoogle ScholarPubMed
Riaz, M., & Brown, J. D. (2019). Association of adverse drug events with hospitalization outcomes and costs in older adults in the USA using the Nationwide Readmissions Database. Pharmaceutical Medicine, 33(4), 321329. https://doi.org/10.1007/s40290-019-00286-z.CrossRefGoogle ScholarPubMed
Saver, B. G., Martin, S. A., Adler, R. N., Candib, L. M., Deligiannidis, K. E., Golding, J., et al. (2015). Care that matters: Quality measurement and health care. PloS Medicine, 12(11), e1001902. https://doi.org/10.1371/journal.pmed.1001902 CrossRefGoogle ScholarPubMed
Schall, M. C., Cullen, L., Matthews, G., Pennathur, P., Chen, H., & Burrell, K. (2017). Usability evaluation and implementation of a health information technology dashboard of evidence-based quality indicators. Cin-Computers Informatics Nursing, 35(6), 281288.CrossRefGoogle ScholarPubMed
Schull, M. J., Guttman, A., Leaver, C. A., Vermeulen, M., Hatcher, C. M., Rowe, B. H., et al. (2011). Prioritizing performance measurement for emergency department care: Consensus on evidence-based quality of care indicators. Canadian Journal of Emergency Medicine, 13(5), E28E43.CrossRefGoogle ScholarPubMed
Schull, M. J., Hatcher, C. M., Guttmann, A., Leaver, C. A., Vermeulen, M., Rowe, B. H., et al. (2010). Development of a consensus on evidence-based quality of care indicators for Canadian Emergency Departments. Retrieved 9 November 2019 from. Toronto: Institute for Clinical Evaluative Sciences. https://caep.ca/wp-content/uploads/2016/03/national_ed_quality_indicators-10mar2010.pdf Google Scholar
Scott, I. A. (2010). Preventing the rebound: Improving care transition in hospital discharge processes. Australian Health Review, 34(4), 445451. https://doi.org/10.1071/AH09777.CrossRefGoogle ScholarPubMed
Spath, P. (2013). Introduction to healthcare quality management (2nd ed). Chicago: Health Administration Press.Google Scholar
Tashobya, C. K., Dubourg, D., Ssengooba, F., Speybroeck, N., Macq, J., & Criel, B. (2016). A comparison of hierarchical cluster analysis and league table rankings as methods for analysis and presentation of district health system performance data in Uganda. Health Policy and Planning, 31(2), 217228. https://doi.org/10.1093/heapol/czv045.CrossRefGoogle ScholarPubMed
Tate, K., Hewko, S., McLane, P., Baxter, P., Perry, K., Armijo-Olivo, S., et al. (2019). Learning to lead: A review and synthesis of literature examining health care managers’ use of knowledge. Journal of Health Services Research & Policy, 24(1), 5770. https://doi.org/10.1177/1355819618786764 CrossRefGoogle ScholarPubMed
Terrell, K. M., Brizendine, E. J., Bean, W. F., Giles, B. K., Davidson, J. R., Evers, S., et al. (2005). An extended care facility-to-emergency department transfer form improves communication. Academic Emergency Medicine: Official Journal Of The Society For Academic Emergency Medicine, 12(2), 114118. https://doi.org/10.1197/j.aem.2004.10.013.CrossRefGoogle ScholarPubMed
Terrell, K. M., Hustey, F. M., Hwang, U., Gerson, L. W., Wenger, N. S., Miller, D. K., & The Society for Academic Emergency Medicine Geriatric Task Force. (2009). Quality indicators for geriatric emergency care. Academic Emergency Medicine, 16(5), 441449. https://doi.org/10.1111/j.1553-2712.2009.00382.x CrossRefGoogle Scholar
Tisminetzky, M., Gurwitz, J. H., Miozzo, R., Gore, J. M., Lessard, D., Yarzebski, J., et al. (2019). Characteristics, management, and short-term outcomes of adults ≥65 years hospitalized with acute myocardial infarction with prior anemia and heart failure. The American Journal of Cardiology, 124(9), 13271332. https://doi.org/10.1016/j.amjcard.2019.07.057 CrossRefGoogle Scholar
Trahan, L. M., Spiers, J. A., & Cummings, G. G. (2016). Decisions to transfer nursing home residents to emergency departments: A scoping review of contributing factors and staff perspectives. Journal of the American Medical Directors Association, 17(11), 9941005. https://doi.org/10.1016/j.jamda.2016.05.012.CrossRefGoogle ScholarPubMed
van Teijlingen, E., & Hundley, V. (2002). The importance of pilot studies. Nursing Standard, 16(40), 4.CrossRefGoogle ScholarPubMed
Vuokko, R., Mäkelä-Bengs, P., Hyppönen, H., Lindqvist, M., & Doupi, P. (2017). Impacts of structuring the electronic health record: Results of a systematic literature review from the perspective of secondary use of patient data. International Journal of Medical Informatics, 97, 293303. https:/doi.org/10.1016/j.ijmedinf.2016.10.004.CrossRefGoogle ScholarPubMed
Wakai, A., O’Sullivan, R., Staunton, P., Walsh, C., Hickey, F., & Plunkett, P. K. (2013). Development of key performance indicators for emergency departments in Ireland using an electronic modified-Delphi consensus approach. European Journal of Emergency Medicine, 20(2), 109114. https://doi.org/10.1097/MEJ.0b013e328351e5d8.CrossRefGoogle ScholarPubMed
Wenger, N. S., Roth, C., Shekelle, P. G., Amin, A., Bedsine, R. K., Blazer, D.G., et al. (2007). Introduction to the assessing care of vulnerable elders-3 quality indicators measurement set. Journal of the American Geriatrics Society, 55(S2), S247S252.CrossRefGoogle Scholar
Williams, D. R., & Mohammed, S. A. (2009). Discrimination and racial disparities in health: Evidence and needed research. Journal of Behavioral Medicine, 32(1), 20. https://doi.org/10.1007/s10865-008-9185-0.CrossRefGoogle ScholarPubMed
Zafirau, W. J., Snyder, S. S., Hazelett, S. E., Bansal, A., & McMahon, S. D. (2012). Improving transitions: Efficacy of a transfer form to communicate patients’ wishes. American Journal of Medical Quality, 27(4), 291296.CrossRefGoogle ScholarPubMed
Figure 0

Figure 1. Care transition process

Figure 1

Figure 2. PRISMA diagram

Figure 2

Figure 3. Indicators extracted from systematic review

Figure 3

Figure 4. Indicators retained after Delphi process

Figure 4

Figure 5. Indicators retained after feasibility review

Figure 5

Table 1. Final set of retained indicators