We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Brief intervention services provide rapid, mobile and flexible short-term delivery of interventions to resolve mental health crises. These interventions may provide an alternative pathway to the emergency department or in-patient psychiatric services for children and young people (CYP), presenting with an acute mental health condition.
Aims
To synthesise evidence on the effectiveness of brief interventions in improving mental health outcomes for CYP (0–17 years) presenting with an acute mental health condition.
Method
A systematic literature search was conducted, and the studies’ methodological quality was assessed. Five databases were searched for peer-reviewed articles between January 2000 and September 2022.
Results
We synthesised 30 articles on the effectiveness of brief interventions in the form of (a) crisis intervention, (b) integrated services, (c) group therapies, (d) individualised therapy, (e) parent–child dyadic therapy, (f) general services, (g) pharmacotherapy, (h) assessment services, (i) safety and risk planning and (j) in-hospital treatment, to improve outcomes for CYP with an acute mental health condition. Among included studies, one study was rated as providing a high level of evidence based on the National Health and Medical Research Council levels of evidence hierarchy scale, which was a crisis intervention showing a reduction in length of stay and return emergency department visits. Other studies, of moderate-quality evidence, described multimodal brief interventions that suggested beneficial effects.
Conclusions
This review provides evidence to substantiate the benefits of brief interventions, in different settings, to reduce the burden of in-patient hospital and readmission rates to the emergency department.
As the proportion of the elderly population in the USA expands, so will the demand for rehabilitation and social care, which play an important role in maintaining function and mediating motor and cognitive decline in older adults. The use of social robotics and telemedicine are each potential solutions but each have limitations. To address challenges with classical telemedicine for rehabilitation, we propose to use a social robot-augmented telepresence (SRAT), Flo, which was deployed for long-term use in a community-based rehabilitation facility catering to older adults. Our goals were to explore how clinicians and patients would use and respond to the robot during rehab interactions. In this pilot study, three clinicians were recruited and asked to rate usability after receiving training for operating the robot and two of them conducted multiple rehab interactions with their patients using the robot (eleven patients with cognitive impairment and/or motor impairment and 23 rehab sessions delivered via SRAT in total). We report on the experience of both therapists and patients after the interactions.
In this squib, I provide arguments in favour of the view that Danish rundt is a postposition. The functional, semantic, and syntactic properties of adpositions are discussed, and I show that competing analyses of rundt are falsifiable while the postposition analysis itself is not falsified.
Patients and their families often ask clinicians to estimate when full-time care (FTC) will be needed after Alzheimer's Disease (AD) is diagnosed. Although a few algorithms predictive algorithms for duration to FTC have been created, these have not been widely adopted for clinical use due to questions regarding precision from limited sample sizes and lack of an easy, user friendly prediction model. Our objective was to develop a clinically relevant, data-driven predictive model using machine learning to estimate time to FTC in AD based on information gathered from a) clinical interview alone, and b) clinical interview plus neuropsychological data.
Participants and Methods:
The National Alzheimer's Coordinating Center dataset was used to examine 3,809 participants (M age at AD diagnosis = 76.05, SD = 9.76; 47.10% male; 87.20% Caucasian) with AD dementia who were aged >50 years, had no history of stroke, and not dependent on others for basic activities of daily living at time of diagnosis based on qualitative self or informant report. To develop a predictive model for time until FTC, supervised machine learning algorithms (e.g., gradient descent, gradient boosting) were implemented. In Model 1, 29 variables captured at the time of AD diagnosis and often gathered in a clinical interview, including sociodemographic factors, psychiatric conditions, medical history, and MMSE, were included. In Model 2, additional neuropsychological variables assessing episodic memory, language, attention, executive function, and processing speed were added. To train and test the algorithm(s), data were split into a 70:30 ratio. Prediction optimization was examined via cross validation using 1000 bootstrapped samples. Model evaluation included assessment of confusion matrices and calculation of accuracy and precision.
Results:
The average time to requiring FTC after AD diagnosis was 3.32 years (Range = 0.53-14.57 years). For the clinical interview only model (Model 1), younger age of onset, use of cholinesterase inhibitor medication, incontinence, and apathy were among the clinical variables that significantly predicted duration to FTC, with the largest effects shown for living alone, a positive family history of dementia, and lower MMSE score. In Model 2, the clinical predictors remained significant, and lower Boston Naming Test and Digit-Symbol Coding scores showed the largest effects in predicting duration to FTC among the neuropsychological measures. Final prediction models were further tested using five randomly selected cases. The average estimated time to FTC using the clinical interview model was within an average of 5.2 months of the recorded event and within an average of 5.8 months for the model with neuropsychological data.
Conclusions:
Predicting when individuals diagnosed with AD will need FTC is important as the transition often carries significant financial costs related to caregiving. Duration to FTC was predicted by clinical and neuropsychological variables that are easily obtained during standard dementia evaluations. Implementation of the model for prediction of FTC in cases showed encouraging prognostic accuracy. The two models show promise as a first step towards creation of a user friendly prediction calculator that could help clinicians better counsel patients on when FTC after AD diagnosis may occur, though the development of separate models for use in more diverse populations will be essential.
Episodic memory functioning is distributed across two brain circuits, one of which courses through the dorsal anterior cingulate cortex (dACC). Thus, delivering non-invasive neuromodulation technology to the dACC may improve episodic memory functioning in patients with memory problems such as in amnestic mild cognitive impairment (aMCI). This preliminary study is a randomized, double-blinded, sham-controlled clinical trial to examine if high definition transcranial direct current stimulation (HD-tDCS) can be a viable treatment in aMCI.
Participants and Methods:
Eleven aMCI participants, of whom 9 had multidomain deficits, were randomized to receive 1 mA HD-tDCS (N=7) or sham (N=4) stimulation. HD-tDCS was applied over ten 20-minute sessions targeting the dACC. Neuropsychological measures of episodic memory, verbal fluency, and executive function were completed at baseline and after the last HD-tDCS session. Changes in composite scores for memory and language/executive function tests were compared between groups (one-tailed t-tests with a = 0.10 for significance). Clinically significant change, defined as > 1 SD improvement on at least one test in the memory and non-memory domains, was compared between active and sham stimulation based on the frequency of participants in each.
Results:
No statistical or clinically significant change (N-1 X2; p = 0.62) was seen in episodic memory for the active HD-tDCS (MDiff = 4.4; SD = 17.1) or sham groups (MDiff = -0.5; SD = 9.7). However, the language and executive function composite showed statistically significant improvement (p = 0.04; MDiff = -15.3; SD = 18.4) for the active HD-tDCS group only (Sham MDiff = -5.8; SD = 10.7). Multiple participants (N=4) in the active group had clinically significant enhancement in language and executive functioning tests, while nobody in the sham group did (p = 0.04).
Conclusions:
HD-tDCS targeting the dACC had no direct benefit for episodic memory deficits in aMCI based on preliminary findings for this ongoing clinical trial. However, significant improvement in language and executive function skills occurred in response to HD-tDCS, suggesting HD-tDCS in this configuration has promising potential as an intervention for language and executive function deficits in MCI.
This article describes a robot walker based on a new single degree-of-freedom six-bar leg mechanism that provides rectilinear, non-rotating, movement of the foot. The walker is statically stable and requires only two actuators, one for each side, to provide effective walking movement on a flat surface. We use Curvature Theory to design a four-bar linkage with a flat-sided coupler curve and then adds a translating link so that walker foot follows this coupler curve in rectilinear movement. A prototype walker was constructed that weighs 1.6 kg, is 180 mm tall, and travels at 162 mm/s. This is an innovative legged robot that has a simple reliable design.
To identify central-line (CL)–associated bloodstream infection (CLABSI) incidence and risk factors in low- and middle-income countries (LMICs).
Design:
From July 1, 1998, to February 12, 2022, we conducted a multinational multicenter prospective cohort study using online standardized surveillance system and unified forms.
Setting:
The study included 728 ICUs of 286 hospitals in 147 cities in 41 African, Asian, Eastern European, Latin American, and Middle Eastern countries.
Patients:
In total, 278,241 patients followed during 1,815,043 patient days acquired 3,537 CLABSIs.
Methods:
For the CLABSI rate, we used CL days as the denominator and the number of CLABSIs as the numerator. Using multiple logistic regression, outcomes are shown as adjusted odds ratios (aORs).
Results:
The pooled CLABSI rate was 4.82 CLABSIs per 1,000 CL days, which is significantly higher than that reported by the Centers for Disease Control and Prevention National Healthcare Safety Network (CDC NHSN). We analyzed 11 variables, and the following variables were independently and significantly associated with CLABSI: length of stay (LOS), risk increasing 3% daily (aOR, 1.03; 95% CI, 1.03–1.04; P < .0001), number of CL days, risk increasing 4% per CL day (aOR, 1.04; 95% CI, 1.03–1.04; P < .0001), surgical hospitalization (aOR, 1.12; 95% CI, 1.03–1.21; P < .0001), tracheostomy use (aOR, 1.52; 95% CI, 1.23–1.88; P < .0001), hospitalization at a publicly owned facility (aOR, 3.04; 95% CI, 2.31–4.01; P <.0001) or at a teaching hospital (aOR, 2.91; 95% CI, 2.22–3.83; P < .0001), hospitalization in a middle-income country (aOR, 2.41; 95% CI, 2.09–2.77; P < .0001). The ICU type with highest risk was adult oncology (aOR, 4.35; 95% CI, 3.11–6.09; P < .0001), followed by pediatric oncology (aOR, 2.51;95% CI, 1.57–3.99; P < .0001), and pediatric (aOR, 2.34; 95% CI, 1.81–3.01; P < .0001). The CL type with the highest risk was internal-jugular (aOR, 3.01; 95% CI, 2.71–3.33; P < .0001), followed by femoral (aOR, 2.29; 95% CI, 1.96–2.68; P < .0001). Peripherally inserted central catheter (PICC) was the CL with the lowest CLABSI risk (aOR, 1.48; 95% CI, 1.02–2.18; P = .04).
Conclusions:
The following CLABSI risk factors are unlikely to change: country income level, facility ownership, hospitalization type, and ICU type. These findings suggest a focus on reducing LOS, CL days, and tracheostomy; using PICC instead of internal-jugular or femoral CL; and implementing evidence-based CLABSI prevention recommendations.
Rates of ventilator-associated pneumonia (VAP) in low- and middle-income countries (LMIC) are several times above those of high-income countries. The objective of this study was to identify risk factors (RFs) for VAP cases in ICUs of LMICs.
Design:
Prospective cohort study.
Setting:
This study was conducted across 743 ICUs of 282 hospitals in 144 cities in 42 Asian, African, European, Latin American, and Middle Eastern countries.
Participants:
The study included patients admitted to ICUs across 24 years.
Results:
In total, 289,643 patients were followed during 1,951,405 patient days and acquired 8,236 VAPs. We analyzed 10 independent variables. Multiple logistic regression identified the following independent VAP RFs: male sex (adjusted odds ratio [aOR], 1.22; 95% confidence interval [CI], 1.16–1.28; P < .0001); longer length of stay (LOS), which increased the risk 7% per day (aOR, 1.07; 95% CI, 1.07–1.08; P < .0001); mechanical ventilation (MV) utilization ratio (aOR, 1.27; 95% CI, 1.23–1.31; P < .0001); continuous positive airway pressure (CPAP), which was associated with the highest risk (aOR, 13.38; 95% CI, 11.57–15.48; P < .0001); tracheostomy connected to a MV, which was associated with the next-highest risk (aOR, 8.31; 95% CI, 7.21–9.58; P < .0001); endotracheal tube connected to a MV (aOR, 6.76; 95% CI, 6.34–7.21; P < .0001); surgical hospitalization (aOR, 1.23; 95% CI, 1.17–1.29; P < .0001); admission to a public hospital (aOR, 1.59; 95% CI, 1.35-1.86; P < .0001); middle-income country (aOR, 1.22; 95% CI, 15–1.29; P < .0001); admission to an adult-oncology ICU, which was associated with the highest risk (aOR, 4.05; 95% CI, 3.22–5.09; P < .0001), admission to a neurologic ICU, which was associated with the next-highest risk (aOR, 2.48; 95% CI, 1.78–3.45; P < .0001); and admission to a respiratory ICU (aOR, 2.35; 95% CI, 1.79–3.07; P < .0001). Admission to a coronary ICU showed the lowest risk (aOR, 0.63; 95% CI, 0.51–0.77; P < .0001).
Conclusions:
Some identified VAP RFs are unlikely to change: sex, hospitalization type, ICU type, facility ownership, and country income level. Based on our results, we recommend focusing on strategies to reduce LOS, to reduce the MV utilization ratio, to limit CPAP use and implementing a set of evidence-based VAP prevention recommendations.
Adolescence is a period of life when dietary patterns and nutrient intakes may greatly influence adult fatness. This study assesses the tracking of energy and nutrient intakes of Ho Chi Minh City adolescents over 5 years. It explores the possible relationships between energy and the percentage of energy from macronutrients with BMI.
Methods:
Height, weight, time spent on physical activity, screen time and dietary intakes were collected annually between 2004 and 2009 among 752 junior high school students with a mean age of 11·87 years at baseline. The tracking was investigated using correlation coefficients and weighted kappa statistics (k) for repeated measurements. Mixed effect models were used to investigate the association between energy intakes and percentage energy from macronutrients with BMI.
Results:
There were increases in the mean BMI annually, but greater in boys than in girls. Correlation coefficients (0·2 < r < 0·4) between participants’ intakes at baseline and 5-year follow-up suggest moderate tracking. Extended kappa values were lowest for energy from carbohydrate (CHO) in both girls and boys (k = 0·18 & 0·24, respectively), and highest for protein in girls (k = 0·47) and fat in boys (k = 0·48). The multilevel models showed the following variables significantly correlated with BMI: CHO, fat, percentage of energy from CHO, fat, time spent for moderate to vigorous physical activity, screen time, age and sex.
Conclusions:
The poor to fair tracking observed in this cohort suggests that individual dietary patterns exhibited in the first year are unlikely to predict energy and nutrient intakes in the fifth year.
This study aims to look at the trends in our head and neck cancer patient population over the past 5 years with an emphasis on the past 2 years to evaluate how the coronavirus disease 2019 (COVID-19) pandemic has impacted our disparities and availability of care for patients, especially those living in rural areas. An additional aim is to identify existing disparities at our institution in the treatment of head and neck patients and determine solutions to improve patient care.
Materials and Methods:
A retrospective chart review was performed to identify patients who were consulted and subsequently treated with at least one fraction of radiation therapy at our institution with palliative or curative intent. Patient demographic information was collected including hometown, distance from the cancer centre based on zip-codes and insurance information and type of appointment (in-person or telehealth). Rural–urban continuum codes were used to determine rurality.
Results:
A total of 490 head and neck cancer patients (n = 490) were treated from 2017 to 2021. When broken down by year, there were no significant trends in patient population regarding travel distance or rurality. Roughly 20–30% of our patients live in rural areas and about 30% have a commute > 50 miles for radiation treatment. A majority of our patients rely on public insurance (68%) with a small percentage of those uninsured (4%). Telehealth visits were rare prior to 2019 and rose to 5 and 2 visits in 2020 and 2021, respectively.
Conclusions:
Head and neck cancer patients, despite rurality or distance from a cancer centre, may present with alarmingly enough symptoms despite limitations and difficulties with seeking medical attention even during the COVID-19 pandemic in 2020. However, providers must be aware of these potential disparities that exist in the rural population and seek to address these.
To examine the association between adherence to plant-based diets and mortality.
Design:
Prospective study. We calculated a plant-based diet index (PDI) by assigning positive scores to plant foods and reverse scores to animal foods. We also created a healthful PDI (hPDI) and an unhealthful PDI (uPDI) by further separating the healthy plant foods from less-healthy plant foods.
Setting:
The VA Million Veteran Program.
Participants:
315 919 men and women aged 19–104 years who completed a FFQ at the baseline.
Results:
We documented 31 136 deaths during the follow-up. A higher PDI was significantly associated with lower total mortality (hazard ratio (HR) comparing extreme deciles = 0·75, 95 % CI: 0·71, 0·79, Ptrend < 0·001]. We observed an inverse association between hPDI and total mortality (HR comparing extreme deciles = 0·64, 95 % CI: 0·61, 0·68, Ptrend < 0·001), whereas uPDI was positively associated with total mortality (HR comparing extreme deciles = 1·41, 95 % CI: 1·33, 1·49, Ptrend < 0·001). Similar significant associations of PDI, hPDI and uPDI were also observed for CVD and cancer mortality. The associations between the PDI and total mortality were consistent among African and European American participants, and participants free from CVD and cancer and those who were diagnosed with major chronic disease at baseline.
Conclusions:
A greater adherence to a plant-based diet was associated with substantially lower total mortality in this large population of veterans. These findings support recommending plant-rich dietary patterns for the prevention of major chronic diseases.
The exercise of administrative discretion by street-level workers plays a key role in shaping citizens’ access to welfare and employment services. Governance reforms of social services delivery, such as performance-based contracting, have often been driven by attempts to discipline this discretion. In several countries, these forms of market governance are now being eclipsed by new modes of digital governance that seek to reshape the delivery of services using algorithms and machine learning. Australia, a pioneer of marketisation, is one example, proposing to deploy digitalisation to fully automate most of its employment services rather than as a supplement to face-to-face case management. We examine the potential and limits of this project to replace human-to-human with ‘machine bureaucracies’. To what extent are welfare and employment services amenable to digitalisation? What trade-offs are involved? In addressing these questions, we consider the purported benefits of machine bureaucracies in achieving higher levels of efficiency, accountability, and consistency in policy delivery. While recognising the potential benefits of machine bureaucracies for both governments and jobseekers, we argue that trade-offs will be faced between enhancing the efficiency and consistency of services and ensuring that services remain accessible and responsive to highly personalised circumstances.
To describe the use of artificial intelligence (AI)-enabled dark nudges by leading global food and beverage companies to influence consumer behaviour.
Design:
The five most recent annual reports (ranging from 2014 to 2018 or 2015 to 2019, depending on the company) and websites from twelve of the leading companies in the global food and beverage industry were reviewed to identify uses of AI and emerging technologies to influence consumer behaviour. Uses of AI and emerging technologies were categorised according to the Typology of Interventions in Proximal Physical Micro-Environments (TIPPME) framework, a tool for categorising and describing nudge-type behaviour change interventions (which has also previously been used to describe dark nudge-type approaches used by the alcohol industry).
Setting:
Not applicable.
Participants:
Twelve leading companies in the global food and beverage industry.
Results:
Text was extracted from fifty-seven documents from eleven companies. AI-enabled dark nudges used by food and beverage companies included those that altered products and objects’ availability (e.g. social listening to inform product development), position (e.g. decision technology and facial recognition to manipulate the position of products on menu boards), functionality (e.g. decision technology to prompt further purchases based on current selections) and presentation (e.g. augmented or virtual reality to deliver engaging and immersive marketing).
Conclusions:
Public health practitioners and policymakers must understand and engage with these technologies and tactics if they are to counter industry promotion of products harmful to health, particularly as investment by the industry in AI and other emerging technologies suggests their use will continue to grow.
Reducing vulnerability to environmental change must be a key component of any strategy for sustainable development. We consider the situation of the nations of the Lower Mekong, namely Cambodia, Lao PDR, and Vietnam, focusing on the threat of climate change. We distinguish between physical vulnerability, characterized in terms of spatial exposure to hazardous events, and social vulnerability, which is a function of the social conditions and historical circumstances that put people at risk. As vulnerability is a dynamic condition, we frame the assessment in terms of the processes and trends that are shaping current patterns of vulnerability and resilience. The nations of the Lower Mekong face a range of potential trends in climate, with changes in the incidence of flooding, variability in water availability, the occurrence of drought and heat stress, the frequency and/or intensity of tropical cyclones, and, in coastal areas, sea-level rise posing the major risks. A baseline assessment of the social, economic, and political trends that are influencing present-day levels of social vulnerability highlights the fact that poverty is the largest barrier to developing the capacity to cope and adapt effectively with change. The situation of the poorest members of society is being adversely affected by trends in inequality, disparities in property rights, dismantling of agricultural cooperatives, unions, and various forms of financial support and changes in social structure and institutions. We identify an important tension that can exist between efforts aimed at improving the general economic situation and what is needed to improve resilience to climate stress, particularly among the rural poor. As far as adaptation is concerned, there are lessons for other regions in the traditional approaches developed within the Lower Mekong, as these nations have a rich history of managing their dynamic natural environment.
To achieve the elimination of the hepatitis C virus (HCV), sustained and sufficient levels of HCV testing is critical. The purpose of this study was to assess trends in testing and evaluate the effectiveness of strategies to diagnose people living with HCV. Data were from 12 primary care clinics in Victoria, Australia, that provide targeted services to people who inject drugs (PWID), alongside general health care. This ecological study spanned 2009–2019 and included analyses of trends in annual numbers of HCV antibody tests among individuals with no previous positive HCV antibody test recorded and annual test yield (positive HCV antibody tests/all HCV antibody tests). Generalised linear models estimated the association between count outcomes (HCV antibody tests and positive HCV antibody tests) and time, and χ2 test assessed the trend in test yield. A total of 44 889 HCV antibody tests were conducted 2009–2019; test numbers increased 6% annually on average [95% confidence interval (CI) 4–9]. Test yield declined from 2009 (21%) to 2019 (9%) (χ2P = <0.01). In more recent years (2013–2019) annual test yield remained relatively stable. Modest increases in HCV antibody testing and stable but high test yield within clinics delivering services to PWID highlights testing strategies are resulting in people are being diagnosed however further increases in the testing of people at risk of HCV or living with HCV may be needed to reach Australia's HCV elimination goals.
Many mental disorders, including depression, bipolar disorder and schizophrenia, are associated with poor dietary quality and nutrient intake. There is, however, a deficit of research looking at the relationship between obsessive–compulsive disorder (OCD) severity, nutrient intake and dietary quality.
Aims
This study aims to explore the relationship between OCD severity, nutrient intake and dietary quality.
Method
A post hoc regression analysis was conducted with data combined from two separate clinical trials that included 85 adults with diagnosed OCD, using the Structured Clinical Interview for DSM-5. Nutrient intakes were calculated from the Dietary Questionnaire for Epidemiological Studies version 3.2, and dietary quality was scored with the Healthy Eating Index for Australian Adults – 2013.
Results
Nutrient intake in the sample largely aligned with Australian dietary guidelines. Linear regression models adjusted for gender, age and total energy intake showed no significant associations between OCD severity, nutrient intake and dietary quality (all P > 0.05). However, OCD severity was inversely associated with caffeine (β = −15.50, 95% CI −28.88 to −2.11, P = 0.024) and magnesium (β = −6.63, 95% CI −12.72 to −0.53, P = 0.034) intake after adjusting for OCD treatment resistance.
Conclusions
This study showed OCD severity had little effect on nutrient intake and dietary quality. Dietary quality scores were higher than prior studies with healthy samples, but limitations must be noted regarding comparability. Future studies employing larger sample sizes, control groups and more accurate dietary intake measures will further elucidate the relationship between nutrient intake and dietary quality in patients with OCD.
We conducted a retrospective review of a hybrid antimicrobial restriction process demonstrating adherence to appropriate use criteria in 72% of provisional-only orders, in 100% of provisional orders followed by ID orders, and in 97% of ID-initiated orders. Therapy interruptions occurred in 24% of provisional orders followed by ID orders.
We examined the return on investment (ROI) from the Endovascular Reperfusion Alberta (ERA) project, a provincially funded population-wide strategy to improve access to endovascular therapy (EVT), to inform policy regarding sustainability.
Methods:
We calculated net benefit (NB) as benefit minus cost and ROI as benefit divided by cost. Patients treated with EVT and their controls were identified from the ESCAPE trial. Using the provincial administrative databases, their health services utilization (HSU), including inpatient, outpatient, physician, long-term care services, and prescription drugs, were compared. This benefit was then extrapolated to the number of patients receiving EVT increased in 2018 and 2019 by the ERA implementation. We used three time horizons, including short (90 days), medium (1 year), and long-term (5 years).
Results:
EVT was associated with a reduced gross HSU cost for all the three time horizons. Given the total costs of ERA were $2.04 million in 2018 ($11,860/patient) and $3.73 million in 2019 ($17,070/patient), NB per patient in 2018 (2019) was estimated at −$7,313 (−$12,524), $54,592 ($49,381), and $47,070 ($41,859) for short, medium, and long-term time horizons, respectively. Total NB for the province in 2018 (2019) were −$1.26 (−$2.74), $9.40 ($10.78), and $8.11 ($9.14) million; ROI ratios were 0.4 (0.3), 5.6 (3.9) and 5.0 (3.5). Probabilities of ERA being cost saving were 39% (31%), 97% (96%), and 94% (91%), for short, medium, and long-term time horizons, respectively.
Conclusion:
The ERA program was cost saving in the medium and long-term time horizons. Results emphasized the importance of considering a broad range of HSU and long-term impact to capture the full ROI.