To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Vast disparities between and within American states’ responses to the COVID-19 pandemic have evoked renewed attention to whether greater centralization might enhance investments in subnational capacity and remedy subnational inequalities or instead erode subnational organizational capacity. Developments in American public education (1997–2015) offer perspective on this puzzle, which we examine by applying interrupted time series analysis to a novel dataset to assess the implications of centralization on subnational investments in administrative and technical capacity, two dimensions of organizational capacity. We find simultaneous subnational erosion in administrative capacity and growth in technical capacity following centralization, both of which appear concentrated in low-poverty areas despite centralization’s explicit antipoverty purposes. Public education reforms highlight both the challenge of dismantling subnational inequality through centralization and the need for future research on policy designs that enable centralization to yield subnational capacity that is able to remedy inequality.
Coronavirus disease 2019 (COVID-19) vaccination effectiveness in healthcare personnel (HCP) has been established. However, questions remain regarding its performance in high-risk healthcare occupations and work locations. We describe the effect of a COVID-19 HCP vaccination campaign on SARS-CoV-2 infection by timing of vaccination, job type, and work location.
We conducted a retrospective review of COVID-19 vaccination acceptance, incidence of postvaccination COVID-19, hospitalization, and mortality among 16,156 faculty, students, and staff at a large academic medical center. Data were collected 8 weeks prior to the start of phase 1a vaccination of frontline employees and ended 11 weeks after campaign onset.
The COVID-19 incidence rate among HCP at our institution decreased from 3.2% during the 8 weeks prior to the start of vaccinations to 0.38% by 4 weeks after campaign initiation. COVID-19 risk was reduced among individuals who received a single vaccination (hazard ratio [HR], 0.52; 95% confidence interval [CI], 0.40–0.68; P < .0001) and was further reduced with 2 doses of vaccine (HR, 0.17; 95% CI, 0.09–0.32; P < .0001). By 2 weeks after the second dose, the observed case positivity rate was 0.04%. Among phase 1a HCP, we observed a lower risk of COVID-19 among physicians and a trend toward higher risk for respiratory therapists independent of vaccination status. Rates of infection were similar in a subgroup of nurses when examined by work location.
Our findings show the real-world effectiveness of COVID-19 vaccination in HCP. Despite these encouraging results, unvaccinated HCP remain at an elevated risk of infection, highlighting the need for targeted outreach to combat vaccine hesitancy.
Bleeding in the perioperative period of congenital heart surgery with cardiopulmonary bypass is associated with increased morbidity and mortality both from the direct effects of haemorrhage as well as the therapies deployed to restore haemostasis. Perioperative bleeding is complex and multifactorial with both patient and procedural contributions. Moreover, neonates and infants are especially at risk. The objective of this review is to summarise the evidence regarding bleeding management in paediatric surgical patients and identify strategies that might facilitate appropriate bleeding management while minimising the risk of thrombosis. We will address the use of standard and point-of-care tests, and the role of contemporary coagulation factors and other novel drugs.
The objectives of this study were (1) to develop and validate a simulation model to estimate daily probabilities of healthcare-associated infections (HAIs), length of stay (LOS), and mortality using time varying patient- and unit-level factors including staffing adequacy and (2) to examine whether HAI incidence varies with staffing adequacy.
The study was conducted at 2 tertiary- and quaternary-care hospitals, a pediatric acute care hospital, and a community hospital within a single New York City healthcare network.
All patients discharged from 2012 through 2016 (N = 562,435).
We developed a non-Markovian simulation to estimate daily conditional probabilities of bloodstream, urinary tract, surgical site, and Clostridioides difficile infection, pneumonia, length of stay, and mortality. Staffing adequacy was modeled based on total nurse staffing (care supply) and the Nursing Intensity of Care Index (care demand). We compared model performance with logistic regression, and we generated case studies to illustrate daily changes in infection risk. We also described infection incidence by unit-level staffing and patient care demand on the day of infection.
Most model estimates fell within 95% confidence intervals of actual outcomes. The predictive power of the simulation model exceeded that of logistic regression (area under the curve [AUC], 0.852 and 0.816, respectively). HAI incidence was greatest when staffing was lowest and nursing care intensity was highest.
This model has potential clinical utility for identifying modifiable conditions in real time, such as low staffing coupled with high care demand.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
Total cost estimates for crime in the USA are both out-of-date and incomplete. We estimated incidence and costs of personal crimes (both violent and non-violent) and property crimes in 2017. Incidence came from national arrest data, multi-state estimates of police-reported crimes per arrest, national victimization and road crash surveys, and police underreporting studies. We updated and expanded upon published unit costs. Estimated crime costs totaled $2.6 trillion ($620 billion in monetary costs plus quality of life losses valued at $1.95 trillion; 95 % uncertainty interval $2.2–$3.0 trillion). Violent crime accounted for 85 % of costs. Principal contributors to the 10.9 million quality-adjusted life years lost were sexual violence, physical assault/robbery, and child maltreatment. Monetary expenditures caused by criminal victimization represent 3 % of Gross Domestic Product – equivalent to the amount spent on national defense. These estimates exclude the additional costs of preventing and avoiding crime such as enhanced lighting and burglar alarms. They also exclude crimes against businesses and most white-collar and corporate offenses.
Amongst patients with CHD, the time of transition to adulthood is associated with lapses in care leading to significant morbidity. The purpose of this study was to identify differences in perceptions between parents and teens in regard to transition readiness.
Responses were collected from 175 teen–parent pairs via the validated CHD Transition Readiness survey and an information request checklist. The survey was distributed via an electronic tablet at a routine clinic visit.
Parents reported a perceived knowledge gap of 29.2% (the percentage of survey items in which a parent believes their teen does not know), compared to teens self-reporting an average of 25.9% of survey items in which they feel deficient (p = 0.01). Agreement was lowest for long-term medical needs, physical activities allowed, insurance, and education. In regard to self-management behaviours, agreement between parent and teen was slight to moderate (weighted κ statistic = 0.18 to 0.51). For self-efficacy, agreement ranged from slight to fair (weighted κ = 0.16 to 0.28). Teens were more likely to request information than their parents (79% versus 65% requesting at least one item) particularly in regard to pregnancy/contraception and insurance.
Parents and teens differ in several key perceptions regarding knowledge, behaviours, and feelings related to the management of heart disease. Specifically, parents perceive a higher knowledge deficit, teens perceive higher self-efficacy, and parents and teens agree that self-management is low.
A case–case–control investigation (216 patients) examined the risk factors and outcomes of carbapenem-resistant Enterobacter (CR-En) acquisition. Recent exposure to fluoroquinolones, intensive care unit (ICU) stay, and rapidly fatal McCabe condition were independent predictors for acquisition. Acquiring CR-En was independently associated with discharge to a long-term care facility after being admitted from home.
Background:Pseudomonas aeruginosa is an important nosocomial pathogen associated with intrinsic and acquired resistance mechanisms to major classes of antibiotics. To better understand clinical risk factors for drug-resistant P. aeruginosa infection, decision-tree models for the prediction of fluoroquinolone and carbapenem-resistant P. aeruginosa were constructed and compared to multivariable logistic regression models using performance characteristics. Methods: In total, 5,636 patients admitted to 4 hospitals within a New York City healthcare system from 2010 to 2016 with blood, respiratory, wound, or urine cultures growing PA were included in the analysis. Presence or absence of drug-resistance was defined using the first culture of any source positive for P. aeruginosa during each hospitalization. To train and validate the prediction models, cases were randomly split (60 of 40) into training and validation datasets. Clinical decision-tree models for both fluoroquinolone and carbapenem resistance were built from the training dataset using 21 clinical variables of interest, and multivariable logistic regression models were built using the 16 clinical variables associated with resistance in bivariate analyses. Decision-tree models were optimized using K-fold cross validation, and performance characteristics between the 4 models were compared. Results: From 2010 through 2016, prevalence of fluoroquinolone and carbapenem resistance was 32% and 18%, respectively. For fluoroquinolone resistance, the logistic regression algorithm attained a positive predictive value (PPV) of 0.57 and a negative predictive value (NPV) of 0.73 (sensitivity, 0.27; specificity, 0.90) and the decision-tree algorithm attained a PPV of 0.65 and an NPV of 0.72 (sensitivity 0.21, specificity 0.95). For carbapenem resistance, the logistic regression algorithm attained a PPV of 0.53 and a NPV of 0.85 (sensitivity 0.20, specificity 0.96) and the decision-tree algorithm attained a PPV of 0.59 and an NPV of 0.84 (sensitivity 0.22, specificity 0.96). The decision-tree partitioning algorithm identified prior fluoroquinolone resistance, SNF stay, sex, and length-of-stay as variables of greatest importance for fluoroquinolone resistance compared to prior carbapenem resistance, age, and length-of-stay for carbapenem resistance. The highest-performing decision tree for fluoroquinolone resistance is illustrated in Fig. 1. Conclusions: Supervised machine-learning techniques may facilitate prediction of P. aeruginosa resistance and risk factors driving resistance patterns in hospitalized patients. Such techniques may be applied to readily available clinical information from hospital electronic health records to aid with clinical decision making.
Several studies suggest significant relationships between migration and autism spectrum disorder (ASD) but there are discrepant results. Given that no studies to date have included a pathological control group, the specificity of the results in ASD can be questioned.
To compare the migration experience (premigration, migratory trip, postmigration) in ASD and non-ASD pathological control groups, and study the relationships between migration and autism severity.
Parents’ and grandparents’ migrant status was compared in 30 prepubertal boys with ASD and 30 prepubertal boys without ASD but with language disorders, using a questionnaire including Human Development Index (HDI)/Inequality-adjusted Human Development Index (IHDI) of native countries. Autism severity was assessed using the Child Autism Rating Scale, Autism Diagnostic Observation Schedule and Autism Diagnostic Interview-Revised scales.
The parents’ and grandparents’ migrant status frequency did not differ between ASD and control groups and was not associated with autism severity. The HDI/IHDI values of native countries were significantly lower for parents and grandparents of children with ASD compared with the controls, especially for paternal grandparents. Furthermore, HDI/IDHI levels from the paternal line (father and especially paternal grandparents) were significantly negatively correlated with autism severity, particularly for social interaction impairments.
In this study, parents’ and/or grandparents’ migrant status did not discriminate ASD and pathological control groups and did not contribute either to autism severity. However, the HDI/IHDI results suggest that social adversity-related stress experienced in native countries, especially by paternal grandparents, is potentially a traumatic experience that may play a role in ASD development. A ‘premigration theory of autism’ is then proposed.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
Quality-adjusted life-years (QALYs) and disability-adjusted life-years (DALYs) are commonly used in cost-effectiveness analysis (CEA) to measure health benefits. We sought to quantify and explain differences between QALY- and DALY-based cost-effectiveness ratios, and explore whether using one versus the other would materially affect conclusions about an intervention's cost-effectiveness.
We identified CEAs using both QALYs and DALYs from the Tufts Medical Center CEA Registry and Global Health CEA Registry, with a supplemental search to ensure comprehensive literature coverage. We calculated absolute and relative differences between the QALY- and DALY-based ratios, and compared ratios to common benchmarks (e.g., 1× gross domestic product per capita). We converted reported costs into US dollars.
Among eleven published CEAs reporting both QALYs and DALYs, seven focused on pharmaceuticals and infectious disease, and five were conducted in high-income countries. Four studies concluded that the intervention was “dominant” (cost-saving). Among the QALY- and DALY-based ratios reported from the remaining seven studies, absolute differences ranged from approximately $2 to $15,000 per unit of benefit, and relative differences from 6–120 percent, but most differences were modest in comparison with the ratio value itself. The values assigned to utility and disability weights explained most observed differences. In comparison with cost-effectiveness thresholds, conclusions were consistent regardless of the ratio type in ten of eleven cases.
Our results suggest that although QALY- and DALY-based ratios for the same intervention can differ, differences tend to be modest and do not materially affect comparisons to common cost-effectiveness thresholds.
To evaluate the National Health Safety Network (NHSN) hospital-onset Clostridioides difficile infection (HO-CDI) standardized infection ratio (SIR) risk adjustment for general acute-care hospitals with large numbers of intensive care unit (ICU), oncology unit, and hematopoietic cell transplant (HCT) patients.
Retrospective cohort study.
Eight tertiary-care referral general hospitals in California.
We used FY 2016 data and the published 2015 rebaseline NHSN HO-CDI SIR. We compared facility-wide inpatient HO-CDI events and SIRs, with and without ICU data, oncology and/or HCT unit data, and ICU bed adjustment.
For these hospitals, the median unmodified HO-CDI SIR was 1.24 (interquartile range [IQR], 1.15–1.34); 7 hospitals qualified for the highest ICU bed adjustment; 1 hospital received the second highest ICU bed adjustment; and all had oncology-HCT units with no additional adjustment per the NHSN. Removal of ICU data and the ICU bed adjustment decreased HO-CDI events (median, −25%; IQR, −20% to −29%) but increased the SIR at all hospitals (median, 104%; IQR, 90%–105%). Removal of oncology-HCT unit data decreased HO-CDI events (median, −15%; IQR, −14% to −21%) and decreased the SIR at all hospitals (median, −8%; IQR, −4% to −11%).
For tertiary-care referral hospitals with specialized ICUs and a large number of ICU beds, the ICU bed adjustor functions as a global adjustment in the SIR calculation, accounting for the increased complexity of patients in ICUs and non-ICUs at these facilities. However, the SIR decrease with removal of oncology and HCT unit data, even with the ICU bed adjustment, suggests that an additional adjustment should be considered for oncology and HCT units within general hospitals, perhaps similar to what is done for ICU beds in the current SIR.
In Wales, devolution signified an opportunity for the Welsh government to do politics differently. In particular, there was a focus on public participation as a mechanism for improvements to the economy, social outcomes and public services (Welsh Government, 2004). In ambition, at least, the devolution experiment in Wales anticipated the development of regulations for the engagement of its citizens. This chapter considers the role of community anchor organisations in the ‘flagship’ regeneration programme of the National Assembly for Wales, ‘Communities First’, launched in 2001 and later terminated in March 2018. The programme started as a ‘bottomup’ initiative for engaging with disadvantaged communities at the margins, setting up regulatory structures to deliver that vision; became a reduced and more competitive programme from 2008/09, with more defined outcomes; and then entered its final phase in 2012, with ‘clusters’ of communities that were expected to deliver governmentdriven outcomes on health, learning and, in particular, employability through a system of results-based accountability (RBA). In the process, regulation for engagement shifted to regulatory structures and processes that controlled engagement: the regulation of engagement.
Other research has traced the evolution of the programme (Pearce, 2012; Dicks, 2014) in the context of a bold policy experiment in a devolved context while the programme was still live. This chapter, however, unpicks the story of its evolution and demise from the perspectives of community development advisors and community development practitioners, the latter based in two community organisations in South Wales: South Riverside Community Development Centre (SRCDC) in Cardiff and 3Gs Community Development Trust in Merthyr Tydfil. Both organisations were involved in the Productive Margins programme and in the design and analysis of this research. Both pre-existed the Communities First programme and were charged with its delivery to local people. We look at the regulatory context in which these organisations found themselves and how they negotiated the demands of the state-funded programme, on the one hand, and their accountabilities to the communities that they believed they represented, on the other. A key question remains as to whether the involvement of community organisations in state-funded programmes can facilitate regulation for engagement for social change or whether their power to improve the well-being of the communities they represent might better be served in providing alternative modes of living.
A new model of radicalisation has appeared in Western countries since the 2010s. Radical groups are smaller, less hierarchical and are mainly composed of young, homegrown individuals. The aim of this review is to decipher the profiles of the European adolescents and young adults who have embraced the cause of radical Islamism and to define the role of psychiatry in dealing with this issue.
We performed a systematic search in several databases from January 2010 to July 2017 and reviewed the relevant studies that included European adolescents and/or young adults and presented empirical data.
In total, 22 qualitative and quantitative studies were reviewed from various fields and using different methodologies. Psychotic disorders are rare among radicalised youths. However, they show numerous risk factors common with adolescent psychopathologies. We develop a comprehensive three-level model to explain the phenomenon of radicalisation among young Europeans: (1) individual risk factors include psychological vulnerabilities such as early experiences of abandonment, perceived injustice and personal uncertainty; (2) micro-environmental risk factors include family dysfunction and friendships with radicalised individuals; (3) societal risk factors include geopolitical events and societal changes such as Durkheim’s concept of anomie. Some systemic factors are also implicated as there is a specific encounter between recruiters and the individual. The former use sectarian techniques to isolate and dehumanise the latter and to offer him a new societal model.
There are many similarities between psychopathological manifestations of adolescence and mechanisms at stake during the radicalisation process. As a consequence, and despite the rarity of psychotic disorders, mental health professionals have a role to play in the treatment and understanding of radical engagement among European youth. Studies with empirical data are limited, and more research should be promoted (in particular in females and in non-Muslim communities) to better understand the phenomenon and to propose recommendations for prevention and treatment.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Nurses’ broad knowledge and treatment skills are instrumental to disaster management. Roles, responsibilities, and practice take on additional dimensions to their regular roles during these times. Despite this crucial position, the literature indicates a gap between their actual work in emergencies and the investment in training and establishing response plans.
To explore trends in disaster nursing reflected in professional literature, link these trends to current disaster nursing competencies and standards, and reflect based on the literature how nursing can better contribute to disaster management.
A systematic literature review, conducted using six electronic databases, and examination of peer-reviewed English journal articles. Selected publications were examined to explore the domains of disaster nursing: policy, education, practice, research. Additional considerations were the scope of the paper: local, national, regional, or international. The International Nursing Councils’ (ICN) Disaster-Nursing competencies are examined in this context.
The search yielded 171 articles that met the inclusion criteria. Articles were published between 2001 and 2018, showing an annual increase. Of the articles, 48% (n = 82) were research studies and 12% (n = 20) were defined as dealing with management issues. Classified by domain, 48% (n = 82) dealt with practical implications of disaster nursing and 35% (n = 60) discussed educational issues. Only 11% of the papers reviewed policy matters, and of these, two included research. Classified by scope, about 11% (n =18) had an international perspective.
Current standards attribute a greater role to disaster-nursing in leadership in disaster preparedness, particularly from a policy perspective. However, this study indicates that only about 11% of publications reviewed policy issues and management matters. A high percentage of educational publications discuss the importance of including disaster nursing issues in the curricula. In order to advance this area, there is a need to conduct dedicated studies.