To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Fontan Outcomes Network was created to improve outcomes for children and adults with single ventricle CHD living with Fontan circulation. The network mission is to optimise longevity and quality of life by improving physical health, neurodevelopmental outcomes, resilience, and emotional health for these individuals and their families. This manuscript describes the systematic design of this new learning health network, including the initial steps in development of a national, lifespan registry, and pilot testing of data collection forms at 10 congenital heart centres.
Time is of the essence to continue the pandemic disaster cycle with a comprehensive post-COVID-19 health care delivery system RECOVERY analysis, plan and operation at the local, regional and state level.The second wave of COVID-19 pandemic response are not the ripples of acute COVID-19 patient clusters that will persist until a vaccine strategy is designed and implemented to effect herd immunity. The COVID-19 second wave are the patients that have had their primary and specialty care delayed. This exponential wave of patients requires prompt health care delivery system planning and response.
Between 1934 and the time of the 1940 Census, the US government built and leased 30,151 units of public housing, but we know little about the residents who benefited from this housing. We use a unique methodology that compares addresses of five public housing developments to complete-count data from the 1940 Census to identify residents of public housing in New York City at the time of the census. We compare these residents to the larger pool of residents living in New York City in 1940 who were eligible to apply for the housing to assess how closely housing authorities adhered to the intent of the National Industrial Recovery Act (1933) and the Housing Act of 1937. This comparison produces a picture of whom public housing administrators considered deserving of this public benefit at the dawn of the public housing program in the United States. Results indicate a shift toward serving households with lower incomes over time. All the developments had a consistent preference for households with a “nuclear family” structure, but policies favoring racial segregation and other discretion on the part of housing authorities for tenant selection created distinct populations across housing developments. Households headed by a naturalized citizen were favored over households headed by a native-born citizen in nearly all the public housing projects. This finding suggests a more nuanced understanding of who public housing administrators considered deserving of the first public housing than archival research accounts had previously indicated.
To describe pathogen distribution and rates for central-line–associated bloodstream infections (CLABSIs) from different acute-care locations during 2011–2017 to inform prevention efforts.
CLABSI data from the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN) were analyzed. Percentages and pooled mean incidence density rates were calculated for a variety of pathogens and stratified by acute-care location groups (adult intensive care units [ICUs], pediatric ICUs [PICUs], adult wards, pediatric wards, and oncology wards).
From 2011 to 2017, 136,264 CLABSIs were reported to the NHSN by adult and pediatric acute-care locations; adult ICUs and wards reported the most CLABSIs: 59,461 (44%) and 40,763 (30%), respectively. In 2017, the most common pathogens were Candida spp/yeast in adult ICUs (27%) and Enterobacteriaceae in adult wards, pediatric wards, oncology wards, and PICUs (23%–31%). Most pathogen-specific CLABSI rates decreased over time, excepting Candida spp/yeast in adult ICUs and Enterobacteriaceae in oncology wards, which increased, and Staphylococcus aureus rates in pediatric locations, which did not change.
The pathogens associated with CLABSIs differ across acute-care location groups. Learning how pathogen-targeted prevention efforts could augment current prevention strategies, such as strategies aimed at preventing Candida spp/yeast and Enterobacteriaceae CLABSIs, might further reduce national rates.
Novel tools for early diagnosis and monitoring of schistosomiasis are urgently needed. This study aimed to validate parasite-derived miRNAs as potential novel biomarkers for the detection of human Schistosoma japonicum infection. A total of 21 miRNAs were initially validated by real-time-polymerase chain reaction (RT-PCR) using serum samples of S. japonicum-infected BALB/c mice. Of these, 6 miRNAs were further validated with a human cohort of individuals from a schistosomiasis-endemic area of the Philippines. RT-PCR analysis showed that two parasite-derived miRNAs (sja-miR-2b-5p and sja-miR-2c-5p) could detect infected individuals with low infection intensity with moderate sensitivity/specificity values of 66%/68% and 55%/80%, respectively. Analysis of the combined data for the two parasite miRNAs revealed a specificity of 77.4% and a sensitivity of 60.0% with an area under the curve (AUC) value of 0.6906 (P = 0.0069); however, a duplex RT-PCR targeting both sja-miR-2b-5p and sja-miR-2c-5p did not result in an increased diagnostic performance compared with the singleplex assays. Furthermore, the serum level of sja-miR-2c-5p correlated significantly with faecal egg counts, whereas the other five miRNAs did not. Targeting S. japonicum-derived miRNAs in serum resulted in a moderate diagnostic performance when applied to a low schistosome infection intensity setting.
Consanguineous marriages potentially play an important role in the transmission of β-thalassaemia in many communities. This study aimed to determine the rate and socio-demographic associations of consanguineous marriages and to assess the influence on the prevalence of β-thalassaemia in Sri Lanka. Three marriage registrars from each district of Sri Lanka were randomly selected to prospectively collect data on all couples who registered their marriage during a 6-month period starting 1st July 2009. Separately, the parents of patients with β-thalassaemia were interviewed to identify consanguinity. A total of 5255 marriages were recorded from 22 districts. The average age at marriage was 27.3 (±6.1) years for males and 24.1 (±5.7) years for females. A majority (71%) of marriages were ‘love’ marriages, except in the Moor community where 84% were ‘arranged’ marriages. Overall, the national consanguinity rate was 7.4%. It was significantly higher among ethnic Tamils (22.4%) compared with Sinhalese (3.8%) and Moors (3.2%) (p < 0.001). Consanguinity rates were also higher in ‘arranged’ as opposed to ‘love’ marriages (11.7% vs 5.6%, p < 0.001). In patients with β-thalassaemia, the overall consanguinity rate was 14.5%; it was highest among Tamils (44%) and lowest among Sinhalese (12%). Parental consanguinity among patients with β-thalassaemia was double the national average. Although consanguinity is not the major factor in the transmission of the disease in the country, emphasis should be given to this significant practice when conducting β-thalassaemia prevention and awareness campaigns, especially in high-prevalence communities.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
Disturbances in trait emotions are a predominant feature in schizophrenia. However, less is known about (a) differences in trait emotion across phases of the illness such as the clinical high-risk (CHR) phase and (b) whether abnormalities in trait emotion that are associated with negative symptoms are driven by primary (i.e. idiopathic) or secondary (e.g. depression, anxiety) factors.
To examine profiles of trait affective disturbance and their clinical correlates in individuals with schizophrenia and individuals at CHR for psychosis.
In two studies (sample 1: 56 out-patients diagnosed with schizophrenia and 34 demographically matched individuals without schizophrenia (controls); sample 2: 50 individuals at CHR and 56 individuals not at CHR (controls)), participants completed self-report trait positive affect and negative affect questionnaires, clinical symptom interviews (positive, negative, disorganised, depression, anxiety) and community-based functional outcome measures.
Both clinical groups reported lower levels of positive affect (specific to joy among individuals with schizophrenia) and higher levels of negative affect compared with controls. For individuals with schizophrenia, links were found between positive affect and negative symptoms (which remained after controlling for secondary factors) and between negative affect and positive symptoms. For individuals at CHR, links were found between both affect dimensions and both types of symptom (which were largely accounted for by secondary factors).
Both clinical groups showed some evidence of reduced trait positive affect and elevated trait negative affect, suggesting that increasing trait positive affect and reducing trait negative affect is an important treatment goal across both populations. Clinical correlates of these emotional abnormalities were more integrally linked to clinical symptoms in individuals with schizophrenia and more closely linked to secondary influences such as depression and anxiety in individuals at CHR.
Complex challenges may arise when patients present to emergency services with an advance decision to refuse life-saving treatment following suicidal behaviour.
To investigate the use of advance decisions to refuse treatment in the context of suicidal behaviour from the perspective of clinicians and people with lived experience of self-harm and/or psychiatric services.
Forty-one participants aged 18 or over from hospital services (emergency departments, liaison psychiatry and ambulance services) and groups of individuals with experience of psychiatric services and/or self-harm were recruited to six focus groups in a multisite study in England. Data were collected in 2016 using a structured topic guide and included a fictional vignette. They were analysed using thematic framework analysis.
Advance decisions to refuse treatment for suicidal behaviour were contentious across groups. Three main themes emerged from the data: (a) they may enhance patient autonomy and aid clarity in acute emergencies, but also create legal and ethical uncertainty over treatment following self-harm; (b) they are anxiety provoking for clinicians; and (c) in practice, there are challenges in validation (for example, validating the patient’s mental capacity at the time of writing), time constraints and significant legal/ethical complexities.
The potential for patients to refuse life-saving treatment following suicidal behaviour in a legal document was challenging and anxiety provoking for participants. Clinicians should act with caution given the potential for recovery and fluctuations in suicidal ideation. Currently, advance decisions to refuse treatment have questionable use in the context of suicidal behaviour given the challenges in validation. Discussion and further patient research are needed in this area.
Declaration of interest
D.G., K.H. and N.K. are members of the Department of Health's (England) National Suicide Prevention Advisory Group. N.K. chaired the National Institute for Health and Care Excellence (NICE) guideline development group for the longer-term management of self-harm and the NICE Topic Expert Group (which developed the quality standards for self-harm services). He is currently chair of the updated NICE guideline for Depression. K.H. and D.G. are NIHR Senior Investigators. K.H. is also supported by the Oxford Health NHS Foundation Trust and N.K. by the Greater Manchester Mental Health NHS Foundation Trust.
Acute blood loss represents a leading cause of death in both civilian and battlefield trauma, despite the prioritization of massive hemorrhage control by well-adopted trauma guidelines. Current Tactical Combat Casualty Care (TCCC) and Tactical Emergency Casualty Care (TECC) guidelines recommend the application of a tourniquet to treat life-threatening extremity hemorrhages. While extremely effective at controlling blood loss, the proper application of a tourniquet is associated with severe pain and could lead to transient loss of limb function impeding the ability to self-extricate or effectively employ weapons systems. As a potential alternative, Innovative Trauma Care (San Antonio, Texas USA) has developed an external soft-tissue hemostatic clamp that could potentially provide effective hemorrhage control without the aforementioned complications and loss of limb function. Thus, this study sought to investigate the effectiveness of blood loss control by an external soft-tissue hemostatic clamp versus a compression tourniquet.
The external soft-tissue hemostatic clamp would be non-inferior at controlling intravascular fluid loss after damage to the femoral and popliteal arteries in a normotensive, coagulopathic, cadaveric lower-extremity flow model using an inert blood analogue, as compared to a compression tourniquet.
Using a fresh cadaveric model with simulated vascular flow, this study sought to compare the effectiveness of the external soft-tissue hemostatic clamp versus the compression tourniquet to control fluid loss in simulated trauma resulting in femoral and posterior tibial artery lacerations using a coagulopathic, normotensive, cadaveric-extremity flow model. A sample of 16 fresh, un-embalmed, human cadaver lower extremities was used in this randomized, balanced two-treatment, two-period, two-sequence, crossover design. Statistical significance of the treatment comparisons was assessed with paired t-tests. Results were expressed as the mean and standard deviation (SD).
Mean intravascular fluid loss was increased from simulated arterial wounds with the external soft-tissue hemostatic clamp as compared to the compression tourniquet at the lower leg (119.8mL versus 15.9mL; P <.001) and in the thigh (103.1mL versus 5.2mL; P <.001).
In this hemorrhagic, coagulopathic, cadaveric-extremity experimental flow model, the use of the external soft-tissue hemostatic clamp as a hasty hemostatic adjunct was associated with statistically significant greater fluid loss than with the use of the compression tourniquet.
Paquette R, Bierle R, Wampler D, Allen P, Cooley C, Ramos R, Michalek J, Gerhardt RT. External soft-tissue hemostatic clamp compared to a compression tourniquet as primary hemorrhage control device in pilot flow model study. Prehosp Disaster Med. 2019;34(2):175–181
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Detailed study of subsurface deposits in the Polish Sudeten Foreland, particularly with reference to provenance data, has revealed that an extensive preglacial drainage system developed there in the Pliocene–Early Pleistocene, with both similarities and differences in comparison with the present-day Odra (Oder) system. This foreland is at the northern edge of an intensely deformed upland, metamorphosed during the Variscan orogeny, with faulted horsts and grabens reactivated in the Late Cenozoic. The main arm of preglacial drainage of this area, at least until the early Middle Pleistocene, was the Palaeo–Nysa Kłodzka, precursor of the Odra left-bank tributary of that name. Significant preglacial evolution of this drainage system can be demonstrated, including incision into the landscape, prior to its disruption by glaciation in the Elsterian (Sanian) and again in the early Saalian (Odranian), which resulted in burial of the preglacial fluvial archives by glacial and fluvioglacial deposits. No later ice sheets reached the area, in which the modern drainage pattern became established, the rivers incising afresh into the landscape and forming post-Saalian terrace systems. Issues of compatibility of this record with the progressive uplift implicit in the formation of conventional terrace systems are examined, with particular reference to crustal properties, which are shown to have had an important influence on landscape and drainage evolution in the region.
The 6th Willi Steiner Memorial Lecture was delivered by David Allen Green and took place on 8 June 2017 during the Annual Conference of the British and Irish Association of Law Librarians (BIALL) which was held in Manchester. His talk concerned libraries and public policy with particular reference to Brexit. He addressed the issues of how a debate like Brexit can be better informed and to what extent reliable legal and policy information makes any difference. In essence, he looks at how good information can help shape Brexit. This article is a later write-up from David's speaking notes. The lecture was, coincidentally, given on the same day as the general election.
To determine risk factors independent of length of stay (LOS) for Staphylococcus aureus acquisition in infants admitted to the neonatal intensive care unit (NICU).
Retrospective matched case–case-control study.
Quaternary-care referral NICU at a large academic children’s hospital.
Infants admitted between January 2014 and March 2016 at a level IV NICU who acquired methicillin resistant (MRSA) or susceptible (MSSA) S. aureus were matched with controls by duration of exposure to determine risk factors for acquisition. A secondary post hoc analysis was performed on the entire cohort of at-risk infants for risk factors identified in the primary analysis to further quantify risk.
In total, 1,751 infants were admitted during the study period with 199 infants identified as having S. aureus prevalent on admission. There were 246 incident S. aureus acquisitions in the remaining at-risk infant cohort. On matched analysis, infants housed in a single-bed unit were associated with a significantly decreased risk of both MRSA (P=.03) and MSSA (P=.01) acquisition compared with infants housed in multibed pods. Across the entire cohort, pooled S. aureus acquisition was significantly lower in infants housed in single-bed units (hazard ratio,=0.46; confidence interval, 0.34–0.62).
NICU bed design is significantly associated with S. aureus acquisition in hospitalized infants independent of LOS.
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)