We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study examined struggles to establish autonomy and relatedness with peers in adolescence and early adulthood as predictors of advanced epigenetic aging assessed at age 30. Participants (N = 154; 67 male and 87 female) were observed repeatedly, along with close friends and romantic partners, from ages 13 through 29. Observed difficulty establishing close friendships characterized by mutual autonomy and relatedness from ages 13 to 18, an interview-assessed attachment state of mind lacking autonomy and valuing of attachment at 24, and self-reported difficulties in social integration across adolescence and adulthood were all linked to greater epigenetic age at 30, after accounting for chronological age, gender, race, and income. Analyses assessing the unique and combined effects of these factors, along with lifetime history of cigarette smoking, indicated that each of these factors, except for adult social integration, contributed uniquely to explaining epigenetic age acceleration. Results are interpreted as evidence that the adolescent preoccupation with peer relationships may be highly functional given the relevance of such relationships to long-term physical outcomes.
Performance characteristics of SARS-CoV-2 nucleic acid detection assays are understudied within contexts of low pre-test probability, including screening asymptomatic persons without epidemiological links to confirmed cases, or asymptomatic surveillance testing. SARS-CoV-2 detection without symptoms may represent presymptomatic or asymptomatic infection, resolved infection with persistent RNA shedding, or a false-positive test. This study assessed the positive predictive value of SARS-CoV-2 real-time reverse transcription polymerase chain reaction (rRT-PCR) assays by retesting positive specimens from 5 pre-test probability groups ranging from high to low with an alternate assay.
Methods:
In total, 122 rRT-PCR positive specimens collected from unique patients between March and July 2020 were retested using a laboratory-developed nested RT-PCR assay targeting the RNA-dependent RNA polymerase (RdRp) gene followed by Sanger sequencing.
Results:
Significantly fewer (15.6%) positive results in the lowest pre-test probability group (facilities with institution-wide screening having ≤3 positive asymptomatic cases) were reproduced with the nested RdRp gene RT-PCR assay than in each of the 4 groups with higher pre-test probability (individual group range, 50.0%–85.0%).
Conclusions:
Large-scale SARS-CoV-2 screening testing initiatives among low pre-test probability populations should be evaluated thoroughly prior to implementation given the risk of false-positive results and consequent potential for harm at the individual and population level.
A multi-disciplinary expert group met to discuss vitamin D deficiency in the UK and strategies for improving population intakes and status. Changes to UK Government advice since the 1st Rank Forum on Vitamin D (2009) were discussed, including rationale for setting a reference nutrient intake (10 µg/d; 400 IU/d) for adults and children (4+ years). Current UK data show inadequate intakes among all age groups and high prevalence of low vitamin D status among specific groups (e.g. pregnant women and adolescent males/females). Evidence of widespread deficiency within some minority ethnic groups, resulting in nutritional rickets (particularly among Black and South Asian infants), raised particular concern. Latest data indicate that UK population vitamin D intakes and status reamain relatively unchanged since Government recommendations changed in 2016. Vitamin D food fortification was discussed as a potential strategy to increase population intakes. Data from dose–response and dietary modelling studies indicate dairy products, bread, hens’ eggs and some meats as potential fortification vehicles. Vitamin D3 appears more effective than vitamin D2 for raising serum 25-hydroxyvitamin D concentration, which has implications for choice of fortificant. Other considerations for successful fortification strategies include: (i) need for ‘real-world’ cost information for use in modelling work; (ii) supportive food legislation; (iii) improved consumer and health professional understanding of vitamin D’s importance; (iv) clinical consequences of inadequate vitamin D status and (v) consistent communication of Government advice across health/social care professions, and via the food industry. These areas urgently require further research to enable universal improvement in vitamin D intakes and status in the UK population.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
In two visual world experiments we disentangled the influence of order of mention (first vs. second mention), grammatical role (subject vs object), and semantic role (proto-agent vs proto-patient) on 7- to 10-year-olds’ real-time interpretation of German pronouns. Children listened to SVO or OVS sentences containing active accusative verbs (küssen “to kiss”) in Experiment 1 (N = 72), or dative object-experiencer verbs (gefallen “to like”) in Experiment 2 (N = 64). This was followed by the personal pronoun er or the demonstrative pronoun der. Interpretive preferences for er were most robust when high prominence cues (first mention, subject, proto-agent) were aligned onto the same entity; and the same applied to der for low prominence cues (second mention, object, proto-patient). These preferences were reduced in conditions where cues were misaligned, and there was evidence that each cue independently influenced performance. Crucially, individual variation in age predicted adult-like weighting preferences for semantic cues (Schumacher, Roberts & Järvikivi, 2017).
During March 27–July 14, 2020, the Centers for Disease Control and Prevention’s National Healthcare Safety Network extended its surveillance to hospital capacities responding to COVID-19 pandemic. The data showed wide variations across hospitals in case burden, bed occupancies, ventilator usage, and healthcare personnel and supply status. These data were used to inform emergency responses.
The systematic review examined the phenomenon of trust during public health emergency events. The literature reviewed was field studies done with people directly affected or likely to be affected by such events and included quantitative, qualitative, mixed-method, and case study primary studies in English (N = 38) as well as Arabic, Chinese, French, Russian, and Spanish (all non-English N = 30). Studies were mostly from high- and middle-income countries, and the event most covered was infectious disease. Findings from individual studies were first synthesized within methods and evaluated for certainty/confidence, and then synthesized across methods. The final set of 11 findings synthesized across methods identified a set of activities for enhancing trust and showed that it is a multi-faceted and dynamic concept.
Intensity in adolescent romantic relationships was examined as a long-term predictor of higher adult blood pressure in a community sample followed from age 17 to 31 years. Romantic intensity in adolescence – measured via the amount of time spent alone with a partner and the duration of the relationship – was predicted by parents’ psychologically controlling behavior and was in turn found to predict higher resting adult systolic and diastolic blood pressure even after accounting for relevant covariates. The prediction to adult blood pressure was partially mediated via conflict in nonromantic adult friendships and intensity in adult romantic relationships. Even after accounting for these mediators, however, a direct path from adolescent romantic intensity to higher adult blood pressure remained. Neither family income in adolescence nor trait measures of personality assessed in adulthood accounted for these findings. The results of this study are interpreted both as providing further support for the view that adolescent social relationship qualities have substantial long-term implications for adult health, as well as suggesting a potential physiological mechanism by which adolescent relationships may be linked to adult health outcomes.
Whole-rock major- and trace-element data are presented on a sample collection from the >3 Ga Amikoq Layered Complex (ALC), and hosting amphibolites within the Mesoarchean Akia terrane, SW Greenland. The lithologies range from leuconorite to melanorite/feldspathic orthopyroxenite, orthopyroxenite to harzburgite through to dunite, and tholeiitic basaltic–picritic mafic host rocks. The Amikoq Layered Complex samples are primitive (Mg#: 65–89) with elevated Ni and Cr contents. However, the absence of troctolitic lithologies and the presence of two orthopyroxene compositional trends, suggests that the successions might not be comagmatic. On the basis of trace-element cumulate models, relatively low Ni contents and minor negative Sr-Eu anomalies in some high-Ti ultramafic rocks, it is not possible to exclude a petrogenesis related to a melt similar to that of the mafic host-rocks. Ultramafic samples with U-shaped trace-element distribution patterns are petrogenetically related to the noritic sequences, either through cumulus mineral accumulation or melt-rock reactions. Assimilation-fractional-crystallisation modelling of melanorites nevertheless require the parental melt to have been contaminated/mixed with a component of island-arc-like tholeiite affinity. A boninite-like parental melt might have been derived from the subcontinental lithospheric mantle of the Akia terrane, or alternatively via assimilation of an ultramafic parental melt with island-arc-like tholeiite. Given the complex geological evolution and high-grade metamorphic overprint of the Amikoq Layered Complex, we are unable to differentiate between the two models.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
The rapid spread of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) throughout key regions of the United States in early 2020 placed a premium on timely, national surveillance of hospital patient censuses. To meet that need, the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN), the nation’s largest hospital surveillance system, launched a module for collecting hospital coronavirus disease 2019 (COVID-19) data. We present time-series estimates of the critical hospital capacity indicators from April 1 to July 14, 2020.
Design:
From March 27 to July 14, 2020, the NHSN collected daily data on hospital bed occupancy, number of hospitalized patients with COVID-19, and the availability and/or use of mechanical ventilators. Time series were constructed using multiple imputation and survey weighting to allow near–real-time daily national and state estimates to be computed.
Results:
During the pandemic’s April peak in the United States, among an estimated 431,000 total inpatients, 84,000 (19%) had COVID-19. Although the number of inpatients with COVID-19 decreased from April to July, the proportion of occupied inpatient beds increased steadily. COVID-19 hospitalizations increased from mid-June in the South and Southwest regions after stay-at-home restrictions were eased. The proportion of inpatients with COVID-19 on ventilators decreased from April to July.
Conclusions:
The NHSN hospital capacity estimates served as important, near–real-time indicators of the pandemic’s magnitude, spread, and impact, providing quantitative guidance for the public health response. Use of the estimates detected the rise of hospitalizations in specific geographic regions in June after they declined from a peak in April. Patient outcomes appeared to improve from early April to mid-July.
Energy deficit is common during prolonged periods of strenuous physical activity and limited sleep, but the extent to which appetite suppression contributes is unclear. The aim of this randomised crossover study was to determine the effects of energy balance on appetite and physiological mediators of appetite during a 72-h period of high physical activity energy expenditure (about 9·6 MJ/d (2300 kcal/d)) and limited sleep designed to simulate military operations (SUSOPS). Ten men consumed an energy-balanced diet while sedentary for 1 d (REST) followed by energy-balanced (BAL) and energy-deficient (DEF) controlled diets during SUSOPS. Appetite ratings, gastric emptying time (GET) and appetite-mediating hormone concentrations were measured. Energy balance was positive during BAL (18 (sd 20) %) and negative during DEF (–43 (sd 9) %). Relative to REST, hunger, desire to eat and prospective consumption ratings were all higher during DEF (26 (sd 40) %, 56 (sd 71) %, 28 (sd 34) %, respectively) and lower during BAL (–55 (sd 25) %, −52 (sd 27) %, −54 (sd 21) %, respectively; Pcondition < 0·05). Fullness ratings did not differ from REST during DEF, but were 65 (sd 61) % higher during BAL (Pcondition < 0·05). Regression analyses predicted hunger and prospective consumption would be reduced and fullness increased if energy balance was maintained during SUSOPS, and energy deficits of ≥25 % would be required to elicit increases in appetite. Between-condition differences in GET and appetite-mediating hormones identified slowed gastric emptying, increased anorexigenic hormone concentrations and decreased fasting acylated ghrelin concentrations as potential mechanisms of appetite suppression. Findings suggest that physiological responses that suppress appetite may deter energy balance from being achieved during prolonged periods of strenuous activity and limited sleep.
Premature ejaculation (PE) and erectile dysfunction (ED) are prevalent sexual problems, with evidence to suggest variation across sexual orientation. Contributing factors have traditionally been divided into organic and psychological categories. While limited research has found support for the influence of metacognitive beliefs, these studies did not investigate potential differences in sexual orientation.
Aim:
The current study aimed to investigate the differences in metacognitive beliefs in men with or without PE and/or ED and whether these varied according to sexual orientation.
Method:
A sample of 531 men was recruited (65 met criteria for PE only, 147 for ED, 83 with PE and ED, and 236 healthy controls). Within this sample, 188 men identified as heterosexual, 144 as bisexual, and 199 as homosexual. Participants completed a cross-sectional online survey consisting of psychometric measures.
Results:
Participants with PE and ED were significantly higher in cognitive confidence, thoughts concerning uncontrollability and danger, and need to control thoughts than PE only, ED only, and healthy controls. Furthermore, the PE only group was significantly higher than healthy controls for cognitive confidence, with the ED significantly higher for thoughts concerning uncontrollability and danger. There were no significant differences between differing sexual orientations for men with/or without PE and/or ED.
Conclusions:
Congruent with previous research, metacognitive beliefs play a role in PE and/or ED, although this is not exclusive to sexual orientation. The findings highlight that assessment and intervention regarding metacognitive beliefs may be beneficial for men of all sexual orientations with PE and/or ED.
There is mounting evidence for the potential for the natural dietary antioxidant and anti-inflammatory amino acid l-Ergothioneine (ERGO) to prevent or mitigate chronic diseases of aging. This has led to the suggestion that it could be considered a ‘longevity vitamin.’ ERGO is produced in nature only by certain fungi and a few other microbes. Mushrooms are, by far, the leading dietary source of ERGO, but it is found in small amounts throughout the food chain, most likely due to soil-borne fungi passing it on to plants. Because some common agricultural practices can disrupt beneficial fungus–plant root relationships, ERGO levels in foods grown under those conditions could be compromised. Thus, research is needed to further analyse the role agricultural practices play in the availability of ERGO in the human diet and its potential to improve our long-term health.
Clinical intuition suggests that personality disorders hinder the treatment of depression, but research findings are mixed. One reason for this might be the way in which current assessment measures conflate general aspects of personality disorders, such as overall severity, with specific aspects, such as stylistic tendencies. The goal of this study was to clarify the unique contributions of the general and specific aspects of personality disorders to depression outcomes.
Methods
Patients admitted to the Menninger Clinic, Houston, between 2012 and 2015 (N = 2352) were followed over a 6–8-week course of multimodal inpatient treatment. Personality disorder symptoms were assessed with the Structured Clinical Interview for Diagnostic and Statistical Manual of Mental Disorders, 4th edition Axis II Personality Screening Questionnaire at admission, and depression severity was assessed using the Patient Health Questionnaire-9 every fortnight. General and specific personality disorder factors estimated with a confirmatory bifactor model were used to predict latent growth curves of depression scores in a structural equation model.
Results
The general factor predicted higher initial depression scores but not different rates of change. By contrast, the specific borderline factor predicted slower rates of decline in depression scores, while the specific antisocial factor predicted a U shaped pattern of change.
Conclusions
Personality disorder symptoms are best represented by a general factor that reflects overall personality disorder severity, and specific factors that reflect unique personality styles. The general factor predicts overall depression severity while specific factors predict poorer prognosis which may be masked in prior studies that do not separate the two.
The metamorphic history of the Mesoarchean Amikoq Layered Complex within the Akia terrane of SW Greenland was characterised by electron microprobe mineral data and detailed petrography on 12 representative samples, integrated with zircon U–Pb geochronology and petrology. The complex intruded into a >3004 Ma supracrustal association now consisting of granoblastic metabasites with subordinate quartz-rich gneiss. Supracrustal host rocks contain a relict high-temperature assemblage of orthopyroxene–clinopyroxene (± pigeonite exsolution lamellae, exsolved at ~975–1010°C), which is interpreted to pre-date the Amikoq intrusion. Cumulate to granoblastic-textured rocks of the main Amikoq Layered Complex range modally from leuconorite to melanorite, orthopyroxenite to harzburgite/dunite and rare hornblende melagabbro. Observed mineralogy of main complex noritic lithologies is essentially relict igneous with orthopyroxene–biotite and hornblende–plagioclase thermometers yielding temperatures of ~800–1070°C. An anatectic zircon megacryst from a patchy quartzo–feldspathic leucosome hosted in an orthopyroxene-dominated Amikoq rock reflects local anatexis at peak metamorphic P–T conditions and yields an intrusion minimum age of 3004 ± 9 Ma. Field observations indicate local anatexis of orthopyroxene-dominated lithologies, possibly indicating a post-intrusion peak temperature of >900°C. The last preserved stages of retrogression are recorded in paragneiss plagioclase–garnet, biotite–garnet and host rock ilmenite–magnetite pairs (≤3 kbar and ~380–560°C).
The Amikoq Complex intruded a MORB-like crustal section and the former remained relatively undisturbed in terms of modal mineralogy. Preservation of igneous textures and mineralogy are related to an anhydrous, high-grade metamorphic history that essentially mimics igneous crystallisation conditions, whereas local high-strain zones acted as fluid pathways resulting in hydrous breakdown of igneous minerals. There is no evidence of equilibration of the intrusion at sub-amphibolite-facies conditions.
Hookworms are some of the most widespread of the soil-transmitted helminths (STH) with an estimated 438.9 million people infected. Until relatively recently Ancylostoma ceylanicum was regarded as a rare cause of hookworm infection in humans, with little public health relevance. However, recent advances in molecular diagnostics have revealed a much higher prevalence of this zoonotic hookworm than previously thought, particularly in Asia. This study examined the prevalence of STH and A. ceylanicum in the municipalities of Palapag and Laoang in the Philippines utilizing real-time polymerase chain reaction (PCR) on stool samples previously collected as part of a cross-sectional survey of schistosomiasis japonica. Prevalence of hookworm in humans was high with 52.8% (n = 228/432) individuals positive for any hookworm, 34.5% (n = 149/432) infected with Necator americanus, and 29.6% (n = 128/432) with Ancylostoma spp; of these, 34 were PCR-positive for A. ceylanicum. Considering dogs, 12 (n = 33) were PCR-positive for A. ceylanicum. This is the first study to utilize molecular diagnostics to identify A. ceylanicum in the Philippines with both humans and dogs infected. Control and elimination of this zoonotic hookworm will require a multifaceted approach including chemotherapy of humans, identification of animal reservoirs, improvements in health infrastructure, and health education to help prevent infection.
This 17-year prospective study applied a social-developmental lens to the challenge of distinguishing predictors of adolescent-era substance use from predictors of longer term adult substance use problems. A diverse community sample of 168 individuals was repeatedly assessed from age 13 to age 30 using test, self-, parent-, and peer-report methods. As hypothesized, substance use within adolescence was linked to a range of likely transient social and developmental factors that are particularly salient during the adolescent era, including popularity with peers, peer substance use, parent–adolescent conflict, and broader patterns of deviant behavior. Substance abuse problems at ages 27–30 were best predicted, even after accounting for levels of substance use in adolescence, by adolescent-era markers of underlying deficits, including lack of social skills and poor self-concept. The factors that best predicted levels of adolescent-era substance use were not generally predictive of adult substance abuse problems in multivariate models (either with or without accounting for baseline levels of use). Results are interpreted as suggesting that recognizing the developmental nature of adolescent-era substance use may be crucial to distinguishing factors that predict socially driven and/or relatively transient use during adolescence from factors that predict long-term problems with substance abuse that extend well into adulthood.