We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Suicide is a major public health problem and a cause of premature mortality. With a view to prevention, a great deal of research has been devoted to the determinants of suicide, focusing mostly on individual risk factors, particularly depression. In addition to causes intrinsic to the individual, the social environment has also been widely studied, particularly social isolation. This paper examines the social dimension of suicide etiology through a review of the literature on the relationship between suicide and social isolation.
Methods
Medline searches via PubMed and PsycINFO were conducted. The keywords were “suicid*” AND “isolation.”
Results
Of the 2,684 articles initially retrieved, 46 were included in the review.
Conclusions
Supported by proven theoretical foundations, mainly those developed by E. Durkheim and T. Joiner, a large majority of the articles included endorse the idea of a causal relationship between social isolation and suicide, and conversely, a protective effect of social support against suicide. Moreover, the association between suicide and social isolation is subject to variations related to age, gender, psychopathology, and specific circumstances. The social etiology of suicide has implications for intervention and future research.
Suicide is one of the main preventable causes of death. Artificial intelligence (AI) could improve methods for assessing suicide risk. The objective of this review is to assess the potential of AI in identifying patients who are at risk of attempting suicide.
Methods
A systematic review of the literature was conducted on PubMed, EMBASE, and SCOPUS databases, using relevant keywords.
Results
Thanks to this research, 296 studies were identified. Seventeen studies, published between 2014 and 2020 and matching inclusion criteria, were selected as relevant. Included studies aimed at predicting individual suicide risk or identifying at-risk individuals in a specific population. The AI performance was overall good, although variable across different algorithms and application settings.
Conclusions
AI appears to have a high potential for identifying patients at risk of suicide. The precise use of these algorithms in clinical situations, as well as the ethical issues it raises, remain to be clarified.
This study sought to identify coronavirus disease 2019 (COVID-19) risk communication materials distributed in Jamaica to mitigate the effects of the disease outbreak. It also sought to explore the effects of health risk communication on vulnerable groups in the context of the pandemic.
Methods:
A qualitative study was conducted, including a content analysis of health risk communications and in-depth interviews with 35 purposively selected elderly, physically disabled, persons with mental health disorders, representatives of government agencies, advocacy and service groups, and caregivers of the vulnerable. Axial coding was applied to data from the interviews, and all data were analyzed using the constant comparison technique.
Results:
Twelve of the 141 COVID-19 risk communication messages directly targeted the vulnerable. All participants were aware of the relevant risk communication and largely complied. Barriers to messaging awareness and compliance included inappropriate message medium for the deaf and blind, rural location, lack of Internet service or digital devices, limited technology skills, and limited connection to agencies that serve the vulnerable.
Conclusion:
The vulnerable are at increased risk in times of crisis. Accessibility of targeted information was inadequate for universal access to health information and support for vulnerable persons regardless of location and vulnerability.
Alcohol use disorder (AUD) and schizophrenia (SCZ) frequently co-occur, and large-scale genome-wide association studies (GWAS) have identified significant genetic correlations between these disorders.
Methods
We used the largest published GWAS for AUD (total cases = 77 822) and SCZ (total cases = 46 827) to identify genetic variants that influence both disorders (with either the same or opposite direction of effect) and those that are disorder specific.
Results
We identified 55 independent genome-wide significant single nucleotide polymorphisms with the same direction of effect on AUD and SCZ, 8 with robust effects in opposite directions, and 98 with disorder-specific effects. We also found evidence for 12 genes whose pleiotropic associations with AUD and SCZ are consistent with mediation via gene expression in the prefrontal cortex. The genetic covariance between AUD and SCZ was concentrated in genomic regions functional in brain tissues (p = 0.001).
Conclusions
Our findings provide further evidence that SCZ shares meaningful genetic overlap with AUD.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
Objective: We evaluated whether memory recall following an extended (1 week) delay predicts cognitive and brain structural trajectories in older adults
Method:
Clinically normal older adults (52–92 years old) were followed longitudinally for up to 8 years after completing a memory paradigm at baseline [Story Recall Test (SRT)] that assessed delayed recall at 30 min and 1 week. Subsets of the cohort underwent neuroimaging (N = 134, mean age = 75) and neuropsychological testing (N = 178–207, mean ages = 74–76) at annual study visits occurring approximately 15–18 months apart. Mixed-effects regression models evaluated if baseline SRT performance predicted longitudinal changes in gray matter volumes and cognitive composite scores, controlling for demographics.
Results:
Worse SRT 1-week recall was associated with more precipitous rates of longitudinal decline in medial temporal lobe volumes (p = .037), episodic memory (p = .003), and executive functioning (p = .011), but not occipital lobe or total gray matter volumes (demonstrating neuroanatomical specificity; p > .58). By contrast, SRT 30-min recall was only associated with longitudinal decline in executive functioning (p = .044).
Conclusions:
Memory paradigms that capture longer-term recall may be particularly sensitive to age-related medial temporal lobe changes and neurodegenerative disease trajectories. (JINS, 2020, xx, xx-xx)
Rey’s Auditory Verbal Learning Test (AVLT) is a widely used word list memory test. We update normative data to include adjustment for verbal memory performance differences between men and women and illustrate the effect of this sex adjustment and the importance of excluding participants with mild cognitive impairment (MCI) from normative samples.
Method:
This study advances the Mayo’s Older Americans Normative Studies (MOANS) by using a new population-based sample through the Mayo Clinic Study of Aging, which randomly samples residents of Olmsted County, Minnesota, from age- and sex-stratified groups. Regression-based normative T-score formulas were derived from 4428 cognitively unimpaired adults aged 30–91 years. Fully adjusted T-scores correct for age, sex, and education. We also derived T-scores that correct for (1) age or (2) age and sex. Test-retest reliability data are provided.
Results:
From raw score analyses, sex explained a significant amount of variance in performance above and beyond age (8–10%). Applying original age-adjusted MOANS norms to the current sample resulted in significantly fewer-than-expected participants with low delayed recall performance, particularly in women. After application of new T-scores adjusted only for age, even in normative data derived from this sample, these age-adjusted T-scores showed scores <40 T occurred more frequently among men and less frequently among women relative to T-scores that also adjusted for sex.
Conclusions:
Our findings highlight the importance of using normative data that adjust for sex with measures of verbal memory and provide new normative data that allow for this adjustment for the AVLT.
Rapidly-rising jökulhlaups, or glacial outburst floods, are a phenomenon with a high potential for damage. The initiation and propagation processes of a rapidly-rising jökulhlaup are still not fully understood. Seismic monitoring can contribute to an improved process understanding, but comprehensive long-term seismic monitoring campaigns capturing the dynamics of a rapidly-rising jökulhlaup have not been reported so far. To fill this gap, we installed a seismic network at the marginal, ice-dammed lake of the A.P. Olsen Ice Cap (APO) in NE-Greenland. Episodic outbursts from the lake cause flood waves in the Zackenberg river, characterized by a rapid discharge increase within a few hours. Our 6 months long seismic dataset comprises the whole fill-and-drain cycle of the ice-dammed lake in 2012 and includes one of the most destructive floods recorded so far for the Zackenberg river. Seismic event detection and localization reveals abundant surface crevassing and correlates with changes of the river discharge. Seismic interferometry suggests the existence of a thin basal sedimentary layer. We show that the ballistic part of the first surface waves can potentially be used to infer medium changes in both the ice body and the basal layer. Interpretation of time-lapse interferograms is challenged by a varying ambient noise source distribution.
Wildlife is an essential component of all ecosystems. Most places in the globe do not have local, timely information on which species are present or how their populations are changing. With the arrival of new technologies, camera traps have become a popular way to collect wildlife data. However, data collection has increased at a much faster rate than the development of tools to manage, process and analyse these data. Without these tools, wildlife managers and other stakeholders have little information to effectively manage, understand and monitor wildlife populations. We identify four barriers that are hindering the widespread use of camera trap data for conservation. We propose specific solutions to remove these barriers integrated in a modern technology platform called Wildlife Insights. We present an architecture for this platform and describe its main components. We recognize and discuss the potential risks of publishing shared biodiversity data and a framework to mitigate those risks. Finally, we discuss a strategy to ensure platforms like Wildlife Insights are sustainable and have an enduring impact on the conservation of wildlife.
The exploitation of works or other subject matter of protection as provided for under Union law has, first and foremost, been set out in the all-embracing provision of article 3 of the InfoSoc Directive. In its first paragraph, article 3 applies to works of whatever kind and, in its second paragraph, to the related rights as enumerated therein.1
Passive seismology allows measurement of the structure of glaciers and ice sheets. However, most techniques used so far in this context are based on horizontally homogeneous media where parameters vary only with depth (1-D approximations), which are appropriate only for a subset of glaciers. Here, we analyze seismic noise records from three different types of glaciers (plateau, valley and avalanching glacier) to characterize the influence of the glacier geometry on the seismic wavefield. Using horizontal-to-vertical spectral ratios, polarization analysis and modal analysis, we show that the plateau glacier and the valley glacier can be seen as 1-D, whereas the relatively small avalanching glacier shows 3-D effects due to its bed topography and the deep crevasses. In principle, the techniques proposed here might allow monitoring such crevasses and their depth, and thus to constrain a key parameter of avalanching and calving glacier fronts.
Background: Amyotrophic lateral sclerosis (ALS) is a progressive motor neuron disease resulting in muscle weakness, dysarthria and dysphagia, and ultimately respiratory failure leading to death. Half of the ALS patients survive less than 3 years, and 80% of the patients survive less than 5 years. Riluzole is the only approved medication in Canada with randomized controlled clinical trial evidence to slow the progression of ALS, albeit only to a modest degree. The Canadian Neuromuscular Disease Registry (CNDR) collects data on over 140 different neuromuscular diseases including ALS across ten academic institutions and 28 clinics including ten multidisciplinary ALS clinics. Methods: In this study, CNDR registry data were analyzed to examine potential differences in ALS care among provinces in time to diagnosis, riluzole and feeding tube use. Results: Significant differences were found among provinces, in time to diagnosis from symptom onset, in the use of riluzole and in feeding tube use. Conclusions: Future investigations should be undertaken to identify factors contributing to such differences, and to propose potential interventions to address the provincial differences reported.
To identify predominant dietary patterns in four African populations and examine their association with obesity.
Design
Cross-sectional study.
Setting/Subjects
We used data from the Africa/Harvard School of Public Health Partnership for Cohort Research and Training (PaCT) pilot study established to investigate the feasibility of a multi-country longitudinal study of non-communicable chronic disease in sub-Saharan Africa. We applied principal component analysis to dietary intake data collected from an FFQ developed for PaCT to ascertain dietary patterns in Tanzania, South Africa, and peri-urban and rural Uganda. The sample consisted of 444 women and 294 men.
Results
We identified two dietary patterns: the Mixed Diet pattern characterized by high intakes of unprocessed foods such as vegetables and fresh fish, but also cold cuts and refined grains; and the Processed Diet pattern characterized by high intakes of salad dressing, cold cuts and sweets. Women in the highest tertile of the Processed Diet pattern score were 3·00 times more likely to be overweight (95 % CI 1·66, 5·45; prevalence=74 %) and 4·24 times more likely to be obese (95 % CI 2·23, 8·05; prevalence=44 %) than women in this pattern’s lowest tertile (both P<0·0001; prevalence=47 and 14 %, respectively). We found similarly strong associations in men. There was no association between the Mixed Diet pattern and overweight or obesity.
Conclusions
We identified two major dietary patterns in several African populations, a Mixed Diet pattern and a Processed Diet pattern. The Processed Diet pattern was associated with obesity.
Little is known about the association of cortical Aβ with depression and anxiety among cognitively normal (CN) elderly persons.
Methods:
We conducted a cross-sectional study derived from the population-based Mayo Clinic Study of Aging in Olmsted County, Minnesota; involving CN persons aged ≥ 60 years that underwent PiB-PET scans and completed Beck Depression Inventory-II (BDI-II) and Beck Anxiety Inventory (BAI). Cognitive diagnosis was made by an expert consensus panel. Participants were classified as having abnormal (≥1.4; PiB+) or normal PiB-PET (<1.4; PiB−) using a global cortical to cerebellar ratio. Multi-variable logistic regression analyses were performed to calculate odds ratios (OR) and 95% confidence intervals (95% CI) after adjusting for age and sex.
Results:
Of 1,038 CN participants (53.1% males), 379 were PiB+. Each one point symptom increase in the BDI (OR = 1.03; 1.00–1.06) and BAI (OR = 1.04; 1.01–1.08) was associated with increased odds of PiB-PET+. The number of participants with BDI > 13 (clinical depression) was greater in the PiB-PET+ than PiB-PET- group but the difference was not significant (OR = 1.42; 0.83–2.43). Similarly, the number of participants with BAI > 10 (clinical anxiety) was greater in the PiB-PET+ than PiB-PET− group but the difference was not significant (OR = 1.77; 0.97–3.22).
Conclusions:
As expected, depression and anxiety levels were low in this community-dwelling sample, which likely reduced our statistical power. However, we observed an informative albeit weak association between increased BDI and BAI scores and elevated cortical amyloid deposition. This observation needs to be tested in a longitudinal cohort study.
Studies were conducted to evaluate density-dependent effects of Palmer amaranth on weed and peanut growth and peanut yield. Palmer amaranth remained taller than peanut throughout the growing season and decreased peanut canopy diameter, although Palmer amaranth density did not affect peanut height. The rapid increase in Palmer amaranth height at Goldsboro correspondingly reduced the maximum peanut canopy diameter at that location, although the growth trends for peanut canopy diameter were similar for both locations. Palmer amaranth biomass was affected by weed density when grown with peanut. Peanut pod weight decreased linearly 2.89 kg/ha with each gram of increase in Palmer amaranth biomass per meter of crop row. Predicted peanut yield loss from season-long interference of one Palmer amaranth plant per meter of crop row was 28%. Palmer amaranth seed production was also described by the rectangular hyperbola model. At the highest density of 5.2 Palmer amaranth plants/m crop row, 1.2 billion Palmer amaranth seed/ha were produced.
By
Paul K. Kleinman, Department of Radiology, Boston Children’s Hospital, and Harvard Medical School, Boston, Massachusetts, USA,
Michele M. Walters, Staff Pediatric Radiologist at Boston Children’s Hospital and Instructor in Radiology at Harvard Medical School, Boston, Massachusetts, USA
Accurate dating of fractures is critical in cases of suspected child abuse (1–8). The ability of medical professionals to assess the veracity of the history provided depends on the clinical and radiologic assessment of the presenting injury or injuries. If, for example, a single injury to a localized portion of the extremity is alleged to have occurred but multiple sites of subperiosteal new bone formation (SPNBF) and/or callus are seen on radiographs, medical providers should become suspicious and initiate further investigation. However, if the severity of the alleged injury correlates with clinical and radiographic findings and all evidence suggests that the fracture is acute, the suspicion of abuse may never arise. It is clear that the ability of the radiologist and the clinician to assess the age of bony injury is critical to a determination of suspected child abuse. The forensic requirements to establishing responsibility and determining the need for intervention by child protection agencies rest strongly on the assessments of the clinician and the radiologist regarding the nature and timing of injury. Accurate fracture dating can aid in the identification and exclusion of potential abusive perpetrators. In criminal proceedings, the requirement to assign age estimates to fractures and to determine if there have been multiple episodes of abuse may have important implications. The presence of prior injuries may influence critical decisions regarding how defendants may be charged, how a prosecution may procede, the jury verdict, and the penalties to a convicted abuser (Fig. 6.1) (9).
As intra-thyroidal iodine stores should be maximised before conception to facilitate the increased thyroid hormone production during pregnancy, women who are planning to become pregnant should ideally consume 150 μg iodine/d (US RDA). As few UK data exist for this population group, a cross-sectional study was carried out at the University of Surrey to assess the iodine intake and status of women of childbearing age. Total iodine excretion was measured from 24 h urine samples in fifty-seven women; iodine intake was estimated by assuming that 90 % of ingested iodine was excreted. The average iodine intake was also estimated from 48 h food diaries that the participants completed. The median urinary iodine concentration value (63·1 μg/l) indicated the group to be mildly iodine deficient by WHO criteria. By contrast, the median 24 h urinary iodine excretion value (149·8 μg/24 h) indicated a relatively low risk of iodine deficiency. The median estimated iodine intake, extrapolated from urinary excretion, was 167 μg/d, whereas it was lower, at 123 μg/d, when estimated from the 48 h food diaries. Iodine intake estimated from the food diaries and 24 h urinary iodine excretion were strongly correlated (r 0·75, P< 0·001). The intake of milk, eggs and dairy products was positively associated with iodine status. The iodine status of this UK cohort is probably a best-case scenario as the women were mostly nutrition students and were recruited in the winter when milk-iodine content is at its highest; further study in more representative cohorts of UK women is required. The present study highlights the need for revised cut-off values for iodine deficiency that are method- and age group-specific.