To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article emerged as the human species collectively have been experiencing the worst global pandemic in a century. With a long view of the ecological, economic, social, and political factors that promote the emergence and spread of infectious disease, archaeologists are well positioned to examine the antecedents of the present crisis. In this article, we bring together a variety of perspectives on the issues surrounding the emergence, spread, and effects of disease in both the Americas and Afro-Eurasian contexts. Recognizing that human populations most severely impacted by COVID-19 are typically descendants of marginalized groups, we investigate pre- and postcontact disease vectors among Indigenous and Black communities in North America, outlining the systemic impacts of diseases and the conditions that exacerbate their spread. We look at how material culture both reflects and changes as a result of social transformations brought about by disease, the insights that paleopathology provides about the ancient human condition, and the impacts of ancient globalization on the spread of disease worldwide. By understanding the differential effects of past epidemics on diverse communities and contributing to more equitable sociopolitical agendas, archaeology can play a key role in helping to pursue a more just future.
Hyperprolific sows rear more piglets than they have teats, and in order to accommodate this, milk replacers are often offered as a supplement. Milk replacers are based on bovine milk, yet components of vegetable origin are often added. This may reduce growth, but could also accelerate maturational changes. Therefore, we investigated the effect of feeding piglets a milk replacer with gradually increasing levels of wheat flour on growth, gut enzyme activity and immune function compared to a diet based entirely on bovine milk. The hypothesis tested was that adding a starch component (wheat flour) induces maturation of the mucosa as measured by higher digestive activity and improved integrity and immunity of the small intestines (SI). To test this hypothesis, piglets were removed from the sow at day 3 and fed either a pure milk replacer diet (MILK) or from day 11 a milk replacer diet with increasing levels of wheat (WHEAT). The WHEAT piglets had an increased enzyme activity of maltase and sucrase in the proximal part of the SI compared with the MILK group. There were no differences in gut morphology, histopathology and gene expression between the groups. In conclusion, the pigs given a milk replacer with added wheat displayed immunological and gut mucosal enzyme maturational changes, indicatory of adaptation toward a vegetable-based diet. This was not associated with any clinical complications and future studies are needed to show whether this could improve responses in the subsequent weaning process.
It is unclear whether olfactory deficits improve after remission in depressed patients. Therefore, we aimed to assess in drug-free patients the olfactory performance of patients with major depressive episodes (MDE) and its change after antidepressant treatment.
In the DEP-ARREST-CLIN study, 69 drug-free patients with a current MDE in the context of major depressive disorder (MDD) were assessed for their olfactory performances and depression severity, before and after 1 (M1) and 3 (M3) months of venlafaxine antidepressant treatment. They were compared to 32 age- and sex-matched healthy controls (HCs). Olfaction was assessed with a psychophysical test, the Sniffin’ Sticks test (Threshold: T score; Discrimination: D score; Identification: I score; total score: T + D + I = TDI score) and Pleasantness (pleasantness score: p score; neutral score: N score; unpleasantness score: U score).
As compared to HCs, depressed patients had lower TDI olfactory scores [mean (s.d.) 30.0(4.5) v. 33.3(4.2), p < 0.001], T scores [5.6(2.6) v. 7.4(2.6), p < 0.01], p scores [7.5(3.0) v. 9.8(2.8), p < 0.001)] and higher N scores [3.5(2.6) v. 2.1(1.8), p < 0.01]. T, p and N scores at baseline were independent from depression and anhedonia severity. After venlafaxine treatment, significant increases of T scores [M1: 7.0(2.6) and M3: 6.8(3.1), p < 0.01] and p scores [M1: 8.1(3.0) and M3: 8.4(3.3), p < 0.05] were evidenced, in remitters only (T: p < 0.01; P: p < 0.01). Olfaction improvement was mediated by depression improvement.
The olfactory signature of MDE is restored after venlafaxine treatment. This olfaction improvement is mediated by depression improvement.
To test the feasibility of targeted gown and glove use by healthcare personnel caring for high-risk nursing-home residents to prevent Staphylococcus aureus acquisition in short-stay residents.
Uncontrolled clinical trial.
This study was conducted in 2 community-based nursing homes in Maryland.
The study included 322 residents on mixed short- and long-stay units.
During a 2-month baseline period, all residents had nose and inguinal fold swabs taken to estimate S. aureus acquisition. The intervention was iteratively developed using a participatory human factors engineering approach. During a 2-month intervention period, healthcare personnel wore gowns and gloves for high-risk care activities while caring for residents with wounds or medical devices, and S. aureus acquisition was measured again. Whole-genome sequencing was used to assess whether the acquisition represented resident-to-resident transmission.
Among short-stay residents, the methicillin-resistant S. aureus acquisition rate decreased from 11.9% during the baseline period to 3.6% during the intervention period (odds ratio [OR], 0.28; 95% CI, 0.08–0.92; P = .026). The methicillin-susceptible S. aureus acquisition rate went from 9.1% during the baseline period to 4.0% during the intervention period (OR, 0.41; 95% CI, 0.12–1.42; P = .15). The S. aureus resident-to-resident transmission rate decreased from 5.9% during the baseline period to 0.8% during the intervention period.
Targeted gown and glove use by healthcare personnel for high-risk care activities while caring for residents with wounds or medical devices, regardless of their S. aureus colonization status, is feasible and potentially decreases S. aureus acquisition and transmission in short-stay community-based nursing-home residents.
There are currently no guidelines for central-line insertion site evaluation. Our study revealed an association between insertion site inflammation (ISI) and the development of central-line–associated bloodstream infections (CLABSIs). Automated surveillance for ISI is feasible and could help prevent CLABSI.
Psychosis is associated with a reasoning bias, which manifests as a tendency to ‘jump to conclusions’. We examined this bias in people at clinical high-risk for psychosis (CHR) and investigated its relationship with their clinical outcomes.
In total, 303 CHR subjects and 57 healthy controls (HC) were included. Both groups were assessed at baseline, and after 1 and 2 years. A ‘beads’ task was used to assess reasoning bias. Symptoms and level of functioning were assessed using the Comprehensive Assessment of At-Risk Mental States scale (CAARMS) and the Global Assessment of Functioning (GAF), respectively. During follow up, 58 (16.1%) of the CHR group developed psychosis (CHR-T), and 245 did not (CHR-NT). Logistic regressions, multilevel mixed models, and Cox regression were used to analyse the relationship between reasoning bias and transition to psychosis and level of functioning, at each time point.
There was no association between reasoning bias at baseline and the subsequent onset of psychosis. However, when assessed after the transition to psychosis, CHR-T participants showed a greater tendency to jump to conclusions than CHR-NT and HC participants (55, 17, 17%; χ2 = 8.13, p = 0.012). There was a significant association between jumping to conclusions (JTC) at baseline and a reduced level of functioning at 2-year follow-up in the CHR group after adjusting for transition, gender, ethnicity, age, and IQ.
In CHR participants, JTC at baseline was associated with adverse functioning at the follow-up. Interventions designed to improve JTC could be beneficial in the CHR population.
Background: Healthcare personnel (HCP) acquire MRSA on their gown and gloves during routine care activities for patients who are colonized or infected with MRSA at a rate of ∼15%. Certain care activities (eg, physical exam, care of endotracheal tube, wound care and bathing/hygiene) have been associated with a higher frequency of transmission from the patient to HCP gown and gloves than other activities (ie, administration of oral medicines, glucose monitoring, and manipulation of IV tubing/medication delivery). However, quantification of MRSA contamination and risk to subsequent patients is poorly defined. Objective: We sought to determine the mean MRSA colony-forming units (CFU) found on the gloves and gowns of HCP who acquire MRSA after various care activities involving patients with MRSA. Methods: We conducted a prospective cohort study at the University of Maryland Medical Center from December 2018 to October 2019. We identified patients colonized or infected with MRSA based on culture data from the prior 7 days. HCP performing prespecified care activities on eligible patients were observed. To isolate the risk of each care activity, HCP donned new gloves and gown prior to a specific care activity. Once that care activity was performed, HCP gloves and gown were swabbed prior to the any further care activities. HCP gloves were cultured with an E-swab by swabbing each digit up and down 3 times followed by 2 circles on the palm of their hands. HCP gowns were sampled by swabbing a 15 × 30-cm area along the beltline of the gown and along each inner forearm twice. E-swab liquid was then serially diluted and plated in triplicate on CHROMagar MRSA II (BD, Sparks, MD) to obtain CFU. We calculated the median CFUs and the interquartile range (IQR) for each specific care activity stratified by gown and gloves. Results: In total, 604 HCP–patient care interactions were observed. Table 1 displays the mean MRSA CFUs stratified by gown and gloves for each patient care activity of interest. Conclusions: The quantity of MRSA found on gowns and gloves varies depending on patient care activities. Recognition of differential transmission rates between various activities may allow different approaches to infection prevention, such as the use of personal protective equipment in high- versus low-risk activities and/or the use of more aggressive interventions for high-risk activities.
Infection prevention and control (IPC) workflows are often retrospective and manual. New tools, however, have entered the field to facilitate rapid prospective monitoring of infections in hospitals. Although artificial intelligence (AI)–enabled platforms facilitate timely, on-demand integration of clinical data feeds with pathogen whole-genome sequencing (WGS), a standardized workflow to fully harness the power of such tools is lacking. We report a novel, evidence-based workflow that promotes quicker infection surveillance via AI-assisted clinical and WGS data analysis. The algorithm suggests clusters based on a combination of similar minimum inhibitory concentration (MIC) data, timing of sample collection, and shared location stays between patients. It helps to proactively guide IPC professionals during investigation of infectious outbreaks and surveillance of multidrug-resistant organisms and healthcare-acquired infections. Methods: Our team established a 1-year workgroup comprised of IPC practitioners, clinical experts, and scientists in the field. We held weekly roundtables to study lessons learned in an ongoing surveillance effort at a tertiary care hospital—utilizing Philips IntelliSpace Epidemiology (ISEpi), an AI-powered system—to understand how such a tool can enhance practice. Based on real-time case discussions and evidence from the literature, a workflow guidance tool and checklist were codified. Results: In our workflow, data-informed clusters posed by ISEpi underwent triage and expert follow-up analysis to assess: (1) likelihood of transmission(s); (2) potential vector(s) identity; (3) need to request WGS; and (4) intervention(s) to be pursued, if warranted. In a representative sample (spanning October 17, 2019, to November 7, 2019) of 67 total isolates suggested for inclusion in 19 unique cluster investigations, we determined that 9 investigations merited follow-up. Collectively, these 9 investigations involved 21 patients and required 115 minutes to review in ISEpi and an additional 70 minutes of review outside of ISEpi. After review, 6 investigations were deemed unlikely to represent a transmission; the other 3 had potential to represent transmission for which interventions would be performed. Conclusions: This study offers an important framework for adaptation of existing infection control workflow strategies to leverage the utility of rapidly integrated clinical and WGS data. This workflow can also facilitate time-sensitive decisions regarding sequencing of specific pathogens given the preponderance of available clinical data supporting investigations. In this regard, our work sets a new standard of practice: precision infection prevention (PIP). Ongoing effort is aimed at development of AI-powered capabilities for enterprise-level quality and safety improvement initiatives.
Funding: Philips Healthcare provided support for this study.
Disclosures: Alan Doty and Juan Jose Carmona report salary from Philips Healthcare.
Background: For the rising number of people living with dementia, cost-effective community-based interventions to support psychosocial care are needed. The FindMyApps program helps people with dementia and their caregivers learn to use tablet computers and find user-friendly apps that facilitate self-management and engagement in meaningful activities. This definitive trial builds on previous feasibility pilot trials of FindMyApps and further evaluates cost-effectiveness.
Method: This is a protocol for a non-blinded randomized controlled trial (RCT) with two arms (intervention and usual care). 150 dyads (person with dementia and their carer) will be recruited. Participants must be resident in the community, with a diagnosis of Mild Cognitive Impairment or mild dementia (Mini Mental-State Examination 17-26, or Global Deterioration Scale 3-4. Dyads will be randomly assigned in equal proportions to receive either the FindMyApps intervention (experimental arm) or usual care (control arm). Primary outcomes measured at 3 months will be: patient self-management and social participation; caregiver sense of competence. Data will be collected through questionnaires filled in by the researcher (patient outcomes) or participants themselves (carer outcomes). In addition to a main effect analysis, a cost-effectiveness analysis will take place. In line with Medical Research Council (MRC) guidance for the evaluation of complex interventions, a process analysis will be undertaken, to identify factors that may influence trial outcomes. Semi-structured interviews and remotely collected data regarding use of the FindMyApps app will support the process analysis.
Result: Results of this study are expected in 2022. The study will be adequately powered to detect at least a moderate effect size of the intervention with respect to the primary outcomes.
Conclusion: This study will investigate the effectiveness and cost-effectiveness of the FindMyApps intervention. The results of the study will provide strong evidence to support or oppose scaling up implementation of the intervention. This is also an example of how the MRC framework for the evaluation of complex interventions can be implemented in practice. In a field which is often criticized for a lack of high quality evidence, randomized controlled trials should be applied more frequently designed for the robust and transparent evaluation of digital tools and technologies.
Background: Infection prevention surveillance for cross transmission is often performed by manual review of microbiologic culture results to identify geotemporally related clusters. However, the sensitivity and specificity of this approach remains uncertain. Whole-genome sequencing (WGS) analysis can help provide a gold-standard for identifying cross-transmission events. Objective: We employed a published WGS program, the Philips IntelliSpace Epidemiology platform, to compare accuracy of two surveillance methods: (i.) a virtual infection practitioner (VIP) with perfect recall and automated analysis of antibiotic susceptibility testing (AST), sample collection timing, and patient location data and (ii) a novel clinical matching (CM) algorithm that provides cluster suggestions based on a nuanced weighted analysis of AST data, timing of sample collection, and shared location stays between patients. Methods: WGS was performed routinely on inpatient and emergency department isolates of Enterobacter cloacae, Enterococcus faecium, Klebsiella pneumoniae, and Pseudomonas aeruginosa at an academic medical center. Single-nucleotide variants (SNVs) were compared within core genome regions on a per-species basis to determine cross-transmission clusters. Moreover, one unique strain per patient was included within each analysis, and duplicates were excluded from the final results. Results: Between May 2018 and April 2019, clinical data from 121 patients were paired with WGS data from 28 E. cloacae, 21 E. faecium, 61 K. pneumoniae, and 46 P. aeruginosa isolates. Previously published SNV relatedness thresholds were applied to define genomically related isolates. Mapping of genomic relatedness defined clusters as follows: 4 patients in 2 E. faecium clusters and 2 patients in 1 P. aeruginosa cluster. The VIP method identified 12 potential clusters involving 28 patients, all of which were “pseudoclusters.” Importantly, the CM method identified 7 clusters consisting of 27 patients, which included 1 true E. faecium cluster of 2 patients with genomically related isolates. Conclusions: In light of the WGS data, all of the potential clusters identified by the VIP were pseudoclusters, lacking sufficient genomic relatedness. In contrast, the CM method showed increased sensitivity and specificity: it decreased the percentage of pseudoclusters by 14% and it identified a related genomic cluster of E. faecium. These findings suggest that integrating clinical data analytics and WGS is likely to benefit institutions in limiting expenditure of resources on pseudoclusters. Therefore, WGS combined with more sophisticated surveillance approaches, over standard methods as modeled by the VIP, are needed to better identify and address true cross-transmission events.
Funding: This study was supported by Philips Healthcare.
Residence capacity can be defined as the capacity someone requires to decide where to live. Its assessment is important in a variety of mental disorders. In this chapter we shall mainly focus on dementia, but the nature and requirements for assessment would largely be similar across all conditions. We shall also focus on the relevant law as it pertains to England and Wales, that is, the Mental Capacity Act (MCA), although, again, the nature of residence capacity and the principles for its assessment remain similar across jurisdictions.
To explore community perceptions on maternal and child nutrition issues in Sub-Saharan Africa.
Thirty focus groups with men and women from three communities facilitated by local researchers.
One urban (Soweto, South Africa) and two rural settings (Navrongo, Ghana and Nanoro, Burkina Faso) at different stages of economic transition.
Two hundred thirty-seven men and women aged 18–55 years, mostly subsistence farmers in Navrongo and Nanoro and low income in Soweto.
Differences in community concerns about maternal and child health and nutrition reflected the transitional stage of the country. Community priorities revolved around poor nutrition and hunger caused by poverty, lack of economic opportunity and traditional gender roles. Men and women felt they had limited control over food and other resources. Women wanted men to take more responsibility for domestic chores, including food provision, while men wanted more involvement in their families but felt unable to provide for them. Solutions suggested focusing on ways of increasing control over economic production, family life and domestic food supplies. Rural communities sought agricultural support, while the urban community wanted regulation of the food environment.
To be acceptable and effective, interventions to improve maternal and child nutrition need to take account of communities’ perceptions of their needs and address wider determinants of nutritional status and differences in access to food reflecting the stage of the country’s economic transition. Findings suggest that education and knowledge are necessary but not sufficient to support improvements in women’s and children’s nutritional status.
Postprandial glycaemia and insulinaemia are important risk factors for type 2 diabetes. The prevalence of insulin resistance in adolescents is increasing, but it is unknown how adolescent participant characteristics such as BMI, waist circumference, fitness and maturity offset may explain responses to a standard meal. The aim of the present study was to examine how such participant characteristics affect the postprandial glycaemic and insulinaemic responses to an ecologically valid mixed meal. Data from the control trials of three separate randomised, crossover experiments were pooled, resulting in a total of 108 participants (fifty-two boys, fifty-six girls; aged 12·5 (SD 0·6) years; BMI 19·05 (SD 2·66) kg/m2). A fasting blood sample was taken for the calculation of fasting insulin resistance, using the homoeostatic model assessment of insulin resistance (HOMA-IR). Further capillary blood samples were taken before and 30, 60 and 120 min after a standardised lunch, providing 1·5 g/kg body mass of carbohydrate, for the quantification of blood glucose and plasma insulin total AUC (tAUC). Hierarchical multiple linear regression demonstrated significant predictors for plasma insulin tAUC were waist circumference, physical fitness and HOMA-IR (F(3,98) = 36·78, P < 0·001, adjusted R2 = 0·515). The variance in blood glucose tAUC was not significantly explained by the predictors used (F(7,94) = 1·44, P = 0·198). Significant predictors for HOMA-IR were BMI and maturity offset (F(2,102) = 14·06, P < 0·001, adjusted R2 = 0·021). In summary, the key findings of the study are that waist circumference, followed by physical fitness, best explained the insulinaemic response to an ecologically valid standardised meal in adolescents. This has important behavioural consequences because these variables can be modified.