We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Few investigations have evaluated the validity of current body composition technology among racially and ethnically diverse populations. This study assessed the validity of common body composition methods in a multi-ethnic sample stratified by race and ethnicity. One hundred and ten individuals (55 % female, age: 26·5 (sd 6·9) years) identifying as Asian, African American/Black, Caucasian/White, Hispanic, Multi-racial and Native American were enrolled. Seven body composition models (dual-energy X-ray absorptiometry (DXA), air displacement plethysmography (ADP), two bioelectrical impedance devices (BIS, IB) and three multi-compartment models) were evaluated against a four-compartment criterion model by assessing total error (TE) and standard error of the estimate. For the total sample, measures of % fat and fat-free mass (FFM) from multi-compartment models were all excellent to ideal (% fat: TE = 0·94–2·37 %; FFM: TE = 0·72–1·78 kg) compared with the criterion. % fat measures were very good to excellent for DXA, ADP and IB (TE = 2·52–2·89 %) and fairly good for BIS (TE = 4·12 %). For FFM, single device estimates were good (BIS; TE = 3·12 kg) to ideal (DXA, ADP, IB; TE = 1·21–2·15 kg). Results did not vary meaningfully between each race and ethnicity, except BIS was not valid for African American/Black, Caucasian/White and Multi-racial participants for % fat (TE = 4·3–4·9 %). The multi-compartment models evaluated can be utilised in a multi-ethnic sample and in each individual race and ethnicity to obtain highly valid results for % fat and FFM. Estimates from DXA, ADP and IB were also valid. The BIS may demonstrate greater TE for all racial and ethnic cohorts and results should be interpreted cautiously.
Recent cannabis exposure has been associated with lower rates of neurocognitive impairment in people with HIV (PWH). Cannabis’s anti-inflammatory properties may underlie this relationship by reducing chronic neuroinflammation in PWH. This study examined relations between cannabis use and inflammatory biomarkers in cerebrospinal fluid (CSF) and plasma, and cognitive correlates of these biomarkers within a community-based sample of PWH.
Methods:
263 individuals were categorized into four groups: HIV− non-cannabis users (n = 65), HIV+ non-cannabis users (n = 105), HIV+ moderate cannabis users (n = 62), and HIV+ daily cannabis users (n = 31). Differences in pro-inflammatory biomarkers (IL-6, MCP-1/CCL2, IP-10/CXCL10, sCD14, sTNFR-II, TNF-α) by study group were determined by Kruskal–Wallis tests. Multivariable linear regressions examined relationships between biomarkers and seven cognitive domains, adjusting for age, sex/gender, race, education, and current CD4 count.
Results:
HIV+ daily cannabis users showed lower MCP-1 and IP-10 levels in CSF compared to HIV+ non-cannabis users (p = .015; p = .039) and were similar to HIV− non-cannabis users. Plasma biomarkers showed no differences by cannabis use. Among PWH, lower CSF MCP-1 and lower CSF IP-10 were associated with better learning performance (all ps < .05).
Conclusions:
Current daily cannabis use was associated with lower levels of pro-inflammatory chemokines implicated in HIV pathogenesis and these chemokines were linked to the cognitive domain of learning which is commonly impaired in PWH. Cannabinoid-related reductions of MCP-1 and IP-10, if confirmed, suggest a role for medicinal cannabis in the mitigation of persistent inflammation and cognitive impacts of HIV.
ABSTRACT IMPACT: Melanoma leptomeningeal disease (LMD) is a devastating subtype of central nervous system (CNS) metastatic disease that is associated with limited treatment options and an extremely poor prognosis, thus requiring the development of preclinical models of LMD for therapeutic development. OBJECTIVES/GOALS:
1. Develop an immunocompetent murine model of melanoma LMD with tumors bearing genetic mutations commonly found in patients, specifically BRAF(V600E)/PTEN-/-
2. Assess the safety of intrathecal (IT) immunotherapy, specifically anti-PD1 antibody (aPD1)
3. Evaluate the therapeutic efficacy of IT aPD1 checkpoint blockade in murine melanoma LMD METHODS/STUDY POPULATION: To develop BRAF(V600E)/PTEN-/- LMD models, we acquired BP, D4M, and D4M-UV2 (irradiated) murine melanoma cell lines and luciferase-tagged them. 1.5x10^4 cells were suspended in 10 uL serum-free media and injected into the cisterna magna of female C57BL/6 mice. Brain and spinal cord were harvested for histologic assessment once mice were moribund. To assess safety of IT aPD1, we injected IT control IgG or IT aPD1 (13 ug, 26 ug, 39 ug) and monitored weights or harvested at days 7 or 14 for IHC staining of inflammation markers. To evaluate therapeutic efficacy of IT aPD1, BP cells were directly injected as above. After 3 days, mice underwent imaging to confirm tumor uptake and randomization to receive 13 ug IT control IgG or aPD1 once + 200 ug systemic (Sys) control IgG or aPD1 (days 0, 3, and 5), and then monitored for survival. RESULTS/ANTICIPATED RESULTS: For LMD development, all mice survived cisternal injection of BP, D4M, and D4M-UV2 cells and median survival was 17, 19, and 30 days, respectively. Presence of leptomeningeal deposits was confirmed for all tumor-bearing mice by IHC for MART1. For safety of IT aPD1, all mice survived the procedure and no mice displayed morbidity or >10% weight loss over 14 days of observation. IHC assessment of brain and spinal cord samples from mice treated with 13 ug aPD1 revealed focal ischemia related to injection site and no other signs of neurological damage or inflammation. IT aPD1 treatment of mice with BP leptomeningeal tumors demonstrated no significant survival advantage, although both IT aPD1 +/- Sys aPD1 had mice live up to days 29 and 26, respectively, compared to both IT control IgG +/- Sys aPD1, for which all mice died by day 22. DISCUSSION/SIGNIFICANCE OF FINDINGS: We demonstrate that cisternal injection of murine BRAF(V600E)/PTEN-/- melanoma cell lines yield LMD with reproducible survival and that treatment with IT aPD1 in this model is feasible and safe. Together these findings establish a new model to facilitate the development of more effective immunotherapy strategies for melanoma patients with LMD.
Background: With the emergence of antibiotic resistant threats and the need for appropriate antibiotic use, laboratory microbiology information is important to guide clinical decision making in nursing homes, where access to such data can be limited. Susceptibility data are necessary to inform antibiotic selection and to monitor changes in resistance patterns over time. To contribute to existing data that describe antibiotic resistance among nursing home residents, we summarized antibiotic susceptibility data from organisms commonly isolated from urine cultures collected as part of the CDC multistate, Emerging Infections Program (EIP) nursing home prevalence survey. Methods: In 2017, urine culture and antibiotic susceptibility data for selected organisms were retrospectively collected from nursing home residents’ medical records by trained EIP staff. Urine culture results reported as negative (no growth) or contaminated were excluded. Susceptibility results were recorded as susceptible, non-susceptible (resistant or intermediate), or not tested. The pooled mean percentage tested and percentage non-susceptible were calculated for selected antibiotic agents and classes using available data. Susceptibility data were analyzed for organisms with ≥20 isolates. The definition for multidrug-resistance (MDR) was based on the CDC and European Centre for Disease Prevention and Control’s interim standard definitions. Data were analyzed using SAS v 9.4 software. Results: Among 161 participating nursing homes and 15,276 residents, 300 residents (2.0%) had documentation of a urine culture at the time of the survey, and 229 (76.3%) were positive. Escherichia coli, Proteus mirabilis, Klebsiella spp, and Enterococcus spp represented 73.0% of all urine isolates (N = 278). There were 215 (77.3%) isolates with reported susceptibility data (Fig. 1). Of these, data were analyzed for 187 (87.0%) (Fig. 2). All isolates tested for carbapenems were susceptible. Fluoroquinolone non-susceptibility was most prevalent among E. coli (42.9%) and P. mirabilis (55.9%). Among Klebsiella spp, the highest percentages of non-susceptibility were observed for extended-spectrum cephalosporins and folate pathway inhibitors (25.0% each). Glycopeptide non-susceptibility was 10.0% for Enterococcus spp. The percentage of isolates classified as MDR ranged from 10.1% for E. coli to 14.7% for P. mirabilis. Conclusions: Substantial levels of non-susceptibility were observed for nursing home residents’ urine isolates, with 10% to 56% reported as non-susceptible to the antibiotics assessed. Non-susceptibility was highest for fluoroquinolones, an antibiotic class commonly used in nursing homes, and ≥ 10% of selected isolates were MDR. Our findings reinforce the importance of nursing homes using susceptibility data from laboratory service providers to guide antibiotic prescribing and to monitor levels of resistance.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
Introduction: Emergency department (ED) crowding is a major problem across Canada. We studied the ability of artificial intelligence methods to improve patient flow through the ED by predicting patient disposition using information available at triage and shortly after patients’ arrival in the ED. Methods: This retrospective study included all visits to an urban, academic, adult ED between May 2012 and June 2019. For each visit, 489 variables were extracted including triage data that had been collected for use in the Canadian Triage Assessment Scale (CTAS) and information regarding laboratory tests, radiological tests, consultations and admissions. A training set consisting of all visits from April 2012 up to December 2018 was used to train 5 classes of machine learning models to predict admission to the hospital from the ED. The models were trained to predict admission at the time of the patient's arrival in the ED and every 30 minutes after arrival until 6 hours into their ED stay. The performance of models was compared using the area under the ROC curve (AUC) on a test set consisting of all visits from January 2019 to June 2019. Results: The study included 536,332 visits and the admission rate was 15.0%. Gradient boosting models generally outperformed other machine learning models. A gradient boosting model using all available data at 2 hours after patient arrival in the ED yielded a test set AUC 0.92 [95% CI 0.91-0.93], while a model using only data available at triage yielded an AUC 0.90 [95% CI 0.89-0.91]. The quality of predictions generally improved as predictions were made later in the patient's ED stay leading to an AUC 0.95 [95% CI 0.93-0.96] at 6 hours after arrival. A gradient boosting model with 20 variables available at 2 hours after patient arrival in the ED yielded an AUC 0.91 [95% CI 0.89-0.93]. A gradient boosting model that makes predictions at 2 hours after arrival in ED using only variables that are available at all EDs in the province of Quebec yielded an AUC 0.91 [95% 0.89-0.92]. Conclusion: Machine learning can predict admission to a hospital from the ED using variables that area collected as part of routine ED care. Machine learning tools may potentially be used to help ED physicians to make faster and more appropriate disposition decisions, to decrease unnecessary testing and alleviate ED crowding.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
Obsessive-compulsive disorder (OCD) is a highly disabling condition, with frequent early onset. Adult/adolescent OCD has been extensively investigated, but little is known about prevalence and clinical characterization of geriatric patients with OCD (G-OCD = 65 years). The present study aimed to assess prevalence of G-OCD and associated socio-demographic and clinical correlates in a large international sample.
Methods:
Data from 416 outpatients, participating in the ICOCS network, were assessed and categorized into 2 groups, age < vs = 65 years, and then divided on the basis of the median age of the sample (age < vs = 42 years). Socio-demographic and clinical variables were compared between groups (Pearson Chi-squared and t tests).
Results:
G-OCD compared with younger patients represented a significant minority of the sample (6% vs 94%, P < .001), showing a significantly later age at onset (29.4 ± 15.1 vs 18.7 ± 9.2 years, P < .001), a more frequent adult onset (75% vs 41.1%, P < .001) and a less frequent use of cognitive-behavioural therapy (CBT) (20.8% vs 41.8%, P < .05). Female gender was more represented in G-OCD patients, though not at a statistically significant level (75% vs 56.4%, P = .07). When the whole sample was divided on the basis of the median age, previous results were confirmed for older patients, including a significantly higher presence of women (52.1% vs 63.1%, P < .05).
Conclusions:
G-OCD compared with younger patients represented a small minority of the sample and showed later age at onset, more frequent adult onset and lower CBT use. Age at onset may influence course and overall management of OCD, with additional investigation needed.
Invasive species drive biodiversity loss and lead to changes in parasite–host associations. Parasites are linked to invasions and can mediate invasion success and outcomes. We review theoretical and empirical research into parasites in biological invasions, focusing on a freshwater invertebrate study system. We focus on the effects of parasitic infection on host traits (behaviour and life history) that can mediate native/invader trophic interactions. We review evidence from the field and laboratory of parasite-driven changes in predation, intraguild predation and cannibalism. Theoretical work shows that the trait-mediated effects of parasites can be as strong as classical density effects and their impact on the host’s trophic interactions merits more consideration. We also report on evidence of broader cascading effects warranting deeper study. Biological invasion can lead to altered parasite–host associations. Focusing on amphipod invasions, we find patterns of parasite introduction and loss that mirror host invasion pathways, but also highlight the risks of introducing invasive parasites. Horizon scanning and impact predictions are vital in identifying future disease risks, potential pathways of introduction and suitable management measures for mitigation.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
A total of 592 people reported gastrointestinal illness following attendance at Street Spice, a food festival held in Newcastle-upon-Tyne, North East England in February/March 2013. Epidemiological, microbiological and environmental investigations were undertaken to identify the source and prevent further cases. Several epidemiological analyses were conducted; a cohort study; a follow-up survey of cases and capture re-capture to estimate the true burden of cases. Indistinguishable isolates of Salmonella Agona phage type 40 were identified in cases and on fresh curry leaves used in one of the accompaniments served at the event. Molecular testing indicated entero-aggregative Escherichia coli and Shigella also contributed to the burden of illness. Analytical studies found strong associations between illness and eating food from a particular stall and with food items including coconut chutney which contained fresh curry leaves. Further investigation of the food supply chain and food preparation techniques identified a lack of clear instruction on the use of fresh uncooked curry leaves in finished dishes and uncertainty about their status as a ready-to-eat product. We describe the investigation of one of the largest outbreaks of food poisoning in England, involving several gastrointestinal pathogens including a strain of Salmonella Agona not previously seen in the UK.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Elemental, chemical, and structural analysis of polycrystalline materials at the micron scale is frequently carried out using microfocused synchrotron X-ray beams, sometimes on multiple instruments. The Maia pixelated energy-dispersive X-ray area detector enables the simultaneous collection of X-ray fluorescence (XRF) and diffraction because of the relatively large solid angle and number of pixels when compared with other systems. The large solid angle also permits extraction of surface topography because of changes in self-absorption. This work demonstrates the capability of the Maia detector for simultaneous measurement of XRF and diffraction for mapping the short- and long-range order across the grain structure in a Ni polycrystalline foil.
Recent developments in U.S-Cuba relations have resulted in a proliferating global interest in Cuba, including its legal regime. This comprehensive Guide aims to fill a noticeable void in the availability of information in English on this enigmatic jurisdiction's legal order, and on how to conduct research related to it. Covered topics include “The Constitution,” “Legislation and Codes,” “The Judiciary,” “Cuba in the International Arena,” and “The Legal Profession.” A detailed section on “Cuban Legal Materials in U.S. and Canadian Libraries” is also featured. Although the Guide emphasizes sources in English and English-language translation, materials in Spanish are likewise included as English-language equivalents are often unavailable. The Guide's 12 authors are members of the Latin American Law Interest Group of the American Association of Law Libraries’ Foreign, Comparative, and International Law Special Interest Section (FCIL-SIS).
Negative symptoms significantly contribute to disability and lack of community participation for low functioning individuals with schizophrenia. Cognitive therapy has been shown to improve negative symptoms and functional outcome in this population. Elucidation of the mechanisms of the therapy would lead to a better understanding of negative symptoms and the development of more effective interventions to promote recovery. The objective of this study was to determine (1) whether guided success at a card-sorting task will produce improvement in defeatist beliefs, positive beliefs about the self, mood, and card-sorting performance, and (2) whether these changes in beliefs and mood predict improvements in unguided card-sorting.
Methods
Individuals with schizophrenia having prominent negative symptoms and impaired neurocognitive performance (N = 35) were randomized to guided success (n = 19) or a control (n = 16) condition.
Results
Controlling for baseline performance, the experimental group performed significantly better, endorsed defeatist beliefs to a lesser degree, reported greater positive self-concept, and reported better mood than the control condition immediately after the experimental session. A composite index of change in defeatist beliefs, self-concept, and mood was significantly correlated with improvements in card-sorting.
Conclusions
This analogue study supports the rationale of cognitive therapy and provides a general therapeutic model in which experiential interventions that produce success have a significant immediate effect on a behavioral task, mediated by changes in beliefs and mood. The rapid improvement is a promising indicator of the responsiveness of this population, often regarded as recalcitrant, to cognitively-targeted behavioral interventions.
This paper seeks to establish good practice in setting inputs for operational risk models for banks, insurers and other financial service firms. It reviews Basel, Solvency II and other regulatory requirements as well as publicly available literature on operational risk modelling. It recommends a combination of historic loss data and scenario analysis for modelling of individual risks, setting out issues with these data, and outlining good practice for loss data collection and scenario analysis. It recommends the use of expert judgement for setting correlations, and addresses information requirements for risk mitigation allowances and capital allocation, before briefly covering Bayesian network methods for modelling operational risks.
The objective is to determine the nature of the unseen companion of the single-lined spectroscopic binary, WR 148 (= WN7h+?). The absence of companion lines supports a compact companion (cc) scenario. The lack of hard X-rays favours a non-compact companion scenario. Is WR 148 a commonplace WR+OB binary or a rare WR+cc binary?
The human circadian system anticipates and adapts to daily environmental changes to optimise behaviour according to time of day and temporally partitions incompatible physiological processes. At the helm of this system is a master clock in the suprachiasmatic nuclei (SCN) of the anterior hypothalamus. The SCN are primarily synchronised to the 24-h day by the light/dark cycle; however, feeding/fasting cycles are the primary time cues for clocks in peripheral tissues. Aligning feeding/fasting cycles with clock-regulated metabolic changes optimises metabolism, and studies of other animals suggest that feeding at inappropriate times disrupts circadian system organisation, and thereby contributes to adverse metabolic consequences and chronic disease development. ‘High-fat diets’ (HFD) produce particularly deleterious effects on circadian system organisation in rodents by blunting feeding/fasting cycles. Time-of-day-restricted feeding, where food availability is restricted to a period of several hours, offsets many adverse consequences of HFD in these animals; however, further evidence is required to assess whether the same is true in humans. Several nutritional compounds have robust effects on the circadian system. Caffeine, for example, can speed synchronisation to new time zones after jetlag. An appreciation of the circadian system has many implications for nutritional science and may ultimately help reduce the burden of chronic diseases.