To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
One of the central concepts in rabbinic Judaism is the notion of the Evil Inclination, which appears to be related to similar concepts in ancient Christianity and the wider late antique world. The precise origins and understanding of the idea, however, are unknown. This volume traces the development of this concept historically in Judaism and assesses its impact on emerging Christian thought concerning the origins of sin. The chapters, which cover a wide range of sources including the Bible, the Ancient Versions, Qumran, Pseudepigrapha and Apocrypha, the Targums, and rabbinic and patristic literature, advance our understanding of the intellectual exchange between Jews and Christians in classical Antiquity, as well as the intercultural exchange between these communities and the societies in which they were situated.
We evaluated whether memory recall following an extended (1 week) delay predicts cognitive and brain structural trajectories in older adults.
Clinically normal older adults (52–92 years old) were followed longitudinally for up to 8 years after completing a memory paradigm at baseline [Story Recall Test (SRT)] that assessed delayed recall at 30 min and 1 week. Subsets of the cohort underwent neuroimaging (N = 134, mean age = 75) and neuropsychological testing (N = 178–207, mean ages = 74–76) at annual study visits occurring approximately 15–18 months apart. Mixed-effects regression models evaluated if baseline SRT performance predicted longitudinal changes in gray matter volumes and cognitive composite scores, controlling for demographics.
Worse SRT 1-week recall was associated with more precipitous rates of longitudinal decline in medial temporal lobe volumes (p = .037), episodic memory (p = .003), and executive functioning (p = .011), but not occipital lobe or total gray matter volumes (demonstrating neuroanatomical specificity; p > .58). By contrast, SRT 30-min recall was only associated with longitudinal decline in executive functioning (p = .044).
Memory paradigms that capture longer-term recall may be particularly sensitive to age-related medial temporal lobe changes and neurodegenerative disease trajectories.
Given the increase in life expectancy and increased number of older people, larger numbers of older people will develop more comorbidities and functional impairments. Wellness in older individuals includes the continuum of primary and secondary prevention toward healthy aging, to both help prevent and screen for these comorbidities and functional impairments. This chapter reviews primary and secondary prevention, with a focus on healthcare maintenance, social isolation, physical activity in older adults, and nutrition. The first section reviews United States Preventive Services Task Force (USPSTF), Centers for Disease Control and Prevention (CDC), Advisory Committee on Immunization Practices (ACIP), and American Cancer Society (ACS) recommendations for immunizations and appropriate screening tools for age, as well as comprehensive geriatric assessment. Regarding physical activity in older adults, there is an extensive review of evidence-based recommendations for patients, including guidelines and tools providers may use. The international epidemic of social isolation is reviewed, along with the importance of screening for it and its associated morbidity and mortality. Finally, screening, risk factors, and interventions for nutritional problems in older persons are reviewed.
Individuals with schizophrenia are more likely to smoke and less likely to quit smoking than those without schizophrenia. Because task persistence is lower in smokers with than without schizophrenia, it is possible that lower levels of task persistence may contribute to greater difficulties in quitting smoking observed among smokers with schizophrenia.
To develop a feasible and acceptable intervention for smokers with schizophrenia.
Participants (N = 24) attended eight weekly individual cognitive behavioral therapy sessions for tobacco use disorder with a focus on increasing task persistence and received 10 weeks of nicotine patch.
In total, 93.8% of participants rated the intervention as at least a 6 out of 7 regarding how ‘easy to understand’ it was and 81.3% rated the treatment as at least a 6 out of 7 regarding how helpful it was to them. A total of 62.5% attended at least six of the eight sessions and session attendance was positively related to nicotine dependence and age and negatively related to self-efficacy for quitting.
This intervention was feasible and acceptable to smokers with schizophrenia. Future research will examine questions appropriate for later stages of therapy development such as initial efficacy of the intervention and task persistence as a mediator of treatment outcome.
Sleep disturbances are prevalent in cancer patients, especially those with advanced disease. There are few published intervention studies that address sleep issues in advanced cancer patients during the course of treatment. This study assesses the impact of a multidisciplinary quality of life (QOL) intervention on subjective sleep difficulties in patients with advanced cancer.
This randomized trial investigated the comparative effects of a multidisciplinary QOL intervention (n = 54) vs. standard care (n = 63) on sleep quality in patients with advanced cancer receiving radiation therapy as a secondary endpoint. The intervention group attended six intervention sessions, while the standard care group received informational material only. Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) and Epworth Sleepiness Scale (ESS), administered at baseline and weeks 4 (post-intervention), 27, and 52.
The intervention group had a statistically significant improvement in the PSQI total score and two components of sleep quality and daytime dysfunction than the control group at week 4. At week 27, although both groups showed improvements in sleep measures from baseline, there were no statistically significant differences between groups in any of the PSQI total and component scores, or ESS. At week 52, the intervention group used less sleep medication than control patients compared to baseline (p = 0.04) and had a lower ESS score (7.6 vs. 9.3, p = 0.03).
Significance of results
A multidisciplinary intervention to improve QOL can also improve sleep quality of advanced cancer patients undergoing radiation therapy. Those patients who completed the intervention also reported the use of less sleep medication.
To systematically assess enhanced personal protective equipment (PPE) doffing safety risks.
We employed a 3-part approach to this study: (1) hierarchical task analysis (HTA) of the PPE doffing process; (2) human factors-informed failure modes and effects analysis (FMEA); and (3) focus group sessions with a convenience sample of infection prevention (IP) subject matter experts.
A large academic US hospital with a regional Special Pathogens Treatment Center and enhanced PPE doffing protocol experience.
Eight IP experts.
The HTA was conducted jointly by 2 human-factors experts based on the Centers for Disease Control and Prevention PPE guidelines. The findings were used as a guide in 7 focus group sessions with IP experts to assess PPE doffing safety risks. For each HTA task step, IP experts identified failure mode(s), assigned priority risk scores, identified contributing factors and potential consequences, and identified potential risk mitigation strategies. Data were recorded in a tabular format during the sessions.
Of 103 identified failure modes, the highest priority scores were associated with team members moving between clean and contaminated areas, glove removal, apron removal, and self-inspection while preparing to doff. Contributing factors related to the individual (eg, technical/ teamwork competency), task (eg, undetected PPE contamination), tools/technology (eg, PPE design characteristics), environment (eg, inadequate space), and organizational aspects (eg, training) were identified. Participants identified 86 types of risk mitigation strategies targeting the failure modes.
Despite detailed guidelines, our study revealed 103 enhanced PPE doffing failure modes. Analysis of the failure modes suggests potential mitigation strategies to decrease self-contamination risk during enhanced PPE doffing.
Antimicrobial stewardship programs are effective in optimizing antimicrobial prescribing patterns and decreasing the negative outcomes of antimicrobial exposure, including the emergence of multidrug-resistant organisms. In dialysis facilities, 30%–35% of antimicrobials are either not indicated or the type of antimicrobial is not optimal. Although antimicrobial stewardship programs are now implemented nationwide in hospital settings, programs specific to the maintenance dialysis facilities have not been developed.
To quantify the effect of an antimicrobial stewardship program in reducing antimicrobial prescribing.
Study design and setting
An interrupted time-series study in 6 outpatient hemodialysis facilities was conducted in which mean monthly antimicrobial doses per 100 patient months during the 12 months prior to the program were compared to those in the 12-month intervention period.
Implementation of the antimicrobial stewardship program was associated with a 6% monthly reduction in antimicrobial doses per 100 patient months during the intervention period (P=.02). The initial mean of 22.6 antimicrobial doses per 100 patient months decreased to a mean of 10.5 antimicrobial doses per 100 patient months at the end of the intervention. There were no significant changes in antimicrobial use by type, including vancomycin. Antimicrobial adjustments were recommended for 30 of 145 antimicrobial courses (20.6%) for which there were sufficient clinical data. The most frequent reasons for adjustment included de-escalation from vancomycin to cefazolin for methicillin-susceptible Staphylococcus aureus infections and discontinuation of antimicrobials when criteria for presumed infection were not met.
Within 6 hemodialysis facilities, implementation of an antimicrobial stewardship was associated with a decline in antimicrobial prescribing with no negative effects.
Background: To determine whether exosomal microRNAs (miRNAs) in CSF of patients with FTD can serve as diagnostic biomarkers, we assessed miRNA expression in the Genetic FTD Initiative (GENFI) cohort and in sporadic FTD. Methods: GENFI participants were either carriers of a pathogenic mutation or at risk of carrying a mutation because a first-degree relative was a symptomatic mutation carrier. Exosomes were isolated from CSF of 23 -pre-symptomatic and 15 symptomatic mutation carriers, and 11 healthy non-mutation carriers. Expression of miRNAs was measured using qPCR arrays. MiRNAs differentially expressed in symptomatic compared to pre-symptomatic mutation carriers were evaluated in 17 patients with sporadic FTD, 13 patients with sporadic Alzheimer’s disease (AD), and 10 healthy controls (HCs). Results: In the GENFI cohort, miR-204-5p and miR-632 were significantly decreased in symptomatic compared to pre-symptomatic mutation carriers. Decrease of miR-204-5p and miR-632 revealed receiver operator characteristics with an area of 0.89 [90% CI: 0.79-0.98] and 0.81 [90% CI: 0.68-0.93], and when combined an area of 0.93 [90% CI: 0.87-0.99]. In sporadic FTD, only miR-632 was significantly decreased compared to sporadic AD and HCs. Decrease of miR-632 revealed an area of 0.89 [90% CI: 0.80-0.98]. Conclusions: Exosomal miR-204-5p and miR-632 have potential as diagnostic biomarkers for genetic FTD and miR-632 also for sporadic FTD.
A novel freeze-cast porous chitosan conduit for peripheral nerve repair with highly-aligned, double layered porosity, which provides the ideal mechanical and chemical properties was designed, manufactured, and assessed in vivo. Efficacies of the conduit and the control inverted nerve autograft were evaluated in bridging 10-mm Lewis rat sciatic nerve gap at 12 weeks post-implantation. Biocompatibility and regenerative efficacy of the porous chitosan conduit were evaluated through the histomorphometric analysis of longitudinal and transverse sections. The porous chitosan conduit was found to have promising regenerative characteristics, promoting the desired neovascularization, and axonal ingrowth and alignment through a combination of structural, mechanical and chemical cues.
When instructors are first tasked with teaching the research methods course for their department, a common reaction is frustration and panic. Although all political scientists are trained in research methods, few besides methodologists view it as their primary or strongest area of expertise, and they are aware that the course rarely returns high teaching evaluations (Fletcher and Painter-Main 2014). Likewise, students approach their required research methods course with extreme anxiety, viewing it as the math class they were trying to avoid by majoring in political science (Bernstein and Allen 2013; Coleman and Conrad 2007). With instructors unhappily teaching the class and students dreading taking it, there is a “perfect storm” of attitudes and beliefs that is hardly likely to lead to a productive learning environment. The challenge driving this article is how to teach research methods in a rigorous, engaging way that promotes student learning without tanking scores on teaching evaluations.
Advances in the study of partial identification allow applied researchers to learn about parameters of interest without making assumptions needed to guarantee point identification. We discuss the roles that assumptions and data play in partial identification analysis, with the goal of providing information to applied researchers that can help them employ these methods in practice. To this end, we present a sample of econometric models that have been used in a variety of recent applications where parameters of interest are partially identified, highlighting common features and themes across these papers. In addition, in order to help illustrate the combined roles of data and assumptions, we present numerical illustrations for a particular application, the joint determination of wages and labor supply. Finally we discuss the benefits and challenges of using partially identifying models in empirical work and point to possible avenues of future research.
The goal of identification analysis is to determine what can be learned using deductive reasoning through the combination of models (sets of assumptions) and data. Standard approaches to econometric modeling for applied research make enough assumptions to ensure that parameters of interest are point identified. However, it is still possible to learn about such quantities even if they are not.
Econometric models that allow for partial identification, or partially identifying models, make fewer assumptions and use them to generate bounds on the parameters of interest. Such models have a long history, with early papers including Frisch (1934), Reiersol (1941), Marschak and Andrews (1944), and Frechet (1951). The literature on the topic then remained fragmented for several decades, with some further notable contributions such as Peterson (1976), Leamer (1981), Klepper and Leamer (1984), and Phillips (1989). It wasn't until the work of Charles Manski and co-authors beginning in the late 1980s that a unified literature began to emerge, beginning with Manski (1989, 1990). Several influential papers by Elie Tamer and co-authors applying partial identification to a variety of econometric models (e.g., Haile and Tamer, 2003; Honore and Tamer, 2006; Ciliberto and Tamer, 2009) have helped to bring these methods into more common use.
Epipaleolithic hunter-gatherers are often interpreted as playing an important role in the development of early cereal cultivation and subsequent farming economies in the Levant. This focus has come at the expense of understanding these people as resilient foragers who exploited a range of changing micro habitats through the Last Glacial Maximum. New phytolith data from Ohalo II seek to redress this. Ohalo II has the most comprehensive and important macrobotanical assemblage in Southwest Asia for the entire Epipaleolithic period. Here we present a phytolith investigation of 28 sediment samples to make three key contributions. First, by comparing the phytolith assemblage to a sample of the macrobotanical assemblage, we provide a baseline to help inform the interpretation of phytolith assemblages at other sites in Southwest Asia. Second, we highlight patterns of plant use at the site. We identify the importance of wetland plant resources to hut construction and provide evidence that supports previous work suggesting that grass and cereal processing may have been a largely “indoor” activity. Finally, drawing on ethnographic data from the American Great Basin, we reevaluate the significance of wetland plant resources for Epipaleolithic hunter-gatherers and argue that the wetland-centered lifeway at Ohalo II represents a wider Levantine adaptive strategy.
Moringa oleifera is a rich source of antioxidants and a promising feed for livestock, due to significant amounts of protein, vitamins, carotenoids and polyphenols, and negligible amounts of anti-nutritional factors. The current study tested whether ensiling would preserve the antioxidant capacity of M. oleifera plants, and assessed whether Moringa silage, fed as a substitute for maize silage, would confer health-promoting traits and affect milk production in dairy cows. To this end, hand-harvested M. oleifera plants were ensiled, with or without molasses and inoculants, in anaerobic jars at room temperature (25 °C) for 37 days. At the end of the storage period the silages were analysed for pH, lactic acid and acetic acid concentrations, aerobic stability, antioxidant capacity, polyphenols and protein content, and tocopherols and carotenoids concentrations. Moringa silages exhibited higher antioxidant capacity compared with fresh and dried Moringa plants, not related to polyphenol content but presumably attributed to accumulation of amino acids and low molecular weight peptides. Based on these findings, a large-scale ensiling protocol was implemented, followed by a feeding trial for dairy cows, in which Moringa silage replaced 263 g maize silage/kg in the diet. Cows fed Moringa silage had higher milk yield and antioxidant capacity and lower milk somatic cell counts compared with controls, during some stages of lactation. These findings imply that ensiling M. oleifera is an appropriate practice by which health and production of dairy cows can be improved.
Multiple input multiple output (MIMO) antenna is at core of the presently available wireless technologies. The design of MIMO antennas over a limited space requires various approaches of mutual coupling reduction, otherwise gain, efficiency, diversity gain, and radiation patterns will be severely affected. Various techniques have been reported in literature to control this degrading factor and to improve the performance of the MIMO antennas. In this review paper, we have carried out an extensive thorough investigation of diversity and mutual coupling (correlation) reduction techniques in compact MIMO antennas.
In 2007, Robert Spitzer considered validity challenges to the diagnosis of post-traumatic stress disorder (PTSD), a construct that originated when he was Chair of DSM-III. Spitzer suggested changes for DSM-5, then in its planning stages, for the purpose of ‘Saving PTSD from itself’. With years gone by, it can be asked if DSM-5 followed Spitzer's recommendations to advance our understanding of posttraumatic disorder.
Post-traumatic stress disorder (PTSD) is associated with elevated risk for metabolic syndrome (MetS). However, the direction of this association is not yet established, as most prior studies employed cross-sectional designs. The primary goal of this study was to evaluate bidirectional associations between PTSD and MetS using a longitudinal design.
A total of 1355 male and female veterans of the conflicts in Iraq and Afghanistan underwent PTSD diagnostic assessments and their biometric profiles pertaining to MetS were extracted from the electronic medical record at two time points (spanning ~2.5 years, n = 971 at time 2).
The prevalence of MetS among veterans with PTSD was just under 40% at both time points and was significantly greater than that for veterans without PTSD; the prevalence of MetS among those with PTSD was also elevated relative to age-matched population estimates. Cross-lagged panel models revealed that PTSD severity predicted subsequent increases in MetS severity (β = 0.08, p = 0.002), after controlling for initial MetS severity, but MetS did not predict later PTSD symptoms. Logistic regression results suggested that for every 10 PTSD symptoms endorsed at time 1, the odds of a subsequent MetS diagnosis increased by 56%.
Results highlight the substantial cardiometabolic concerns of young veterans with PTSD and raise the possibility that PTSD may predispose individuals to accelerated aging, in part, manifested clinically as MetS. This demonstrates the need to identify those with PTSD at greatest risk for MetS and to develop interventions that improve both conditions.