To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Self-efficacy can be defined as individuals’ beliefs in their capability to implement a behavior needed to reach a goal or perform a task successfully. A vast amount of empirical research shows that self-efficacy is a key factor in predicting and explaining the successful initiation and maintenance of behavior change in various domains of human life. Less research has been conducted on the sources of self-efficacy (mastery experiences, vicarious experiences, verbal persuasion, somatic and affective states) and how these can be prompted in behavior change interventions. This chapter reviews primary and meta-analytic research on behavior change techniques promoting self-efficacy beliefs in interventions for change in health, work, and academic contexts. It also provides practical guidelines and concrete examples on how to design and evaluate behavior change interventions that target self-efficacy.
The National Institute of Standards and Technology (NIST) certifies a suite of Standard Reference Materials (SRMs) to be used to evaluate specific aspects of the instrument performance of both X-ray and neutron powder diffractometers. This report describes SRM 640f, the seventh generation of this powder diffraction SRM, which is designed to be used primarily for calibrating powder diffractometers with respect to line position; it also can be used for the determination of the instrument profile function. It is certified with respect to the lattice parameter and consists of approximately 7.5 g of silicon powder prepared to minimize line broadening. A NIST-built diffractometer, incorporating many advanced design features, was used to certify the lattice parameter of the Si powder. Both statistical and systematic uncertainties have been assigned to yield a certified value for the lattice parameter at 22.5 °C of a = 0.5431144 ± 0.000008 nm.
To determine the Final ICU Need in the 24 hours prior to ICU discharge for children with cardiac disease by utilising a single-centre survey.
A cross-sectional survey was utilised to determine Final ICU Need, which was categorised as “Cardiovascular”, “Respiratory”, “Feeding”, “Sedation”, “Systems Issue”, or “Other” for each encounter. Survey responses were obtained from attending physicians who discharged children (≤18 years of age with ICU length of stay >24 hours) from the Cardiac ICU between April 2016 and July 2018.
Measurements and results:
Survey response rate was 99% (n = 1073), with 667 encounters eligible for analysis. “Cardiovascular” (61%) and “Respiratory” (26%) were the most frequently chosen Final ICU Needs. From a multivariable mixed effects logistic regression model fitted to “Cardiovascular” and “Respiratory”, operations with significantly reduced odds of having “Cardiovascular” Final ICU Need included Glenn palliation (p = 0.003), total anomalous pulmonary venous connection repair (p = 0.024), truncus arteriosus repair (p = 0.044), and vascular ring repair (p < 0.001). Short lengths of stay (<7.9 days) had significantly higher odds of “Cardiovascular” Final ICU Need (p < 0.001). “Cardiovascular” and “Respiratory” Final ICU Needs were also associated with provider and ICU discharge season.
Final ICU Need is a novel metric to identify variations in Cardiac ICU utilisation and clinical trajectories. Final ICU Need was significantly influenced by benchmark operation, length of stay, provider, and season. Future applications of Final ICU Need include targeting quality and research initiatives, calibrating provider and family expectations, and identifying provider-level variability in care processes and mental models.
Functional determinations of stone tools gleaned through high-magnification usewear analysis enable archaeologists to reconstruct ancient household practices and identify diversity across regional domestic economies. A systematic obsidian usewear study with 300 specimens from the site of Altica, Mexico presented here reveals that tools from the Early–Middle Formative (1250–800 cal. b.c.) occupation were used for woodworking and subsistence-related activities. The high frequency of woodworking usewear patterns can be attributed to the construction and maintenance of the newly established settlement's households and agricultural plots. Combined with previous analyses of the site's paleoethnobotanical, osteological, and isotopic datasets, the usewear data further indicate a subsistence strategy that balanced foraging and non-intensive maize agriculture. Thanks to their proximity to the Otumba source and other sites exploiting it, Altica residents were able to employ a unifunctional tool-use approach with expedient percussion tools, which contrasts the multifunctional tool-use approaches documented at other Middle Formative sites.
Lewy body dementia, consisting of both dementia with Lewy bodies (DLB) and Parkinson's disease dementia (PDD), is considerably under-recognised clinically compared with its frequency in autopsy series.
This study investigated the clinical diagnostic pathways of patients with Lewy body dementia to assess if difficulties in diagnosis may be contributing to these differences.
We reviewed the medical notes of 74 people with DLB and 72 with non-DLB dementia matched for age, gender and cognitive performance, together with 38 people with PDD and 35 with Parkinson's disease, matched for age and gender, from two geographically distinct UK regions.
The cases of individuals with DLB took longer to reach a final diagnosis (1.2 v. 0.6 years, P = 0.017), underwent more scans (1.7 v. 1.2, P = 0.002) and had more alternative prior diagnoses (0.8 v. 0.4, P = 0.002), than the cases of those with non-DLB dementia. Individuals diagnosed in one region of the UK had significantly more core features (2.1 v. 1.5, P = 0.007) than those in the other region, and were less likely to have dopamine transporter imaging (P < 0.001). For patients with PDD, more than 1.4 years prior to receiving a dementia diagnosis: 46% (12 of 26) had documented impaired activities of daily living because of cognitive impairment, 57% (16 of 28) had cognitive impairment in multiple domains, with 38% (6 of 16) having both, and 39% (9 of 23) already receiving anti-dementia drugs.
Our results show the pathway to diagnosis of DLB is longer and more complex than for non-DLB dementia. There were also marked differences between regions in the thresholds clinicians adopt for diagnosing DLB and also in the use of dopamine transporter imaging. For PDD, a diagnosis of dementia was delayed well beyond symptom onset and even treatment.
We describe a widespread laboratory surveillance program for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) at an integrated medical campus that includes a tertiary-care center, a skilled nursing facility, a rehabilitation treatment center, and temporary shelter units. We identified 22 asymptomatic cases of SARS-CoV-2 and implemented infection control measures to prevent SARS-CoV-2 transmission in congregate settings.
Major depression (MD) is often characterised as a categorical disorder; however, observational studies comparing sub-threshold and clinical depression suggest MD is continuous. Many of these studies do not explore the full continuum and are yet to consider genetics as a risk factor. This study sought to understand if polygenic risk for MD could provide insight into the continuous nature of depression.
Factor analysis on symptom-level data from the UK Biobank (N = 148 957) was used to derive continuous depression phenotypes which were tested for association with polygenic risk scores (PRS) for a categorical definition of MD (N = 119 692).
Confirmatory factor analysis showed a five-factor hierarchical model, incorporating 15 of the original 18 items taken from the PHQ-9, GAD-7 and subjective well-being questionnaires, produced good fit to the observed covariance matrix (CFI = 0.992, TLI = 0.99, RMSEA = 0.038, SRMR = 0.031). MD PRS associated with each factor score (standardised β range: 0.057–0.064) and the association remained when the sample was stratified into case- and control-only subsets. The case-only subset had an increased association compared to controls for all factors, shown via a significant interaction between lifetime MD diagnosis and MD PRS (p value range: 2.23 × 10−3–3.94 × 10−7).
An association between MD PRS and a continuous phenotype of depressive symptoms in case- and control-only subsets provides support against a purely categorical phenotype; indicating further insights into MD can be obtained when this within-group variation is considered. The stronger association within cases suggests this variation may be of particular importance.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
Impairments in social cognition contribute significantly to disability in schizophrenia patients (SzP). Perception of facial expressions is critical for social cognition. Intact perception requires an individual to visually scan a complex dynamic social scene for transiently moving facial expressions that may be relevant for understanding the scene. The relationship of visual scanning for these facial expressions and social cognition remains unknown.
In 39 SzP and 27 healthy controls (HC), we used eye-tracking to examine the relationship between performance on The Awareness of Social Inference Test (TASIT), which tests social cognition using naturalistic video clips of social situations, and visual scanning, measuring each individual's relative to the mean of HC. We then examined the relationship of visual scanning to the specific visual features (motion, contrast, luminance, faces) within the video clips.
TASIT performance was significantly impaired in SzP for trials involving sarcasm (p < 10−5). Visual scanning was significantly more variable in SzP than HC (p < 10−6), and predicted TASIT performance in HC (p = 0.02) but not SzP (p = 0.91), differing significantly between groups (p = 0.04). During the visual scanning, SzP were less likely to be viewing faces (p = 0.0001) and less likely to saccade to facial motion in peripheral vision (p = 0.008).
SzP show highly significant deficits in the use of visual scanning of naturalistic social scenes to inform social cognition. Alterations in visual scanning patterns may originate from impaired processing of facial motion within peripheral vision. Overall, these results highlight the utility of naturalistic stimuli in the study of social cognition deficits in schizophrenia.
Accurate near-field measurements for either deterministic or stochastic electromagnetic fields characterization require a relevant process that removes the influence of the probes, transmission lines, and measurement circuits. The main part of the experimental work presented here is related to a calibration procedure of a test setup consisting of a microstrip test structure and a scanning loop probe. The calibration characteristic, obtained by comparing measured and simulated results, is then used to convert the measured voltage into the magnetic field across and along the microstrip line at the specific height above it. By performing the measurements and simulations of the same test structure with the loop probe in the presence of an additional scanning probe, the influence of the additional probe to the measured output is thoroughly investigated and relevant corrections are given. These corrections can be important when two-point correlation measurement is required, especially in scanning points when two probes are mutually close.
A major concern of sweetpotato producers is the potential negative effects from herbicide drift or sprayer contamination events when dicamba is applied to nearby dicamba-resistant crops. A field study was initiated in 2014 and repeated in 2015 to assess the effects of reduced rates of N,N-Bis-(3-aminopropyl)methylamine (BAPMA) or diglycloamine (DGA) salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of the 1× use rate of each dicamba formulation at 0.56 kg ha−1, glyphosate at 1.12 kg ha−1, and a combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial and storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal to or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and a quadratic decrease in crop yield (with respect to most yield grades) observed with an increased herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, with a few exceptions, neither this relationship nor the significance of herbicide rate was observed on crop injury or sweetpotato yield when herbicide application occurred at the storage root formation stage. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of No.1 and marketable grades was observed following 1/250×, 1/100×, or 1/10× application rates of dicamba alone or with glyphosate when applied at storage root development.
In 2019, a 42-year-old African man who works as an Ebola virus disease (EVD) researcher traveled from the Democratic Republic of Congo (DRC), near an ongoing EVD epidemic, to Philadelphia and presented to the Hospital of the University of Pennsylvania Emergency Department with altered mental status, vomiting, diarrhea, and fever. He was classified as a “wet” person under investigation for EVD, and his arrival activated our hospital emergency management command center and bioresponse teams. He was found to be in septic shock with multisystem organ dysfunction, including circulatory dysfunction, encephalopathy, metabolic lactic acidosis, acute kidney injury, acute liver injury, and diffuse intravascular coagulation. Critical care was delivered within high-risk pathogen isolation in the ED and in our Special Treatment Unit until a diagnosis of severe cerebral malaria was confirmed and EVD was definitively excluded.
This report discusses our experience activating a longitudinal preparedness program designed for rare, resource-intensive events at hospitals physically remote from any active epidemic but serving a high-volume international air travel port-of-entry.
Acute cannabis administration can produce transient psychotic-like effects in healthy individuals. However, the mechanisms through which this occurs and which factors predict vulnerability remain unclear. We investigate whether cannabis inhalation leads to psychotic-like symptoms and speech illusion; and whether cannabidiol (CBD) blunts such effects (study 1) and adolescence heightens such effects (study 2).
Two double-blind placebo-controlled studies, assessing speech illusion in a white noise task, and psychotic-like symptoms on the Psychotomimetic States Inventory (PSI). Study 1 compared effects of Cann-CBD (cannabis containing Δ-9-tetrahydrocannabinol (THC) and negligible levels of CBD) with Cann+CBD (cannabis containing THC and CBD) in 17 adults. Study 2 compared effects of Cann-CBD in 20 adolescents and 20 adults. All participants were healthy individuals who currently used cannabis.
In study 1, relative to placebo, both Cann-CBD and Cann+CBD increased PSI scores but not speech illusion. No differences between Cann-CBD and Cann+CBD emerged. In study 2, relative to placebo, Cann-CBD increased PSI scores and incidence of speech illusion, with the odds of experiencing speech illusion 3.1 (95% CIs 1.3–7.2) times higher after Cann-CBD. No age group differences were found for speech illusion, but adults showed heightened effects on the PSI.
Inhalation of cannabis reliably increases psychotic-like symptoms in healthy cannabis users and may increase the incidence of speech illusion. CBD did not influence psychotic-like effects of cannabis. Adolescents may be less vulnerable to acute psychotic-like effects of cannabis than adults.
It is not unusual for the cardiac anaesthetist to encounter adults with palliated, corrected or newly diagnosed congenital heart disease (CHD). It is essential, therefore, that the anaesthetist has an appreciation of the types of CHD, surgical procedures and perioperative management.
While theoretical, analytical, and methodological issues surrounding research on generations and generational differences at work have been thoroughly discussed, one topic that has received far less attention is the extent to which the inferences suggested by this research are appropriate. Therefore, the purpose of this effort is to review the recent-generations literature, identify the commonly represented inferences, and offer a critical review of the appropriateness of each. A qualitative review of the last ten years of published research found four main inferences: (1) organizations should adopt customized HR policies, (2) intergenerational conflict is inevitable, (3) generations should be led differently, and (4) the benefits of capitalizing on generational strengths. These inferences are critiqued using several different lenses including legal, methodological, practice, and theoretical. Our conclusion is that these inferences are not supported by the literature and that organizations should instead focus on broader work and workplace trends.
Clinical diagnostics in sudden onset disasters have historically been limited. We set out to design, implement, and evaluate a mobile diagnostic laboratory accompanying a type 2 emergency medical team (EMT) field hospital.
Available diagnostic platforms were reviewed and selected against in field need. Platforms included HemoCue301/WBC DIFF, i-STAT, BIOFIRE FILMARRAY multiplex rt-PCR, Olympus BX53 microscopy, ABO/Rh grouping, and specific rapid diagnostic tests. This equipment was trialed in Katherine, Australia, and Dili, Timor-Leste.
During the initial deployment, an evaluation of FilmArray tests was successful using blood culture identification, gastrointestinal, and respiratory panels. HemoCue301 (n = 20) hemoglobin values were compared on Sysmex XN 550 (r = 0.94). HemoCue WBC DIFF had some variation, dependent on the cell, when compared with Sysmex XN 550 (r = 0.88-0.16). i-STAT showed nonsignificant differences against Vitros 250. Further evaluation of FilmArray in Dili, Timor-Leste, diagnosed 117 pathogens on 168 FilmArray pouches, including 25 separate organisms on blood culture and 4 separate cerebrospinal fluid pathogens.
This mobile laboratory represents a major advance in sudden onset disaster. Setup of the service was quick (< 24 hr) and transport to site rapid. Future deployment in fragmented health systems after sudden onset disasters with EMT2 will now allow broader diagnostic capability.
The paper describes the adaptation and psychometric evaluation of the Hungarian version of the quality of life in depression scale. The adaptation procedure involved: bilingual translation; field-testing for face and content validity; and assessment of instrument's reliability and construct validity. The new language version was shown to be well-accepted by respondents and to have excellent psychometric properties.