We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Deep neural networks (DNNs) have had extraordinary successes in classifying photographic images of objects and are often described as the best models of biological vision. This conclusion is largely based on three sets of findings: (1) DNNs are more accurate than any other model in classifying images taken from various datasets, (2) DNNs do the best job in predicting the pattern of human errors in classifying objects taken from various behavioral datasets, and (3) DNNs do the best job in predicting brain signals in response to images taken from various brain datasets (e.g., single cell responses or fMRI data). However, these behavioral and brain datasets do not test hypotheses regarding what features are contributing to good predictions and we show that the predictions may be mediated by DNNs that share little overlap with biological vision. More problematically, we show that DNNs account for almost no results from psychological research. This contradicts the common claim that DNNs are good, let alone the best, models of human object recognition. We argue that theorists interested in developing biologically plausible models of human vision need to direct their attention to explaining psychological findings. More generally, theorists need to build models that explain the results of experiments that manipulate independent variables designed to test hypotheses rather than compete on making the best predictions. We conclude by briefly summarizing various promising modelling approaches that focus on psychological data.
Supplemental food from anthropogenic sources is a source of conflict with humans for many wildlife species. Food-seeking behaviours by black bears Ursus americanus and brown bears Ursus arctos can lead to property damage, human injury and mortality of the offending bears. Such conflicts are a well-known conservation management issue wherever people live in bear habitats. In contrast, the use of anthropogenic foods by the polar bear Ursus maritimus is less common historically but is a growing conservation and management issue across the Arctic. Here we present six case studies that illustrate how negative food-related interactions between humans and polar bears can become either chronic or ephemeral and unpredictable. Our examination suggests that attractants are an increasing problem, exacerbated by climate change-driven sea-ice losses that cause increased use of terrestrial habitats by bears. Growing human populations and increased human visitation increase the likelihood of human–polar bear conflict. Efforts to reduce food conditioning in polar bears include attractant management, proactive planning and adequate resources for northern communities to reduce conflicts and improve human safety. Permanent removal of unsecured sources of nutrition, to reduce food conditioning, should begin immediately at the local level as this will help to reduce polar bear mortality.
Earth is rapidly losing free-living species. Is the same true for parasitic species? To reveal temporal trends in biodiversity, historical data are needed, but often such data do not exist for parasites. Here, parasite communities of the past were reconstructed by identifying parasites in fluid-preserved specimens held in natural history collections. Approximately 2500 macroparasites were counted from 109 English Sole (Parophrys vetulus) collected between 1930 and 2019 in the Salish Sea, Washington, USA. Alpha and beta diversity were measured to determine if and how diversity changed over time. Species richness of parasite infracommunities and community dispersion did not vary over time, but community composition of decadal component communities varied significantly over the study period. Community dissimilarity also varied: prior to the mid-20th century, parasites shifted in abundance in a seemingly stochastic manner and, after this time period, a canalization of community change was observed, where species' abundances began to shift in consistent directions. Further work is needed to elucidate potential drivers of these changes and to determine if these patterns are present in the parasite communities of other fishes of the Salish Sea.
Real-World Evidence is useful for validating crossover adjustment approaches, particularly when the adjustment is required because a trial does not accurately reflect a health technology assessment (HTA)-relevant population. We use the MAVORIC trial advanced stage mycosis fungoides and Sézary syndrome cutaneous T-cell lymphoma population and data from the Hospital Episodes Statistics to explore and validate crossover adjustment methods.
Introduction
The MAVORIC trial compared mogamulizumab to vorinostat in patients with mycosis fungoides (MF) or Sézary syndrome (SS), subtypes of cutaneous T-cell lymphoma. However, the treatment comparison within MAVORIC may not represent an HTA relevant population from a UK perspective: (i) 72.6 percent of patients randomized to vorinostat switched to mogamulizumab and (ii) vorinostat is not used in current clinical practice in the UK. This study explores methods to adjust treatment effect estimates using different crossover adjustment methods and Real-World Evidence.
This medicine is subject to additional monitoring. This will allow quick identification of new safety information. See www.mhra.gov.uk/yellowcard for how to report side effects.
Methods
An advanced stage (stage ≥IIB MF and all SS) population was included. Three methods were considered for treatment crossover adjustment. A synthetic control arm was created using the Hospital Episodes Statistics (HES) dataset. Predicted survival for the MAVORIC control arm, post-crossover adjustment, was compared to the HES to inform the selection of the appropriate methods for adjustment. A direct comparison between mogamulizumab (reweighted to represent the distribution of MF/SS patients in the HES) and the synthetic control was also conducted.
Results
Following crossover adjustment of the vorinostat arm, using the inverse probability of censoring weighting method, the overall survival (OS) hazard ratio (HR) estimate for mogamulizumab vs. vorinostat was 0.45 (95% confidence interval (CI): 0.19, 1.07). This adjustment method was considered the most appropriate based on an assessment of assumptions and a comparison of OS between the adjusted vorinostat data and the HES data. The OS HR estimate for reweighted mogamulizumab vs. synthetic control from HES was 0.33 (CI: 0.21, 0.50).
Conclusions
Real World Evidence from the HES database can be used to validate crossover adjustment methods and to better reflect current clinical practice in the UK. Results using both methods support each other.
Patient and public involvement (PPI) plays a crucial role in ensuring research is carried out in conjunction with the people that it will impact upon. In this article, we present our experiences and reflections from working collaboratively with patients and public through the lifetime of an National Institute for Health Research (NIHR) programme grant; the Chronic Headache Education and Self-management Study (CHESS) which took place between 2015 and 2020.
PPI over the course of CHESS:
We worked closely with three leading UK migraine charities and a lay advisory group throughout the programme. We followed NIHR standards and used the Guidance for Reporting Involvement of Patients and the Public checklist. We consulted our PPI contacts using a variety of methods depending on the phase of the study and the nature of the request. This included emails, discussions, and face-to-face contact.
PPI members contributed throughout the study in the programme development, in the grant application, ethics documentation, and trial oversight. During the feasibility study; in supporting the development of a classification interview for chronic headache by participating in a headache classification conference, assessing the relevance, and acceptability of patient-reported outcome measures by helping to analyse cognitive interview data, and testing the smartphone application making suggestions on how best to present the summary of data collected for participants. Due to PPI contribution, the content and duration of the study intervention were adapted and a Delphi study with consensus meeting developed a core outcome set for migraine studies.
Conclusions:
The involvement of the public and patients in CHESS has allowed us to shape its overall design, intervention development, and establish a core outcome set for future migraine studies. We have reflected on many learning points for the future application of PPI.
Social anxiety disorder (SoAD) in youth is often treated with a generic form of cognitive behavioural therapy (CBT). Some studies have suggested that primary SoAD is associated with lower recovery rates following generic CBT compared with other anxiety disorders.
Aims:
This systematic review and meta-analysis investigated recovery rates following generic CBT for youth with primary SoAD versus other primary anxiety disorders.
Method:
Five databases (PsycINFO, Web of Science, PubMed, Embase, Medline) were searched for randomised controlled trials of generic CBT for child and/or adolescent anxiety.
Results:
Ten trials met criteria for inclusion in the systematic review, six of which presented sufficient data for inclusion in the meta-analysis. Sixty-seven did not report data on recovery rates relative to primary diagnosis. While most individual studies included in the systematic review were not sufficiently powered to detect a difference in recovery rates between diagnoses, there was a pattern of lower recovery rates for youth with primary SoAD. Across the trials included in the meta-analysis, the post-CBT recovery rate from primary SoAD (35%) was significantly lower than the recovery rate from other primary anxiety disorders (54%).
Conclusions:
Recovery from primary SoAD is significantly less likely than recovery from any other primary anxiety disorder following generic CBT in youth. This suggests a need for research to enhance the efficacy of CBT for youth SoAD.
Background: Long-term acute-care hospitals (LTACHs) are disproportionately burdened by multidrug-resistant organisms (MDROs) like KPC-Kp. Although cohorting KPC-Kp+ patients into rooms with other carriers can be an outbreak-control strategy and may protect negative patients from colonization, it is unclear whether cohorted patients are at unintended increased risk of cross colonization with additional KPC-Kp strains. Methods: Cohorting KPC-Kp+ patients at admission into rooms with other positive patients was part of a bundled intervention that reduced transmission in a high-prevalence LTACH. Rectal surveillance culturing for KPC-Kp was performed at the start of the study, upon admission, and biweekly thereafter, capturing 94% of patients. We evaluated whole-genome sequencing (WGS) evidence of acquisition of distinct KPC-Kp strains in a convenience sample of patients positive for KPC-Kp at study start or admission to identify plausible secondary KPC-Kp acquisitions. Results: WGS multilocus sequence type (MLST) strain variability was observed among the 452 isolates from the 254 patients colonized by KPC-Kp (Fig. 1). Among the 32 patients who were positive at the beginning of the study or admission and had a secondary isolate collected at a later date (median, 89 days apart, range, 2–310 days), 17 (53%) had secondary isolates differing by MLST from their admission isolate. Although 60% of the KPC-Kp in the study was ST258, there was substantial genomic variation within ST258 isolates from the same patient (range, 0–102 genetic variants), suggesting multiple acquisitions of distinct ST258 isolates. Among the 17 patients who imported ST258 and had ST258 isolated again later, 11 (65%) carried secondary isolates genetically closer to isolates from other importing patients than to their own ST258 (Fig. 2). Examination of spatiotemporal exposures among patients with evidence of multiple acquisitions revealed that 11 (65%) patients with multiple MLSTs shared a room with a patient who was colonized with an isolate matching the secondary MLST, and 6 (35%) patients who carried multiple distinct ST258 isolates shared a room with a patient who imported these closely related isolates prior to secondary acquisition. Conclusions: Half of patients who imported KPC-Kp and had multiple isolates available had genomically supported secondary acquisitions linked to roommates who carried the acquired strains. Although cohorting is intended to protect negative patients from acquiring MDROs, this practice may promote multiple strain acquisitions by colonized patients in the cohort, potentially prolonging the period of MDRO carriage and increasing time at risk of infection. Our findings add to the debate about single-patient rooms, which may be preferred to cohorts to minimize potential harms by reducing MDRO transmission.
Cohorting patients who are colonized or infected with multidrug-resistant organisms (MDROs) protects uncolonized patients from acquiring MDROs in healthcare settings. The potential for cross transmission within the cohort and the possibility of colonized patients acquiring secondary isolates with additional antibiotic resistance traits is often neglected. We searched for evidence of cross transmission of KPC+ Klebsiella pneumoniae (KPC-Kp) colonization among cohorted patients in a long-term acute-care hospital (LTACH), and we evaluated the impact of secondary acquisitions on resistance potential.
Design:
Genomic epidemiological investigation.
Setting:
A high-prevalence LTACH during a bundled intervention that included cohorting KPC-Kp–positive patients.
Methods:
Whole-genome sequencing (WGS) and location data were analyzed to identify potential cases of cross transmission between cohorted patients.
Results:
Secondary KPC-Kp isolates from 19 of 28 admission-positive patients were more closely related to another patient’s isolate than to their own admission isolate. Of these 19 cases, 14 showed strong genomic evidence for cross transmission (<10 single nucleotide variants or SNVs), and most of these patients occupied shared cohort floors (12 patients) or rooms (4 patients) at the same time. Of the 14 patients with strong genomic evidence of acquisition, 12 acquired antibiotic resistance genes not found in their primary isolates.
Conclusions:
Acquisition of secondary KPC-Kp isolates carrying distinct antibiotic resistance genes was detected in nearly half of cohorted patients. These results highlight the importance of healthcare provider adherence to infection prevention protocols within cohort locations, and they indicate the need for future studies to assess whether multiple-strain acquisition increases risk of adverse patient outcomes.
To determine how children interpret terms related to food processing; whether their categorisation of foods according to processing level is consistent with those used in research; and whether they associate the degree of processing with healthfulness.
Design:
Qualitative data were collected from ten focus groups. Focus groups were audio-recorded, transcribed verbatim, and thematic analysis was conducted.
Setting:
Four elementary and afterschool programmes in a large, urban school district in the USA that served predominantly low-income, racial/ethnic minority students.
Participants:
Children, 9–12 years old, in the fourth–sixth grades (n 53).
Results:
The sample was 40 % male, 47 % Hispanic with a mean age of 10·4 ± 1·1 years. Children’s understanding of unprocessed foods was well aligned with research classifications, while concordance of highly processed foods with research categorisations varied. Five primary themes regarding the way children categorised foods according to their processing level emerged: type and amount of added ingredients; preparation method; packaging and storage; change in physical state or sensory experience; and growing method. Most children associated processing level with healthfulness, describing unprocessed foods as healthier. The most common reason provided for the unhealthfulness of processed foods was added ingredients, including ‘chemicals’ and ‘sugar’.
Conclusions:
The current study demonstrated that children have a working knowledge of processing that could be leveraged to encourage healthier eating patterns; however, their understanding is not always consistent with the classification systems used in research. The vocabulary used by researchers and consumers to talk about processing must be reconciled to translate findings into actionable messages.
The Everyday Cognition (ECog) scales measure cognitively based across domains of everyday abilities that are affected early in the course of neurodegenerative disorders such as Alzheimer’s disease. However, the degree to which the ECog may be differentially influenced by ethnic/racial background is unknown. This study evaluates measurement invariance of the ECog across non-Hispanic White (NHW), Black, and Hispanic individuals.
Methods:
Participants included 1177 NHW, 243 Black, and 216 Hispanic older adults from the UC Davis Alzheimer’s Disease Center Cohort who had an ECog. Differential item functioning (DIF) for each ECog domain was evaluated separately for Black and Hispanic participants compared to NHW participants. An iterative multiple group confirmatory factor analysis approach for ordinal scores was used to identify items whose measurement properties differed across groups and to adjust scores for DIF. Adjusted scores were then evaluated to test whether they were more strongly associated with cognitive function (concurrent and longitudinal change in cognition) and brain volumes (measured by brain imaging).
Results:
Varying levels, patterns, and impacts of DIF were found across domains and groups. However, the impact of DIF was relatively small, and DIF effects on scores generally were less than one-half standard error of measurement. There were no meaningful differences in associations with cognition and brain injury between DIF adjusted and unadjusted scores.
Conclusions:
Varying patterns of DIF were observed across the Black and Hispanic participants across select ECog domains. Overall, DIF effects were relatively small and did not change the relationship between the ECog and other indicators of disease.
This research explores media reporting of Indigenous students’ Programme for International Student Assessment (PISA) results in two national and 11 metropolitan Australian newspapers from 2001 to 2015. Of almost 300 articles on PISA, only 10 focused on reporting of Indigenous PISA results. While general or non-Indigenous PISA results featured in media reports, especially at the time of the publication of PISA results, there was overwhelming neglect of Indigenous results and the performance gap. A thematic analysis of articles showed mainstream PISA reporting had critical commentary which is not found in the Indigenous PISA articles. The three themes identified include: a lack of teacher quality in remote and rural schools; the debate on Gonski funding recommendations and the PISA achievement gap between Indigenous and non-Indigenous students. This study concluded the overwhelming neglect is linked to media bias, which continues to drive mainstream media coverage of Indigenous Australians.
The cognitive process of worry, which keeps negative thoughts in mind and elaborates the content, contributes to the occurrence of many mental health disorders. Our principal aim was to develop a straightforward measure of general problematic worry suitable for research and clinical treatment. Our secondary aim was to develop a measure of problematic worry specifically concerning paranoid fears.
Methods
An item pool concerning worry in the past month was evaluated in 250 non-clinical individuals and 50 patients with psychosis in a worry treatment trial. Exploratory factor analysis and item response theory (IRT) informed the selection of scale items. IRT analyses were repeated with the scales administered to 273 non-clinical individuals, 79 patients with psychosis and 93 patients with social anxiety disorder. Other clinical measures were administered to assess concurrent validity. Test-retest reliability was assessed with 75 participants. Sensitivity to change was assessed with 43 patients with psychosis.
Results
A 10-item general worry scale (Dunn Worry Questionnaire; DWQ) and a five-item paranoia worry scale (Paranoia Worries Questionnaire; PWQ) were developed. All items were highly discriminative (DWQ a = 1.98–5.03; PWQ a = 4.10–10.7), indicating small increases in latent worry lead to a high probability of item endorsement. The DWQ was highly informative across a wide range of the worry distribution, whilst the PWQ had greatest precision at clinical levels of paranoia worry. The scales demonstrated excellent internal reliability, test-retest reliability, concurrent validity and sensitivity to change.
Conclusions
The new measures of general problematic worry and worry about paranoid fears have excellent psychometric properties.
Metal–insulator–metal (MIM) resonant absorbers comprise a conducting ground plane, a dielectric of thickness t, and thin separated metal top-surface structures of dimension l. The fundamental resonance wavelength is predicted by an analytic standing-wave model based on t, l, and the dielectric refractive index spectrum. For the dielectrics SiO2, AlN, and TiO2, values for l of a few microns give fundamental resonances in the 8-12 μm long-wave infrared (LWIR) wavelength region. Agreement with theory is better for t/l exceeding 0.1. Harmonics at shorter wavelengths were already known, but we show that there are additional resonances in the far-infrared 20 - 50 μm wavelength range in MIM structures designed to have LWIR fundamental resonances. These new resonances are consistent with the model if far-IR dispersion features in the index spectrum are considered. LWIR fundamental absorptions are experimentally shown to be optimized for a ratio t/l of 0.1 to 0.3 for SiO2- and AlN-based MIM absorbers, respectively, with TiO2-based MIM optimized at an intermediate ratio.
Metal–insulator–metal (MIM) resonant absorbers comprise a conducting ground plane, a thin dielectric, and thin separated metal top-surface structures. The dielectric SiO2 strongly absorbs near 9 µm wavelength and has correspondingly strong long-wave-infrared (LWIR) dispersion for the refractive index. This dispersion results in multiple absorption resonances spanning the LWIR, which can enhance broad-band sensitivity for LWIR bolometers. Similar considerations apply to silicon nitride Si3N4. TiO2 and AlN have comparatively low dispersion and give simple single LWIR resonances. These dispersion-dependent features for infrared MIM devices are demonstrated by experiment, electrodynamic simulation, and an analytic model based on standing waves.
Vanadium Oxide has application to infrared bolometers due to high temperature coefficient of resistivity (TCR). It has attracted interest for switchable plasmonic devices due to its metal to insulator transition near room temperature. We report here the properties of vanadium oxide deposited by an aqueous spray process. The films have a ropy surface morphology with ∼70 nm surface roughness. The polycrystalline phase depends on annealing conditions. The films have TCR of ∼2%/deg, which compares well with sputtered films. Only weak evidence is found for an insulator-metal phase transition in these films.
Motivated by growing concern as to the many threats that islands face, subsequent calls for more extensive island nature conservation and recent discussion in the conservation literature about the potential for wellbeing as a useful approach to understanding how conservation affects people's lives, this paper reviews the literature in order to explore how islands and wellbeing relate and how conservation might impact that relationship. We apply a three-dimensional concept of social wellbeing to structure the discussion and illustrate the importance of understanding island–wellbeing interactions in the context of material, relational and subjective dimensions, using examples from the literature. We posit that islands and their shared characteristics of ‘islandness’ provide a useful setting in which to apply social wellbeing as a generalizable framework, which is particularly adept at illuminating the relevance of social relationships and subjective perceptions in island life – aspects that are often marginalized in more economically focused conservation impact assessments. The paper then explores in more depth the influences of island nature conservation on social wellbeing and sustainability outcomes using two case studies from the global north (UK islands) and global south (the Solomon Islands). We conclude that conservation approaches that engage with all three dimensions of wellbeing seem to be associated with success.
Patients >18 years old with sepsis and concurrent bacteremia or fungemia were included in the study; patients who were pregnant, had polymicrobial septicemia, or were transferred from an outside hospital were excluded.
INTERVENTION
Prior to the intervention, polymerase chain reaction was used to identify Staphylococcus species from positive blood cultures, and traditional laboratory techniques were used to identify non-staphylococcal species. After the intervention, matrix-assisted laser desorption ionization time-of-flight (MALDI-TOF) assay and FilmArray were also used to identify additional species. During both periods, the antimicrobial stewardship team provided prospective audit and feedback for all patients on antibiotics.
RESULTS
A total of 219 patients were enrolled in the study: 115 patients prior to the intervention and 104 after the intervention. The median time to clinical response was statistically significantly shorter in the postintervention group than in the preintervention group (2 days vs 4 days, respectively; P=.002). By Cox regression, the implementation of MALDI-TOF and FilmArray was associated with shorter time to clinical response (hazard ratio [HR], 1.360; 95% confidence interval [CI], 1.018–1.816). After controlling for potential confounders, the study group was not independently associated with clinical response (adjusted HR, 1.279; 95% CI, 0.955–1.713). Mortality was numerically, but not statistically significantly, lower in the postintervention group than in the preintervention group (7.6% vs 11.4%; P=.342).
CONCLUSIONS
In the setting of an existing antimicrobial stewardship program, implementation of MALDI-TOF and FilmArray was associated with improved time to clinical response. Further research is needed to fully describe the effect of antimicrobial stewardship programs on time to clinical response.
Background: Many patients do not respond adequately to current pharmacological or psychological treatments for psychosis. Persistent persecutory delusions are common in clinical services, and cause considerable patient distress and impairment. Our aim has been to build a new translational personalized treatment, with the potential for wide use, that leads to high rates of recovery in persistent persecutory delusions. We have been developing, and evaluating individually, brief modular interventions, each targeting a key causal factor identified from our cognitive model. These modules are now combined in “The Feeling Safe Programme”. Aims: To test the feasibility of a new translational modular treatment for persistent persecutory delusions and provide initial efficacy data. Method: 12 patients with persistent persecutory delusions in the context of non-affective psychosis were offered the 6-month Feeling Safe Programme. After assessment, patients chose from a personalized menu of treatment options. Four weekly baseline assessments were carried out, followed by monthly assessments. Recovery in the delusion was defined as conviction falling below 50% (greater doubt than certainty). Results: 11 patients completed the intervention. One patient withdrew before the first monthly assessment due to physical health problems. An average of 20 sessions (SD = 4.4) were received. Posttreatment, 7 out of 11 (64%) patients had recovery in their persistent delusions. Satisfaction ratings were high. Conclusions: The Feeling Safe Programme is feasible to use and was associated with large clinical benefits. To our knowledge this is the first treatment report focused on delusion recovery. The treatment will be tested in a randomized controlled trial.