To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The first neuroimaging studies of intelligence were done with positron emission tomography (PET) (Haier et al., 1988). PET was expensive and invasive but more researchers had access to neuroimaging when Magnetic Resonance Imaging (MRI) became widely available around the year 2000. The advent of advanced MRI methods enabled researchers to investigate localized (region-level) associations of brain measures and measures of intelligence in healthy individuals (Gray & Thompson, 2004; Luders, Narr, Thompson, & Toga, 2009). At the whole-brain level, MRI-based studies have reported a positive association (r = .40 to .51) between some measures of intelligence and brain size (Andreasen et al., 1993; McDaniel, 2005). Several studies at the voxel and regional levels have also demonstrated a positive correlation of morphometry with intelligence in brain regions that are especially relevant to higher cognitive functions including frontal, temporal, parietal, hippocampus, and cerebellum (Andreasen et al., 1993; Burgaleta, Johnson, Waber, Colom, & Karama, 2014; Colom et al., 2009; Karama et al., 2011; Narr et al., 2007; Shaw et al., 2006). More recently, neuroimaging studies have revealed large-scale structural and functional brain networks as potential neural substrates of intelligence (see review by Jung & Haier, 2007 and Barbey et al., 2012; Barbey, Colom, Paul, & Grafman, 2014; Colom, Karama, Jung, & Haier, 2010; Khundrakpam et al., 2017; Li et al., 2009; Sripada, Angstadt, Rutherford, & Taxali, 2019).
In 2020 a group of U.S. healthcare leaders formed the National Organization to Prevent Hospital-Acquired Pneumonia (NOHAP) to issue a call to action to address non–ventilator-associated hospital-acquired pneumonia (NVHAP). NVHAP is one of the most common and morbid healthcare-associated infections, but it is not tracked, reported, or actively prevented by most hospitals. This national call to action includes (1) launching a national healthcare conversation about NVHAP prevention; (2) adding NVHAP prevention measures to education for patients, healthcare professionals, and students; (3) challenging healthcare systems and insurers to implement and support NVHAP prevention; and (4) encouraging researchers to develop new strategies for NVHAP surveillance and prevention. The purpose of this document is to outline research needs to support the NVHAP call to action. Primary needs include the development of better models to estimate the economic cost of NVHAP, to elucidate the pathophysiology of NVHAP and identify the most promising pathways for prevention, to develop objective and efficient surveillance methods to track NVHAP, to rigorously test the impact of prevention strategies proposed to prevent NVHAP, and to identify the policy levers that will best engage hospitals in NVHAP surveillance and prevention. A joint task force developed this document including stakeholders from the Veterans’ Health Administration (VHA), the U.S. Centers for Disease Control and Prevention (CDC), The Joint Commission, the American Dental Association, the Patient Safety Movement Foundation, Oral Health Nursing Education and Practice (OHNEP), Teaching Oral-Systemic Health (TOSH), industry partners and academia.
The Hawaiian archipelago was formerly home to one of the most species-rich land snail faunas (> 752 species), with levels of endemism > 99%. Many native Hawaiian land snail species are now extinct, and the remaining fauna is vulnerable. Unfortunately, lack of information on critical habitat requirements for Hawaiian land snails limits the development of effective conservation strategies. The purpose of this study was to examine the plant host preferences of native arboreal land snails in Puʻu Kukui Watershed, West Maui, Hawaiʻi, and compare these patterns to those from similar studies on the islands of Oʻahu and Hawaiʻi. Concordant with studies on other islands, we found that four species from three diverse families of snails in Puʻu Kukui Watershed had preferences for a few species of understorey plants. These were not the most abundant canopy or mid canopy species, indicating that forests without key understorey plants may not support the few remaining lineages of native snails. Preference for Broussaisia arguta among various island endemic snails across all studies indicates that this species is important for restoration to improve snail habitat. As studies examining host plant preferences are often incongruent with studies examining snail feeding, we suggest that we are in the infancy of defining what constitutes critical habitat for most Hawaiian arboreal snails. However, our results indicate that preserving diverse native plant assemblages, particularly understorey plant species, which facilitate key interactions, is critical to the goal of conserving the remaining threatened snail fauna.
The coronavirus disease 2019 (COVID-19) pandemic has resulted in shortages of personal protective equipment (PPE), underscoring the urgent need for simple, efficient, and inexpensive methods to decontaminate masks and respirators exposed to severe acute respiratory coronavirus virus 2 (SARS-CoV-2). We hypothesized that methylene blue (MB) photochemical treatment, which has various clinical applications, could decontaminate PPE contaminated with coronavirus.
The 2 arms of the study included (1) PPE inoculation with coronaviruses followed by MB with light (MBL) decontamination treatment and (2) PPE treatment with MBL for 5 cycles of decontamination to determine maintenance of PPE performance.
MBL treatment was used to inactivate coronaviruses on 3 N95 filtering facepiece respirator (FFR) and 2 medical mask models. We inoculated FFR and medical mask materials with 3 coronaviruses, including SARS-CoV-2, and we treated them with 10 µM MB and exposed them to 50,000 lux of white light or 12,500 lux of red light for 30 minutes. In parallel, integrity was assessed after 5 cycles of decontamination using multiple US and international test methods, and the process was compared with the FDA-authorized vaporized hydrogen peroxide plus ozone (VHP+O3) decontamination method.
Overall, MBL robustly and consistently inactivated all 3 coronaviruses with 99.8% to >99.9% virus inactivation across all FFRs and medical masks tested. FFR and medical mask integrity was maintained after 5 cycles of MBL treatment, whereas 1 FFR model failed after 5 cycles of VHP+O3.
MBL treatment decontaminated respirators and masks by inactivating 3 tested coronaviruses without compromising integrity through 5 cycles of decontamination. MBL decontamination is effective, is low cost, and does not require specialized equipment, making it applicable in low- to high-resource settings.
Influencer marketing may be amplified on livestreaming platforms (e.g., Twitch) compared with asynchronous social media (e.g., YouTube). However, food and beverage marketing on Twitch has not been evaluated at a user level. The present study aimed to compare users’ self-reported exposure to food marketing and associated attitudes, consumption and purchasing behaviours on Twitch compared with YouTube. A survey administered via social media was completed by 621 Twitch users (90 % male, 64 % white, 69 % under 25 years old). Of respondents, 72 % recalled observing at least one food or beverage advertisement on Twitch. There were significant differences in the recall of specific brands advertised on Twitch (P < 0⋅01). After observing advertised products, 14 % reported craving the product and 8 % reported purchasing one. In chat rooms, 56 % observed conversations related to food and 25 % participated in such conversations. There were significant differences in the number of users who consumed various products while watching Twitch (P < 0⋅01). Of users who frequented YouTube (n 273), 65 % reported negative emotions when encountering advertising on YouTube compared with 40 % on Twitch (P < 0⋅01). A higher proportion felt Twitch's advertising primarily supported content creators (79 v. 54 %, P < 0⋅01), while a higher proportion felt that YouTube's advertising primarily supported the platform (49 v. 66 %, P < 0⋅01). The findings support that food marketing exposures on Twitch are noticeable, less bothersome to users and influence consumption and purchasing behaviours. Future studies are needed to examine how the livestreaming environment may enhance advertising effectiveness relative to asynchronous platforms.
Cooperation among militant organizations contributes to capability but also presents security risks. This is particularly the case when organizations face substantial repression from the state. As a consequence, for cooperation to emerge and persist when it is most valuable, militant groups must have means of committing to cooperation even when the incentives to defect are high. We posit that shared ideology plays this role by providing community monitoring, authority structures, trust, and transnational networks. We test this theory using new, expansive, time-series data on relationships between militant organizations from 1950 to 2016, which we introduce here. We find that when groups share an ideology, and especially a religion, they are more likely to sustain material cooperation in the face of state repression. These findings contextualize and expand upon research demonstrating that connections between violent nonstate actors strongly shape their tactical and strategic behavior.
Science, as both a body of knowledge and a process of acquiring new knowledge, is widely regarded as playing a central role in biodiversity conservation. Science undoubtedly enhances our understanding of the drivers of biodiversity loss and assists in the formulation of practical and policy responses, but it has not yet proved sufficiently influential to reverse global trends of biodiversity decline. This review seeks to critically examine the science of biodiversity conservation and to identify any hidden assumptions that, once interrogated and explored, may assist in improving conservation science, policy and practice. By drawing on existing reviews of the literature, this review describes the major themes of the literature and examines the historical shifts in the framing of conservation. It highlights the dominance of research philosophies that view conservation through a primarily ecological lens, changes in the goal(s) of conservation and a lack of clarity over the role(s) of science in biodiversity conservation. Finally, this review offers a simple framework to more clearly and consistently conceptualize the role(s) of science in biodiversity conservation in the future. Greater critical reflection on how conservation science might better accommodate multiple knowledges, goals and values could assist in ‘opening up’ new, legitimate pathways for biodiversity conservation.
Garlic mustard [Alliaria petiolata (M. Bieb.) Cavara & Grande] is a biennial invasive plant commonly found in the northeastern and midwestern United States. Although it is not recommended to apply herbicides after flowering, land managers frequently desire to conduct management during this timing. We applied glyphosate and triclopyr (3% v/v and 1% v/v using 31.8% and 39.8% acid equivalent formulations, respectively) POST to established, second-year A. petiolata populations at three locations when petals were dehiscing and evaluated control, seed production, and seed viability. POST glyphosate applications at this timing provided 100% control of A. petiolata by 4 wk after treatment at all locations, whereas triclopyr efficacy was variable, providing 38% to 62% control. Seed production was only reduced at one location, with similar results regardless of treatment. Percent seed viability was also reduced, and when combined with reductions in seed production, resulted in a 71% to 99% reduction in number of viable seeds produced per plant regardless of treatment. While applications did not eliminate viable seed production, our findings indicate that glyphosate and triclopyr applied while petals are dehiscing is a viable alternative to cutting or hand pulling at this timing, as it substantially decreased viable A. petiolata seed production.
Language and cognitive impairments are common consequences of stroke. These difficulties persist with 60% of stroke survivors continuing to experience memory problems, 50% attention deficits and 61% communication problems long after the onset of the stroke-related impairments. Such deficits are ‘invisible’ – evident only through patient report, behavioural observation or formal assessment. The impacts of such deficits are considerable and can include prolonged hospital stays, poorer functional recovery and reduced quality of life. Effective and timely rehabilitation of language (auditory comprehension, expressive language, reading and writing) and cognitive abilities (memory, attention, spatial awareness, perception and executive function) are crucial to optimise recovery after stroke. In this chapter we review the current evidence base, relevant clinical guidelines relating to language and cognitive impairments and consider the implications for stroke rehabilitation practice and future research. Speech and language therapy offers benefit to people with aphasia after stroke; intensive intervention, if tolerated, likely augments the benefits. Interventions for deficits in all non-language cognitive domains exist, but need refining and evaluating more thoroughly with a wider range of methodologies.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
External urinary collection devices (EUCDs) may reduce indwelling catheter usage and catheter-associated urinary tract infections (CAUTIs). In this retrospective quasi-experimental study, we demonstrated that EUCD implementation in women was associated with significantly decreased indwelling catheter usage and a trend (P = .10) toward decreased CAUTI per 1,000 patient days.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
To scale-out an experiential teaching kitchen in Parks and Recreation centres’ after-school programming in a large urban setting among predominantly low-income, minority children.
We evaluated the implementation of a skills-based, experiential teaching kitchen to gauge programme success. Effectiveness outcomes included pre–post measures of child-reported cooking self-efficacy, attitudes towards cooking, fruit and vegetable preference, intention to eat fruits and vegetables and willingness to try new fruits and vegetables. Process outcomes included attendance (i.e., intervention dose delivered), cost, fidelity and adaptations to the intervention.
After-school programming in Parks and Recreation Community centres in Nashville, TN.
Predominantly low-income minority children aged 6–14 years.
Of the twenty-five city community centres, twenty-one successfully implemented the programme, and nineteen of twenty-five implemented seven or more of the eight planned sessions. Among children with pre–post data (n 369), mean age was 8·8 (sd 1·9) years, and 53·7 % were female. All five effectiveness measures significantly improved (P < 0·001). Attendance at sessions ranged from 36·3 % of children not attending any sessions to 36·6 % of children attending at least four sessions. Across all centres, fidelity was 97·5 %. The average food cost per serving was $1·37.
This type of nutritional education and skills building experiential teaching kitchen can be successfully implemented in a community setting with high fidelity, effectiveness and organisational alignment, while also expanding reach to low-income, underserved children.
In the absence of pyuria, positive urine cultures are unlikely to represent infection. Conditional urine reflex culture policies have the potential to limit unnecessary urine culturing. We evaluated the impact of this diagnostic stewardship intervention.
We conducted a retrospective, quasi-experimental (nonrandomized) study, with interrupted time series, from August 2013 to January 2018 to examine rates of urine cultures before versus after the policy intervention. We compared 3 intervention sites to 3 control sites in an aggregated series using segmented negative binomial regression.
The study included 6 acute-care hospitals within the Veterans’ Health Administration across the United States.
Adult patients with at least 1 urinalysis ordered during acute-care admission, excluding pregnant patients or those undergoing urological procedures, were included.
At the intervention sites, urine cultures were performed if a preceding urinalysis met prespecified criteria. No such restrictions occurred at the control sites. The primary outcome was the rate of urine cultures performed per 1,000 patient days. The safety outcome was the rate of gram-negative bloodstream infection per 1,000 patient days.
The study included 224,573 urine cultures from 50,901 admissions in 24,759 unique patients. Among the intervention sites, the overall average number of urine cultures performed did not significantly decrease relative to the preintervention period (5.9% decrease; P = 0.8) but did decrease by 21% relative to control sites (P < .01). We detected no significant difference in the rates of gram-negative bloodstream infection among intervention or control sites (P = .49).
Conditional urine reflex culture policies were associated with a decrease in urine culturing without a change in the incidence of gram-negative bloodstream infection.