We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Two studies were conducted in 2022 and 2023 near Rocky Mount and Clayton, NC, to determine the optimal granular ammonium sulfate (AMS) rate and application timing for pyroxasulfone-coated AMS. In the rate study, AMS rates included 161, 214, 267, 321, 374, 428, and 481 kg ha-1, equivalent to 34, 45, 56, 67, 79, 90, and 101 kg N ha-1, respectively. All rates were coated with pyroxasulfone at 118 g ai ha-1 and top-dressed onto 5- to 7-leaf cotton. In the timing study, pyroxasulfone (118 g ai ha-1) was coated on AMS and top-dressed at 321 kg ha-1 (67 kg N ha-1) onto 5- to 7-leaf, 9- to 11-leaf, and first bloom cotton. In both studies, weed control and cotton tolerance to pyroxasulfone-coated AMS was compared to pyroxasulfone applied postemergence (POST) and postemergence-directed (POST-directed). The check in both studies received non-herbicide-treated AMS (321 kg ha-1). Before treatment applications, all plots (including the check) were maintained weed-free with glyphosate and glufosinate. In both studies, pyroxasulfone applied POST was most injurious (8 to 16%), while pyroxasulfone-coated AMS resulted in ≤ 4% injury. Additionally, no differences in cotton lint yield were observed in both studies. With the exception of the lowest rate of AMS (161 kg ha-1; 79%), all AMS rates coated with pyroxasulfone controlled Palmer amaranth ≥ 83%, comparable to pyroxasulfone applied POST (92%) and POST-directed (89%). In the timing study, the application method did not affect Palmer amaranth control; however, applications made at the mid- and late timings outperformed early applications. These results indicate pyroxasulfone-coated AMS can control Palmer amaranth comparable to pyroxasulfone applied POST and POST-directed, with minimal risk of cotton injury. However, the application timing could warrant additional treatment to achieve adequate late-season weed control.
This study bridges the study of social inclusion with welfare regime theory. By linking social inclusion with welfare regimes, we establish a novel analytical framework for assessing global trends and national divergences in social inclusion based on a multidimensional view of the concept. While scholars have developed typologies for social inclusion and welfare regimes independent of each other, limited insights exist on how social inclusion relates to welfare regime typologies. We develop a social inclusion index for 225 countries using principal component analysis with 10 measures of social inclusion from the United Nations’ Sustainable Development Goals Indicators Database. We then employ clustering algorithms to inductively group countries based on the index. We find six “worlds” of social inclusion based on the index and other social factors – the Low, Mid, and High Social Inclusion regimes and the Low, Mid, and High Social Exclusion regimes.
The relationship between measures of verbal fluency and certain personality traits is examined by factor techniques. From a matrix of eight factor scores derived from mental tests plus five personality scores, six factors were obtained. An oblique solution lends limited support to the hypothesized relationship between the two domains.
As pressures build, this study can serve as a guidepost for scholars and policymakers to learn from global trends in social inclusion and social inclusion policy. Our systematic review of global trends in social inclusion and social inclusion policy points to the general expansion and retrenchment of social inclusion policy amid increasing social exclusion associated with trends such as globalisation and neoliberalism. In the absence of recent, detailed case descriptions of social inclusion policy at the national level, we call for a renewed scholarly focus on case studies of social inclusion policy. We also discuss the likelihood that persistent climate change, migration, ageing populations, and technological innovations are poised to dramatically influence global social inclusion and suggest that future research should seek to understand the relationship between these developments and social inclusion. As we look to the future and the growing needs of excluded populations, we aim to use this study to learn from and build on these global trends to promote the inclusion of excluded groups around the world.
The aim of this study was to determine whether there was a significant change in cardiac [123I]-metaiodobenzylguanidine uptake between baseline and follow-up in individuals with mild cognitive impairment with Lewy bodies (MCI-LB) who had normal baseline scans. Eight participants with a diagnosis of probable MCI-LB and a normal baseline scan consented to a follow-up scan between 2 and 4 years after baseline. All eight repeat scans remained normal; however, in three cases uptake decreased by more than 10%. The mean change in uptake between baseline and repeat was −5.2% (range: −23.8% to +7.0%). The interpolated mean annual change in uptake was −1.6%.
To evaluate the impact of receptive vocabulary versus years of education on neuropsychological performance of Black and White older adults.
Method:
A community-based prospectively enrolled cohort (n = 1,007; 130 Black, 877 White) in the Emory Healthy Brain Study were administered the NIH Toolbox Picture Vocabulary Test and neuropsychological measures. Group differences were evaluated with age, sex, and education or age, sex, and Toolbox Vocabulary scores as covariates to determine whether performance differences between Black versus White participants were attenuated or eliminated.
Results:
With vocabulary as a covariate, the main effect of race was no longer significant for the MoCA, Phonemic Fluency, Rey Auditory Verbal Learning Test, and Rey Complex Figure Test immediate and delayed recall. Although still significantly different between groups, the effect sizes for Animal Fluency, Trails B-A, Symbol Digit Modalities Test, and Rey Copy were attenuated, with the greatest reductions occurring for the Multilingual Naming Test and Judgment of Line Orientation.
Conclusions:
Findings support the value of using receptive vocabulary as a proxy for premorbid ability level when comparing the cognitive performance of Black and White older adults. The results extend investigations using measures of single word reading to encompass measures assessing word meaning.
An experiment was conducted in 2022 and 2023 near Rocky Mount and Clayton, NC, to evaluate residual herbicide-coated fertilizer for cotton tolerance and Palmer amaranth control. Treatments included acetochlor; atrazine; dimethenamid-P; diuron; flumioxazin; fluometuron; fluridone; fomesafen; linuron; metribuzin; pendimethalin; pyroxasulfone; pyroxasulfone + carfentrazone; S-metolachlor; and sulfentrazone. Each herbicide was individually coated on granular ammonium sulfate (AMS) and top-dressed at 321 kg ha-1 (67 kg N ha-1) onto 5- to 7-leaf cotton. The check received the equivalent rate of non-herbicide-treated AMS. Before top-dress, all plots (including the check) were treated with glyphosate and glufosinate to control previously emerged weeds. All herbicides resulted in transient cotton injury, except metribuzin. Cotton response to metribuzin varied by year and location. In 2022, metribuzin caused 11 to 39% and 8 to 17% injury at Clayton and Rocky Mount, respectively. In 2023, metribuzin caused 13 to 32% injury at Clayton and 73 to 84% injury at Rocky Mount. Pyroxasulfone (91%), pyroxasulfone + carfentrazone (89%), fomesafen (87%), fluridone (86%), flumioxazin (86%), and atrazine (85%) controlled Palmer amaranth ≥ 85%. Pendimethalin and fluometuron were the least effective treatments, resulting in 58% and 62% control, respectively. As anticipated, early season metribuzin injury translated into yield loss; plots treated with metribuzin yielded 640 kg ha-1 and were only comparable to linuron (790 kg ha-1). These findings research suggest, with the exception of metribuzin, residual herbicides coated on AMS may be suitable and effective in cotton production, providing growers with additional modes of action for late-season control of multiple herbicide-resistant Palmer amaranth.
In Georgia plasticulture vegetable production, a single installation of plastic mulch is used for up to five cropping cycles over an 18-mo period. Preplant applications of glyphosate and glufosinate ensure fields are weed-free before transplanting, but recent data suggest that residual activity of these herbicides may pose a risk to transplanted vegetables. Glyphosate and glufosinate were applied preplant in combination with three different planting configurations, including 1) a new plant hole into new mulch, 2) a preexisting plant hole, 3) or a new plant hole spaced 15 cm from a preexisting plant hole (adjacent). Following herbicide application, overhead irrigation was used to remove residues from the mulch before punching transplanting holes for tomato, cucumber, or squash. Visible injury; widths; biomass; and yield of tomato, cucumber, or squash were not influenced by herbicide in the new mulch or adjacent planting configurations. When glyphosate was applied at 5.0 kg ae ha−1 and the new crop was planted into preexisting holes, tomato was injured by 45%, with reduced heights, biomass, and yields; at 2.5 kg ae ha−1 injury of 8% and a biomass reduction was observed. Cucumber and squash were injured by 23% to 32% by glyphosate at 5.0 kg ae ha−1, with reductions in growth and early-season yield; lower rates did not influence crop growth or production when the crop was placed into a preexisting plant hole. Glufosinate applied at the same rates did not affect tomato growth or yield when planted into preexisting plant holes. Cucumber, when planted into preexisting plant holes, was injured by 43% to 75% from glufosinate, with reductions in height and biomass, and yield losses of 1.3 to 2.6 kg ai ha−1; similar results from glufosinate were observed in squash. In multi-crop plasticulture production, growers should ensure vegetable transplants are placed a minimum of 15 cm away from soil exposed to these herbicides.
This chapter gives an account of Goldsmith’s relationship with the book trade in general, but more specifically with the booksellers who assisted, and sometimes troubled, his access to the republic of letters. It traces his ascent in the business of writing from his work for Ralph Griffiths as an anonymous writer of reviews through to his later, acclaimed works and his relationship, sometimes conflictual, with the ‘fame machine’ powered by the eighteenth-century book trade.
Herbicide drift to sensitive crops can result in significant injury, yield loss, and even crop destruction. When pesticide drift is reported to the Georgia Department of Agriculture (GDA), tissue samples are collected and analyzed for residues. Seven field studies were conducted in 2020 and 2021 in cooperation with the GDA to evaluate the effect of (1) time interval between simulated drift event and sampling, (2) low-dose herbicide rates, and (3) the sample collection methods on detecting herbicide residues in cotton (Gossypium hirsutum L.) foliage. Simulated drift rates of 2,4-D, dicamba, and imazapyr were applied to non-tolerant cotton in the 8- to 9-leaf stage with plant samples collected at 7 or 21 d after treatment (DAT). During collection, plant sampling consisted of removing entire plants or removing new growth occurring after the 7-leaf stage. Visual cotton injury from 2,4-D reached 43% to 75% at 0.001 and 0.004 kg ae ha−1, respectively; for dicamba, it was 9% to 41% at 0.003 or 0.014 kg ae ha−1, respectively; and for imazapyr, it was 1% to 74% with 0.004 and 0.03 kg ae ha−1 rates, respectively. Yield loss was observed with both rates of 2,4-D (11% to 51%) and with the high rate of imazapyr (52%); dicamba did not influence yield. Herbicide residues were detected in 88%, 88%, and 69% of samples collected from plants treated with 2,4-D, dicamba, and imazapyr, respectively, at 7 DAT compared with 25%, 16%, and 22% when samples were collected at 21 DAT, highlighting the importance of sampling quickly after a drift event. Although the interval between drift event and sampling, drift rate, and sampling method can all influence residue detection for 2,4-D, dicamba, and imazapyr, the factor with the greatest influence is the amount of time between drift and sample collection.
As climate change progresses, natural hazards are projected to continue to increase in frequency and intensity, posing a new form of social risk, implicating both the welfare and environmental state and raising the salience of ecosocial policy as a mechanism to attend to the distributional effects of climate change mitigation and adaptation. This study posits a novel conceptual framework for ecosocial policy and offers the US ecosocial safety net as a case analysis. While we conceptualise disaster relief policy as a mode of the environmental state, it includes unique ecosocial policies that constitute the backbone of the US ecosocial safety net. This study describes and compares the developmental and functional synergies between the US welfare and environmental state manifested in the form of an ecosocial safety net by explicating the Individual Assistance Program and the National Flood Insurance Program. Our findings reveal synergies between US disaster relief and welfare, including parallel developmental trends, philosophies of deserving/undeserving, functions of racial capitalism and relationships with economic growth. This study and its conceptual framework of ecosocial policy offer a groundwork for the study of ecosocial policy in other contexts.
Integrated weed management practices that reduce selection for resistance on herbicides are critical to delay resistance. To quantify the reduction in selection for resistance placed on Palmer amaranth from 2,4-D applied postemergence in cotton, an experiment was conducted three times in Georgia during 2020 and 2021 evaluating the benefits of (i) a cover crop, (ii) preemergence herbicides, and (iii) timeliness of applications. When a timely total-postemergence program of glyphosate + 2,4-D was applied three times over the season in a conventionally tilled system, 281,690 glyphosate-resistant Palmer amaranth plants ha–1 were exposed to 2,4-D. Over 61,500 of these plants were exposed to multiple 2,4-D applications. Altering the production system to conservation tillage, and including a rolled-rye cover crop, reduced the total number of plants exposed to 2,4-D for the season by 72% and the number of plants exposed multiple times by 60%. Even more effective, including a mixture of residual preemergence herbicides reduced the number of plants exposed to 2,4-D at least once over 99.9%, and reduced multiple exposures over 99.3% for the season; this benefit was observed for both conventional and conservation tillage systems. Delaying the initial application of the total-postemergence program did not influence the number of Palmer amaranth plants treated at least once but increased the number of plants treated multiple times by a factor of 3.7 times. As a result of early-season weed competition, cotton height and yield reductions were also associated with both lack of preemergence residuals and delayed postemergence applications. When considering the goal of minimizing the number of Palmer amaranth treated with a postemergence application of 2,4-D in a cotton system, the preemergence was the most effective option followed by (fb) the cover crop fb making timely postemergence applications. However, the most effective approach was to utilize each of these tactics in the same growing season.
Evaluate system-wide antimicrobial stewardship program (ASP) update impact on intravenous (IV)-to-oral (PO) antimicrobial conversion in select community hospitals through pre- and postimplementation trend analysis.
Methods:
Retrospective study across seven hospitals: region one (four hospitals, 827 beds) with tele-ASP managed by infectious diseases (ID)-trained pharmacists and region two (three hospitals, 498 beds) without. Data were collected pre- (April 2022–September 2022) and postimplementation (April 2023–September 2023) on nine antimicrobials for the IV to PO days of therapy (DOTs). Antimicrobial administration route and (DOTs)/1,000 patient days were extracted from the electronical medical record (EMR). Primary outcome: reduction in IV DOTs/1,000 patient days. Secondary outcomes: decrease in IV usage via PO:total antimicrobial ratios and cost reduction.
Results:
In region one, IV usage decreased from 461 to 209/1,000 patient days (P = < .001), while PO usage increased from 289 to 412/1,000 patient days (P = < .001). Total antimicrobial use decreased from 750 to 621/1,000 patient days (P = < .001). In region two, IV usage decreased from 300 to 243/1,000 patient days (P = .005), and PO usage rose from 154 to 198/1,000 patient days (P = .031). The PO:total antimicrobial ratios increased in both regions, from .42–.52 to .60–.70 in region one and from .36–.55 to .46–.55 in region two. IV cost savings amounted to $19,359.77 in region one and $4,038.51 in region two.
Conclusion:
The ASP intervention improved IV-to-PO conversion rates in both regions, highlighting the contribution of ID-trained pharmacists in enhancing ASP initiatives in region one and suggesting tele-ASP expansion may be beneficial in resource-constrained settings.
Poor mental health of university students is a growing concern for public health. Indeed, academic settings may exacerbate students’ vulnerability to mental health issues. Nonetheless, university students are often unable to seek mental health support due to barriers, at both individual and organisational level. Digital technologies are proved to be effective in collecting health-related information and in managing psychological distress, representing useful instruments to tackle mental health needs, especially considering their accessibility and cost-effectiveness.
Objectives
Although digital tools are recognised to be useful for mental health support, university students’ opinions and experiences related to such interventions are still to be explored. In this qualitative research, we aimed to address this gap in the scientific literature.
Methods
Data were drawn from “the CAMPUS study”, which longitudinally assesses students’ mental health at the University of Milano-Bicocca (Italy) and the University of Surrey (United Kingdom). We performed detailed interviews and analysed the main themes of the transcripts. We also performed a cross-cultural comparison between Italy and the United Kingdom.
Results
Across 33 interviews, five themes were identified, and an explanatory model was developed. From the students’ perspective, social media, podcasts, and apps could be sources of significant mental health content. On the one hand, students recognised wide availability and anonymity as advantages that make digital technologies suitable for primary to tertiary prevention, to reduce mental health stigma, and as an extension of face-to-face interventions. On the other hand, perceived disadvantages were lower efficacy compared to in-person approaches, lack of personalisation, and difficulties in engagement. Students’ opinions and perspectives could be widely influenced by cultural and individual background.
Conclusions
Digital tools may be an effective option to address mental health needs of university students. Since face-to-face contact remains essential, digital interventions should be integrated with in-person ones, in order to offer a multi-modal approach to mental well-being.
The most prominent authigenic reaction in Holocene tuffaceous sediments at Teels Marsh, Nevada, is the hydration of rhyolitic glass by interstitial brines and the subsequent formation of phillipsite. This reaction has the form: rhyolitic glass + H2O → hydrous alkali alumninosilicate gel → phillipsite. Phillipsite is the most abundant authigenic phase in the tuffaceous sediments (>95%), analcime is the next most abundant phase, and clinoptilolite occurs as a trace mineral in the <2-mm fraction. Analcime forms by the reaction of phillipsite and Na+. Gaylussite and searlesite also are common authigenic phases at Teels Marsh. The concentration of silica in the interstitial brines is controlled by one or more of the authigenic reactions at less than 100 ppm. A stoichiometric equation for the reaction of phillipsite to analcime at Teels Marsh is:
Sodium and potassium activities of brines associated with both phillipsite and analcime were used to estimate the equilibrium constant for this reaction as 3.04 × 10−5. The ΔG0 value for the reaction is +6.2 kcal/mole at 25°C and 1 atm pressure. The estimated ΔG0 value of phillipsite, using this reaction, is −1072.8 kcal/mole at 25°C and 1 atm.
The complementary feeding period (6-23 months of age) is when solid foods are introduced alongside breastmilk or infant formula and is the most significant dietary change a person will experience. The introduction of complementary foods is important to meet changing nutritional requirements(1). Despite the rising Asian population in New Zealand, and the importance of nutrition during the complementary feeding period, there is currently no research on Asian New Zealand (NZ) infants’ micronutrient intakes from complementary foods. Complementary foods are a more easily modifiable component of the diet than breastmilk or other infant milk intake. This study aimed to compare the dietary intake of micronutrients from complementary foods of Asian infants and non-Asian infants in NZ. This study reported a secondary analysis of the First Foods New Zealand cross-sectional study of infants (aged 7.0-9.9 months) in Dunedin and Auckland. 24-hour recall data were analysed using FoodFiles 10 software with the NZ food composition database FOODfiles 2018, and additional data for commercial complementary foods(2). The multiple source method was used to estimate usual dietary intake. Ethnicity was collected from the main questionnaire of the study, answered by the respondents (the infant’s parent/caregiver). Within the Asian NZ group, three Asian subgroups were identified – South East Asian, East Asian, and South Asian. The non-Asian group included all remaining participants of non-Asian ethnicities. Most nutrient reference values (NRV’s)(3) available for the 7-12 month age group are for total intake from complementary foods and infant milks, so the adequacy for the micronutrient intakes from complementary foods alone could not be determined. Vitamin A was the only micronutrient investigated in this analysis that had an NRV available from complementary foods only, allowing conclusions around adequacy to be made. The Asian NZ group (n = 99) had lower mean group intakes than the non-Asian group (n = 526) for vitamin A (274µg vs. 329µg), and vitamin B12 (0.49µg vs. 0.65µg), and similar intakes for vitamin C (27.8mg vs. 28.5mg), and zinc (1.7mg vs. 1.9mg). Mean group iron intakes were the same for both groups (3.0mg). The AI for vitamin A from complementary foods (244µg) was exceeded by the mean intakes for both groups, suggesting that Vitamin A intakes were adequate. The complementary feeding period is a critical time for obtaining nutrients essential for development and growth. The results from this study indicate that Asian NZ infants have lower intakes of two of the micronutrients of interest than the non-Asian infants in NZ. However, future research is needed with the inclusion of infant milk intake in these groups to understand the total intake of the micronutrients. Vitamin A intakes do appear to be adequate in NZ infants.
The prevalence of food allergies in New Zealand infants is unknown; however, it is thought to be similar to Australia, where the prevalence is over 10% of 1-year-olds(1). Current New Zealand recommendations for reducing the risk of food allergies are to: offer all infants major food allergens (age appropriate texture) at the start of complementary feeding (around 6 months); ensure major allergens are given to all infants before 1 year; once a major allergen is tolerated, maintain tolerance by regularly (approximately twice a week) offering the allergen food; and continue breastfeeding while introducing complementary foods(2). To our knowledge, there is no research investigating whether parents follow these recommendations. Therefore, this study aimed to explore parental offering of major food allergens to infants during complementary feeding and parental-reported food allergies. The cross-sectional study included 625 parent-infant dyads from the multi-centred (Auckland and Dunedin) First Foods New Zealand study. Infants were 7-10 months of age and participants were recruited in 2020-2022. This secondary analysis included the use of a study questionnaire and 24-hour diet recall data. The questionnaire included determining whether the infant was currently breastfed, whether major food allergens were offered to the infant, whether parents intended to avoid any foods during the first year of life, whether the infant had any known food allergies, and if so, how they were diagnosed. For assessing consumers of major food allergens, 24-hour diet recall data was used (2 days per infant). The questionnaire was used to determine that all major food allergens were offered to 17% of infants aged 9-10 months. On the diet recall days, dairy (94.4%) and wheat (91.2%) were the most common major food allergens consumed. Breastfed infants (n = 414) were more likely to consume sesame than non-breastfed infants (n = 211) (48.8% vs 33.7%, p≤0.001). Overall, 12.6% of infants had a parental-reported food allergy, with egg allergy being the most common (45.6% of the parents who reported a food allergy). A symptomatic response after exposure was the most common diagnostic tool. In conclusion, only 17% of infants were offered all major food allergens by 9-10 months of age. More guidance may be required to ensure current recommendations are followed and that all major food allergens are introduced by 1 year of age. These results provide critical insight into parents’ current practices, which is essential in determining whether more targeted advice regarding allergy prevention and diagnosis is required.