We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Widespread testing for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) is necessary to curb the spread of coronavirus disease 2019 (COVID-19), but testing is undermined when the only option is a nasopharyngeal swab. Self-collected swab techniques can overcome many of the disadvantages of a nasopharyngeal swab, but they require evaluation.
Methods:
Three self-collected non-nasopharyngeal swab techniques (saline gargle, oral swab and combined oral-anterior nasal swab) were compared to a nasopharyngeal swab for SARS-CoV-2 detection at multiple COVID-19 assessment centers in Toronto, Canada. The performance characteristics of each test were assessed.
Results:
The adjusted sensitivity of the saline gargle was 0.90 (95% CI 0.86-0.94), the oral swab was 0.82 (95% CI, 0.72–0.89) and the combined oral–anterior nasal swab was 0.87 (95% CI, 0.77–0.93) compared to a nasopharyngeal swab, which demonstrated a sensitivity of ˜90% when all positive tests were the reference standard. The median cycle threshold values for the SARS-CoV-2 E-gene for concordant and discordant saline gargle specimens were 17 and 31 (P < .001), for the oral swabs these values were 17 and 28 (P < .001), and for oral–anterior nasal swabs these values were 18 and 31 (P = .007).
Conclusions:
Self-collected saline gargle and an oral–anterior nasal swab have a similar sensitivity to a nasopharyngeal swab for the detection of SARS-CoV-2. These alternative collection techniques are cheap and can eliminate barriers to testing, particularly in underserved populations.
The Kaskawulsh Glacier is an iconic outlet draining the icefields of the St. Elias Mountains in Yukon, Canada. We determine and attempt to interpret its catchment-wide mass budget since 2007. Using SPOT5/6/7 data we estimate a 2007–18 geodetic balance of −0.46 ± 0.17 m w.e. a−1. We then compute balance fluxes and observed ice fluxes at nine flux gates to examine the discrepancy between the climatic mass balance and internal mass redistribution by glacier flow. Balance fluxes are computed using a fully distributed mass-balance model driven by downscaled and bias-corrected climate-reanalysis data. Observed fluxes are calculated using NASA ITS_LIVE surface velocities and glacier cross-sectional areas derived from ice-penetrating radar data. We find the glacier is still in the early stages of dynamic adjustment to its mass imbalance. We estimate a committed terminus retreat of ~23 km under the 2007–18 climate and a lower bound of 46 km3 of committed ice loss, equivalent to ~15% of the total glacier volume.
Influenza vaccine effectiveness (VE) wanes over the course of a temperate climate winter season but little data are available from tropical countries with year-round influenza virus activity. In Singapore, a retrospective cohort study of adults vaccinated from 2013 to 2017 was conducted. Influenza vaccine failure was defined as hospital admission with polymerase chain reaction-confirmed influenza infection 2–49 weeks after vaccination. Relative VE was calculated by splitting the follow-up period into 8-week episodes (Lexis expansion) and the odds of influenza infection in the first 8-week period after vaccination (weeks 2–9) compared with subsequent 8-week periods using multivariable logistic regression adjusting for patient factors and influenza virus activity. Records of 19 298 influenza vaccinations were analysed with 617 (3.2%) influenza infections. Relative VE was stable for the first 26 weeks post-vaccination, but then declined for all three influenza types/subtypes to 69% at weeks 42–49 (95% confidence interval (CI) 52–92%, P = 0.011). VE declined fastest in older adults, in individuals with chronic pulmonary disease and in those who had been previously vaccinated within the last 2 years. Vaccine failure was significantly associated with a change in recommended vaccine strains between vaccination and observation period (adjusted odds ratio 1.26, 95% CI 1.06–1.50, P = 0.010).
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Spatial and temporal trends of remotely sensed sea-ice cover, sea surface temperatures, chlorophyll-a concentration and primary production in the Baffin Bay, Davis Strait and Labrador Sea were analyzed for the 1998–2017 period. We found spatial variability in the trends of these cryospheric, biologic and oceanographic phenomena. For example, in the northern Baffin Bay, we observed decreases in annual sea-ice persistence, yet increases along the Labrador Sea-ice edge during winter, with the latter having significant correlations with broader atmospheric patterns. In general, we observed increases in summer sea surface temperatures across the study region, except a small area of cooling along the southern Greenlandic coast. We also found significant negative trends in April chlorophyll-a and primary production followed by significant positive trends for both biological phenomena in May, owing to anomalously high values in 2014 and 2015. Notably, we found a significant positive correlation between days of monthly sea ice presence in April with May primary production quantities. Finally, we found a significant positive trend in total annual primary production over the study period. This novel finding suggests an important relationship between the timing of breakup along the sea-ice edge and peaks in biological production.
Magnetic field-assisted freeze-casting of porous alumina structures is reported. Different freeze-casting parameters were investigated and include the composition of the original slurry (Fe3O4 and PVA content) and the control of temperature during the free casting process. The optimum content of the additives in the slurry were 3 and 6 wt% for PVA and Fe3O4, respectively. These conditions provided the most unidirectional porous structures throughout the length of the sample. The sintering temperature was maintained at 1500 °C for 3 h. The application of a vertical magnetic field (parallel to ice growth direction) with using a cooling rate mode technique was found to enhance the homogeneity of the porous structure across the sample. The current study suggests that magnetic field-assisted freeze-casting is a viable method to create highly anisotropic porous ceramic structures.
Field studies were conducted in 2018 and 2019 in Arkansas, Indiana, Illinois, Missouri, and Tennessee to determine if cover-crop residue interfered with herbicides that provide residual control of Palmer amaranth and waterhemp in no-till soybean. The experiments were established in the fall with planting of cover crops (cereal rye + hairy vetch). Herbicide treatments consisted of a nontreated or no residual, acetochlor, dimethenamid-P, flumioxazin, pyroxasulfone + flumioxazin, pendimethalin, metribuzin, pyroxasulfone, and S-metolachlor. Palmer amaranth took 18 d and waterhemp took 24 d in the cover crop–alone (nontreated) treatment to reach a height of 10 cm. Compared with this treatment, all herbicides except metribuzin increased the number of days until 10-cm Palmer amaranth was present. Flumioxazin applied alone or in a mixture with pyroxasulfone were the best at delaying Palmer amaranth growing to a height of 10 cm (35 d and 33 d, respectively). The herbicides that resulted in the lowest Palmer amaranth density (1.5 to 4 times less) integrated with a cover crop were pyroxasulfone + flumioxazin, flumioxazin, pyroxasulfone, and acetochlor. Those four herbicide treatments also delayed Palmer amaranth emergence for the longest period (27 to 34 d). Waterhemp density was 7 to 14 times less with acetochlor than all the other herbicides present. Yield differences were observed for locations with waterhemp. This research supports previous research indicating that utilizing soil-residual herbicides along with cover crops improves control of Palmer amaranth and/or waterhemp.
Even in cases with complexity, simple techniques can be useful to target a specific symptom. Intrusive mental images are highly disruptive, drive emotion, and contribute to maintaining psychopathology. Cognitive science suggests that we might target intrusive images using competing tasks.
Aims:
We describe an imagery competing task technique within cognitive behavioural therapy (CBT) with a patient with bipolar disorder and post-traumatic stress disorder (PTSD) symptoms. The intervention – including Tetris computer game-play – was used (1) to target a specific image within one therapy session, and (2) to manage multiple images in daily life.
Method:
A single case (AB) design was used. (1) To target a specific image, the patient brought the image to mind and, after mental rotation instructions and game-play practice, played Tetris for 10 minutes. Outcomes, pre- and post-technique, were: vividness/distress ratings when the image was brought to mind; reported intrusion frequency over a week. (2) To manage multiple images, the patient used the intervention after an intrusive image occurred. Outcomes were weekly measures of: (a) imagery characteristics; (b) symptoms of PTSD, anxiety, depression and mania.
Results:
(1) For the target image, there were reductions in vividness (80% to 40%), distress (70% to 0%), and intrusion frequency (daily to twice/week). (2) For multiple images, there were reductions from baseline to follow-up in (a) imagery vividness (38%), realness (66%) and compellingness (23%), and (b) PTSD symptoms (Impact of Events Scale-Revised score 26.33 to 4.83).
Conclusion:
This low-intensity intervention aiming to directly target intrusive mental imagery may offer an additional, complementary tool in CBT.
Recent declines of wild pollinators and infections in honey, bumble and other bee species have raised concerns about pathogen spillover from managed honey and bumble bees to other pollinators. Parasites of honey and bumble bees include trypanosomatids and microsporidia that often exhibit low host specificity, suggesting potential for spillover to co-occurring bees via shared floral resources. However, experimental tests of trypanosomatid and microsporidial cross-infectivity outside of managed honey and bumble bees are scarce. To characterize potential cross-infectivity of honey and bumble bee-associated parasites, we inoculated three trypanosomatids and one microsporidian into five potential hosts – including four managed species – from the apid, halictid and megachilid bee families. We found evidence of cross-infection by the trypanosomatids Crithidia bombi and C. mellificae, with evidence for replication in 3/5 and 3/4 host species, respectively. These include the first reports of experimental C. bombi infection in Megachile rotundata and Osmia lignaria, and C. mellificae infection in O. lignaria and Halictus ligatus. Although inability to control amounts inoculated in O. lignaria and H. ligatus hindered estimates of parasite replication, our findings suggest a broad host range in these trypanosomatids, and underscore the need to quantify disease-mediated threats of managed social bees to sympatric pollinators.
To evaluate the extent of hypomanic symptoms in patients presenting with a current major depressive episode (MDE) and to identify characteristics differentiating patients with hypomanic symptoms from those with pure unipolar depression, using the HCL-32 self-assessment tool.
Methods
This cross-sectional diagnostic study was performed in eighteen countries. Community- and hospital- based psychiatrists included consecutively all consulting adult patients with a diagnosis of MDE and completed a questionnaire on sociodemographic variables, diagnosis, medical history, treatment and comorbid psychiatric disorders. Each patient completed the Hypomania Self-Rating Scale (HCL-32 R2), and those scoring ≥14 were assigned a diagnosis of bipolar disorder. The frequency of study variables in the bipolar disorder (BD) and unipolar depression subgroups were compared.
Results
A total of 5635 patients were included. Overall, 1645(39%) had received a diagnosis of BD, 703(16%) fulfilled DSM-IV-TR criteria for BD and 2942(54%) scored ≥14 on the HCL-32. Patients scoring ≥14 on the HCL-32 were significantly more likely to have experienced a mood switch in response to antidepressants (OR:3.4), a family history of bipolarity (OR:2.4), comorbid substance abuse (OR:2.1) or borderline personality disorder (OR:1.7) and current mixed-state symptoms (OR:1.5).
Conclusions
In patients with DSM-IV MDE self-assessed, hypomanic symptoms were present in 54% of patients, whereas only 16% fulfilled DSM-IV criteria for bipolar disorder. However, these patients presented features recognised to be associated with bipolar disorder. The presence of bipolarity in patients presenting with a major depressive disorder may be frequent and use of this questionnaire would contribute to improve awareness and prompt better diagnosis.
Characteristics of DSM-IV attention-deficit/hyperactivity disorder (ADHD) in adults can also be found as part of other psychiatric disorders. This study investigated the specificity of adult ADHD features in relation to patients with borderline personality disorder (BPD), a syndrome which shares some of its intrinsic features with ADHD and often co-occurs with ADHD. A group of 20 adult patients selected on the basis of a diagnosis of ADHD and 20 patients selected on the basis of a diagnosis of BPD were assessed by the self-report Attention Deficit Scales for Adults (ADSA). The two groups were matched for age, verbal IQ and gender. Of the nine ADSA scales, seven showed significant inter-group differences, in particular involving attention, organisation and persistence. The ‘Consistency/Long-Term’ scale, which mainly reflects impaired task and goal persistence, was the best discriminator between the groups. Furthermore, ratings on this scale correlated significantly with the error score of a computer-administered task of spatial working memory, the performance of which has been reported to be impaired in patients with ADHD. The results provide further validation for the ADSA scales and support a previous claim that ‘long-term consistencies’, i.e., related to task and goal persistence, is ‘the centrepiece behavioural issue’ for adults with ADHD.
Selenium (Se) is an essential element for human health. However, our knowledge of the prevalence of Se deficiency is less than for other micronutrients of public health concern such as iodine, iron and zinc, especially in sub-Saharan Africa (SSA). Studies of food systems in SSA, in particular in Malawi, have revealed that human Se deficiency risks are widespread and influenced strongly by geography. Direct evidence of Se deficiency risks includes nationally representative data of Se concentrations in blood plasma and urine as population biomarkers of Se status. Long-range geospatial variation in Se deficiency risks has been linked to soil characteristics and their effects on the Se concentration of food crops. Selenium deficiency risks are also linked to socio-economic status including access to animal source foods. This review highlights the need for geospatially-resolved data on the movement of Se and other micronutrients in food systems which span agriculture–nutrition–health disciplinary domains (defined as a GeoNutrition approach). Given that similar drivers of deficiency risks for Se, and other micronutrients, are likely to occur in other countries in SSA and elsewhere, micronutrient surveillance programmes should be designed accordingly.
Stressful experiences affect biological stress systems, such as the hypothalamic–pituitary–adrenal (HPA) axis. Life stress can potentially alter regulation of the HPA axis and has been associated with poorer physical and mental health. Little, however, is known about the relative influence of stressors that are encountered at different developmental periods on acute stress reactions in adulthood. In this study, we explored three models of the influence of stress exposure on cortisol reactivity to a modified version of the Trier Social Stress Test (TSST) by leveraging 37 years of longitudinal data in a high-risk birth cohort (N = 112). The cumulative stress model suggests that accumulated stress across the lifespan leads to dysregulated reactivity, whereas the biological embedding model implicates early childhood as a critical period. The sensitization model assumes that dysregulation should only occur when stress is high in both early childhood and concurrently. All of the models predicted altered reactivity, but do not anticipate its exact form. We found support for both cumulative and biological embedding effects. However, when pitted against each other, early life stress predicted more blunted cortisol responses at age 37 over and above cumulative life stress. Additional analyses revealed that stress exposure in middle childhood also predicted more blunted cortisol reactivity.
A longstanding issue in the field of nutrition is the potential inaccuracy of methods traditionally used for dietary assessment (i.e. food diaries and food frequency questionnaires). It is possible to overcome the limitations and biases of these techniques by combining them with analytical measurements in human biofluids. Metabolomic technologies are gaining popularity as nutritional tools due to their capacity to measure metabolic responses to external stimuli, such as the ingestion of certain foods. This project performed both LC-MS and 1H-NMR metabolomic profiling on serum samples collected as part of the NICOLA study (Northern Irish Cohort for the Longitudinal Study of Aging) in order to discover novel dietary biomarkers. A dietary validation cohort (NIDAS) was incorporated within NICOLA, involving 45 males and 50 females, aged 50 years and over. Participants provided detailed dietary data (4-day food diary) and blood samples at two time-points, six months apart. Serum samples were processed on two analytical platforms. 1H-NMR spectra were acquired using a Bruker 600 MHz Ascent coupled to a TCI cryoprobe and processed using Bayesil (University of Alberta, Canada). A Waters TQ-S coupled with an Acquity I-class UPLC was used in combination with a targeted commercially available kit (AbsoluteIDQ p180 kit, Biocrates). Mass spectra obtained were processed with MetIDQ and verified using MassLynx (v4.1). Data were tested for normality, and metabolite concentrations were correlated with recorded dietary intake of each food type using SPSS. Additional tests (PCA, PLS-DA, ROC Curves) were performed on MetaboAnalyst 4.0 (University of Alberta, Canada). More than 50 statistically significant (P < 0.05) food-metabolite correlations were detected, 15 of which remained significant after eliminating potential confounding from sex, age and BMI. The strongest correlations were between fruit consumption and acetic acid, and between dairy consumption and certain glycerophospholipids (e.g. LysoPC aa C20:3). Stratifying the cohort by gender yielded further correlations, including PC ae C38:2 (dairy; males), PC aa C34:4 (dairy; females), PC aa C36:4 (dairy; females) and trans-4-Hydroxyproline (meat; males). A number of potential blood-based food biomarkers were detected, many of which are gender-specific, and some are corroborated by previously published studies. However, further validation work is required. For example, biological plausibility needs to be established, and the findings need to be reproduced in other cohorts to demonstrate their applicability in larger and more diverse populations. These results contribute greatly to the ongoing efforts to discover and validate reliable nutritional biomarkers as an objective and unbiased measurement of food intake.
Indicators are necessary to monitor national progress toward commitments made to the Convention on Biological Diversity (CBD), but countries often struggle to mobilize quantitative indicators for many biodiversity targets. Assessing the extent to which countries are using measurable indicators from global and national sources by surveying 5th National Reports to the CBD, we found that nationally generated indicators were used 11 times more frequently than global indicators and only one-fifth of indicators matched those recommended by the CBD, suggesting that countries and indicator experts should work more closely to agree upon measurable, scalable, fit-for-purpose indicators for the next generation of CBD targets.
Evidence is clear that the nation is experiencing an increasing number of incompetent to stand trial (IST) admissions to state hospitals. As a result, defendants in need of treatment can wait in jail for weeks for admission for restoration. This study was conducted to better understand this growing population and to inform hospital administration about the characteristics of IST admissions.
Methods.
The study was conducted at the Department of State Hospitals (DSH) facility in Napa (DSH-Napa), a 1200-bed primarily forensic inpatient psychiatric facility located in northern California. The records of patients found IST and admitted to DSH-Napa for restoration of competence between the dates of 1/1/2009 and 12/31/2016 were eligible for inclusion in the study.
Results.
There were a total of 3158 unduplicated IST admissions available during the specified time period. Our data indicate that the number of admissions with more than 15 prior arrests increased significantly, from 17.7% in 2009 to 46.4% in 2016. In contrast, the percent of patients reporting prior inpatient psychiatric hospitalization evidenced a consistent decrease over time from over 76% in 2009 to less than 50% in 2016.
Conclusion.
Our data add to the body of literature on the potential causes of the nationwide increase in competency referrals. The literature is clear that jails and prisons are now the primary provider of the nation’s mental health care. Our data suggest that another system has assumed this role: state hospitals and other providers charged with restoring individuals to competence.
The intrinsic oxygen fugacity of a planet profoundly influences a variety of its geochemical and geophysical aspects. Most rocky bodies in our solar system formed with oxygen fugacities approximately five orders of magnitude higher than that corresponding to a hydrogen-rich gas of solar composition. Here we derive oxygen fugacities of extrasolar rocky bodies from the elemental abundances in 15 white dwarf (WD) stars polluted by accretion of rocks. We find that the intrinsic oxygen fugacities of rocks accreted by the WDs are similar to those of terrestrial planets and asteroids in our solar system. This result suggests that at least some rocky exoplanets are geophysically and geochemically similar to Earth.
Although the majority of research on revolving-door lobbyists centers on the influence they exercise during their postgovernment careers, relatively little attention is given to whether future career concerns affect the behaviors of revolving-door lobbyists while they still work in government. We argue that the revolving-door incentivizes congressional staffers to showcase their legislative skills to the lobbying market in ways that affect policymaking in Congress. Using comprehensive data on congressional staffers, we find that employing staffers who later become lobbyists is associated with higher legislative productivity for members of Congress, especially in staffers’ final terms in Congress. It also is associated with increases in a member’s bill sponsorship in the areas of health and commerce, the topics most frequently addressed by clients in the lobbying industry, as well as granting more access to lobbying firms. These results provide the systematic empirical evidence of pre-exit effects of the revolving-door among congressional staff.