To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Temperature may regulate seed dormancy and germination and determine the geographical distribution of species. The present study investigated the thermal limits for seed germination of Polygonum ferrugineum (Polygonaceae), an aquatic emergent herb distributed throughout tropical and subtropical America. Seed germination responses to light and temperature were evaluated both before (control) and after stratification at 10, 15 and 20°C for 7, 14 and 28 d. Germination of control seeds was ~50% at 10 and 15°C, and they did not germinate from 20 to 30°C. The best stratification treatment was 7 d at 10°C, where seed germination was >76% in the dark for all temperatures, except at 30°C, and < 60% in light conditions. A thermal time approach was applied to the seed germination results. Base temperature (Tb) was 6.3°C for non-dormant seeds and optimal temperature (To) was 20.6°C, ceiling temperature (Tc (<50)) was 32.8°C, and thermal time requirement for 50% germination was 44.4°Cd. We concluded that a fraction of P. ferrugineum seeds is dormant, has a narrow thermal niche to germinate (10 and 15°C) and that cold stratification (10°C) alleviated dormancy and amplified the thermal range permissive for germination of the species. Consequently, P. ferrugineum is expected to occur in colder environments, for example, at high altitudes. Higher temperatures decrease the probabilities of alleviate dormancy and the ability of their seeds to germinate.
Reading difficulties are one of the most significant challenges for children with neurofibromatosis type 1 (NF1). The aims of this study were to identify and categorize the types of reading impairments experienced by children with NF1 and to establish predictors of poor reading in this population.
Children aged 7–12 years with NF1 (n = 60) were compared with typically developing children (n = 36). Poor word readers with NF1 were classified according to impairment type (i.e., phonological, surface, mixed), and their reading subskills were compared. A hierarchical multiple regression was conducted to identify predictors of word reading.
Compared to controls, children with NF1 demonstrated significantly poorer literacy abilities. Of the 49 children with NF1 classified as poor readers, 20 (41%) were classified with phonological dyslexia, 24 (49%) with mixed dyslexia, and 5 (10%) fell outside classification categories. Children with mixed dyslexia displayed the most severe reading impairments. Stronger working memory, better receptive language, and fewer inattentive behaviors predicted better word reading skills.
The majority of children with NF1 experience deficits in key reading skills which are essential for them to become successful readers. Weaknesses in working memory, receptive language, and attention are associated with reading difficulties in children with NF1.
The following position statement from the Union of the European Phoniatricians, updated on 25th May 2020 (superseding the previous statement issued on 21st April 2020), contains a series of recommendations for phoniatricians and ENT surgeons who provide and/or run voice, swallowing, speech and language, or paediatric audiology services.
This material specifically aims to inform clinical practices in countries where clinics and operating theatres are reopening for elective work. It endeavours to present a current European view in relation to common procedures, many of which fall under the aegis of aerosol generating procedures.
As evidence continues to build, some of the recommended practices will undoubtedly evolve, but it is hoped that the updated position statement will offer clinicians precepts on safe clinical practice.
Spot spraying POST herbicides is an effective approach to reduce herbicide input and weed control cost. Machine vision detection of grass or grass-like weeds in turfgrass systems is a challenging task due to the similarity in plant morphology. In this work, we explored the feasibility of using image classification with deep convolutional neural networks (DCNN), including AlexNet, GoogLeNet, and VGGNet, for detection of crabgrass species (Digitaria spp.), doveweed [Murdannia nudiflora (L.) Brenan], dallisgrass (Paspalum dilatatum Poir.), and tropical signalgrass [Urochloa distachya (L.) T.Q. Nguyen] in bermudagrass [Cynodon dactylon (L.) Pers.]. VGGNet generally outperformed AlexNet and GoogLeNet in detecting selected grassy weeds. For detection of P. dilatatum, VGGNet achieved high F1 scores (≥0.97) and recall values (≥0.99). A single VGGNet model exhibited high F1 scores (≥0.93) and recall values (1.00) that reliably detected Digitaria spp., M. nudiflora, P. dilatatum, and U. distachya. Low weed density reduced the recall values of AlexNet at detecting all weed species and GoogLeNet at detecting Digitaria spp. In comparison, VGGNet achieved excellent performances (overall accuracy = 1.00) at detecting all weed species in both high and low weed-density scenarios. These results demonstrate the feasibility of using DCNN for detection of grass or grass-like weeds in turfgrass systems.
The use of mechanical restraint is a challenging area for psychiatry. Although mechanical restraint remains accepted as standard practice in some regions, there are ethical, legal and medical reasons to minimise or abolish its use. These concerns have intensified following the Convention on the Rights of Persons with Disabilities. Despite national policies to reduce use, the reporting of mechanical restraint has been poor, hampering a reasonable understanding of the epidemiology of restraint. This paper aims to develop a consistent measure of mechanical restraint and compare the measure within and across countries in the Pacific Rim.
We used the publicly available data from four Pacific Rim countries (Australia, New Zealand, Japan and the United States) to compare and contrast the reported rates of mechanical restraint. Summary measures were computed so as to enable international comparisons. Variation within each jurisdiction was also analysed.
International rates of mechanical restraint in 2017 varied from 0.03 (New Zealand) to 98.8 (Japan) restraint events per million population per day, a variation greater than 3000-fold. Restraint in Australia (0.17 events per million) and the United States (0.37 events per million) fell between these two extremes. Variation as measured by restraint events per 1000 bed-days was less extreme but still substantial. Within all four countries there was also significant variation in restraint across districts. Variation across time did not show a steady reduction in restraint in any country during the period for which data were available (starting from 2003 at the earliest).
Policies to reduce or abolish mechanical restraint do not appear to be effecting change. It is improbable that the variation in restraint within the four examined Pacific Rim countries is accountable for by psychopathology. Greater efforts at reporting, monitoring and carrying out interventions to achieve the stated aim of reducing restraint are urgently needed.
This article introduces the concept of playful work design—the process through which employees proactively create conditions within work activities that foster enjoyment and challenge without changing the design of the job itself. First, we review play theory and the motives people may have to play during work. In addition, we use the literature on proactive work behavior to argue that individuals can take personal initiative to increase person-job fit. Combining these literatures, we provide a theoretical framework for playful work design. We discuss the development and validation of an instrument to assess playful work design, and review recent studies to elucidate the psychological effects of playful work design and its possible outcomes. Finally, we briefly discuss practical implications.
Background: There is an unmet need for blood-based biomarkers that can reliably detect MS disease activity. Serum Biomarkers of interest includ Neurofilament-light-chain (NfL), Glial-fibrillary-strocyte-protein(GFAP) and Tau. Bone Marrow Transplantation (BMT) is reserved for aggressive forms of MS and has been shown to halt detectable CNS inflammatory activity for prolonged periods. Significant pre-treatment tissue damage at followed by inflammatory disease abeyance should be reflected longitudinal sera collected from these patients. Methods: Sera were collected from 23 MS patients pre-treatment, and following BMT at 3, 6, 9 and 12-months in addition from 33 non-inflammatory neurological controls. Biomarker quantification was performed with SiMoA. Results: Pre-AHSCT levels of serum NfL and GFAP but not Tau were elevated compared to controls (p=0.0001), and NfL correlated with lesion-based disease activity (6-month-relapse, MRI-T2 and Gadolinium-enhancement). 3-months post-treatment, while NfL levels remained elevated, Tau/GFAP paradoxically increased (p=0.0023/0.0017). These increases at 3m correlated with MRI ‘pseudoatrophy’ at 6-months. NfL/Tau levels dropped to that of controls by 6-months (p=0.0036/0.0159). GFAP levels dropped progressively after 6-months although even at 12-months remained higher than controls (p=0.004). Conclusions: NfL was the closest correlate of MS disease activity and treatment response. Chemotherapy-related toxicity may account for transient increases in NfL, Tau and MRI brain atrophy post-BMT.
Social patterning of infectious diseases is increasingly recognised. Previous studies of social determinants of acute respiratory illness (ARI) have found that highly educated and lower income families experience more illnesses. Subjective social status (SSS) has also been linked to symptomatic ARI, but the association may be confounded by household composition. We examined SSS and ARI in the Household Influenza Vaccine Evaluation (HIVE) Study in 2014–2015. We used SSS as a marker of social disadvantage and created a workplace disadvantage score for working adults. We examined the association between these measures and ARI incidence using mixed-effects Poisson regression models with random intercepts to account for household clustering. In univariate analyses, mean ARI was higher among children <5 years old (P < 0.001), and females (P = 0.004) at the individual level. At the household level, mean ARI was higher for households with at least one child <5 years than for those without (P = 0.002). In adjusted models, individuals in the lowest tertile of SSS had borderline significantly higher rates of ARI than those in the highest tertile (incidence rate ratio (IRR) 1.34, 95% confidence interval (CI) 0.98–1.92). Households in the lowest tertile of SSS had significantly higher ARI incidence in household-level models (IRR 1.46, 95% CI 1.05–2.03). We observed no association between workplace disadvantage and ARI. We detected an increase in the incidence of ARI for households with low SSS compared with those with high SSS, suggesting that socio-economic position has a meaningful impact on ARI incidence.
Preemergence herbicides are typically applied by broadcasting to the top of raised beds before laying the plastic mulch in plasticulture production systems. Broadleaf and grass emergence is limited to transplant holes in the mulch. As a result, most herbicides are applied under the mulch in locations where weeds cannot emerge and herbicides are unnecessary. To reduce this excessive off-target application, a precision hole-punch sprayer was developed at the University of Florida for use in plasticulture production systems. The technology facilitates the application of herbicides during the hole-punch operation immediately before transplant. Application of napropamide and S-metolachlor in an application volume of 233 L ha−1 of water using the precision hole-punch applicator had no effect on tomato and bell pepper growth and yield. Equipment accuracy ranged from 55% to 90%. Preemergence herbicide use was reduced by 88% to 92% with no reduction in weed control. The hole-punch applicator is an effective way to reduce PRE herbicide use in transplant vegetables grown using the plasticulture production system.
Weed interference during crop establishment is a serious concern for Florida strawberry [Fragaria×ananassa (Weston) Duchesne ex Rozier (pro sp.) [chiloensis×virginiana]] producers. In situ remote detection for precision herbicide application reduces both the risk of crop injury and herbicide inputs. Carolina geranium (Geranium carolinianum L.) is a widespread broadleaf weed within Florida strawberry production with sensitivity to clopyralid, the only available POST broadleaf herbicide. Geranium carolinianum leaf structure is distinct from that of the strawberry plant, which makes it an ideal candidate for pattern recognition in digital images via convolutional neural networks (CNNs). The study objective was to assess the precision of three CNNs in detecting G. carolinianum. Images of G. carolinianum growing in competition with strawberry were gathered at four sites in Hillsborough County, FL. Three CNNs were compared, including object detection–based DetectNet, image classification–based VGGNet, and GoogLeNet. Two DetectNet networks were trained to detect either leaves or canopies of G. carolinianum. Image classification using GoogLeNet and VGGNet was largely unsuccessful during validation with whole images (Fscore<0.02). CNN training using cropped images increased G. carolinianum detection during validation for VGGNet (Fscore=0.77) and GoogLeNet (Fscore=0.62). The G. carolinianum leaf–trained DetectNet achieved the highest Fscore (0.94) for plant detection during validation. Leaf-based detection led to more consistent detection of G. carolinianum within the strawberry canopy and reduced recall-related errors encountered in canopy-based training. The smaller target of leaf-based DetectNet did increase false positives, but such errors can be overcome with additional training images for network desensitization training. DetectNet was the most viable CNN tested for image-based remote sensing of G. carolinianum in competition with strawberry. Future research will identify the optimal approach for in situ detection and integrate the detection technology with a precision sprayer.
To explore the prevalence and drivers of hospital-level variability in antibiotic utilization among hematopoietic cell transplant (HCT) recipients to inform antimicrobial stewardship initiatives.
Retrospective cohort study using data merged from the Pediatric Health Information System and the Center for International Blood and Marrow Transplant Research.
The study included 27 transplant centers in freestanding children’s hospitals.
The primary outcome was days of broad-spectrum antibiotic use in the interval from day of HCT through neutrophil engraftment. Hospital antibiotic utilization rates were reported as days of therapy (DOTs) per 1,000 neutropenic days. Negative binomial regression was used to estimate hospital utilization rates, adjusting for patient covariates including demographics, transplant characteristics, and severity of illness. To better quantify the magnitude of hospital variation and to explore hospital-level drivers in addition to patient-level drivers of variation, mixed-effects negative binomial models were also constructed.
Adjusted hospital rates of antipseudomonal antibiotic use varied from 436 to 1121 DOTs per 1,000 neutropenic days, and rates of broad-spectrum, gram-positive antibiotic use varied from 153 to 728 DOTs per 1,000 neutropenic days. We detected variability by hospital in choice of antipseudomonal agent (ie, cephalosporins, penicillins, and carbapenems), but gram-positive coverage was primarily driven by vancomycin use. Considerable center-level variability remained even after controlling for additional hospital-level factors. Antibiotic use was not strongly associated with days of significant illness or mortality.
Among a homogenous population of children undergoing HCT for acute leukemia, both the quantity and spectrum of antibiotic exposure in the immediate posttransplant period varied widely. Antimicrobial stewardship initiatives can apply these data to optimize the use of antibiotics in transplant patients.
As utilization of CT imaging has risen dramatically, evidence-based decision rules and clinical decision support (CDS) tools have been developed to avoid unnecessary CT use in low risk patients. However, their ability to change physician practice has been limited to date, with a number of barriers cited. The purpose of this study was to identify the barriers and facilitators to CDS adoption following a local CDS implementation. Methods: All emergency physicians at 4 urban EDs and 1 urgent care center were randomized to voluntary evidence-based CT imaging CDS for patients with either mild traumatic brain injury (MTBI) or suspected pulmonary embolism (PE). CDS was integrated into the computerized physician order entry (CPOE) software and triggered whenever a CT scan for an eligible patient was ordered. Physicians in both the MTBI and PE arms were ranked according to their CDS use, and a stratified sampling strategy was used to randomly select 5 physicians from each of the low, medium and high CDS use tertiles in each study arm. Each physician was invited to participate in a 30-minute semi-structured interview to assess the barriers and facilitators to CDS use. Physician responses were reported using a thematic analysis. Results: A total of 202 emergency physicians were randomized to receive CDS for either MTBI or PE, triggering CDS 4561 times, and interacting with the CDS software 1936 times (42.4%). Variation in CDS use ranged from 0% to 88.9% of eligible encounters by physician. Fourteen physicians have participated in interviews to date, and data collection is ongoing. Physicians reported that CDS use was facilitated by their confidence in the evidence supporting the CDS algorithms and that it provided documentation to reduce medico-legal risk. CDS use was not impeded by concerns over missed diagnoses or patient expectations. Reported barriers to CDS use included suboptimal integration into the CPOE such as the inability to auto-populate test results, it disrupted the ordering process and was time consuming. A common concern was that CDS was implemented too late in workflow as most decision making takes place at the bedside. Physicians did not view CDS as infringing on physician autonomy, however they advised that CDS should be a passive educational option and should not automatically trigger for all physicians and eligible encounters. Conclusion: Physicians were generally supportive of CDS integration into practice, and were confident that CDS is an evidence-based way to reduce unnecessary CT studies. However, concerns were raised about the optimal integration of CDS into CPOE and workflow. Physicians also stated a preference to a passive educational approach to CDS rather than an automatic triggering mechanism requiring clinical documentation.
The controls on rapid surface lake drainage on the Greenland ice sheet (GrIS) remain uncertain, making it challenging to incorporate lake drainage into models of GrIS hydrology, and so to determine the ice-dynamic impact of meltwater reaching the ice-sheet bed. Here, we first use a lake area and volume tracking algorithm to identify rapidly draining lakes within West Greenland during summer 2014. Second, we derive hydrological, morphological, glaciological and surface-mass-balance data for various factors that may influence rapid lake drainage. Third, these factors are used within Exploratory Data Analysis to examine existing hypotheses for rapid lake drainage. This involves testing for statistical differences between the rapidly and non-rapidly draining lake types, as well as examining associations between lake size and the potential controlling factors. This study shows that the two lake types are statistically indistinguishable for almost all factors investigated, except lake area. Thus, we are unable to recommend an empirically supported, deterministic alternative to the fracture area threshold parameter for modelling rapid lake drainage within existing surface-hydrology models of the GrIS. However, if improved remotely sensed datasets (e.g. ice-velocity maps, climate model outputs) were included in future research, it may be possible to detect the causes of rapid drainage.
Avian influenza virus (AIV) subtypes H5 and H7 can infect poultry causing low pathogenicity (LP) AI, but these LPAIVs may mutate to highly pathogenic AIV in chickens or turkeys causing high mortality, hence H5/H7 subtypes demand statutory intervention. Serological surveillance in the European Union provides evidence of H5/H7 AIV exposure in apparently healthy poultry. To identify the most sensitive screening method as the first step in an algorithm to provide evidence of H5/H7 AIV infection, the standard approach of H5/H7 antibody testing by haemagglutination inhibition (HI) was compared with an ELISA, which detects antibodies to all subtypes. Sera (n = 1055) from 74 commercial chicken flocks were tested by both methods. A Bayesian approach served to estimate diagnostic test sensitivities and specificities, without assuming any ‘gold standard’. Sensitivity and specificity of the ELISA was 97% and 99.8%, and for H5/H7 HI 43% and 99.8%, respectively, although H5/H7 HI sensitivity varied considerably between infected flocks. ELISA therefore provides superior sensitivity for the screening of chicken flocks as part of an algorithm, which subsequently utilises H5/H7 HI to identify infection by these two subtypes. With the calculated sensitivity and specificity, testing nine sera per flock is sufficient to detect a flock seroprevalence of 30% with 95% probability.
Fragile X mental retardation 1 (FMR1) full-mutation expansion causes fragile X syndrome. Trans-generational fragile X syndrome transmission can be avoided by preimplantation genetic diagnosis (PGD). We describe a robust PGD strategy that can be applied to virtually any couple at risk of transmitting fragile X syndrome. This novel strategy utilises whole-genome amplification, followed by triplet-primed polymerase chain reaction (TP-PCR) for robust detection of expanded FMR1 alleles, in parallel with linked multi-marker haplotype analysis of 13 highly polymorphic microsatellite markers located within 1 Mb of the FMR1 CGG repeat, and the AMELX/Y dimorphism for gender identification. The assay was optimised and validated on single lymphoblasts isolated from fragile X reference cell lines, and applied to a simulated PGD case and a clinical in vitro fertilisation (IVF)-PGD case. In the simulated PGD case, definitive diagnosis of the expected results was achieved for all ‘embryos’. In the clinical IVF-PGD case, delivery of a healthy baby girl was achieved after transfer of an expansion-negative blastocyst. FMR1 TP-PCR reliably detects presence of expansion mutations and obviates reliance on informative normal alleles for determining expansion status in female embryos. Together with multi-marker haplotyping and gender determination, misdiagnosis and diagnostic ambiguity due to allele dropout is minimised, and couple-specific assay customisation can be avoided.
Identifying youth who may engage in future substance use could facilitate early identification of substance use disorder vulnerability. We aimed to identify biomarkers that predicted future substance use in psychiatrically un-well youth.
LASSO regression for variable selection was used to predict substance use 24.3 months after neuroimaging assessment in 73 behaviorally and emotionally dysregulated youth aged 13.9 (s.d. = 2.0) years, 30 female, from three clinical sites in the Longitudinal Assessment of Manic Symptoms (LAMS) study. Predictor variables included neural activity during a reward task, cortical thickness, and clinical and demographic variables.
Future substance use was associated with higher left middle prefrontal cortex activity, lower left ventral anterior insula activity, thicker caudal anterior cingulate cortex, higher depression and lower mania scores, not using antipsychotic medication, more parental stress, older age. This combination of variables explained 60.4% of the variance in future substance use, and accurately classified 83.6%.
These variables explained a large proportion of the variance, were useful classifiers of future substance use, and showed the value of combining multiple domains to provide a comprehensive understanding of substance use development. This may be a step toward identifying neural measures that can identify future substance use disorder risk, and act as targets for therapeutic interventions.
Supraglacial ponds play a key role in absorbing atmospheric energy and directing it to the ice of debris-covered glaciers, but the spatial and temporal distribution of these features is not well documented. We analyse 172 Landsat TM/ETM+ scenes for the period 1999–2013 to identify thawed supraglacial ponds for the debris-covered tongues of five glaciers in the Langtang Valley of Nepal. We apply an advanced atmospheric correction routine (Landcor/6S) and use band ratio and image morphological techniques to identify ponds and validate our results with 2.5 m Cartosat-1 observations. We then characterize the spatial, seasonal and interannual patterns of ponds. We find high variability in pond incidence between glaciers (May–October means of 0.08–1.69% of debris area), with ponds most frequent in zones of low surface gradient and velocity. The ponds show pronounced seasonality, appearing in the pre-monsoon as snow melts, peaking at the monsoon onset at 2% of debris-covered area, then declining in the post-monsoon as ponds drain or freeze. Ponds are highly recurrent and persistent, with 40.5% of pond locations occurring for multiple years. Rather than a trend in pond cover over the study period, we find high interannual variability for each glacier after controlling for seasonality.
In current practice, children with anatomically normal hearts routinely undergo fluoroscopy-free ablations. Infants and children with congenital heart disease (CHD) represent the most difficult population to perform catheter ablation without fluoroscopy. We report two neonatal patients with CHD in whom cardiac ablations were performed without fluoroscopy. The first infant had pulmonary atresia with intact ventricular septum with refractory supraventricular tachycardia, and the second infant presented with Ebstein’s anomaly of the tricuspid valve along with persistent supraventricular tachycardia. Both patients underwent uncomplicated, successful ablation without recurrence of arrhythmias. These cases suggest that current approaches to minimising fluoroscopy may be useful even in challenging patients such as neonates with CHD.