We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Lusala (Dioscorea hirtiflora Benth. subsp. pedicellata Milne-Redh) is an important wild edible tuber foraged widely from natural forests in Southern Zambia, but at risk from overharvesting and deforestation. Its propagation was investigated in glasshouse studies to explore potential domestication and future in situ and ex situ genetic resources conservation. Almost all tubers planted with visible shoot buds produced vines, with no effect of tuber size on vine emergence or tuber yield. Few tubers without visible shoot buds at planting produced vines, but those that did not re-tuberized. The progeny provided good vine emergence and similar tuber yield, with vines from tubers produced by re-tuberization being more vigorous. Re-tuberization in the absence of vine emergence also occurred in other experiments. Minisetts cut from the proximal end of tubers provided better vine emergence (with more from 20-mm than 10-mm-long sections) and greater tuber yield than mid- or distal minisetts. Nodal stem cuttings rooted well, vined, and provided small tubers. This study shows that lusala can be propagated successfully from tubers, minisetts, nodal vine cuttings, or mini-tubers from nodal vine cuttings, for genetic resources conservation and/or domestication. Domestication is likely to be hampered by the long period required for vines to emerge and establish. More sustainable foraging, including re-planting in natural forests, is recommended to balance consumption of lusala in the region and promote its long-term conservation.
We reviewed current state of research involving the applications of TMS and rTMS in understanding of pathophysiology as well as the treatment of ADHD.
Objectives
To assess how TMS has furthered our knowledge of neurobiological models of ADHD and to consider further research. To look at possible applications of rTMS in the management of ADHD and to evaluate the current state of research.
Methods
Literature review using an online search.
Results
The investigative studies are small in numbers, but show some promising results. TMS adds weight to the theory of a hypofunctional dopaminergic circuit involved in ADHD pathophysiology. Treatment studies (only 2) using rTMS shows some use in treatment of ADHD, such as brief improvement in attention. These studies, however, are very preliminary, small in numbers and suffer from methodological difficulties.
Conclusions
TMS has provided some useful information about the likely pathophysiology of ADHD, and results show that it is a safe an effective way to investigate and treat this condition. Much more research is needed to investigate the potential applications of this technology.
Mechanistic models (MMs) have served as causal pathway analysis and ‘decision-support’ tools within animal production systems for decades. Such models quantitatively define how a biological system works based on causal relationships and use that cumulative biological knowledge to generate predictions and recommendations (in practice) and generate/evaluate hypotheses (in research). Their limitations revolve around obtaining sufficiently accurate inputs, user training and accuracy/precision of predictions on-farm. The new wave in digitalization technologies may negate some of these challenges. New data-driven (DD) modelling methods such as machine learning (ML) and deep learning (DL) examine patterns in data to produce accurate predictions (forecasting, classification of animals, etc.). The deluge of sensor data and new self-learning modelling techniques may address some of the limitations of traditional MM approaches – access to input data (e.g. sensors) and on-farm calibration. However, most of these new methods lack transparency in the reasoning behind predictions, in contrast to MM that have historically been used to translate knowledge into wisdom. The objective of this paper is to propose means to hybridize these two seemingly divergent methodologies to advance the models we use in animal production systems and support movement towards truly knowledge-based precision agriculture. In order to identify potential niches for models in animal production of the future, a cross-species (dairy, swine and poultry) examination of the current state of the art in MM and new DD methodologies (ML, DL analytics) is undertaken. We hypothesize that there are several ways via which synergy may be achieved to advance both our predictive capabilities and system understanding, being: (1) building and utilizing data streams (e.g. intake, rumination behaviour, rumen sensors, activity sensors, environmental sensors, cameras and near IR) to apply MM in real-time and/or with new resolution and capabilities; (2) hybridization of MM and DD approaches where, for example, a ML framework is augmented by MM-generated parameters or predicted outcomes and (3) hybridization of the MM and DD approaches, where biological bounds are placed on parameters within a MM framework, and the DD system parameterizes the MM for individual animals, farms or other such clusters of data. As animal systems modellers, we should expand our toolbox to explore new DD approaches and big data to find opportunities to increase understanding of biological systems, find new patterns in data and move the field towards intelligent, knowledge-based precision agriculture systems.
Drought and high temperature each damage rice (Oryza sativa L.) crops. Their effect during seed development and maturation on subsequent seed quality development was investigated in Japonica (cv. Gleva) and Indica (cv. Aeron 1) plants grown in controlled environments subjected to drought (irrigation ended) and/or brief high temperature (HT; 3 days at 40/30°C). Ending irrigation early in cv. Gleva (7 or 14 days after anthesis, DAA) resulted in earlier plant senescence, more rapid decline in seed moisture content, more rapid seed quality development initially, but substantial decline later in planta in the ability of seeds to germinate normally. Subsequent seed storage longevity amongst later harvests was greatest with no drought because with drought it declined from 16 or 22 DAA onwards in planta, 9 or 8 days after irrigation ended, respectively. Later drought (14 or 28 DAA) also reduced seed longevity at harvest maturity (42 DAA). Well-irrigated plants provided poorer longevity the earlier during seed development they were exposed to HT (greatest at anthesis and histodifferentiation; no effect during seed maturation). Combining drought and HT damaged seed quality more than each stress alone, and more so in the Japonica cv. Gleva than the Indica cv. Aeron 1. Overall, the earlier plant drought occurred the greater the damage to subsequent seed quality; seed quality was most vulnerable to damage from plant drought and HT at anthesis and histodifferentiation; and seed quality of the Indica rice was more resilient to damage from these stresses than the Japonica.
This chapter presents reflections on next-generation ethical issues by four deans at the University of Southern California: Public Policy, Medicine, Business, and Engineering. Each of the deans was asked to reflect on some of the important ethical issues that they believe we face today or that we will face in the near future. Their responses follow.
The long-standing hypothesis that seed quality improves during seed filling, is greatest at the end of seed filling, and declines thereafter (because seed deterioration was assumed to begin then), provided a template for research in seed quality development. It was rejected by investigations where seed quality was shown to improve throughout both seed development and maturation until harvest maturity, before seed deterioration was first observed. Several other temporal patterns of seed quality development and decline have also been reported. These are portrayed and compared. The assessment suggests that the original hypothesis was too simple, because it combined several component hypotheses: (a) the seed improvement (only) phase ends before seed deterioration (only) commences; (b) there is only a brief single point in time during seed development and maturation when, in all circumstances, seed quality is maximal; (c) the seed quality improvement phase coincides perfectly with seed filling, with deterioration only post-seed filling. It is concluded that the search for the single point of maximum seed quality was a false quest because (a) seed improvement and deterioration may cycle (sequentially if not simultaneously) during seed development and maturation; (b) the relative sensitivity of the rates of improvement and deterioration to environment may differ; (c) the period of maximum quality may be brief or extended. Hence, when maximum quality is first attained, and for how long it is maintained, during seed development and maturation varies with genotype and environment. This is pertinent to quality seed production in current and future climates as it will be affected by climate change and a likelihood of more frequent coincidence of brief periods of extreme temperatures with highly sensitive phases of seed development and maturation. This is a possible tipping point for food security and for ecological diversity.
England has recently started a new paediatric influenza vaccine programme using a live-attenuated influenza vaccine (LAIV). There is uncertainty over how well the vaccine protects against more severe end-points. A test-negative case–control study was used to estimate vaccine effectiveness (VE) in vaccine-eligible children aged 2–16 years of age in preventing laboratory-confirmed influenza hospitalisation in England in the 2015–2016 season using a national sentinel laboratory surveillance system. Logistic regression was used to estimate the VE with adjustment for sex, risk-group, age group, region, ethnicity, deprivation and month of sample collection. A total of 977 individuals were included in the study (348 cases and 629 controls). The overall adjusted VE for all study ages and vaccine types was 33.4% (95% confidence interval (CI) 2.3–54.6) after adjusting for age group, sex, index of multiple deprivation, ethnicity, region, sample month and risk group. Risk group was shown to be an important confounder. The adjusted VE for all influenza types for the live-attenuated vaccine was 41.9% (95% CI 7.3–63.6) and 28.8% (95% CI −31.1 to 61.3) for the inactivated vaccine. The study provides evidence of the effectiveness of influenza vaccination in preventing hospitalisation due to laboratory-confirmed influenza in children in 2015–2016 and continues to support the rollout of the LAIV childhood programme.
Effective communication is a critical part of managing an emergency. During an emergency, the ways in which health agencies normally communicate warnings may not reach all of the intended audience. Not all communities are the same, and households within communities are diverse. Because different communities prefer different communication methods, community leaders and emergency planners need to know their communities’ preferred methods for seeking information about an emergency. This descriptive report explores findings from previous community assessments that have collected information on communication preferences, including television (TV), social media, and word-of-mouth (WoM) delivery methods. Data were analyzed from 12 Community Assessments for Public Health Emergency Response (CASPERs) conducted from 2014-2017 that included questions regarding primary and trusted communication sources. A CASPER is a rapid needs assessment designed to gather household-based information from a community. In 75.0% of the CASPERs, households reported TV as their primary source of information for specific emergency events (range = 24.0%-83.1%). Households reporting social media as their primary source of information differed widely across CASPERs (3.2%-41.8%). In five of the CASPERs, nearly one-half of households reported WoM as their primary source of information. These CASPERs were conducted in response to a specific emergency (ie, chemical spill, harmful algal bloom, hurricane, and flood). The CASPERs conducted as part of a preparedness activity had lower percentages of households reporting WoM as their primary source of information (8.3%-10.4%). The findings in this report demonstrate the need for emergency plans to include hybrid communication models, combining traditional methods with newer technologies to reach the broadest audience. Although TV was the most commonly reported preferred source of information, segments of the population relied on social media and WoM messaging. By using multiple methods for risk communication, emergency planners are more likely to reach the whole community and engage vulnerable populations that might not have access to, trust in, or understanding of traditional news sources. Multiple communication channels that include user-generated content, such as social media and WoM, can increase the timeliness of messaging and provide community members with message confirmation from sources they trust encouraging them to take protective public health actions.
WolkinAF, SchnallAH, NakataNK, EllisEM. Getting the Message Out: Social Media and Word-of-Mouth as Effective Communication Methods during Emergencies. Prehosp Disaster Med. 2019;34(1):89–94.
The biogeographic histories of parasites and pathogens are infrequently compared with those of free-living species, including their hosts. Documenting the frequency with which parasites and pathogens disperse across geographic regions contributes to understanding not only their evolution, but also the likelihood that they may become emerging infectious diseases. Haemosporidian parasites of birds (parasite genera Plasmodium, Haemoproteus and Leucocytozoon) are globally distributed, dipteran-vectored parasites. To date, over 2000 avian haemosporidian lineages have been designated by molecular barcoding methods. To achieve their current distributions, some lineages must have dispersed long distances, often over water. Here we quantify such events using the global avian haemosporidian database MalAvi and additional records primarily from the Americas. We scored lineages as belonging to one or more global biogeographic regions based on infection records. Most lineages were restricted to a single region but some were globally distributed. We also used part of the cytochrome b gene to create genus-level parasite phylogenies and scored well-supported nodes as having descendant lineages in regional sympatry or allopatry. Descendant sister lineages of Plasmodium, Haemoproteus and Leucocytozoon were distributed in allopatry in 11, 16 and 15% of investigated nodes, respectively. Although a small but significant fraction of the molecular variance in cytochrome b of all three genera could be explained by biogeographic region, global parasite dispersal likely contributed to the majority of the unexplained variance. Our results suggest that avian haemosporidian parasites have faced few geographic barriers to dispersal over their evolutionary history.
Significant increases in excess all-cause mortality, particularly in the elderly, were observed during the winter of 2014/15 in England. With influenza A(H3N2) the dominant circulating influenza A subtype, this paper determines the contribution of influenza to this excess controlling for weather. A standardised multivariable Poisson regression model was employed with weekly all-cause deaths the dependent variable for the period 2008–2015. Adjusting for extreme temperature, a total of 26 542 (95% CI 25 301–27 804) deaths in 65+ and 1942 (95% CI 1834–2052) in 15–64-year-olds were associated with influenza from week 40, 2014 to week 20, 2015. This is compatible with the circulation of influenza A(H3N2). It is the largest estimated number of influenza-related deaths in England since prior to 2008/09. The findings highlight the potential health impact of influenza and the important role of the annual influenza vaccination programme that is required to protect the population including the elderly, who are vulnerable to a severe outcome.
The resilience of seed quality in rice (Oryza sativa L.) to flooding was investigated. Pot-grown plants of the japonica cv. Gleva, the indica cv. IR64, and the introgressed line IR64-Sub1 were submerged in water, to simulate flooding, for 3‒5 days at different stages of seed development and maturation. Mean seed weight, pre-harvest sprouting, ability to germinate, and subsequent longevity in air-dry storage were assessed. Whereas seed quality in both IR64 and IR64-Sub1 was resilient to submergence, in Gleva the longer the duration of submergence and the later in development when plants were submerged the greater the pre-harvest sprouting. Thousand seed dry weight was reduced more by submergence in Gleva than IR64 or IR64-Sub1. At harvest maturity, few pre-harvest sprouted seeds were able to germinate upon rehydration after desiccation to 11‒12% moisture content. Seed longevity of the non-sprouted seed fraction in air-dry hermetic storage (40°C, 15% moisture content) was not affected greatly by submergence, but longevity of the japonica rice was less than that of the indica rices due to the former's steeper seed survival curves. Longevity of the two indica rices was predicted well by the seed viability equation and previously published estimates of viability constants for rice. The greater dormancy of IR64 and IR64-Sub1, compared with Gleva, enhanced resilience to pre-harvest sprouting and reduced thousand seed dry weight from plant submergence. There was little or no effect of plant submergence on subsequent air-dry storage longevity of non-sprouted seeds in any genotype.
Post-harvest drying prolongs seed survival in air-dry storage; previous research has shown a benefit of drying moist rice seeds at temperatures greater than recommended for genebanks (5–20°C). The aim of this study was to determine whether there is a temperature limit for safely drying rice seeds, and to explore whether the benefit to longevity is caused by high-temperature stress or continued seed development. Seeds of two rice varieties were harvested at different stages of development and dried initially either over silica gel, or intermittently (8 h day–1) or continuously (24 h day–1) over MgCl2 at temperatures between 15 and 60°C for up to 3 days. Seeds dried more rapidly the warmer the temperature. Subsequent seed longevity in hermetic storage (45°C and 10.9% moisture content) was substantially improved by increase in drying temperature up to 45°C in both cultivars, and also with further increase from 45 to 60°C in cv. ‘Macassane’. The benefit of high-temperature drying to subsequent longevity tended to diminish the later the stage of development at seed harvest. Intermittent or continuous drying at high temperatures provided broadly similar improvements to longevity, but with the greatest improvements detected in a few treatment combinations with continuous drying. Heated-air drying of rice seeds harvested before maturity improved their subsequent storage longevity by more than that which occurred during subsequent development in planta, which may have resulted from the triggering of protection mechanisms in response to high-temperature stress.
Parasites of the genera Plasmodium and Haemoproteus (Apicomplexa: Haemosporida) are a diverse group of pathogens that infect birds nearly worldwide. Despite their ubiquity, the ecological and evolutionary factors that shape the diversity and distribution of these protozoan parasites among avian communities and geographic regions are poorly understood. Based on a survey throughout the Neotropics of the haemosporidian parasites infecting manakins (Pipridae), a family of Passerine birds endemic to this region, we asked whether host relatedness, ecological similarity and geographic proximity structure parasite turnover between manakin species and local manakin assemblages. We used molecular methods to screen 1343 individuals of 30 manakin species for the presence of parasites. We found no significant correlations between manakin parasite lineage turnover and both manakin species turnover and geographic distance. Climate differences, species turnover in the larger bird community and parasite lineage turnover in non-manakin hosts did not correlate with manakin parasite lineage turnover. We also found no evidence that manakin parasite lineage turnover among host species correlates with range overlap and genetic divergence among hosts. Our analyses indicate that host switching (turnover among host species) and dispersal (turnover among locations) of haemosporidian parasites in manakins are not constrained at this scale.
Human parainfluenza virus (HPIV) infections are one of the commonest causes of upper and lower respiratory tract infections. In order to determine if there have been any recent changes in HPIV epidemiology in England and Wales, laboratory surveillance data between 1998 and 2013 were analysed. The UK national laboratory surveillance database, LabBase, and the newly established laboratory-based virological surveillance system, the Respiratory DataMart System (RDMS), were used. Descriptive analysis was performed to examine the distribution of cases by year, age, sex and serotype, and to examine the overall temporal trend using the χ2 test. A random-effects model was also employed to model the number of cases. Sixty-eight per cent of all HPIV detections were due to HPIV type 3 (HPIV-3). HPIV-3 infections were detected all year round but peaked annually between March and June. HPIV-1 and HPIV-2 circulated at lower levels accounting for 20% and 8%, respectively, peaking during the last quarter of the year with a biennial cycle. HPIV-4 was detected in smaller numbers, accounting for only 4% and also mainly observed in the last quarter of the year. However, in recent years, HPIV-4 detection has been reported much more commonly with an increase from 0% in 1998 to 3·7% in 2013. Although an overall higher proportion of HPIV infection was reported in infants (43·0%), a long-term decreasing trend in proportion in infants was observed. An increase was also observed in older age groups. Continuous surveillance will be important in tracking any future changes.
In this article, we conduct a review of introduced and enacted youth concussion legislation in Canada and present a conceptual framework and recommendations for future youth sport concussion laws. We conducted online searches of federal, provincial, and territorial legislatures to identify youth concussion bills that were introduced or successfully enacted into law. Internet searches were carried out from July 26 and 27, 2016. Online searches identified six youth concussion bills that were introduced in provincial legislatures, including two in Ontario and Nova Scotia and one each in British Columbia and Quebec. One of these bills (Ontario Bill 149, Rowan’s Law Advisory Committee Act, 2016) was enacted into provincial law; it is not actual concussion legislation, but rather a framework for possible enactment of legislation. Two bills have been introduced in federal parliament but neither bill has been enacted into law. At present, there is no provincial or federal concussion legislation that directly legislates concussion education, prevention, management, or policy in youth sports in Canada. The conceptual framework and recommendations presented here should be used to guide the design and implementation of future youth sport concussion laws in Canada.
Climate change will alter rainfall patterns. The effect of rainfall during seed development and maturation on wheat (Triticum aestivum L.) seed quality (ability to germinate normally; air-dry longevity in hermetic storage at 40°C with c. 15% moisture content) was investigated in field experiments (2011, 2012) by providing rain shelter or simulating additional rainfall. High ability to germinate was detected from mid seed filling until after harvest maturity. Subsequent longevity was more sensitive to stage of development. It increased progressively, reaching maximum values during maturation drying at 53–56 days after anthesis (DAA), 5–11 (2011) or 8–14 (2012) days beyond mass maturity; maximal values were maintained thereafter in 2011; longevity declined with further delay to harvest in 2012. Post-anthesis protection from rain had no major effect: in later harvests longevity was slightly greater than the control in each year, but in 2011 wetting treatments were also superior to the control. Wetting ears at all stages of development reduced longevity immediately, but considerable recovery in subsequent longevity occurred when seeds re-dried in planta for several days. The greatest damage to longevity from ear wetting occurred with treatments at about 56 DAA, with poorest recovery at 70 DAA (i.e. around harvest maturity) in absolute terms but at 56–70 DAA relative to gross damage. Hence, seed quality in a strongly dormant wheat variety was resilient to rain. Net damage was greatest from rain late in maturation. The phase of seed quality improvement in planta was dynamic with deterioration also occurring then, but with net improvement overall.
The effects of simulated additional rain (ear wetting, 25 mm) or of rain shelter imposed at different periods after anthesis on grain quality at maturity and the dynamics of grain filling and desiccation were investigated in UK field-grown crops of wheat (Triticum aestivum L., cvar Tybalt) in 2011 and in 2012 when June–August rainfall was 255·0 and 214·6 mm, respectively, and above the decadal mean (157·4 mm). Grain filling and desiccation were quantified well by broken-stick regressions and Gompertz curves, respectively. Rain shelter for 56 (2011) or 70 days (2012) after anthesis, and to a lesser extent during late maturation only, resulted in more rapid desiccation and hence progress to harvest maturity whereas ear wetting had negligible effects, even when applied four times. Grain-filling duration was also affected as above in 2011, but with no significant effect in 2012. In both years, there were strong positive associations between final grain dry weight and duration of filling. The treatments affected all grain quality traits in 2011: nitrogen (N) and sulphur (S) concentrations, N : S ratio, sodium dodecyl sulphate (SDS) sedimentation volume, Hagberg Falling Number (HFN), and the incidence of blackpoint. Only N concentration and blackpoint were affected significantly by treatments in 2012. Rain shelter throughout grain filling reduced N concentration, whereas rain shelter reduced the incidence of blackpoint and ear wetting increased it. In 2011, rain shelter throughout reduced S concentration, increased N : S ratio and reduced SDS. Treatment effects on HFN were not consistent within or between years. Nevertheless, a comparison between the extreme treatment means in 2012 indicated damage from late rain combined with ear wetting resulted in a reduction of c. 0·7 s in HFN/mm August rainfall, while that between samples taken immediately after ear wetting at harvest maturity or 7 days later suggested recovery from damage to HFN upon re-drying in planta. Hence, the incidence of blackpoint was the only grain quality trait affected consistently by the diverse treatments. The remaining aspects of grain quality were comparatively resilient to rain incident upon developing and maturing ears of cvar Tybalt. No consistent temporal patterns of sensitivity to shelter or ear wetting were detected for any aspect of grain quality.