To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study integrated an experimental medicine approach and a randomized cross-over clinical trial design following CONSORT recommendations to evaluate a cognitive training (CT) intervention for attention deficit hyperactivity disorder (ADHD). The experimental medicine approach was adopted because of documented pathophysiological heterogeneity within the diagnosis of ADHD. The cross-over design was adopted to provide the intervention for all participants and make maximum use of data.
Children (n = 93, mean age 7.3 +/− 1.1 years) with or sub-threshold for ADHD were randomly assigned to CT exercises over 15 weeks, before or after 15 weeks of treatment-as-usual (TAU). Fifteen dropped out of the CT/TAU group and 12 out of the TAU/CT group, leaving 66 for cross-over analysis. Seven in the CT/TAU group completed CT before dropping out making 73 available for experimental medicine analyses. Attention, response inhibition, and working memory were assessed before and after CT and TAU.
Children were more likely to improve with CT than TAU (27/66 v. 13/66, McNemar p = 0.02). Consistent with the experimental medicine hypotheses, responders improved on all tests of executive function (p = 0.009–0.01) while non-responders improved on none (p = 0.27–0.81). The degree of clinical improvement was predicted by baseline and change scores in focused attention and working memory (p = 0.008). The response rate was higher in inattentive and combined subtypes than hyperactive-impulsive subtype (p = 0.003).
Targeting cognitive dysfunction decreases clinical symptoms in proportion to improvement in cognition. Inattentive and combined subtypes were more likely to respond, consistent with targeted pathology and clinically relevant heterogeneity within ADHD.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
Our ALMA observations of HCO+ and HCN show such redshifted absorption toward an isolated core, BHR 71. Both lines show a similar redshifted absorption profile. We also found emissions of complex organic molecules (COMs) around 345 GHz from a compact region centered on the continuum source, which is barely resolved with a beam of 0″27, corresponding to ∼50 AU.
Different diagnostic interviews are used as reference standards for major depression classification in research. Semi-structured interviews involve clinical judgement, whereas fully structured interviews are completely scripted. The Mini International Neuropsychiatric Interview (MINI), a brief fully structured interview, is also sometimes used. It is not known whether interview method is associated with probability of major depression classification.
To evaluate the association between interview method and odds of major depression classification, controlling for depressive symptom scores and participant characteristics.
Data collected for an individual participant data meta-analysis of Patient Health Questionnaire-9 (PHQ-9) diagnostic accuracy were analysed and binomial generalised linear mixed models were fit.
A total of 17 158 participants (2287 with major depression) from 57 primary studies were analysed. Among fully structured interviews, odds of major depression were higher for the MINI compared with the Composite International Diagnostic Interview (CIDI) (odds ratio (OR) = 2.10; 95% CI = 1.15–3.87). Compared with semi-structured interviews, fully structured interviews (MINI excluded) were non-significantly more likely to classify participants with low-level depressive symptoms (PHQ-9 scores ≤6) as having major depression (OR = 3.13; 95% CI = 0.98–10.00), similarly likely for moderate-level symptoms (PHQ-9 scores 7–15) (OR = 0.96; 95% CI = 0.56–1.66) and significantly less likely for high-level symptoms (PHQ-9 scores ≥16) (OR = 0.50; 95% CI = 0.26–0.97).
The MINI may identify more people as depressed than the CIDI, and semi-structured and fully structured interviews may not be interchangeable methods, but these results should be replicated.
Declaration of interest
Drs Jetté and Patten declare that they received a grant, outside the submitted work, from the Hotchkiss Brain Institute, which was jointly funded by the Institute and Pfizer. Pfizer was the original sponsor of the development of the PHQ-9, which is now in the public domain. Dr Chan is a steering committee member or consultant of Astra Zeneca, Bayer, Lilly, MSD and Pfizer. She has received sponsorships and honorarium for giving lectures and providing consultancy and her affiliated institution has received research grants from these companies. Dr Hegerl declares that within the past 3 years, he was an advisory board member for Lundbeck, Servier and Otsuka Pharma; a consultant for Bayer Pharma; and a speaker for Medice Arzneimittel, Novartis, and Roche Pharma, all outside the submitted work. Dr Inagaki declares that he has received grants from Novartis Pharma, lecture fees from Pfizer, Mochida, Shionogi, Sumitomo Dainippon Pharma, Daiichi-Sankyo, Meiji Seika and Takeda, and royalties from Nippon Hyoron Sha, Nanzando, Seiwa Shoten, Igaku-shoin and Technomics, all outside of the submitted work. Dr Yamada reports personal fees from Meiji Seika Pharma Co., Ltd., MSD K.K., Asahi Kasei Pharma Corporation, Seishin Shobo, Seiwa Shoten Co., Ltd., Igaku-shoin Ltd., Chugai Igakusha and Sentan Igakusha, all outside the submitted work. All other authors declare no competing interests. No funder had any role in the design and conduct of the study; collection, management, analysis and interpretation of the data; preparation, review or approval of the manuscript; and decision to submit the manuscript for publication.
Domesticated from a wild teosinte grass in southern Mexico more than 6,000 years ago, maize (Zea mays ssp. mays L.) is today the world's single most important food crop. In this chapter I follow the initial diffusion of this major domesticate from its origin north through Mexico into the southwestern United States and then across the southern Plains into eastern North America, and consider its evolution under human selection during its long journey. Maize provides a well-documented example of a domesticate that was not part of a crop complex carried into new regions by expanding agricultural societies, but rather was exchanged, unaccompanied by other domesticates, from group to group.
Keywords: Maize, domestication, North America, crop diffusion
Domesticated from a wild teosinte grass in southern Mexico more than 6,000 years ago, maize (Zea mays ssp. mays L.) is today the world's single most important food crop, with a recent annual harvest of more than 818 million metric tons (Varshney et al. 2012: Table 1). In this chapter I look at the early history of this major crop and follow its initial diffusion northward through Mexico into the southwestern United States and then across the Great Plains into the eastern woodlands of North America. A general temporal framework for the spread of maize throughout the Americas now exists, based on direct small-sample accelerator mass spectrometry (AMS) radiocarbon dating of cob and kernel remains recovered from sites scattered across this vast geographical area (Blake 2006). While this general spatial-temporal map of maize diffusion is still very much a work in progress (see ancientmaize.com), it does bring into clearer focus a wide range of questions, many of which are still unanswered, regarding the rates, routes, and mechanisms of initial human diffusion of the domesticate, its evolution under human selection, its adaptation to new environments, and the wide range of different ways in which it was added into and eventually came to dominate the subsistence economies of small-scale societies throughout North America.
SOUTHERN MEXICO: THE START OF THE JOURNEY
Populations of a wild teosinte grass that grow today along the central Balsas River valley in southern Mexico (Zea mays ssp. parviglumis Iltis and Doebley) have been identified as being phylogenetically most closely allied with maize, and are considered to be the present-day descendants of its probable wild progenitor (Matsuoka et al. 2002).
Weed resistance to herbicides occurs when herbicides are overused and can be mitigated by reducing their use. Consensus on herbicide resistance management strategies is problematic given strong industrial profit motive links in the weed science discipline.
The dynamic model Nitrogen Dynamics in Crop rotations in Ecological Agriculture (NDICEA) was used to assess the nitrogen (N), phosphorus (P) and potassium (K) balance of long-term organic cropping trials and typical organic crop rotations on a range of soil types and rainfall zones in the UK. The measurements of soil N taken at each of the organic trial sites were also used to assess the performance of NDICEA. The modeled outputs compared well to recorded soil N levels, with relatively small error margins. NDICEA therefore seems to be a useful tool for UK organic farmers. The modeling of typical organic rotations has shown that positive N balances can be achieved, although negative N balances can occur under high rainfall conditions and on lighter soil types as a result of leaching. The analysis and modeling also showed that some organic cropping systems rely on imported sources of P and K to maintain an adequate balance and large deficits of both nutrients are apparent in stockless systems. Although the K deficits could be addressed through the buffering capacity of minerals, the amount available for crop uptake will depend on the type and amount of minerals present, current cropping and fertilization practices and the climatic environment. A P deficit represents a more fundamental problem for the maintenance of crop yields and the organic sector currently relies on mined sources of P which represents a fundamental conflict with the International Federation of Organic Agriculture Movements organic principles.
Communication between emergency department (ED) staff and parents of children with asthma may play a role in asthma exacerbation management. We investigated the extent to which parents of children with asthma implement recommendations provided by the ED staff. Method: We asked questions on asthma triggers, ED care (including education and discharge recommendations), and asthma management strategies used at home shortly after the ED visit and again at 6 months.
A total of 148 children with asthma were recruited. Thirty-two percent of children were not on inhaled corticosteroids prior to their ED visit. Eighty percent of parents identified upper respiratory tract infections (URTIs) as the primary trigger for their child’s asthma. No parent received or implemented any specific asthma strategies to reduce the impact of URTIs; 82% of parents did not receive any printed asthma education materials. Most (66%) parents received verbal instructions on how to manage their child’s future asthma exacerbations. Of those, one-third of families were told to return to the ED. Parents were rarely advised to bring their child to their family doctor in the event of a future exacerbation. At 6 months, parents continued to use the ED services for asthma exacerbations in their children, despite reporting feeling confident in managing their child’s asthma.
Improvements are urgently needed in developing strategies to manage pediatric asthma exacerbations related to URTIs, communication with parents at discharge in acute care, and using alternate acute care services for parents who continue to rely on EDs for the initial care of mild asthma exacerbations.
Growing populations and a constrained fossil-manufactured energy supply present a major challenge for society and there is a real need to develop forms of agriculture that are less dependent on finite energy sources. It has been suggested that organic agriculture can provide a more energy efficient approach due to its focus on sustainable production methods. This review has investigated the extent to which this is true for a range of farming systems. Data from about 50 studies were reviewed with results suggesting that organic farming performs better than conventional for nearly all crop types when energy use is expressed on a unit of area basis. Results are more variable per unit of product due to the lower yield for most organic crops. For livestock, ruminant production systems tend to be more energy efficient under organic management due to the production of forage in grass–clover leys. Conversely, organic poultry tend to perform worse in terms of energy use as a result of higher feed conversion ratios and mortality rates compared to conventional fully housed or free-range systems. With regard to energy sources, there is some evidence that organic farms use more renewable energy and have less of an impact on natural ecosystems. Human energy requirements on organic farms are also higher as a result of greater system diversity and manual weed control. Overall this review has found that most organic farming systems are more energy efficient than their conventional counterparts, although there are some notable exceptions.
This landmark volume eloquently underscores the enduring legacy of Jack Harlan's broad-ranging and multiple-perspective approach to considering the past development and future challenges of agricultural economies, world-wide. It also highlights the remarkable degree to which plant and animal domestication and agricultural origins continue to expand as a general research question across a wide spectrum of different disciplines in the biological and social sciences.
General areas of inquiry are continually emerging in science, and for widely varying periods of time, they attract and reward researchers, providing interesting and unfolding sequences of questions before eventually closing down as their research potential is exhausted. The evolution of agricultural economies, from first origins to future developments, is an excellent example of an extremely long-lived problem area which not only has witnessed substantial growth since the pioneering efforts of Vavilov, Braidwood, Harlan, Heiser, MacNeish, and others, but also holds the very real promise of continuing to expand and provide new research questions for generations to come.
One of the key questions of primary importance to global agriculture and food security is how to optimize sustainable intensification to balance competing demands on land for food and energy production, while ensuring the provision of ecosystem services and maintaining or increasing yields. Integrating trees and agriculture through agroforestry has been attracting increasing interest as an agroecological approach to sustainable intensification. Trees have traditionally been important elements of temperate agricultural systems around the world, but there has been increasing separation of agriculture, forestry and nature over the past few decades. This paper discusses what we can learn from traditional agroforestry systems to help develop modern systems that integrate ecological farming and agroecological advances to achieve sustainable intensification. We also discuss the existing barriers to wider adoption of agroforestry, and identify how these barriers can be overcome to promote agroforestry as a mainstream land-use system.
Meeting the needs for a growing world population calls for multifunctional land use, which can meet the multiple demands of food and fuel production, environmental and biodiversity protection, and has the capacity for adaptation or resilience to climate change. Agroforestry, a land-use system that integrates trees and shrubs with crops and/or livestock production, has been identified by the International Assessment of Agricultural Knowledge, Science and Technology for Development (IAASTD) as a ‘win–win’ approach that balances the production of commodities (food, feed, fuel, fiber, etc.) with non-commodity outputs such as environmental protection and cultural and landscape amenities. Evidence is now coming to light that supports the promotion of agroforestry in temperate developed countries as a sustainable alternative to the highly industrialized agricultural model with its associated negative environmental externalities. This paper reviews this evidence within the ‘ecosystem services’ framework to evaluate agroforestry as part of a multifunctional working landscape in temperate regions. Establishing trees on agricultural land can help to mitigate many of the negative impacts of agriculture, for example by regulating soil, water and air quality, supporting biodiversity, reducing inputs by natural regulation of pests and more efficient nutrient cycling, and by modifying local and global climates. The challenge now lies in promoting the adoption of agroforestry as a mainstream land use through research, dissemination of information and policy changes.