To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There is a wealth of literature on the observed association between childhood trauma and psychotic illness. However, the relationship between childhood trauma and psychosis is complex and could be explained, in part, by gene–environment correlation.
The association between schizophrenia polygenic scores (PGS) and experiencing childhood trauma was investigated using data from the Avon Longitudinal Study of Parents and Children (ALSPAC) and the Norwegian Mother, Father and Child Cohort Study (MoBa). Schizophrenia PGS were derived in each cohort for children, mothers, and fathers where genetic data were available. Measures of trauma exposure were derived based on data collected throughout childhood and adolescence (0–17 years; ALSPAC) and at age 8 years (MoBa).
Within ALSPAC, we found a positive association between schizophrenia PGS and exposure to trauma across childhood and adolescence; effect sizes were consistent for both child or maternal PGS. We found evidence of an association between the schizophrenia PGS and the majority of trauma subtypes investigated, with the exception of bullying. These results were comparable with those of MoBa. Within ALSPAC, genetic liability to a range of additional psychiatric traits was also associated with a greater trauma exposure.
Results from two international birth cohorts indicate that genetic liability for a range of psychiatric traits is associated with experiencing childhood trauma. Genome-wide association study of psychiatric phenotypes may also reflect risk factors for these phenotypes. Our findings also suggest that youth at higher genetic risk might require greater resources/support to ensure they grow-up in a healthy environment.
Observational studies have linked elevated homocysteine to vascular conditions. Folate intake has been associated with lower homocysteine concentration, although randomised controlled trials of folic acid supplementation to decrease the incidence of vascular conditions have been inconclusive. We investigated determinants of maternal homocysteine during pregnancy, particularly in a folic acid-fortified population.
Data were from the Ottawa and Kingston Birth Cohort of 8085 participants. We used multivariable regression analyses to identify factors associated with maternal homocysteine, adjusted for gestational age at bloodwork. Continuous factors were modelled using restricted cubic splines. A subgroup analysis examined the modifying effect of MTHFR 677C>T genotype on folate, in determining homocysteine concentration.
Participants were recruited in Ottawa and Kingston, Canada, from 2002 to 2009.
Women were recruited when presenting for prenatal care in the early second trimester.
In 7587 participants, factors significantly associated with higher homocysteine concentration were nulliparous, smoking and chronic hypertension, while factors significantly associated with lower homocysteine concentration were non-Caucasian race, history of a placenta-mediated complication and folic acid supplementation. Maternal age and BMI demonstrated U-shaped associations. Folic acid supplementation of >1 mg/d during pregnancy did not substantially increase folate concentration. In the subgroup analysis, MTHFR 677C>T modified the effect of folate status on homocysteine concentration.
We identified determinants of maternal homocysteine relevant to the lowering of homocysteine in the post-folic acid fortification era, characterised by folate-replete populations. A focus on periconceptional folic acid supplementation and improving health status may form an effective approach to lower homocysteine.
Populations of native North American parasitoids attacking Agrilus Curtis (Coleoptera: Buprestidae) species have recently been considered as part of an augmentative biological control programme in an attempt to manage emerald ash borer, Agrilus planipennis Fairmaire, a destructive wood-boring beetle discovered in North America in 2002. We evaluate trapping methods to detect and monitor populations of two important native larval parasitoids, Phasgonophora sulcata Westwood (Hymenoptera: Chalcididae) and Atanycolus Förster (Hymenoptera: Braconidae) species, attacking emerald ash borer in its introduced range. We found that purple prism traps captured more P. sulcata than green prism traps, yellow pan traps, and log samples and thus were considered better for detecting and monitoring P. sulcata populations. Trap type did not affect the number of captures of Atanycolus species. Surprisingly, baiting prism traps with a green leaf volatile or manuka oil did not significantly increase captures of P. sulcata or Atanycolus species. Based on these results, unbaited purple prism traps would be optimal for sampling these native emerald ash borer parasitoids in long-term management programmes.
The beginning of laminar–turbulent transition is usually associated with a wave-like disturbance, but its evolution and role in precipitating the development of other flow structures are not well understood from a structure-based view. Nonlinear parabolized stability equations (NPSE) were solved numerically to simulate the transition of K-regime, N-regime and O-regime. However, only the K-regime transition was examined experimentally using both hydrogen bubble visualization and time-resolved tomographic particle image velocimetry (tomo-PIV). Based on the ‘NPSE visualization’ and ‘tomographic visualization’, at least four common characteristics of the generic transition process were identified: (i) inflectional regions representing high-shear layers (HSL) that develop in vertical velocity profiles, accompanied by ejection–sweep behaviours; (ii) low-speed streak (LSS) patterns, manifested in horizontal timelines, that seem to consist of several three-dimensional (3-D) waves; (iii) a warped wave front (WWF) pattern, displaying multiple folding processes, which develops adjacent to the LSS in the near-wall region, prior to the appearance of 𝛬-vortices; (iv) a coherent 3-D wave front, similar to a soliton, in the upper boundary layer, accompanied by regions of depression along the flanks of the wave. It was determined that the amplification and lift-up of a 3-D wave causes the development of the HSL, WWF and multiple folding behaviour of material surfaces, that all contribute to the development of a 𝛬-vortex. The amplified 3-D wave is hypothesized as a soliton-like coherent structure. Based on our results, a path to transition is proposed, which hypothesizes the function of the WWF in boundary-layer transition.
The prevalence of many diseases in pigs displays seasonal distributions. Despite growing concerns about the impacts of climate change, we do not yet have a good understanding of the role that weather factors play in explaining such seasonal patterns. In this study, national and county-level aggregated abattoir inspection data were assessed for England and Wales during 2010–2015. Seasonally-adjusted relationships were characterised between weekly ambient maximum temperature and the prevalence of both respiratory conditions and tail biting detected at slaughter. The prevalence of respiratory conditions showed cyclical annual patterns with peaks in the summer months and troughs in the winter months each year. However, there were no obvious associations with either high or low temperatures. The prevalence of tail biting generally increased as temperatures decreased, but associations were not supported by statistical evidence: across all counties there was a relative risk of 1.028 (95% CI 0.776–1.363) for every 1 °C fall in temperature. Whilst the seasonal patterns observed in this study are similar to those reported in previous studies, the lack of statistical evidence for an explicit association with ambient temperature may possibly be explained by the lack of information on date of disease onset. There is also the possibility that other time-varying factors not investigated here may be driving some of the seasonal patterns.
Palmer amaranth is the most common and troublesome weed in North Carolina sweetpotato. Field studies were conducted in Clinton, NC, in 2016 and 2017 to determine the critical timing of Palmer amaranth removal in ‘Covington’ sweetpotato. Palmer amaranth was grown with sweetpotato from transplanting to 2, 3, 4, 5, 6, 7, 8, and 9 wk after transplanting (WAP) and maintained weed-free for the remainder of the season. Palmer amaranth height and shoot dry biomass increased as Palmer amaranth removal was delayed. Season-long competition by Palmer amaranth interference reduced marketable yields by 85% and 95% in 2016 and 2017, respectively. Sweetpotato yield loss displayed a strong inverse linear relationship with Palmer amaranth height. A 0.6% and 0.4% decrease in yield was observed for every centimeter of Palmer amaranth growth in 2016 and 2017, respectively. The critical timing for Palmer amaranth removal, based on 5% loss of marketable yield, was determined by fitting a log-logistic model to the relative yield data and was determined to be 2 WAP. These results show that Palmer amaranth is highly competitive with sweetpotato and should be managed as early as possible in the season. The requirement of an early critical timing of weed removal to prevent yield loss emphasizes the importance of early-season scouting and Palmer amaranth removal in sweetpotato fields. Any delay in removal can result in substantial yield reductions and fewer premium quality roots.
Fetal growth restriction (FGR) can be defined as the failure of the fetus to meet its genetically predetermined growth potential  and is associated with significant fetal and perinatal morbidity and mortality. In addition, there is evidence to suggest a longer-term impact of FGR on childhood neurodevelopmental outcomes  and cardiovascular and metabolic diseases that manifest in adulthood . However, predicting FGR is not straightforward and methods for screening and diagnosis are imprecise. In the UK and USA, ultrasound scans in the second half of pregnancy are not performed routinely but targeted at women considered to be at risk for FGR, where high risk is identified by maternal characteristics (including anthropometry and pre-existing disease), the development of complications, or clinical suspicion based on being ‘small for dates’ on physical examination. For practical purposes, FGR may be suspected if biometric measurements are below a given threshold of the distribution in the population, typically <10th, 5th or 3rd centile for gestational age, or if there is a reduction in growth velocity (‘crossing centiles’) from previous scans . The difficulty with using biometry alone is that it does not differentiate between the growth-restricted fetus affected by placental insufficiency, and the healthy, constitutionally small fetus. Therefore, additional measures may be employed to diagnose placental dysfunction, such as Doppler studies of the fetal and uteroplacental circulation, and analysis of maternal serum biomarkers. At present, the only treatment available for FGR is to expedite delivery, but at preterm gestations this can also can cause harm. However, new genomics-based research could help us better understand the etiology of growth restriction and identify more accurate diagnostic biomarkers or potential therapeutic targets. This chapter will focus on current practice in screening for and intervention in FGR and will also consider new developments and the future of the field.
The addition of Cr2O3 to modern UO2 fuel modifies the microstructure so that, through the generation of larger grains during fission, a higher proportion of fission gases can be accommodated. This reduces the pellet-cladding mechanical interaction of the fuel rods, allowing the fuels to be “burned” for longer than traditional UO2 fuel, thus maximising the energy obtained. We here describe the preparation of UO2 and Cr-doped UO2 using Hot Isostatic Pressing (HIP), as a potential method for fuel fabrication, and for development of analogue materials for spent nuclear fuel research. Characterization of the synthesised materials confirmed that high density UO2 was successfully formed, and that Cr was present as particles at grain boundaries and also within the UO2 matrix, possibly in a reduced form due to the processing conditions. In contrast to studies of Cr-doped UO2 synthesised by other methods, no significant changes to the grain size were observed in the presence of Cr.
Deep learning using convolutional neural networks represents a form of artificial intelligence where computers recognise patterns and make predictions based upon provided datasets. This study aimed to determine if a convolutional neural network could be trained to differentiate the location of the anterior ethmoidal artery as either adhered to the skull base or within a bone ‘mesentery’ on sinus computed tomography scans.
Coronal sinus computed tomography scans were reviewed by two otolaryngology residents for anterior ethmoidal artery location and used as data for the Google Inception-V3 convolutional neural network base. The classification layer of Inception-V3 was retrained in Python (programming language software) using a transfer learning method to interpret the computed tomography images.
A total of 675 images from 388 patients were used to train the convolutional neural network. A further 197 unique images were used to test the algorithm; this yielded a total accuracy of 82.7 per cent (95 per cent confidence interval = 77.7–87.8), kappa statistic of 0.62 and area under the curve of 0.86.
Convolutional neural networks demonstrate promise in identifying clinically important structures in functional endoscopic sinus surgery, such as anterior ethmoidal artery location on pre-operative sinus computed tomography.
The Late Formative period immediately precedes the emergence of Tiwanaku, one of the earliest South American states, yet it is one of the most poorly understood periods in the southern Lake Titicaca Basin (Bolivia). In this article, we refine the ceramic chronology of this period with large sets of dates from eight sites, focusing on temporal inflection points in decorated ceramic styles. These points, estimated here by Bayesian models, index specific moments of change: (1) cal AD 120 (60–170, 95% probability): the first deposition of Kalasasaya red-rimmed and zonally incised styles; (2) cal AD 240 (190–340, 95% probability): a tentative estimate of the final deposition of Kalasasaya zonally incised vessels; (3) cal AD 420 (380–470, 95% probability): the final deposition of Kalasasaya red-rimmed vessels; and (4) cal AD 590 (500–660, 95% probability): the first deposition of Tiwanaku Redwares. These four modeled boundaries anchor an updated Late Formative chronology, which includes the Initial Late Formative phase, a newly identified decorative hiatus between the Middle and Late Formative periods. The models place Qeya and transitional vessels between inflection points 3 and 4 based on regionally consistent stratigraphic sequences. This more precise chronology will enable researchers to explore the trajectories of other contemporary shifts during this crucial period in Lake Titicaca Basin's prehistory.
This study compares the frequency and severity of influenza A/H1N1pdm09 (A/H1), influenza A/H3N2 (A/H3) and other respiratory virus infections in hospitalised patients. Data from 17 332 adult hospitalised patients admitted to Sir Charles Gairdner Hospital, Perth, Western Australia, with a respiratory illness between 2012 and 2015 were linked with data containing reverse transcription polymerase chain reaction results for respiratory viruses including A/H1, A/H3, influenza B, human metapneumovirus, respiratory syncytial virus and parainfluenza. Of these, 1753 (10.1%) had test results. Multivariable regression analyses were conducted to compare the viruses for clinical outcomes including ICU admission, ventilation, pneumonia, length of stay and death. Patients with A/H1 were more likely to experience severe outcomes such as ICU admission (OR 2.5, 95% CI 1.2–5.5, P = 0.016), pneumonia (OR 3.0, 95% CI 1.6–5.7, P < 0.001) and lower risk of discharge from hospital (indicating longer lengths of hospitalisation; HR 0.64 95% CI 0.47–0.88, P = 0.005), than patients with A/H3. Patients with a non-influenza respiratory virus were less likely to experience severe clinical outcomes than patients with A/H1, however, had similar likelihood when compared to patients with A/H3. Patients hospitalised with A/H1 had higher odds of severe outcomes than patients with A/H3 or other respiratory viruses. Knowledge of circulating influenza strains is important for healthcare preparedness.
We have detected 27 new supernova remnants (SNRs) using a new data release of the GLEAM survey from the Murchison Widefield Array telescope, including the lowest surface brightness SNR ever detected, G 0.1 – 9.7. Our method uses spectral fitting to the radio continuum to derive spectral indices for 26/27 candidates, and our low-frequency observations probe a steeper spectrum population than previously discovered. None of the candidates have coincident WISE mid-IR emission, further showing that the emission is non-thermal. Using pulsar associations we derive physical properties for six candidate SNRs, finding G 0.1 – 9.7 may be younger than 10 kyr. Sixty per cent of the candidates subtend areas larger than 0.2 deg2 on the sky, compared to < 25% of previously detected SNRs. We also make the first detection of two SNRs in the Galactic longitude range 220°–240°.
This work makes available a further
of the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey, covering half of the accessible galactic plane, across 20 frequency bands sampling 72–231 MHz, with resolution
. Unlike previous GLEAM data releases, we used multi-scale CLEAN to better deconvolve large-scale galactic structure. For the galactic longitude ranges
$345^\circ < l < 67^\circ$
$180^\circ < l < 240^\circ$
, we provide a compact source catalogue of 22 037 components selected from a 60-MHz bandwidth image centred at 200 MHz, with RMS noise
and position accuracy better than 2 arcsec. The catalogue has a completeness of 50% at
, and a reliability of 99.86%. It covers galactic latitudes
towards the galactic centre and
for other regions, and is available from Vizier; images covering
for all longitudes are made available on the GLEAM Virtual Observatory (VO).server and SkyView.
We examined the latest data release from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey covering 345° < l < 60° and 180° < l < 240°, using these data and that of the Widefield Infrared Survey Explorer to follow up proposed candidate Supernova Remnant (SNR) from other sources. Of the 101 candidates proposed in the region, we are able to definitively confirm ten as SNRs, tentatively confirm two as SNRs, and reclassify five as H ii regions. A further two are detectable in our images but difficult to classify; the remaining 82 are undetectable in these data. We also investigated the 18 unclassified Multi-Array Galactic Plane Imaging Survey (MAGPIS) candidate SNRs, newly confirming three as SNRs, reclassifying two as H ii regions, and exploring the unusual spectra and morphology of two others.
Increasing weed control costs and limited herbicide options threaten vegetable crop profitability. Traditional interrow mechanical cultivation is very effective at removing weeds between crop rows. However, weed control within the crop rows is necessary to establish the crop and prevent yield loss. Currently, many vegetable crops require hand weeding to remove weeds within the row that remain after traditional cultivation and herbicide use. Intelligent cultivators have come into commercial use to remove intrarow weeds and reduce cost of hand weeding. Intelligent cultivators currently on the market such as the Robovator, use pattern recognition to detect the crop row. These cultivators do not differentiate crops and weeds and do not work well among high weed populations. One approach to differentiate weeds is to place a machine-detectable mark or signal on the crop (i.e., the crop has the mark and the weed does not), thereby facilitating weed/crop differentiation. Lettuce and tomato plants were marked with labels and topical markers, then cultivated with an intelligent cultivator programmed to identify the markers. Results from field trials in marked tomato and lettuce found that the intelligent cultivator removed 90% more weeds from tomato and 66% more weeds from lettuce than standard cultivators without reducing yields. Accurate crop and weed differentiation described here resulted in a 45% to 48% reduction in hand-weeding time per hectare.
An outbreak of 18 cases of hepatitis A virus infection across five Canadian provinces was investigated. Case onsets occurred between October 2017 and May 2018. A retrospective matched case-control study was conducted to identify the likely source of the outbreak. Three matched controls were recruited for each case using a previously established control bank, supplemented by landline and cell phone call lists. Univariate and multivariate matched analyses were conducted to identify a potential outbreak source. Seventy-two per cent of controls were recruited through the control bank, and required on average 25.5 calls per recruited control; 20% of controls were recruited through a landline sample and 8% of controls were recruited through a cell phone sample, requiring an average of 847.3 and 331.7 calls per recruited control, respectively. Results of the analysis pointed to shrimp/prawns (odds ratio (OR) 15.75, p = 0.01) and blackberries (OR 7.21, p = 0.02) as foods of interest, however, an outbreak source could not be confirmed. The control bank proved to be a more efficient method for control recruitment than random call lists. Expanding the control bank size and using alternative methods, such as online surveys, may prove beneficial for increasing the timeliness of a case-control study during an outbreak investigation.
Smoking prevalence is higher amongst individuals with schizophrenia and depression compared with the general population. Mendelian randomisation (MR) can examine whether this association is causal using genetic variants identified in genome-wide association studies (GWAS).
We conducted two-sample MR to explore the bi-directional effects of smoking on schizophrenia and depression. For smoking behaviour, we used (1) smoking initiation GWAS from the GSCAN consortium and (2) we conducted our own GWAS of lifetime smoking behaviour (which captures smoking duration, heaviness and cessation) in a sample of 462690 individuals from the UK Biobank. We validated this instrument using positive control outcomes (e.g. lung cancer). For schizophrenia and depression we used GWAS from the PGC consortium.
There was strong evidence to suggest smoking is a risk factor for both schizophrenia (odds ratio (OR) 2.27, 95% confidence interval (CI) 1.67–3.08, p < 0.001) and depression (OR 1.99, 95% CI 1.71–2.32, p < 0.001). Results were consistent across both lifetime smoking and smoking initiation. We found some evidence that genetic liability to depression increases smoking (β = 0.091, 95% CI 0.027–0.155, p = 0.005) but evidence was mixed for schizophrenia (β = 0.022, 95% CI 0.005–0.038, p = 0.009) with very weak evidence for an effect on smoking initiation.
These findings suggest that the association between smoking, schizophrenia and depression is due, at least in part, to a causal effect of smoking, providing further evidence for the detrimental consequences of smoking on mental health.
Fossil crayfish are typically rare, worldwide. In Australia, the strictly Southern Hemisphere clade Parastacidae, while ubiquitous in modern freshwater systems, is known only from sparse fossil occurrences from the Aptian–Albian of Victoria. We expand this record to the Cenomanian of northern New South Wales, where opalized bio-gastroliths (temporary calcium storage bodies found in the foregut of pre-moult crayfish) form a significant proportion of the fauna of the Griman Creek Formation. Crayfish bio-gastroliths are exceedingly rare in the fossil record but here form a remarkable supplementary record for crayfish, whose body and trace fossils are otherwise unknown from the Griman Creek Formation. The new specimens indicate that parastacid crayfish were widespread in eastern Australia by middle Cretaceous time, occupying a variety of freshwater ecosystems from the Australian–Antarctic rift valley in the south, to the near-coastal floodplains surrounding the epeiric Eromanga Sea further to the north.
Emergency service (ambulance, police, fire) call-takers and dispatchers are often exposed to duty-related trauma, placing them at increased risk for developing mental health challenges like stress, anxiety, depression, and posttraumatic stress disorder (PTSD). Their unique working environment also puts them at-risk for physical health issues like obesity, headache, backache, and insomnia. Along with the stress associated with being on the receiving end of difficult calls, call-takers and dispatchers also deal with the pressure and demand of following protocol despite dealing with the variability of complex and stressful situations.
A systematic literature review was conducted using the MEDLINE, PubMed, CINAHL, and PsychInfo databases.
A total of 25 publications were retrieved by the search strategy. The majority of studies (n = 13; 52%) reported a quantitative methodology, while nine (36%) reported the use of a qualitative research methodology. One study reported a mixed-methods methodology, one reported an evaluability assessment with semi-structured interviews, one reported on a case study, and one was a systematic review with a narrative synthesis.
Challenges to physical health included: shift-work leading to lack of physical activity, poor nutrition, and obesity; outdated and ergonomically ill-fitted equipment, and physically confining and isolating work spaces leading to physical injuries; inadequate breaks leading to fatigue; and high noise levels and poor lighting being correlated with higher cortisol levels. Challenges to mental health included: being exposed to traumatic calls; working in high-pressure environments with little downtime in between stressful calls; inadequate debriefing after stressful calls; inappropriate training for mental-health-related calls; and being exposed to verbally aggressive callers. Lack of support from leadership was an additional source of stress.
Emergency service call-takers and dispatchers experience both physical and mental health challenges as a result of their work, which appears to be related to a range of both operational and support-based issues. Future research should explore the long-term effects of these physical and mental health challenges.