To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In electron beams where space charge plays an important role in the beam transport, the beams’ transverse and longitudinal properties will become coupled. One example of this is the transverse–longitudinal correlation produced in a current-modulated beam generated in a DC electron gun, formed through the competition between the time-dependent radial space charge force and the time-independent radial focusing force. This correlation will cause both the slice radius and divergence of the beam extracted from the gun to depend on the slice current. Here we consider the transport of such a beam in a linearly tapered solenoid focusing channel. Transport performance was generally improved with longer taper lengths, minimal initial correlation between slice divergence and slice current, and moderate degrees of initial correlation between initial slice radius and slice current. Performance was also generally improved with lower slice emittances, although surprisingly transport was improved by slightly increasing the assumed slice emittance in certain limited circumstances.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Substantial clinical heterogeneity of major depressive disorder (MDD) suggests it may group together individuals with diverse aetiologies. Identifying distinct subtypes should lead to more effective diagnosis and treatment, while providing more useful targets for further research. Genetic and clinical overlap between MDD and schizophrenia (SCZ) suggests an MDD subtype may share underlying mechanisms with SCZ.
The present study investigated whether a neurobiologically distinct subtype of MDD could be identified by SCZ polygenic risk score (PRS). We explored interactive effects between SCZ PRS and MDD case/control status on a range of cortical, subcortical and white matter metrics among 2370 male and 2574 female UK Biobank participants.
There was a significant SCZ PRS by MDD interaction for rostral anterior cingulate cortex (RACC) thickness (β = 0.191, q = 0.043). This was driven by a positive association between SCZ PRS and RACC thickness among MDD cases (β = 0.098, p = 0.026), compared to a negative association among controls (β = −0.087, p = 0.002). MDD cases with low SCZ PRS showed thinner RACC, although the opposite difference for high-SCZ-PRS cases was not significant. There were nominal interactions for other brain metrics, but none remained significant after correcting for multiple comparisons.
Our significant results indicate that MDD case-control differences in RACC thickness vary as a function of SCZ PRS. Although this was not the case for most other brain measures assessed, our specific findings still provide some further evidence that MDD in the presence of high genetic risk for SCZ is subtly neurobiologically distinct from MDD in general.
Over the past 25 years, numerous studies utilizing both X-ray diffraction (XRE) and differential scanning calorimetry (DSC) have been reported In the literature. Generally, conventional high-temperature X-ray data identifies solid-state transitions, then attempts to correlate them with thermal events observed by the calorimeter. Since changes occur in the sample during studies such as these, separate portions of the sample must be used for XRD and DSC experiments. When comparing results of the two experiments, questions arise concerning sample homogeniety as well as temperature and environmental differences. In fact, no conventional high-temperature X-ray diffraction instrument can give the precise control over temperature and heating rate available with a DSC, The problems of sample inhomogeneltles and Instrumental differences could be avoided if X-ray diffraction and DSC could be performed simultaneously on one sample.
Objective: Few studies have investigated the assessment and functional impact of egocentric and allocentric neglect among stroke patients. This pilot study aimed to determine (1) whether allocentric and egocentric neglect could be dissociated among a sample of stroke patients using eye tracking; (2) the specific patterns of attention associated with each subtype; and (3) the nature of the relationship between neglect subtype and functional outcome. Method: Twenty acute stroke patients were administered neuropsychological assessment batteries, a pencil-and-paper Apples Test to measure neglect subtype, and an adaptation of the Apples Test with an eye tracking measure. To test clinical discriminability, twenty age- and education-matched control participants were administered the eye tracking measure of neglect. Results: The eye tracking measure identified a greater number of individuals as having egocentric and/or allocentric neglect than the pencil-and-paper Apples Test. Classification of neglect subtype based on eye tracking performance was a significant predictor of functional outcome beyond that accounted for by the neuropsychological test performance and Apples Test neglect classification. Preliminary evidence suggests that patients with no neglect symptoms had superior functional outcomes compared with patients with neglect. Patients with combined egocentric and allocentric neglect had poorer functional outcomes than those with either subtype. Functional outcomes of patients with either allocentric or egocentric neglect did not differ significantly. The applications of our findings, to improve neglect detection, are discussed. Conclusion: Results highlight the potential clinical utility of eye tracking for the assessment and identification of neglect subtype among stroke patients to predict functional outcomes. (JINS, 2019, 25, 479–489)
This replication study examined protective effects of positive childhood memories with caregivers (“angels in the nursery”) against lifespan and intergenerational transmission of trauma. More positive, elaborated angel memories were hypothesized to buffer associations between mothers’ childhood maltreatment and their adulthood posttraumatic stress disorder (PTSD) and depression symptoms, comorbid psychopathology, and children's trauma exposure. Participants were 185 mothers (M age = 30.67 years, SD = 6.44, range = 17–46 years, 54.6% Latina, 17.8% White, 10.3% African American, 17.3% other; 24% Spanish speaking) and children (M age = 42.51 months; SD = 15.95, range = 3–72 months; 51.4% male). Mothers completed the Angels in the Nursery Interview (Van Horn, Lieberman, & Harris, 2008), and assessments of childhood maltreatment, adulthood psychopathology, children's trauma exposure, and demographics. Angel memories significantly moderated associations between maltreatment and PTSD (but not depression) symptoms, comorbid psychopathology, and children's trauma exposure. For mothers with less positive, elaborated angel memories, higher levels of maltreatment predicted higher levels of psychopathology and children's trauma exposure. For mothers with more positive, elaborated memories, however, predictive associations were not significant, reflecting protective effects. Furthermore, protective effects against children's trauma exposure were significant only for female children, suggesting that angel memories may specifically buffer against intergenerational trauma from mothers to daughters.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
Internal gravity wave energy contributes significantly to the energy budget of the oceans, affecting mixing and the thermohaline circulation. Hence it is important to determine the internal wave energy flux
is the pressure perturbation field and
is the velocity perturbation field. However, the pressure perturbation field is not directly accessible in laboratory or field observations. Previously, a Green’s function based method was developed to calculate the instantaneous energy flux field from a measured density perturbation field
, given a constant buoyancy frequency
. Here we present methods for computing the instantaneous energy flux
for an internal wave field with vertically varying background
, as in the oceans where
typically decreases by two orders of magnitude from the pycnocline to the deep ocean. Analytic methods are presented for computing
from a density perturbation field for
varying linearly with
. To generalize this approach to arbitrary
, we present a computational method for obtaining
. The results for
for the different cases agree well with results from direct numerical simulations of the Navier–Stokes equations. Our computational method can be applied to any density perturbation data using the MATLAB graphical user interface ‘EnergyFlux’.
A robust biomedical informatics infrastructure is essential for academic health centers engaged in translational research. There are no templates for what such an infrastructure encompasses or how it is funded. An informatics workgroup within the Clinical and Translational Science Awards network conducted an analysis to identify the scope, governance, and funding of this infrastructure. After we identified the essential components of an informatics infrastructure, we surveyed informatics leaders at network institutions about the governance and sustainability of the different components. Results from 42 survey respondents showed significant variations in governance and sustainability; however, some trends also emerged. Core informatics components such as electronic data capture systems, electronic health records data repositories, and related tools had mixed models of funding including, fee-for-service, extramural grants, and institutional support. Several key components such as regulatory systems (e.g., electronic Institutional Review Board [IRB] systems, grants, and contracts), security systems, data warehouses, and clinical trials management systems were overwhelmingly supported as institutional infrastructure. The findings highlighted in this report are worth noting for academic health centers and funding agencies involved in planning current and future informatics infrastructure, which provides the foundation for a robust, data-driven clinical and translational research program.
Grommet insertion is a common surgical procedure in children. Long waiting times for grommet insertion are not unusual. This project aimed to streamline the process by introducing a pathway for audiologists to directly schedule children meeting National Institute for Health and Care Excellence Clinical Guideline 60 (‘CG60’) for grommet insertion.
Method and results
A period from June to November 2014 was retrospectively audited. Mean duration between the first audiology appointment and grommet insertion was 294.5 days (median = 310 days). Implementing the direct-listing pathway reduced the duration between first audiology appointment and grommet insertion (mean = 232 days; median = 231 days). There has been a reduction in the time between the first audiology appointment and surgery (mean difference of 62.5 days; p = 0.024), and a reduction in the time between second audiology appointment and surgery (28 days; p = 0.009).
Direct-listing pathways for grommet insertion can reduce waiting times and expedite surgery. Implementation involves a simple alteration of current practice, adhering to National Institute for Health and Care Excellence Clinical Guideline 60. The ultimate decision regarding surgery still rests with ENT specialists.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Potential participants seek information about clinical trials for many reasons, but the process can be challenging. We analyzed 101,249 searches in ResearchMatch Trials Today, a free interface to recruiting trials from ClinicalTrials.gov. Searches from March 2015 to November 2016 included a broad range of conditions and healthy volunteer concepts, including 12,649 unique topics. Trials Today data indicate that it is being used to identify trials on a variety of topics.
Introduction: Emergency Department Overcrowding (EDOC) is a multifactorial issue that leads to Access Block for patients needing emergency care. Identified as a national problem, patients presenting to a Canadian Emergency Department (ED) at a time of overcrowding have higher rates of admission to hospital and increased seven-day mortality. Using the well accepted input-throughput-output model to study EDOC, current research has focused on throughput as a measure of patient flow, reported as ED length of stay (LOS). In fact, ED LOS and ED beds occupied by inpatients are two “extremely important indicators of EDOC identified by a 2005 survey of Canadian ED directors. One proposed solution to improve ED throughput is to utilize a physician at triage (PAT) to rapidly assess newly arriving patients. In 2017, a pilot PAT program was trialed at Kelowna General Hospital (KGH), a tertiary care hospital, as part of a PDSA cycle. The aim was to mitigate EDOC by improving ED throughput by the end of 2018, to meet the national targets for ED LOS suggested in the 2013 CAEP position statement. Methods: During the fiscal periods 1-6 (April 1 to September 7, 2017) a PAT shift occurred daily from 1000-2200, over four long weekends. ED LOS, time to inpatient bed, time to physician initial assessment (PIA), number of British Columbia Ambulance Service (BCAS) offload delays, and number of patients who left without being seen (LWBS) were extracted from an administrative database. Results were retrospectively analyzed and compared to data from 1000-2200 of non-PAT trial days during the trial periods. Results: Median ED LOS decreased from 3.8 to 3.4 hours for high-acuity patients (CTAS 1-3), from 2.1 to 1.8 hours for low-acuity patients (CTAS 4-5), and from 9.3 to 8.0 hours for all admitted patients. During PAT trial weekends, there was a decrease in the average time to PIA by 65% (from 73 to 26 minutes for CTAS 2-5), average number of daily BCAS offload delays by 39% (from 2.3 to 1.4 delays per day), and number of patients who LWBS from 2.4% to 1.7%. Conclusion: The implementation of PAT was associated with improvements in all five measures of ED throughput, providing a potential solution for EDOC at KGH. ED LOS was reduced compared to non-PAT control days, successfully meeting the suggested national targets. PAT could improve efficiency, resulting in the ability to see more patients in the ED, and increase the quality and safety of ED practice. Next, we hope to prospectively evaluate PAT, continuing to analyze these process measures, perform a cost-benefit analysis, and formally assess ED staff and patient perceptions of the program.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.
Studies have consistently shown that subthreshold depression is associated with an increased risk of developing major depression. However, no study has yet calculated a pooled estimate that quantifies the magnitude of this risk across multiple studies.
We conducted a systematic review to identify longitudinal cohort studies containing data on the association between subthreshold depression and future major depression. A baseline meta-analysis was conducted using the inverse variance heterogeneity method to calculate the incidence rate ratio (IRR) of major depression among people with subthreshold depression relative to non-depressed controls. Subgroup analyses were conducted to investigate whether IRR estimates differed between studies categorised by age group or sample type. Sensitivity analyses were also conducted to test the robustness of baseline results to several sources of study heterogeneity, such as the case definition for subthreshold depression.
Data from 16 studies (n = 67 318) revealed that people with subthreshold depression had an increased risk of developing major depression (IRR = 1.95, 95% confidence interval 1.28–2.97). Subgroup analyses estimated similar IRRs for different age groups (youth, adults and the elderly) and sample types (community-based and primary care). Sensitivity analyses demonstrated that baseline results were robust to different sources of study heterogeneity.
The results of this study support the scaling up of effective indicated prevention interventions for people with subthreshold depression, regardless of age group or setting.
Water cultures were significantly more sensitive than concurrently collected swab cultures (n=2,147 each) in detecting Legionella pneumophila within a Veterans Affairs healthcare system. Sensitivity for water versus swab cultures was 90% versus 30% overall, 83% versus 48% during a nosocomial Legionnaires’ disease outbreak, and 93% versus 22% post outbreak.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Despite the ubiquity of tunnel channels and valleys within formerly glaciated areas, their origin remains enigmatic. Few modern analogues exist for event-related subglacial erosion. This paper presents evidence of subglacial meltwater erosion and tunnel channel formation during the November 1996 jökulhlaup, Skeiðarárjökull, Iceland. The jökulhlaup reached a peak discharge of 45 000 to 50 000 m3 s–1, with flood outbursts emanating from multiple outlets across the entire 23 km wide glacier snout. Subsequent retreat of the southeast margin of Skeiðarárjökull has revealed a tunnel channel excavated into the surrounding moraine sediment and ascending 11.5m over a distance of 160 m from a larger trough to join the apex of an ice-contact fan formed in November 1996. The tunnel channel formed via hydro-mechanical erosion of 14 000m3 to 24 000 m3 of unconsolidated glacier substrate, evidenced by copious rip-up clasts within the ice-contact fan. Flow reconstruction provides peak discharge estimates of 680±140m3 s–1. The tunnel channel orientation, oblique to local ice flow direction and within a col, suggests that local jökulhlaup routing was controlled by (a) subglacial topography and (b) the presence of a nearby proglacial lake. We describe the first modern example of tunnel channel formation and illustrate the importance of pressurized subglacial jökulhlaup flow for tunnel channel formation.