We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Aircraft ground taxiing contributes significantly to carbon emissions and engine wear. The electric towing tractor (ETT) addresses these issues by towing the aircraft to the runway end, thereby minimising ground taxiing. As the complexity of ETT towing operations increases, both the towing distance and time increase significantly, and the original method for estimating the number of ETTs is no longer applicable. Due to the substantial acquisition cost of ETT and the need to reduce waste while ensuring operational efficiency, this paper introduces for the first time an ETT quantity estimation model that combines simulation and vehicle scheduling models. The simulation model simulates the impact of ETT on apron operations, taxiing on taxiways and takeoffs and landings on runways. Key timing points for ETT usage by each aircraft are identified through simulation, forming the basis for determining the minimum number of vehicles required for airport operations using a hard-time window vehicle scheduling model. To ensure the validity of the model, simulation model verification is conducted. Furthermore, the study explores the influence of vehicle speed and airport scale on the required number of ETTs. The results demonstrate the effective representation of real-airport operations by the simulation model. ETT speed, airport runway and taxiway configurations, takeoff and landing frequencies and imbalances during peak periods all impact the required quantity of ETTs. A comprehensive approach considering these factors is necessary to determine the optimal number of ETTs.
An investigation into an outbreak of Salmonella Newport infections in Canada was initiated in July 2020. Cases were identified across several provinces through whole-genome sequencing (WGS). Exposure data were gathered through case interviews. Traceback investigations were conducted using receipts, invoices, import documentation, and menus. A total of 515 cases were identified in seven provinces, related by 0–6 whole-genome multi-locus sequence typing (wgMLST) allele differences. The median age of cases was 40 (range 1–100), 54% were female, 19% were hospitalized, and three deaths were reported. Forty-eight location-specific case sub-clusters were identified in restaurants, grocery stores, and congregate living facilities. Of the 414 cases with exposure information available, 71% (295) had reported eating onions the week prior to becoming ill, and 80% of those cases who reported eating onions, reported red onion specifically. The traceback investigation identified red onions from Grower A in California, USA, as the likely source of the outbreak, and the first of many food recall warnings was issued on 30 July 2020. Salmonella was not detected in any tested food or environmental samples. This paper summarizes the collaborative efforts undertaken to investigate and control the largest Salmonella outbreak in Canada in over 20 years.
The omnipresence of change has been singled out as posing an important challenge to law, both in theory and in practice, throughout its history. Arguably, the most efficient method of adapting the law to constant changes is legal education. Recent changes in the global arena have added to the complexity of the expected role of future legal talents, requiring them to acquire not only a profound knowledge of local and global laws but also a variety of legal as well as non-legal skills. This article presents some of the principal challenges faced by law schools and legal education in the world of today. These challenges are then explored using the example of a new Bachelor of Laws (LL.B.) degree programme in Chinese Law and Global Legal Studies in the English Language that will be offered by the Faculty of Law of the University of Macau in Macao, China.
Eggs are highly digestible, nutrient-rich and are a valuable source of protein and choline, thereby promoting a range of health benefits. Several studies have found an association between protein intake and gastrointestinal microbial diversity(1), while bacterial fermentation of undigested protein in the large bowel can produce short-chain fatty acids, such as butyrate, positively influencing host metabolic health, gut integrity and immune function(2). On the other hand, dietary choline stimulates gastrointestinal bacterial production of trimethylamine and the prothrombotic compound trimethylamine-N-oxide (TMAO)(3). Despite these established links, limited studies have explored the effects of whole egg intake on indices of gastrointestinal health. This systematic literature review aimed to synthesise research that has investigated the impact of egg-supplemented diets or egg consumption on markers of gastrointestinal health including microbiome, function and symptoms. This review was conducted in accordance with PRISMA guidelines. Five databases (Ovid Medline, Embase, CINAHL Plus, SCOPUS, and PsychInfo), and reference lists of relevant papers, were searched from inception until April 2023. Studies were included if they examined the link between whole chicken egg consumption and gastrointestinal health in healthy adults (aged>16). Indices of gastrointestinal health were defined as any outcomes related to gastrointestinal factors, including symptoms, microbiome, inflammation, colonic fermentation and TMAO. Reviews and case studies were excluded. All studies underwent risk of bias assessment. Overall, 548 studies were identified and 19 studies were included after screening. Eight of these were randomised controlled trials (RCTs), 8 cross-sectional and 3 prospective cohort studies. Participants ranged in number between 20-32,166 and in age between 18–84 years. Study periods varied between 3–14 weeks for RCTs and 6 months–12.5 years for prospective cohort studies. RCTs examined intakes between 1–4 eggs/day, with the majority examining 3 eggs/day (n = 6). The primary outcome across 15 articles was TMAO levels, with most reporting no significant associations (n = 13). Five studies examined inflammation with inconsistent findings ranging from no alterations (in TNF-α, IL-8, CRP), increases (in anti-inflammatory marker LTB5, TNF-α), and decreases (in IL-6, CRP). Lastly, 7 studies explored alterations in microbiome. Two RCTs and 2 cross-sectional trials reported no alterations in microbial diversity in response to eggs. Meanwhile, 2 cross-sectional and 1 prospective study linked specific bacteria to consistent egg intake. Eggs were associated with species that produce butyrate (E.rectale, F.prausnitzii, M.smithii, and R.bromii), and protect against metabolic syndrome (A.muciniphila). This systematic review found that egg consumption did not increase levels of the undesirable biomarker TMAO and were associated with butyrate-producing bacteria. Evidence regarding the effect of egg intake on inflammation was inconsistent. This review revealed the general lack of available research investigating whole eggs and gastrointestinal health. Future carefully designed RCTs are required to improve understanding of how eggs may influence the gastrointestinal microbiome and colonic fermentation.
Eggs provide several nutrients that have been linked to neurological function. Phospholipids, which comprise 30% of lipids in egg yolk, modulate neurotransmitter receptors and have been shown to lower reaction time in healthy adults(1). Eggs are also high in choline (340mg per egg), a building block for acetylcholine, a neurotransmitter involved in memory, learning and attention(2). Finally, eggs contain the omega-3 fatty acid docosahexaenoic acid (DHA) (25mg per egg), which has roles in neurological function including neurogenesis, synaptic plasticity and myelination(3). The impact of whole egg consumption on cognition has not been widely explored. This systematic review aimed to consolidate studies that investigated frequency of egg consumption or egg-supplemented diets on cognitive function. This review followed PRISMA guidelines and involved a search of five databases (Ovid Medline, Embase, CINAHL Plus, SCOPUS, and PsychInfo) from inception until April 2023. Included studies examined the link between whole chicken egg consumption and brain function, including cognitive decline, memory, risk-taking, reaction-time, decision-making, and executive function, in healthy adults (aged>16 y). All studies underwent risk of bias assessment. Twelve studies were included in the review. Four were prospective cohort studies, 4 were retrospective, 3 cross-sectional and 1 was a randomised controlled trial (RCT). Participant numbers, with the exception of the RCT, ranged between 178-9028 and were aged between 42-97 years. Duration of prospective studies varied from 2-5 years. Egg intake was measured via food frequency questionnaires (n = 8), 24-hr diet recalls (n = 2), a 4-day food record (n = 1) and a 7-day food record (n = 1). The RCT provided 2 DHA-fortified eggs/day compared to 2 whole eggs/day for 8 weeks. The primary outcome across 9 studies was cognitive decline, followed by memory (n = 7), reaction-time (n = 2), attention (n = 2), and executive function (n = 1). For outcome measures, studies used 9 different validated task-oriented tools (including the Montreal Cognitive Assessment n = 3, and California Verbal Learning Test n = 2), or 4 self-completed questionnaires. Several studies found no significant associations between egg consumption and cognitive decline (n = 4) or memory (n = 2). Conversely, 5 studies reported significant inverse associations between egg consumption and rates of cognitive decline. The RCT found that reaction-times were faster on both whole eggs and DHA-eggs after 8 weeks (p>0.05 between groups). Although conflicting results were found, more studies showed a greater frequency of habitual egg consumption to be associated with reduced cognitive decline. However, the variety of outcome measures across studies make direct comparisons challenging, preventing definitive conclusions about the impact of eggs on cognitive health. This review highlights the need for future RCTs.
The aim of this work was to determine the impact of Moral Distress (MD) in emergency physicians, nurses, and emergency medical service staff at the Rand Memorial Hospital (RMH) in the Bahamas, and the impact of Hurricane Dorian and the COVID-19 pandemic on Moral Distress.
Method:
A cross-sectional study utilizing a 3-part survey, which collected sociodemographic information, Hurricane Dorian and COVID-19 experiences, as well as responses to a validated modified Moral Distress Scale (MDS).
Results:
Participants with 2 negatively impactful experiences from COVID-19 had statistically significantly increased MD compared to participants with only 1 negatively impactful experience (40.4 vs. 23.6, P = 0.014). Losing a loved one due to COVID-19 was associated with significantly decreased MD (B = - 0.42, 95% CI -19.70 to -0.88, P = 0.03). Losing a loved one due to Hurricane Dorian had a non-statistically significant trend towards higher MD scores (B = 0.34, 95% CI -1.23 to 28.75, P = 0.07).
Conclusion:
The emergency medical staff at the RMH reported having mild - moderate MD. This is one of the first studies to look at the impact of concurrent disasters on MD in emergency medical providers in the Bahamas.
Two 10-day in vitro experiments were conducted to investigate the relationship between nitrogen (N) isotope discrimination (δ15N) and ammonia (NH3) emissions from sheep manure. In Exp. 1, three different manure mixtures were set up: control (C); C mixed with lignite (C + L); and grape marc (GM), with 5, 4 and 5 replications, respectively. For C, urine and faeces were collected from sheep fed a diet of 550 g lucerne hay/kg, 400 g barley grain/kg and 50 g faba bean/kg; for C + L, urine and faeces were collected from sheep fed the C diet and 100 g ground lignite added to each incubation system at the start of the experiment; for GM, urine and faeces were collected from sheep fed a diet consisting of C diet with 200 g/kg of the diet replaced with GM. In Exp. 2, three different urine-faeces mixtures were set up: 2U:1F, 1.4U:1F and 1U:1F with urine to faeces ratios of 2:1, 1.4:1 and 1:1, respectively, each with 5 replications. Lignite in C + L led to significantly lower cumulative manure-N loss by 81 and 68% in comparison with C and GM groups, respectively (P = 0.001). Cumulative emitted manure NH3-N was lower in C + L than C and GM groups by 35 and 36%, respectively (P = 0.020). Emitted manure NH3-N was higher in 2U:1F compared to 1.4U:1F and 1U:1F by 18 and 26%, respectively (P < 0.001). This confirms the relationship between manure δ15N and cumulative NH3-N loss reported by earlier studies, which may be useful for estimating NH3 losses.
Recent research has shown the potential of speleothem δ13C to record a range of environmental processes. Here, we report on 230Th-dated stalagmite δ13C records for southwest Sulawesi, Indonesia, over the last 40,000 yr to investigate the relationship between tropical vegetation productivity and atmospheric methane concentrations. We demonstrate that the Sulawesi stalagmite δ13C record is driven by changes in vegetation productivity and soil respiration and explore the link between soil respiration and tropical methane emissions using HadCM3 and the Sheffield Dynamic Global Vegetation Model. The model indicates that changes in soil respiration are primarily driven by changes in temperature and CO2, in line with our interpretation of stalagmite δ13C. In turn, modelled methane emissions are driven by soil respiration, providing a mechanism that links methane to stalagmite δ13C. This relationship is particularly strong during the last glaciation, indicating a key role for the tropics in controlling atmospheric methane when emissions from high-latitude boreal wetlands were suppressed. With further investigation, the link between δ13C in stalagmites and tropical methane could provide a low-latitude proxy complementary to polar ice core records to improve our understanding of the glacial–interglacial methane budget.
Our earth is immersed in the near-earth space plasma environment, which plays a vital role in protecting our planet against the solar-wind impact and influencing space activities. It is significant to investigate the physical processes dominating the environment, for deepening our scientific understanding of it and improving the ability to forecast the space weather. As a crucial part of the National Major Scientific and Technological Infrastructure–Space Environment Simulation Research Infrastructure (SESRI) in Harbin, the Space Plasma Environment Research Facility (SPERF) builds a system to replicate the near-earth space plasma environment in the laboratory. The system aims to simulate the three-dimensional (3-D) structure and processes of the terrestrial magnetosphere for the first time in the world, providing a unique platform to reveal the physics of the 3-D asymmetric magnetic reconnection relevant to the earth's magnetopause, wave–particle interaction in the earth's radiation belt, particles’ dynamics during the geomagnetic storm, etc. The paper will present the engineering design and construction of the near-earth space plasma simulation system of the SPERF, with a focus on the critical technologies that have been resolved to achieve the scientific goals. Meanwhile, the possible physical issues that can be studied based on the apparatus are sketched briefly. The earth-based system is of great value in understanding the space plasma environment and supporting space exploration.
We explored the utility of the standardized infection ratio (SIR) for surgical site infection (SSI) reporting in an Australian jurisdiction.
Design:
Retrospective chart review.
Setting:
Statewide SSI surveillance data from 2013 to 2019.
Patients:
Individuals who had cardiac bypass surgery (CABG), colorectal surgery (COLO), cesarean section (CSEC), hip prosthesis (HPRO), or knee prosthesis (KPRO) procedures.
Methods:
The SIR was calculated by dividing the number of observed infections by the number of predicted infections as determined using the National Healthcare Safety Network procedure-specific risk models. In line with a minimum precision criterion, an SIR was not calculated if the number of predicted infections was <1.
Results:
A SIR >0 (≥1 observed SSI, predicted number of SSI ≥1, no missing covariates) could be calculated for a median of 89.3% of reporting quarters for CABG, 75.0% for COLO, 69.0% for CSEC, 0% for HPRO, and 7.1% for KPRO. In total, 80.6% of the reporting quarters, when the SIR was not calculated, were due to no observed infections or predicted infections <1, and 19.4% were due to missing covariates alone. Within hospitals, the median percentage of quarters during which zero infections were observed was 8.9% for CABG, 20.0% for COLO, 25.4% for CSEC, 67.3% for HPRO, and 71.4% for KPRO.
Conclusions:
Calculating an SIR for SSIs is challenging for hospitals in our regional network, primarily because of low event numbers and many facilities with predicted infections <1. Our SSI reporting will continue to use risk-indexed rates, in tandem with SIR values when predicted number of SSI ≥1.
RadioTalk is a communication platform that enabled members of the Radio Galaxy Zoo (RGZ) citizen science project to engage in discussion threads and provide further descriptions of the radio subjects they were observing in the form of tags and comments. It contains a wealth of auxiliary information which is useful for the morphology identification of complex and extended radio sources. In this paper, we present this new dataset, and for the first time in radio astronomy, we combine text and images to automatically classify radio galaxies using a multi-modal learning approach. We found incorporating text features improved classification performance which demonstrates that text annotations are rare but valuable sources of information for classifying astronomical sources, and suggests the importance of exploiting multi-modal information in future citizen science projects. We also discovered over 10000 new radio sources beyond the RGZ-DR1 catalogue in this dataset.
Internalising disorders are highly prevalent emotional dysregulations during preadolescence but clinical decision-making is hampered by high heterogeneity. During this period impulsivity represents a major risk factor for psychopathological trajectories and may act on this heterogeneity given the controversial anxiety–impulsivity relationships. However, how impulsivity contributes to the heterogeneous symptomatology, neurobiology, neurocognition and clinical trajectories in preadolescent internalising disorders remains unclear.
Aims
The aim was to determine impulsivity-dependent subtypes in preadolescent internalising disorders that demonstrate distinct anxiety–impulsivity relationships, neurobiological, genetic, cognitive and clinical trajectory signatures.
Method
We applied a data-driven strategy to determine impulsivity-related subtypes in 2430 preadolescents with internalising disorders from the Adolescent Brain Cognitive Development study. Cross-sectional and longitudinal analyses were employed to examine subtype-specific signatures of the anxiety–impulsivity relationship, brain morphology, cognition and clinical trajectory from age 10 to 12 years.
Results
We identified two distinct subtypes of patients who internalise with comparably high anxiety yet distinguishable levels of impulsivity, i.e. enhanced (subtype 1) or decreased (subtype 2) compared with control participants. The two subtypes exhibited opposing anxiety–impulsivity relationships: higher anxiety at baseline was associated with higher lack of perseverance in subtype 1 but lower sensation seeking in subtype 2 at baseline/follow-up. Subtype 1 demonstrated thicker prefrontal and temporal cortices, and genes enriched in immune-related diseases and glutamatergic and GABAergic neurons. Subtype 1 exhibited cognitive deficits and a detrimental trajectory characterised by increasing emotional/behavioural dysregulations and suicide risks during follow-up.
Conclusions
Our results indicate impulsivity-dependent subtypes in preadolescent internalising disorders and unify past controversies about the anxiety–impulsivity interaction. Clinically, individuals with a high-impulsivity subtype exhibit a detrimental trajectory, thus early interventions are warranted.
We present the third data release from the Parkes Pulsar Timing Array (PPTA) project. The release contains observations of 32 pulsars obtained using the 64-m Parkes ‘Murriyang’ radio telescope. The data span is up to 18 yr with a typical cadence of 3 weeks. This data release is formed by combining an updated version of our second data release with $\sim$3 yr of more recent data primarily obtained using an ultra-wide-bandwidth receiver system that operates between 704 and 4032 MHz. We provide calibrated pulse profiles, flux density dynamic spectra, pulse times of arrival, and initial pulsar timing models. We describe methods for processing such wide-bandwidth observations and compare this data release with our previous release.
Childhood is a crucial neurodevelopmental period. We investigated whether childhood reading for pleasure (RfP) was related to young adolescent assessments of cognition, mental health, and brain structure.
Methods
We conducted a cross-sectional and longitudinal study in a large-scale US national cohort (10 000 + young adolescents), using the well-established linear mixed model and structural equation methods for twin study, longitudinal and mediation analyses. A 2-sample Mendelian randomization (MR) analysis for potential causal inference was also performed. Important factors including socio-economic status were controlled.
Results
Early-initiated long-standing childhood RfP (early RfP) was highly positively correlated with performance on cognitive tests and significantly negatively correlated with mental health problem scores of young adolescents. These participants with higher early RfP scores exhibited moderately larger total brain cortical areas and volumes, with increased regions including the temporal, frontal, insula, supramarginal; left angular, para-hippocampal; right middle-occipital, anterior-cingulate, orbital areas; and subcortical ventral-diencephalon and thalamus. These brain structures were significantly related to their cognitive and mental health scores, and displayed significant mediation effects. Early RfP was longitudinally associated with higher crystallized cognition and lower attention symptoms at follow-up. Approximately 12 h/week of youth regular RfP was cognitively optimal. We further observed a moderately significant heritability of early RfP, with considerable contribution from environments. MR analysis revealed beneficial causal associations of early RfP with adult cognitive performance and left superior temporal structure.
Conclusions
These findings, for the first time, revealed the important relationships of early RfP with subsequent brain and cognitive development and mental well-being.
Background: Although unapproved by the FDA for treating insomnia, trazodone is commonly prescribed in the US partly due to lack of scheduling, hence it’s perceived as safer than z-drugs and benzodiazepines. This study investigated trazodone abuse/dependence potential and safety risks. Methods: Cases involving trazodone or benzodiazepines (temazepam, triazolam, estazolam) frequently prescribed for insomnia were identified from the FDA Adverse Events Reporting System (FAERS), National Forensic Laboratory Information System (NFLIS) for confiscation data, and the American Association of Poison Control Centers’-National Poison Data System (AAPCC-NPDS). Drug-related falls risk was assessed from claims databases. Results: FAERS included 11,228 trazodone and 5120 benzodiazepine reports. Of these, drug-abuse and drug-dependence cases with trazodone were lower than benzodiazepines (drug-abuse: 6.4%/12.6%; drug-dependence: 1.1%/3.6%). Serious cases (81.8%/83.9%) and deaths (35.4%/36.0%), were similar between trazodone and benzodiazepines. NFLIS reported 612/1,575,874 (0.04%) drug-seizure cases that included trazodone. AAPCC-NPDS reported 22,225/1,446,011 (1.54%) total case mentions of trazodone/all pharmaceuticals and 8445 trazodone-related single-exposure cases. Falls risk (1year-period) in Medicare beneficiaries ≥65y and commercially-insured enrollees ≥18y was reported for trazodone and benzodiazepines: Medicare, 9.5%/11.3%; Commercially-insured: 4.6%/3.7%. Conclusions: Trazodone has abuse/dependence potential and important safety risks. Given limited data from well-controlled studies and off-label use, re-evaluation of trazodone prescribing rates in patients with insomnia is warranted.
Industrial disasters can have a myriad of repercussions ranging from deaths, injuries, and long-term adverse health impacts on nearby populations, to political fallout and environmental damage. This is a descriptive epidemiological analysis of industrial disasters occurring between 1995 and 2021 which may provide useful insight for health-care systems and disaster medicine specialists to better prevent and mitigate the effects of future industrial disasters.
Methods:
Data were collected using a retrospective database search of the Emergency Events Database (EM-DATS) for all industrial disasters occurring between January 1, 1995, and December 31, 2021.
Results:
A total of 1054 industrial disasters were recorded from 1995 to 2021. Most of these disasters occurred in Asia (720; 68.3%), with 131 (12.4%) in Africa, 107 (10.2%) in Europe, 94 (8.9%) in the Americas, and 2 (0.2%) in Oceania. Half of these disasters were explosions (533; 50.6%), 147 (13.9%) were collapses, 143 (13.6%) were fires, 46 (4.4%) were chemical spills, 41 (3.9%) were gas leaks, and 34 (3.2%) were poisonings. There were 6 (0.6%) oil spills and 3 (0.3%) radiation events.
Conclusions:
A total of 29,708 deaths and 57,605 injuries were recorded as a result of industrial disasters, and they remain a significant contributor to the health-care risks of both workers and regional communities. The need for specialized emergency response training, the potential devastation of an industrial accident, and the vulnerability of critical infrastructure as terror targets highlight the need to better understand the potential immediate and long-term consequences of such events and to improve health-care responses in the future.
Many clinical trials leverage real-world data. Typically, these data are manually abstracted from electronic health records (EHRs) and entered into electronic case report forms (CRFs), a time and labor-intensive process that is also error-prone and may miss information. Automated transfer of data from EHRs to eCRFs has the potential to reduce data abstraction and entry burden as well as improve data quality and safety.
Methods:
We conducted a test of automated EHR-to-CRF data transfer for 40 participants in a clinical trial of hospitalized COVID-19 patients. We determined which coordinator-entered data could be automated from the EHR (coverage), and the frequency with which the values from the automated EHR feed and values entered by study personnel for the actual study matched exactly (concordance).
Results:
The automated EHR feed populated 10,081/11,952 (84%) coordinator-completed values. For fields where both the automation and study personnel provided data, the values matched exactly 89% of the time. Highest concordance was for daily lab results (94%), which also required the most personnel resources (30 minutes per participant). In a detailed analysis of 196 instances where personnel and automation entered values differed, both a study coordinator and a data analyst agreed that 152 (78%) instances were a result of data entry error.
Conclusions:
An automated EHR feed has the potential to significantly decrease study personnel effort while improving the accuracy of CRF data.