To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study estimates the incubation period of COVID-19 among locally transmitted cases, and its association with age to better inform public health measures in containing COVID-19. Epidemiological data of all PCR-confirmed COVID-19 cases from all restructured hospitals in Singapore were collected between 23 January 2020 and 2 April 2020. Activity mapping and detailed epidemiological investigation were conducted by trained personnel. Positive cases without clear exposure to another positive case were excluded from the analysis. One hundred and sixty-four cases (15.6% of patients) met the inclusion criteria during the defined period. The crude median incubation period was 5 days (range 1–12 days) and median age was 42 years (range 5–79 years). The median incubation period among those 70 years and older was significantly longer than those younger than 70 years (8 vis-à-vis 5 days, P = 0.040). Incubation period was negatively correlated with day of illness in both groups. These findings support current policies of 14-day quarantine periods for close contacts of confirmed cases and 28 days for monitoring infections in known clusters. An elderly person who may have a longer incubation period than a younger counterpart may benefit from earlier and proactive testing, especially after exposure to a positive case.
The present study aimed to compare the effects of drinking different types of coffee before a high-glycaemic index (GI) meal on postprandial glucose metabolism and to assess the effects of adding milk and sugar into coffee. In this randomised, crossover, acute feeding study, apparently healthy adults (n 21) consumed the test drink followed by a high-GI meal in each session. Different types of coffee (espresso, instant, boiled and decaffeinated, all with milk and sugar) and plain water were tested in separate sessions, while a subset of the participants (n 10) completed extra sessions using black coffees. Postprandial levels of glucose, insulin, active glucagon-like peptide 1 (GLP-1) and nitrotyrosine between different test drinks were compared using linear mixed models. Results showed that only preloading decaffeinated coffee with milk and sugar led to significantly lower glucose incremental AUC (iAUC; 14 % lower, P = 0·001) than water. Preloading black coffees led to greater postprandial glucose iAUC than preloading coffees with milk and sugar added (12–35 % smaller, P < 0·05 for all coffee types). Active GLP-1 and nitrotyrosine levels were not significantly different between test drinks. To conclude, preloading decaffeinated coffee with milk and sugar led to a blunted postprandial glycaemic response after a subsequent high-GI meal, while adding milk and sugar into coffee could mitigate the impairment effect of black coffee towards postprandial glucose responses. These findings may partly explain the positive effects of coffee consumption on glucose metabolism.
To describe the infection control preparedness measures undertaken for coronavirus disease (COVID-19) due to SARS-CoV-2 (previously known as 2019 novel coronavirus) in the first 42 days after announcement of a cluster of pneumonia in China, on December 31, 2019 (day 1) in Hong Kong.
A bundled approach of active and enhanced laboratory surveillance, early airborne infection isolation, rapid molecular diagnostic testing, and contact tracing for healthcare workers (HCWs) with unprotected exposure in the hospitals was implemented. Epidemiological characteristics of confirmed cases, environmental samples, and air samples were collected and analyzed.
From day 1 to day 42, 42 of 1,275 patients (3.3%) fulfilling active (n = 29) and enhanced laboratory surveillance (n = 13) were confirmed to have the SARS-CoV-2 infection. The number of locally acquired case significantly increased from 1 of 13 confirmed cases (7.7%, day 22 to day 32) to 27 of 29 confirmed cases (93.1%, day 33 to day 42; P < .001). Among them, 28 patients (66.6%) came from 8 family clusters. Of 413 HCWs caring for these confirmed cases, 11 (2.7%) had unprotected exposure requiring quarantine for 14 days. None of these was infected, and nosocomial transmission of SARS-CoV-2 was not observed. Environmental surveillance was performed in the room of a patient with viral load of 3.3 × 106 copies/mL (pooled nasopharyngeal and throat swabs) and 5.9 × 106 copies/mL (saliva), respectively. SARS-CoV-2 was identified in 1 of 13 environmental samples (7.7%) but not in 8 air samples collected at a distance of 10 cm from the patient’s chin with or without wearing a surgical mask.
Appropriate hospital infection control measures was able to prevent nosocomial transmission of SARS-CoV-2.
Better understanding of interplay among symptoms, cognition and functioning in first-episode psychosis (FEP) is crucial to promoting functional recovery. Network analysis is a promising data-driven approach to elucidating complex interactions among psychopathological variables in psychosis, but has not been applied in FEP.
This study employed network analysis to examine inter-relationships among a wide array of variables encompassing psychopathology, premorbid and onset characteristics, cognition, subjective quality-of-life and psychosocial functioning in 323 adult FEP patients in Hong Kong. Graphical Least Absolute Shrinkage and Selection Operator (LASSO) combined with extended Bayesian information criterion (BIC) model selection was used for network construction. Importance of individual nodes in a generated network was quantified by centrality analyses.
Our results showed that amotivation played the most central role and had the strongest associations with other variables in the network, as indexed by node strength. Amotivation and diminished expression displayed differential relationships with other nodes, supporting the validity of two-factor negative symptom structure. Psychosocial functioning was most strongly connected with amotivation and was weakly linked to several other variables. Within cognitive domain, digit span demonstrated the highest centrality and was connected with most of the other cognitive variables. Exploratory analysis revealed no significant gender differences in network structure and global strength.
Our results suggest the pivotal role of amotivation in psychopathology network of FEP and indicate its critical association with psychosocial functioning. Further research is required to verify the clinical significance of diminished motivation on functional outcome in the early course of psychotic illness.
Upper respiratory tract infections (URTIs) account for substantial attendances at emergency departments (EDs). There is a need to elucidate determinants of antibiotic prescribing in time-strapped EDs – popular choices for primary care despite highly accessible primary care clinics. Semi-structured in-depth interviews were conducted with purposively sampled physicians (n = 9) in an adult ED in Singapore. All interviews were analysed using thematic analysis and further interpreted using the Social Ecological Model to explain prescribing determinants. Themes included: (1) reliance on clinical knowledge and judgement, (2) patient-related factors, (3) patient–physician relationship factors, (4) perceived practice norms, (5) policies and treatment guidelines and (6) patient education and awareness. The physicians relied strongly on their clinical knowledge and judgement in managing URTI cases and seldom interfered with their peers’ clinical decisions. Despite departmental norms of not prescribing antibiotics for URTIs, physicians would prescribe antibiotics when faced with uncertainty in patients’ diagnoses, treating immunocompromised or older patients with comorbidities, and for patients demanding antibiotics, especially under time constraints. Participants had a preference for antibiotic prescribing guidelines based on local epidemiology, but viewed hospital policies on prescribing as a hindrance to clinical judgement. Participants highlighted the need for more public education and awareness on the appropriate use of antibiotics and management of URTIs. Organisational practice norms strongly influenced antibiotic prescribing decisions by physicians, who can be swayed by time pressures and patient demands. Clinical decision support tools, hospital guidelines and patient education targeting at individual, interpersonal and community levels could reduce unnecessary antibiotic use.
We argue that the ways in which we as humans derive well-being from nature – for example by harvesting firewood, selling fish or enjoying natural beauty – feed back into how we behave towards the environment. This feedback is mediated by institutions (rules, regulations) and by individual capacities to act. Understanding these relationships can guide better interventions for sustainably improving well-being and alleviating poverty. However, more attention needs to be paid to how experience-related benefits from nature influence attitudes and actions towards the environment, and how these relationships can be reflected in more environmentally sustainable development projects.
Brain tumor behavior is driven by aberrations in the genome and epigenome. Many of these changes, such as IDH mutations in diffuse low-grade glioma (DLGG), are common amongst the same class of tumour and can be incorporated into the diagnostic criteria. However, any given tumor may have other, less common genomic aberrations that are essential for its biological behavior and may inform on underlying aberrant cellular pathways, and potential therapeutic agents. Precision oncology is a genomics-based approach which profiles these alterations to better manage cancer patients and has established itself within the practice of oncology and is slowly making its way into neuro-oncology. The BC Cancer’s Personalized OncoGenomics (POG) program has profiled 16 adult tumours originating from the central nervous system using whole genome and transcriptome analysis (WGTA), for the first time, within a meaningful clinical timeframe/setting. As expected, primary genomic drivers were consistent with their respective diagnoses, though secondary drivers were found to be unique to each tumour. Although these analyses did not result in altered clinical management for these patients, primarily due to availability of drug or clinical trials, they highlight the heterogeneity of secondary drivers in cancers and provide clinicians with meaningful biological information. Lastly, the data generated by POG has highlighted the frequency and complexity of novel driver fusions which are predicted to behave similarly to canonical driver events in their respective tumours. The information available to clinicians through POG has provided paramount knowledge into the biology of each unique tumour.
The main goal of this paper is to provide insights into swash flow dynamics, generated by a non-breaking solitary wave on a steep slope. Both laboratory experiments and numerical simulations are conducted to investigate the details of runup and rundown processes. Special attention is given to the evolution of the bottom boundary layer over the slope in terms of flow separation, vortex formation and the development of a hydraulic jump during the rundown phase. Laboratory experiments were performed to measure the flow velocity fields by means of high-speed particle image velocimetry (HSPIV). Detailed pathline patterns of the swash flows and free-surface profiles were also visualized. Highly resolved computational fluid dynamics (CFD) simulations were carried out. Numerical results are compared with laboratory measurements with a focus on the velocities inside the boundary layer. The overall agreement is excellent during the initial stage of the runup process. However, discrepancies in the model/data comparison grow as time advances because the numerical model does not simulate the shoreline dynamics accurately. Introducing small temporal and spatial shifts in the comparison yields adequate agreement during the entire rundown process. Highly resolved numerical solutions are used to study physical variables that are not measured in laboratory experiments (e.g. pressure field and bottom shear stress). It is shown that the main mechanism for vortex shedding is correlated with the large pressure gradient along the slope as the rundown flow transitions from supercritical to subcritical, under the developing hydraulic jump. Furthermore, the bottom shear stress analysis indicates that the largest values occur at the shoreline and that the relatively large bottom shear stress also takes place within the supercritical flow region, being associated with the backwash vortex system rather than the plunging wave. It is clearly demonstrated that the combination of laboratory observations and numerical simulations have indeed provided significant insights into the swash flow processes.
While previous work showed that the Centers for Disease Control and Prevention toolkit for carbapenem-resistant Enterobacteriaceae (CRE) can reduce spread regionally, these interventions are costly, and decisions makers want to know whether and when economic benefits occur.
Orange County, California
Using our Regional Healthcare Ecosystem Analyst (RHEA)-generated agent-based model of all inpatient healthcare facilities, we simulated the implementation of the CRE toolkit (active screening of interfacility transfers) in different ways and estimated their economic impacts under various circumstances.
Compared to routine control measures, screening generated cost savings by year 1 when hospitals implemented screening after identifying ≤20 CRE cases (saving $2,000–$9,000) and by year 7 if all hospitals implemented in a regional coordinated manner after 1 hospital identified a CRE case (hospital perspective). Cost savings was achieved only if hospitals independently screened after identifying 10 cases (year 1, third-party payer perspective). Cost savings was achieved by year 1 if hospitals independently screened after identifying 1 CRE case and by year 3 if all hospitals coordinated and screened after 1 hospital identified 1 case (societal perspective). After a few years, all strategies cost less and have positive health effects compared to routine control measures; most strategies generate a positive cost-benefit each year.
Active screening of interfacility transfers garnered cost savings in year 1 of implementation when hospitals acted independently and by year 3 if all hospitals collectively implemented the toolkit in a coordinated manner. Despite taking longer to manifest, coordinated regional control resulted in greater savings over time.
Evidence suggests that autism and schizophrenia share similarities in genetic, neuropsychological and behavioural aspects. Although both disorders are associated with theory of mind (ToM) impairments, a few studies have directly compared ToM between autism patients and schizophrenia patients. This study aimed to investigate to what extent high-functioning autism patients and schizophrenia patients share and differ in ToM performance.
Thirty high-functioning autism patients, 30 schizophrenia patients and 30 healthy individuals were recruited. Participants were matched in age, gender and estimated intelligence quotient. The verbal-based Faux Pas Task and the visual-based Yoni Task were utilised to examine first- and higher-order, affective and cognitive ToM. The task/item difficulty of two paradigms was examined using mixed model analyses of variance (ANOVAs). Multiple ANOVAs and mixed model ANOVAs were used to examine group differences in ToM.
The Faux Pas Task was more difficult than the Yoni Task. High-functioning autism patients showed more severely impaired verbal-based ToM in the Faux Pas Task, but shared similar visual-based ToM impairments in the Yoni Task with schizophrenia patients.
The findings that individuals with high-functioning autism shared similar but more severe impairments in verbal ToM than individuals with schizophrenia support the autism–schizophrenia continuum. The finding that verbal-based but not visual-based ToM was more impaired in high-functioning autism patients than schizophrenia patients could be attributable to the varied task/item difficulty between the two paradigms.
As the consciousness of energy saving and carbon reduction and comfortable environment is paid increasing attention to, the common objective of various countries with decreasing energy is to develop and popularize high efficiency and low running noise blowers. This study uses CFD to calculate the flow field and performance of a blower and compare with the experimental measurement. The characteristic curve of blower shows that the simulated and experimental values are close to each other, the difference between the values is only 0.4%. This analysis result proofs the CFD package is a highly reliable tool for the future blower design improvement. In addition, this study discusses the noise distribution of blower flow field, the periodic pressure output value calculated by CFD is used in the sound source input of sound pressure field, so as to simulate and analyze the aerodynamic noise reading of the flow field around the blower. The result shows that the simulated value of flow field around the fan has as high as 80.5 dB(A) ∼ 81.5 dB(A) noise level and is agree with measurement (82 dB(A)). The noise level is low but has a sharp noise. According to the numerical results, designer of the blower modify the tongue geometry and remove the sharp noise.
Phenomenological and mechanistic models are widely used to assist resource planning for pandemics and emerging infections. We conducted a systematic review, to compare methods and outputs of published phenomenological and mechanistic modelling studies pertaining to the 2013–2016 Ebola virus disease (EVD) epidemics in four West African countries – Sierra Leone, Liberia, Guinea and Nigeria. We searched Pubmed, Embase and Scopus databases for relevant English language publications up to December 2015. Of the 874 articles identified, 41 met our inclusion criteria. We evaluated these selected studies based on: the sources of the case data used, and modelling approaches, compartments used, population mixing assumptions, model fitting and calibration approaches, sensitivity analysis used and data bias considerations. We synthesised results of the estimated epidemiological parameters: basic reproductive number (R0), serial interval, latent period, infectious period and case fatality rate, and examined their relationships. The median of the estimated mean R0 values were between 1·30 and 1·84 in Sierra Leone, Liberia and Guinea. Much higher R0 value of 9·01 was described for Nigeria. We investigated several issues with uncertainty around EVD modes of transmission, and unknown observation biases from early reported case data. We found that epidemic models offered R0 mean estimates which are country-specific, but these estimates are not associating with the use of several key disease parameters within the plausible ranges. We find simple models generally yielded similar estimates of R0 compared with more complex models. Models that accounted for data uncertainty issues have offered a higher case forecast compared with actual case observation. Simple model which offers transparency to public health policy makers could play a critical role for advising rapid policy decisions under an epidemic emergency.
The oriental army worm Mythimna separata (Lepidoptera: Noctuidae) is a migratory pest in Eastern Asia and China. Seasonal high temperatures in Southern China and low temperatures in Northern China are pressures favouring the annual migration of this species, while cold tolerance determines the northern limit of its overwintering range. A number of physiological stress responses occur in insects as a result of variations in temperature. One reaction to thermal stress is the generation of reactive oxygen species (ROS), which can be harmful by causing oxidative damage. The time-related effects (durations of 1, 4 and 7 h) of thermal stress treatments of M. separata at comparatively low (5, 10, 15 and 20°C) and high (30, 35, 40 and 45°C) temperatures on the activities of antioxidant enzymes, including superoxide dismutase (SOD), catalase (CAT), peroxidase (POX) and glutathione S-transferases (GSTs), and total antioxidant capacity (T-AOC) were determined. Thermal stress resulted in significant elevation of the activities of SOD, CAT and GSTs, indicating that these enzymes contribute to defence mechanisms counteracting oxidative damage caused by an increase in ROS. However, at high-temperatures, POX and T-AOC were also found to contribute to scavenging ROS. Our results also indicate that extreme temperatures lead to elevated ROS production in M. separata. The present study confirms that thermal stress can be responsible for oxidative damage. To overcome such stress, antioxidant enzymes play key roles in diminishing oxidative damage in M. separata.
To ascertain determinants of an interest in a career in ENT surgery through a survey of medical students and junior doctors.
A survey was administered, comprising Likert scales, forced response and single option questions, and free text responses, at five different courses or events for those interested in a career in ENT.
The survey had an 87 per cent response rate; respondents consisted of 43 applicants for national selection, 15 foundation doctors and 23 medical students. The most important factors that encourage ENT as a career included: the variety of operative procedures, work–life balance, inherent interest in this clinical area and inspirational senior role models. Exposure to ENT in undergraduate or post-graduate training is critical in deciding to pursue this specialty.
It is important to promote those aspects of ENT surgery that attract people to it, and to argue for greater exposure to ENT during undergraduate and post-graduate training.
To study the association between gastrointestinal colonization of carbapenemase-producing Enterobacteriaceae (CPE) and proton pump inhibitors (PPIs).
We analyzed 31,526 patients with prospective collection of fecal specimens for CPE screening: upon admission (targeted screening) and during hospitalization (opportunistic screening, safety net screening, and extensive contact tracing), in our healthcare network with 3,200 beds from July 1, 2011, through December 31, 2015. Specimens were collected at least once weekly during hospitalization for CPE carriers and subjected to broth enrichment culture and multiplex polymerase chain reaction.
Of 66,672 fecal specimens collected, 345 specimens (0.5%) from 100 patients (0.3%) had CPE. The number and prevalence (per 100,000 patient-days) of CPE increased from 2 (0.3) in 2012 to 63 (8.0) in 2015 (P<.001). Male sex (odds ratio, 1.91 [95% CI, 1.15–3.18], P=.013), presence of wound or drain (3.12 [1.70–5.71], P<.001), and use of cephalosporins (3.06 [1.42–6.59], P=.004), carbapenems (2.21 [1.10–4.48], P=.027), and PPIs (2.84 [1.72–4.71], P<.001) in the preceding 6 months were significant risk factors by multivariable analysis. Of 79 patients with serial fecal specimens, spontaneous clearance of CPE was noted in 57 (72.2%), with a median (range) of 30 (3–411) days. Comparing patients without use of antibiotics and PPIs, consumption of both antibiotics and PPIs after CPE identification was associated with later clearance of CPE (hazard ratio, 0.35 [95% CI, 0.17–0.73], P=.005).
Concomitant use of antibiotics and PPIs prolonged duration of gastrointestinal colonization by CPE.
Measurement error in self-reported total sugars intake may obscure associations between sugars consumption and health outcomes, and the sum of 24 h urinary sucrose and fructose may serve as a predictive biomarker of total sugars intake.
The Study of Latinos: Nutrition & Physical Activity Assessment Study (SOLNAS) was an ancillary study to the Hispanic Community Health Study/Study of Latinos (HCHS/SOL) cohort. Doubly labelled water and 24 h urinary sucrose and fructose were used as biomarkers of energy and sugars intake, respectively. Participants’ diets were assessed by up to three 24 h recalls (88 % had two or more recalls). Procedures were repeated approximately 6 months after the initial visit among a subset of ninety-six participants.
Four centres (Bronx, NY; Chicago, IL; Miami, FL; San Diego, CA) across the USA.
Men and women (n 477) aged 18–74 years.
The geometric mean of total sugars was 167·5 (95 % CI 154·4, 181·7) g/d for the biomarker-predicted and 90·6 (95 % CI 87·6, 93·6) g/d for the self-reported total sugars intake. Self-reported total sugars intake was not correlated with biomarker-predicted sugars intake (r=−0·06, P=0·20, n 450). Among the reliability sample (n 90), the reproducibility coefficient was 0·59 for biomarker-predicted and 0·20 for self-reported total sugars intake.
Possible explanations for the lack of association between biomarker-predicted and self-reported sugars intake include measurement error in self-reported diet, high intra-individual variability in sugars intake, and/or urinary sucrose and fructose may not be a suitable proxy for total sugars intake in this study population.
Late-life depression (LLD) in the elderly was reported to present with emotion dysregulation accompanied by high perceived loneliness. Previous research has suggested that LLD is a disorder of connectivity and is associated with aberrant network properties. On the other hand, perceived loneliness is found to adversely affect the brain, but little is known about its neurobiological basis in LLD. The current study investigated the relationships between the structural connectivity, functional connectivity during affective processing, and perceived loneliness in LLD.
The current study included 54 participants aged >60 years of whom 31 were diagnosed with LLD. Diffusion tensor imaging (DTI) data and task-based functional magnetic resonance imaging (fMRI) data of an affective processing task were collected. Network-based statistics and graph theory techniques were applied, and the participants’ perceived loneliness and depression level were measured. The affective processing task included viewing affective stimuli.
Structurally, a loneliness-related sub-network was identified across all subjects. Functionally, perceived loneliness was related to connectivity differently in LLD than that in controls when they were processing negative stimuli, with aberrant networking in subcortical area.
Perceived loneliness was identified to have a unique role in relation to the negative affective processing in LLD at the functional brain connectional and network levels. The findings increas our understanding of LLD and provide initial evidence of the neurobiological mechanisms of loneliness in LLD. Loneliness might be a potential intervention target in depressive patients.