We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Singapore General Hospital (SGH) is the largest acute tertiary-care hospital in Singapore. Healthcare workers (HCWs) are at risk of acquiring COVID-19 in both the community and workplaces. SGH has a robust exposure management process including prompt contact tracing, immediate ring fencing, lock down of affected cubicles or single room isolation for patient contacts, and home isolation orders for staff contacts of COVID-19 cases during the containment phase of the pandemic. Contacts were also placed on enhanced surveillance with PCR testing on days 1 and 4 as well as daily antigen rapid tests (ARTs) for 10 days after exposure. Here, we describe the characteristic of HCWs with COVID-19 during the third wave of the COVID-19 pandemic. Methods: This retrospective observational study included all SGH HCWs who acquired COVID-19 during the third wave (ie, the 18-week period from September 1 to December 31, 2021) of the COVID-19 pandemic. Univariate analysis was used to compare characteristics of work-associated infection (WAI) and community-acquired infection (CAI) among HCWs. Results: Among a workforce of >10,000 at SGH, 335 HCWs acquired COVID-19 during study period. CAI (exposure to known clusters or household contact) accounted for 111 HCW infections (33.1%). Also, 48 HCWs (14.3%) had a WAI (ie, acquired at their work places where there was no patient contact). Among WAsI, only 5 HCWs had hospital-acquired infection (confirmed by phylogenetic analysis). The sources of exposure for the remaining 176 HCWs were unknown. Weekly incidence of COVID-19 among HCWs was comparable to the epidemiology curve of all cases in Singapore (Fig. 1 and 2). The mean age of HCWs with COVID-19 was 39.6 years, and most were women. At the time of positive SARS-CoV-2 PCR test, 223 HCWs were symptomatic, and 67 (20.0%) of them had comorbidities. Only 16 HCWs (4.8%) required hospitalization, and all recovered fully with no mortality (Table 1). Being female was associated with community COVID-19 acquisition (OR, 4.6, P Conclusions: During the thrid wave of the COVID-19 pandemic, a higher percentage of HCWs at SGH acquired the infection from the community than from the workplace. Safe management measures, such as universal masking, social distancing, and robust exposure management processes including prompt contact tracing and environmental disinfection, can reduce the risk of COVID-19 in the hospital work environment.
Sporadic clusters of healthcare-associated coronavirus disease 2019 (COVID-19) occurred despite intense rostered routine surveillance and a highly vaccinated healthcare worker (HCW) population, during a community surge of the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) B.1.617.2 δ (delta) variant. Genomic analysis facilitated timely cluster detection and uncovered additional linkages via HCWs moving between clinical areas and among HCWs sharing a common lunch area, enabling early intervention.
OBJECTIVES/GOALS: To develop a two-staged convolutional neural network to identify the ovary and antral follicles within ovarian ultrasound images and determine its reliability and feasibility compared to conventional techniques in 2D and 3D ultrasonography image analysis. METHODS/STUDY POPULATION: De-identified and archived ultrasonographic images of women across the reproductive spectrum (N=500) will be used in the study. These ultrasound images will be labeled by experienced raters to train a two-staged convolutional neural network (CU-Net). CU-Net will first separate the entire ovary from the background and subsequently identify all antral follicles within the ovary. Following training, the CU-Net will evaluate a second set of independent images (N=100) to determine performance accuracy. Three specialized raters will establish the reliability and feasibility of CU-Net compared to conventional 2D and 3D ovarian ultrasound image analysis methods. RESULTS/ANTICIPATED RESULTS: The labeled training dataset of ovarian ultrasound images is expected to successfully train the CU-Net and allow for accurate identification of the ovary and the total number of antral follicles in the second testing set of ultrasound images. When compared to conventional 2D and 3D ultrasound image analysis methods, CU-Net is expected to have similar accuracy when compared to the gold-standard method (2D-Offline with Grid) and outperform other approaches, such as 2D-Real Time and 3D volume software (VOCAL and Sono-AVC). However, CU-Net is anticipated to be the fastest and most reliable method across users, supporting its clinical feasibility. DISCUSSION/SIGNIFICANCE: This study will immediately translate to providing a standardized platform that can improve the accuracy, reliability, and time demand required for the evaluation of ovarian ultrasounds across users and clinical and research settings.
To describe OXA-48–like carbapenem-producing Enterobacteriaceae (CPE) outbreaks at Singapore General Hospital between 2018 and 2020 and to determine the risk associated with OXA-48 carriage in the 2020 outbreak.
Design:
Outbreak report and case–control study.
Setting:
Singapore General Hospital (SGH) is a tertiary-care academic medical center in Singapore with 1,750 beds.
Methods:
Active surveillance for CPE is conducted for selected high-risk patient cohorts through molecular testing on rectal swabs or stool samples. Patients with CPE are isolated or placed in cohorts under contact precautions. During outbreak investigations, rectal swabs are repeated for culture. For the 2020 outbreak, a retrospective case–control study was conducted in which controls were inpatients who tested negative for OXA-48 and were selected at a 1:3 case-to-control ratio.
Results:
Hospital wide, the median number of patients with healthcare-associated OXA-48 was 2 per month. In the 3-year period between 2018 and 2020, 3 OXA-48 outbreaks were investigated and managed, involving 4 patients with Klebsiella pneumoniae in 2018, 55 patients with K. pneumoniae or Escherichia coli in 2019, and 49 patients with multispecies Enterobacterales in 2020. During the 2020 outbreak, independent risk factors for OXA-48 carriage on multivariate analysis (49 patients and 147 controls) were diarrhea within the preceding 2 weeks (OR, 3.3; 95% CI, 1.1–10.7; P = .039), contact with an OXA-48–carrying patient (OR, 8.7; 95% CI, 1.9–39.3; P = .005), and exposure to carbapenems (OR, 17.2; 95% CI, 2.2–136; P = .007) or penicillin (OR, 16.6; 95% CI, 3.8–71.0; P < .001).
Conclusions:
Multispecies OXA-48 outbreaks in our institution are likely related to a favorable ecological condition and selective pressure exerted by antimicrobial use. The integration of molecular surveillance epidemiology of the healthcare environment is important in understanding the risk of healthcare–associated infection to patients.
We study the classical Hermite–Hadamard inequality in the matrix setting. This leads to a number of interesting matrix inequalities such as the Schatten p-norm estimates
Disease-related malnutrition is prevalent among older adults; therefore, identifying the modifiable risk factors in the diet is essential for the prevention and management of disease-related malnutrition. The present study examined the cross-sectional association between dietary patterns and malnutrition in Chinese community-dwelling older adults aged ≥65 years in Hong Kong. Dietary patterns, including Diet Quality Index International (DQI-I), Dietary Approaches to Stop Hypertension (DASH), the Mediterranean Diet Score, ‘vegetable–fruit’ pattern, ‘snack–drink–milk product’ pattern and ‘meat–fish’ pattern, were estimated and generated from a validated food frequency questionnaire. Malnutrition was classified according to the modified Global Leadership Initiative on Malnutrition (GLIM) criteria based on two phenotypic components (low body mass index and reduced muscle mass) and one aetiologic component (inflammation/disease burden). The association between the tertile or level of adherence of each dietary pattern and modified GLIM criteria was analysed using adjusted binary logistic regression models. Data of 3694 participants were available (49 % men). Malnutrition was present in 397 participants (10⋅7 %). In men, a higher DQI-I score, a higher ‘vegetable–fruit’ pattern score and a lower ‘meat–fish’ pattern score were associated with a lower risk of malnutrition. In women, higher adherence to the DASH diet was associated with a lower risk of malnutrition. After the Bonferroni correction, the association remained statistically significant only in men for the DQI-I score. To conclude, a higher DQI-I score was associated with a lower risk of malnutrition in Chinese older men. Nutritional strategies for the prevention and management of malnutrition could potentially be targeted on dietary quality.
We obtain several norm and eigenvalue inequalities for positive matrices partitioned into four blocks. The results involve the numerical range $W(X)$ of the off-diagonal block $X$, especially the distance $d$ from $0$ to $W(X)$. A special consequence is an estimate,
Population-based colorectal cancer (CRC) screening programs that use a fecal immunochemical test (FIT) are often faced with a noncompliance issue and its subsequent waiting time (WT) for those FIT positives complying with confirmatory diagnosis. We aimed to identify factors associated with both of the correlated problems in the same model.
Methods
A total of 294,469 subjects, either with positive FIT test results or having a family history, collected from 2004 to 2013 were enrolled for analysis. We applied a hurdle Poisson regression model to accommodate the hurdle of compliance and also its related WT for undergoing colonoscopy while assessing factors responsible for the mixture of the two outcomes.
Results
The effect on compliance and WT varied with contextual factors, such as geographic areas, type of screening units, and level of urbanization. The hurdle score, representing the risk score in association with noncompliance, and the WT score, reflecting the rate of taking colonoscopy, were used to classify subjects into each of three groups representing the degree of compliance and the level of health awareness.
Conclusion
Our model was not only successfully applied to evaluating factors associated with the compliance and the WT distribution, but also developed into a useful assessment model for stratifying the risk and predicting whether and when screenees comply with the procedure of receiving confirmatory diagnosis given contextual factors and individual characteristics.
In bilinguals, language proficiency has been advanced to influence the involvement of domain-general control networks in language selection. We assessed, in university student translators with moderate- to high-second language (L2) proficiency depending on their translation educational level, the functional activity in the key language and control areas (the caudate nucleus, anterior cingulate, and prefrontal cortex), during task- and language-selection in an oral production context. We found that L2 proficiency influenced the relative involvement of our regions of interest during language selection vs domain-general cognitive control processes. While the left middle frontal and left caudate areas were more involved during linguistic than alphanumeric task selection in the low L2 proficiency group, these regions were similarly involved in both tasks in the high-L2 proficiency group. These findings suggest that language selection relies primarily on a network within domain-general cognitive control system with an increase in resource needs when L2 proficiency is low.
A gap exists between the evidence for reducing risk of knee osteoarthritis (KOA) progression and its application in patients’ daily lives. We aimed to bridge this gap by identifying patient and family physician (FP) self-management priorities to conceptualize and develop a mobile-health application (m-health app). Our co-design approach combined priorities and concerns solicited from patients and FPs with evidence on risk of progression to design and develop a KOA self-management tool.
Methods:
Parallel qualitative research of patient and FP perspectives was conducted to inform the co-design process. Researchers from the Enhancing Alberta Primary Care Research Networks (EnACT) evaluated the mental models of FPs using cognitive task analysis through structured interviews with four FPs. Using grounded theory methods, patient researchers from the Patient and Community Engagement Research (PaCER) program interviewed five patients to explore their perspectives about needs and interactions within primary care. In three co-design sessions relevant stakeholders (four patients, five FPs, and thirteen researchers) participated to: (i) identify user needs with regard to KOA self-management; and (ii) conceptualize and determine design priorities and functionalities of an m-health app using a modified nominal group process.
Results:
Priority measures for symptoms, activities, and quality of life from the user perspective were determined in the first two sessions. The third co-design session with our industry partner resulted in finalization of priorities through interactive patient and FP feedback. The top three features were: (i) a symptoms graph and summary; (ii) information and strategies; and (iii) setting goals. These features were used to inform the development of a minimum viable product.
Conclusions:
The novel use of co-design created directive dialog around the needs of patients, highlighting the contrasting views that exist between patients and FPs and emphasizing how exploring these differences might lead to strong design options for patient-oriented m-health apps. Characterizing these disjunctions has important implications for operationalizing patient-centered health care.
To examine score validity and reliability of a child version of the twenty-one-item Three-Factor Eating Questionnaire (CTFEQ-R21) in a sample of Canadian children and adolescents and its relationship with BMI Z-score and food/taste preferences.
Design
Cross-sectional study.
Setting
School-based.
Participants
Children (n 158), sixty-three boys (mean age 11·5 (sd 1·6) years) and ninety-five girls (11·9 (sd 1·9) years).
Results
Exploratory factor analysis revealed that the CTFEQ-R21 was best represented by four factors with item 17 removed (CFFEQ-R20), representing Cognitive Restraint (CR), Cognitive Uncontrolled Eating (UE 1), External Uncontrolled Eating (UE 2) and Emotional Eating (EE), accounting for 41·2 % of the total common variance with good scale reliability. ANOVA revealed that younger children reported higher UE 1 and CR scores than older children, and boys who reported high UE 1 scores had significantly higher BMI Z-scores. Children with high UE 1 scores reported a greater preference for high-protein and -fat foods, and high-fat savoury (HFSA) and high-fat sweet (HFSW) foods. Higher preference for high-protein, -fat and -carbohydrate foods, and HFSA, HFSW and low-fat savoury foods was found in children with high UE 2 scores.
Conclusions
The study suggests that the CFFEQ-R20 can be used to measure eating behaviour traits and associations with BMI Z-score and food/taste preferences in Canadian children and adolescents. Future research is needed to examine the validity of the questionnaire in larger samples and other geographical locations, as well as the inclusion of extraneous variables such as parental eating or socio-economic status.
Psychotropic medication use and psychiatric symptoms during pregnancy each are associated with adverse neurodevelopmental outcomes in offspring. Commonly, studies considering medication effects do not adequately assess symptoms, nor evaluate children when the effects are believed to occur, the fetal period. This study examined maternal serotonin reuptake inhibitor and polypharmacy use in relation to serial assessments of five indices of fetal neurobehavior and Bayley Scales of Infant Development at 12 months in N = 161 socioeconomically advantaged, non-Hispanic White women with a shared risk phenotype, diagnosed major depressive disorder. On average fetuses showed the expected development over gestation. In contrast, infant average Bayley psychomotor and mental development scores were low (M = 84.10 and M = 89.92, range of normal limits 85–114) with rates of delay more than 2–3 times what would be expected based on this measure's normative data. Controlling for prenatal and postnatal depressive symptoms, prenatal medication effects on neurobehavioral development were largely undetected in the fetus and infant. Mental health care directed primarily at symptoms may not address the additional psychosocial needs of women parenting infants. Speculatively, prenatal serotonin reuptake inhibitor exposure may act as a plasticity rather than risk factor, potentially enhancing receptivity to a nonoptimal postnatal environment in some mother–infant dyads.
Previous work has identified associations between psychotic experiences (PEs) and general medical conditions (GMCs), but their temporal direction remains unclear as does the extent to which they are independent of comorbid mental disorders.
Methods
In total, 28 002 adults in 16 countries from the WHO World Mental Health (WMH) Surveys were assessed for PEs, GMCs and 21 Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) mental disorders. Discrete-time survival analyses were used to estimate the associations between PEs and GMCs with various adjustments.
Results
After adjustment for comorbid mental disorders, temporally prior PEs were significantly associated with subsequent onset of 8/12 GMCs (arthritis, back or neck pain, frequent or severe headache, other chronic pain, heart disease, high blood pressure, diabetes and peptic ulcer) with odds ratios (ORs) ranging from 1.3 [95% confidence interval (CI) 1.1–1.5] to 1.9 (95% CI 1.4–2.4). In contrast, only three GMCs (frequent or severe headache, other chronic pain and asthma) were significantly associated with subsequent onset of PEs after adjustment for comorbid GMCs and mental disorders, with ORs ranging from 1.5 (95% CI 1.2–1.9) to 1.7 (95% CI 1.2–2.4).
Conclusions
PEs were associated with the subsequent onset of a wide range of GMCs, independent of comorbid mental disorders. There were also associations between some medical conditions (particularly those involving chronic pain) and subsequent PEs. Although these findings will need to be confirmed in prospective studies, clinicians should be aware that psychotic symptoms may be risk markers for a wide range of adverse health outcomes. Whether PEs are causal risk factors will require further research.
Traumatic events are associated with increased risk of psychotic experiences, but it is unclear whether this association is explained by mental disorders prior to psychotic experience onset.
Aims
To investigate the associations between traumatic events and subsequent psychotic experience onset after adjusting for post-traumatic stress disorder and other mental disorders.
Method
We assessed 29 traumatic event types and psychotic experiences from the World Mental Health surveys and examined the associations of traumatic events with subsequent psychotic experience onset with and without adjustments for mental disorders.
Results
Respondents with any traumatic events had three times the odds of other respondents of subsequently developing psychotic experiences (OR=3.1, 95% CI 2.7–3.7), with variability in strength of association across traumatic event types. These associations persisted after adjustment for mental disorders.
Conclusions
Exposure to traumatic events predicts subsequent onset of psychotic experiences even after adjusting for comorbid mental disorders.
We report detections of thermal X-ray line emission and proper motions in the supernova remnant (SNR) RX J1713.7-3946, the prototype of the small class of synchrotron dominated SNRs. Based on deep XMM-Newton observations, we find clear line features including Ne Lyα, Mg Heα, and Si Heα from the central portion of the remnant. The metal abundance ratios suggest that the thermal emission originates from core-collapse SN ejecta arising from a relatively low-mass (≲20 M⊙) progenitor. In addition, using XMM-Newton observations on a 13 yr time interval, we have measured expansion in the southeastern rim to be ~0.75″ yr−1 or ~3500 km s−1 at a distance of 1 kpc. Given this, we derive an upstream density to be ~0.01 cm−3, compatible with the lack of thermal X-rays from the shocked ambient medium. We also estimate the age of the remnant to be ~1200–1600 yr, roughly consistent with the idea that RX J1713.7-3946 is the remnant of SN 393.
To evaluate the appropriateness of the screening strategy for healthcare personnel (HCP) during a hospital-associated Middle East Respiratory Syndrome (MERS) outbreak, we performed a serologic investigation in 189 rRT-PCR–negative HCP exposed and assigned to MERS patients. Although 20%–25% of HCP experienced MERS-like symptoms, none of them showed seroconversion by plaque reduction neutralization test (PRNT).
The subsurface exploration of other planetary bodies can be used to unravel their geological history and assess their habitability. On Mars in particular, present-day habitable conditions may be restricted to the subsurface. Using a deep subsurface mine, we carried out a program of extraterrestrial analog research – MINe Analog Research (MINAR). MINAR aims to carry out the scientific study of the deep subsurface and test instrumentation designed for planetary surface exploration by investigating deep subsurface geology, whilst establishing the potential this technology has to be transferred into the mining industry. An integrated multi-instrument suite was used to investigate samples of representative evaporite minerals from a subsurface Permian evaporite sequence, in particular to assess mineral and elemental variations which provide small-scale regions of enhanced habitability. The instruments used were the Panoramic Camera emulator, Close-Up Imager, Raman spectrometer, Small Planetary Linear Impulse Tool, Ultrasonic drill and handheld X-ray diffraction (XRD). We present science results from the analog research and show that these instruments can be used to investigate in situ the geological context and mineralogical variations of a deep subsurface environment, and thus habitability, from millimetre to metre scales. We also show that these instruments are complementary. For example, the identification of primary evaporite minerals such as NaCl and KCl, which are difficult to detect by portable Raman spectrometers, can be accomplished with XRD. By contrast, Raman is highly effective at locating and detecting mineral inclusions in primary evaporite minerals. MINAR demonstrates the effective use of a deep subsurface environment for planetary instrument development, understanding the habitability of extreme deep subsurface environments on Earth and other planetary bodies, and advancing the use of space technology in economic mining.