To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Due to shortages of N95 respirators during the COVID-19 pandemic, it is necessary to estimate the number of N95s required for healthcare workers (HCW) to inform manufacturing targets and resource allocation.
We developed a model to determine the number of N95 respirators needed for HCWs both in a single acute care hospital and the United States.
For an acute care hospital with 400 all-cause monthly admissions, the number of N95 respirators needed to manage COVID-19 patients admitted during a month ranges from 113 (95% IPR: 50-229) if 0.5% of admissions are COVID-19 patients to 22,101 (95% IPR: 5,904-25,881) if 100% of admissions are COVID-19 patients (assuming single use per respirator, and 10 encounters between HCWs and each COVID-19 patient per day). The number of N95s needed decreases (22 [95% IPR: 10-43]-4,445 [95% IPR: 1,975-8,684]) if each N95 is used for five patient encounters. Varying monthly all-cause admissions to 2,000 requires 6,645-13,404 respirators with a 60% COVID-19 admission prevalence, 10 HCW-patient encounters, and reusing N95s 5-10 times. Nationally, the number of N95 respirators needed over the course of the pandemic ranges from 86 million (95% IPR: 37.1-200.6 million) to 1.6 billion (95% IPR: 0.7-3.6 billion) as 5-90% of the population is exposed (single-use), and 17.4 million (95% IPR: 7.3-41 million) to 312.3 million (95% IPR: 131.5-737.3 million) using each respirator for five encounters.
Our study quantifies the number of N95 respirators needed for a given acute care hospital and nationally during the COVID-19 pandemic under varying conditions.
The interaction between an incident shock wave and a Mach-6 undisturbed hypersonic laminar boundary layer over a cold wall is addressed using direct numerical simulations (DNS) and wall-modelled large-eddy simulations (WMLES) at different angles of incidence. At sufficiently high shock-incidence angles, the boundary layer transitions to turbulence via breakdown of near-wall streaks shortly downstream of the shock impingement, without the need of any inflow free-stream disturbances. The transition causes a localized significant increase in the Stanton number and skin-friction coefficient, with high incidence angles augmenting the peak thermomechanical loads in an approximately linear way. Statistical analyses of the boundary layer downstream of the interaction for each case are provided that quantify streamwise spatial variations of the Reynolds analogy factors and indicate a breakdown of the Morkovin's hypothesis near the wall, where velocity and temperature become correlated. A modified strong Reynolds analogy with a fixed turbulent Prandtl number is observed to perform best. Conventional transformations fail at collapsing the mean velocity profiles on the incompressible log law. The WMLES prompts transition and peak heating, delays separation and advances reattachment, thereby shortening the separation bubble. When the shock leads to transition, WMLES provides predictions of DNS peak thermomechanical loads within $\pm 10\,\%$ at a computational cost lower than DNS by two orders of magnitude. Downstream of the interaction, in the turbulent boundary layer, the WMLES agrees well with DNS results for the Reynolds analogy factor, the mean profiles of velocity and temperature, including the temperature peak, and the temperature/velocity correlation.
There is evidence that environmental and genetic risk factors for schizophrenia spectrum disorders are transdiagnostic and mediated in part through a generic pathway of affective dysregulation.
We analysed to what degree the impact of schizophrenia polygenic risk (PRS-SZ) and childhood adversity (CA) on psychosis outcomes was contingent on co-presence of affective dysregulation, defined as significant depressive symptoms, in (i) NEMESIS-2 (n = 6646), a representative general population sample, interviewed four times over nine years and (ii) EUGEI (n = 4068) a sample of patients with schizophrenia spectrum disorder, the siblings of these patients and controls.
The impact of PRS-SZ on psychosis showed significant dependence on co-presence of affective dysregulation in NEMESIS-2 [relative excess risk due to interaction (RERI): 1.01, p = 0.037] and in EUGEI (RERI = 3.39, p = 0.048). This was particularly evident for delusional ideation (NEMESIS-2: RERI = 1.74, p = 0.003; EUGEI: RERI = 4.16, p = 0.019) and not for hallucinatory experiences (NEMESIS-2: RERI = 0.65, p = 0.284; EUGEI: −0.37, p = 0.547). A similar and stronger pattern of results was evident for CA (RERI delusions and hallucinations: NEMESIS-2: 3.02, p < 0.001; EUGEI: 6.44, p < 0.001; RERI delusional ideation: NEMESIS-2: 3.79, p < 0.001; EUGEI: 5.43, p = 0.001; RERI hallucinatory experiences: NEMESIS-2: 2.46, p < 0.001; EUGEI: 0.54, p = 0.465).
The results, and internal replication, suggest that the effects of known genetic and non-genetic risk factors for psychosis are mediated in part through an affective pathway, from which early states of delusional meaning may arise.
This study attempted to replicate whether a bias in probabilistic reasoning, or ‘jumping to conclusions’(JTC) bias is associated with being a sibling of a patient with schizophrenia spectrum disorder; and if so, whether this association is contingent on subthreshold delusional ideation.
Data were derived from the EUGEI project, a 25-centre, 15-country effort to study psychosis spectrum disorder. The current analyses included 1261 patients with schizophrenia spectrum disorder, 1282 siblings of patients and 1525 healthy comparison subjects, recruited in Spain (five centres), Turkey (three centres) and Serbia (one centre). The beads task was used to assess JTC bias. Lifetime experience of delusional ideation and hallucinatory experiences was assessed using the Community Assessment of Psychic Experiences. General cognitive abilities were taken into account in the analyses.
JTC bias was positively associated not only with patient status but also with sibling status [adjusted relative risk (aRR) ratio : 4.23 CI 95% 3.46–5.17 for siblings and aRR: 5.07 CI 95% 4.13–6.23 for patients]. The association between JTC bias and sibling status was stronger in those with higher levels of delusional ideation (aRR interaction in siblings: 3.77 CI 95% 1.67–8.51, and in patients: 2.15 CI 95% 0.94–4.92). The association between JTC bias and sibling status was not stronger in those with higher levels of hallucinatory experiences.
These findings replicate earlier findings that JTC bias is associated with familial liability for psychosis and that this is contingent on the degree of delusional ideation but not hallucinations.
Background: During a 2017–2019 intervention in Chicago-area vSNFs to control carbapenem-resistant Enterobacteriaceae, healthcare worker adherence to hand hygiene and personal protective equipment was stubbornly inadequate (hand hygiene adherence, ~16% and 56% on entry and exit), despite educational and monitoring efforts. Little is known about vSNF staff understanding of multidrug-resistant organism (MDRO) transmission. We conducted a qualitative analysis of staff members at a vSNF that included assessment of staff perceptions of personal MDRO acquisition risk and associated personal hygiene routines transitioning from work to home. Methods: Between September 2018 and November 2018, a PhD-candidate medical anthropologist conducted semistructured interviews with management (N = 5), nursing staff (N = 6), and certified nursing assistants (N = 6) at a vSNF in the Chicago region (Illinois) who had already received 1 year of MDRO staff education and hand hygiene adherence monitoring. More than 11 hours of semistructured interviews were collected and transcribed. Data collection and analysis included identifying how staff members related to their own risk of MDRO acquisition/infection and what personal hygiene routines they followed. Transcriptions of the data were analyzed using thematic coding aided by MAXQDA qualitative analysis software. Results: Staff members at all levels were able to describe their perceptions related to the risk of acquiring an MDRO and personal hygiene in great detail. The risk of acquiring an MDRO was perceived as a constant threat by staff members, who described germs as bad and everywhere (Table 1). The perceived threat of MDRO acquisition was connected to individual personal hygiene routines (eg, changing shoes before leaving work), which were considered important by staff members (Table 2). Nursing staff and certified nursing assistants noted that personal hygiene was a critical factor keeping their residents, themselves, and their families free from MDROs. Conclusions: In the context of a quality improvement campaign, vSNF healthcare workers are aware of the transmissibility of microscopic MDROs and are highly motivated in preventing transmission of MDROs to themselves. Such perceptions may explain actions such as why workers may be differentially adherent with infection control interventions (eg, more likely to perform hand hygiene leaving a room rather than going into a room, or less likely to change gowns in between residents in multibed rooms if they believe they are already personally protected with a gown). Our findings suggest that interventions to improve staff adherence to infection control measures may need to address other factors related to adherence besides knowledge deficit (eg, understaffing) and may need to acknowledge self-protection as a driving motivator for staff adherence.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are endemic in the Chicago region. We assessed the regional impact of a CRE control intervention targeting high-prevalence facilities; that is, long-term acute-care hospitals (LTACHs) and ventilator-capable skilled nursing facilities (vSNFs). Methods: In July 2017, an academic–public health partnership launched a regional CRE prevention bundle: (1) identifying patient CRE status by querying Illinois’ XDRO registry and periodic point-prevalence surveys reported to public health, (2) cohorting or private rooms with contact precautions for CRE patients, (3) combining hand hygiene adherence, monitoring with general infection control education, and guidance by project coordinators and public health, and (4) daily chlorhexidine gluconate (CHG) bathing. Informed by epidemiology and modeling, we targeted LTACHs and vSNFs in a 13-mile radius from the coordinating center. Illinois mandates CRE reporting to the XDRO registry, which can also be manually queried or generate automated alerts to facilitate interfacility communication. The regional intervention promoted increased automation of alerts to hospitals. The prespecified primary outcome was incident clinical CRE culture reported to the XDRO registry in Cook County by month, analyzed by segmented regression modeling. A secondary outcome was colonization prevalence measured by serial point-prevalence surveys for carbapenemase-producing organism colonization in LTACHs and vSNFs. Results: All eligible LTACHs (n = 6) and vSNFs (n = 9) participated in the intervention. One vSNF declined CHG bathing. vSNFs that implemented CHG bathing typically bathed residents 2–3 times per week instead of daily. Overall, there were significant gaps in infection control practices, especially in vSNFs. Also, 75 Illinois hospitals adopted automated alerts (56 during the intervention period). Mean CRE incidence in Cook County decreased from 59.0 cases per month during baseline to 40.6 cases per month during intervention (P < .001). In a segmented regression model, there was an average reduction of 10.56 cases per month during the 24-month intervention period (P = .02) (Fig. 1), and an estimated 253 incident CRE cases were averted. Mean CRE incidence also decreased among the stratum of vSNF/LTACH intervention facilities (P = .03). However, evidence of ongoing CRE transmission, particularly in vSNFs, persisted, and CRE colonization prevalence remained high at intervention facilities (Table 1). Conclusions: A resource-intensive public health regional CRE intervention was implemented that included enhanced interfacility communication and targeted infection prevention. There was a significant decline in incident CRE clinical cases in Cook County, despite high persistent CRE colonization prevalence in intervention facilities. vSNFs, where understaffing or underresourcing were common and lengths of stay range from months to years, had a major prevalence challenge, underscoring the need for aggressive infection control improvements in these facilities.
Funding: The Centers for Disease Control and Prevention (SHEPheRD Contract No. 200-2011-42037)
Disclosures: M.Y.L. has received research support in the form of contributed product from OpGen and Sage Products (now part of Stryker Corporation), and has received an investigator-initiated grant from CareFusion Foundation (now part of BD).
Background: In an effort to reduce inappropriate testing of hospital-onset Clostridioides difficile infection (HO-CDI), we sequentially implemented 2 strategies: an electronic health record-based clinical decision support tool that alerted ordering physicians about potentially inappropriate testing without a hard stop (intervention period 1), replaced by mandatory infectious diseases attending physician approval for any HO-CDI test order (intervention period 2). We analyzed appropriate HO-CDI testing rates of both intervention periods. Methods: We performed a retrospective study of patients 18 years or older who had an HO-CDI test (performed after hospital day 3) during 3 different periods: baseline (no intervention, September 2014–February 2015), intervention 1 (clinical decision support tool only, April 2015–September 2015), and intervention 2 (ID approval only, December 2017–September 2018). From each of the 3 periods, we randomly selected 150 patients who received HO-CDI testing (450 patients total). We restricted the study to the general medicine, bone marrow transplant, medical intensive care, and neurosurgical intensive care units. We assessed each HO-CDI test for appropriateness (see Table 1 for criteria), and we compared rates of appropriateness using the χ2 test or Kruskall-Wallis test, where appropriate. Results: In our cohort of 450 patients, the median age was 61 years, and the median hospital length of stay was 20 days. The median hospital day that HO-CDI testing was performed differed among the 3 groups: 12 days at baseline, 10 days during intervention 1, and 8.5 days during intervention 2 (P < .001). Appropriateness of HO-CDI testing increased from the baseline with both interventions, but mandatory ID approval was associated with the highest rate of testing appropriateness (Fig. 1). Reasons for inappropriate ordering did not differ among the periods, with <3 documented stools being the most common criterion for inappropriateness. During intervention 2, among the 33 inappropriate tests, 8 (24%) occurred where no approval from an ID attending was recorded. HO-CDI test positivity rates during the 3 time periods were 12%, 11%, and 21%, respectively (P = .03). Conclusions: We found that both the clinical decision support tool and mandatory ID attending physician approval interventions improved appropriateness of HO-CDI testing. Mandatory ID attending physician approval leading to the highest appropriateness rate. Even with mandatory ID attending physician approval, some tests continued to be ordered inappropriately per retrospective chart review; we suspect that this is partly explained by underdocumentation of criteria such as stool frequency. In healthcare settings where appropriateness of HO-CDI testing is not optimal, mandatory ID attending physician approval may provide an option beyond clinical decision-support tools.
Background: During 2017–2019 in the Chicago region, several ventilator-capable skilled nursing facilities (vSNFs) participated in a quality improvement project to control the spread of highly prevalent carbapenem-resistant Enterobacteriaceae (CRE). With guidance from regional project coordinators and public health departments that involved education, assistance with implementation, and adherence monitoring, the facilities implemented a CRE prevention bundle that included a hand hygiene campaign that promoted alcohol-based hand rub, contact precautions (personal protective equipment with glove/gown) for care of CRE-colonized residents, and 2% chlorhexidine gluconate (CHG) wipes for routine resident bathing. We conducted a qualitative study to better understand the ways that vSNF employees engage with the implementation of such infection control measures. Methods: A PhD-candidate medical anthropologist conducted semistructured interviews with management (N = 5), nursing staff (N = 6), and certified nursing assistants (N = 6) at a vSNF in the Chicago region (Illinois) between September 2018 and November 2018. More than 11 hours of semistructured interviews were collected and transcribed. Data collection and analysis focused on identifying healthcare worker experiences during an infection control intervention. Transcriptions of the data were analyzed using thematic coding aided by MAXQDA qualitative analysis software. Results: Healthcare workers described the facility using language associated with a family environment (Table 1). Furthermore, healthcare workers demonstrated motivation to implement infection control policies (Table 2). However, healthcare workers expressed cultural and structural challenges encountered during implementation, such as their belief that some infection control measures discouraged maintenance of a home-like environment, lack of time, and understaffing. Some healthcare workers perceived that alcohol-based hand rub was ineffective over time and left unpleasant textures on the skin. Additionally, some workers did not trust the available gown and gloves used to prevent transmission. Lastly, healthcare workers typically did not prefer 2% CHG wipes over soap and water, citing residual resident postbathing smell as one indicator of CHG ineffectiveness. Conclusions: In a vSNF we found both considerable support and challenges implementing a CRE prevention bundle from the healthcare worker perspective. Healthcare workers were dedicated to recreating a home-like environment for their residents, which sometimes felt at odds with infection control interventions. Residual misconceptions (eg, alcohol-based hand rub is not effective) and negative worker perceptions (eg, permeability of contact precaution gowns and/or residue from alcohol-based hand rub) suggest that ongoing education and participation by healthcare workers in evaluating infection control products for interventions is critical.
Background: Long-term acute-care hospitals (LTACHs) are disproportionately burdened by multidrug-resistant organisms (MDROs) like KPC-Kp. Although cohorting KPC-Kp+ patients into rooms with other carriers can be an outbreak-control strategy and may protect negative patients from colonization, it is unclear whether cohorted patients are at unintended increased risk of cross colonization with additional KPC-Kp strains. Methods: Cohorting KPC-Kp+ patients at admission into rooms with other positive patients was part of a bundled intervention that reduced transmission in a high-prevalence LTACH. Rectal surveillance culturing for KPC-Kp was performed at the start of the study, upon admission, and biweekly thereafter, capturing 94% of patients. We evaluated whole-genome sequencing (WGS) evidence of acquisition of distinct KPC-Kp strains in a convenience sample of patients positive for KPC-Kp at study start or admission to identify plausible secondary KPC-Kp acquisitions. Results: WGS multilocus sequence type (MLST) strain variability was observed among the 452 isolates from the 254 patients colonized by KPC-Kp (Fig. 1). Among the 32 patients who were positive at the beginning of the study or admission and had a secondary isolate collected at a later date (median, 89 days apart, range, 2–310 days), 17 (53%) had secondary isolates differing by MLST from their admission isolate. Although 60% of the KPC-Kp in the study was ST258, there was substantial genomic variation within ST258 isolates from the same patient (range, 0–102 genetic variants), suggesting multiple acquisitions of distinct ST258 isolates. Among the 17 patients who imported ST258 and had ST258 isolated again later, 11 (65%) carried secondary isolates genetically closer to isolates from other importing patients than to their own ST258 (Fig. 2). Examination of spatiotemporal exposures among patients with evidence of multiple acquisitions revealed that 11 (65%) patients with multiple MLSTs shared a room with a patient who was colonized with an isolate matching the secondary MLST, and 6 (35%) patients who carried multiple distinct ST258 isolates shared a room with a patient who imported these closely related isolates prior to secondary acquisition. Conclusions: Half of patients who imported KPC-Kp and had multiple isolates available had genomically supported secondary acquisitions linked to roommates who carried the acquired strains. Although cohorting is intended to protect negative patients from acquiring MDROs, this practice may promote multiple strain acquisitions by colonized patients in the cohort, potentially prolonging the period of MDRO carriage and increasing time at risk of infection. Our findings add to the debate about single-patient rooms, which may be preferred to cohorts to minimize potential harms by reducing MDRO transmission.
Little is known about methylphenidate (MPH) use and mortality outcomes.
To investigate the association between MPH use and mortality among children with an attention-deficit hyperactivity disorder (ADHD) diagnosis.
This population-based cohort study analysed data from Taiwan's National Health Insurance Research Database (NHIRD). A total of 68 096 children and adolescents aged 4–17 years with an ADHD diagnosis and prescribed MPH between 2000 and 2010 were compared with 68 096 without an MPH prescription, matched on age, gender and year of first ADHD diagnosis. All participants were followed to death, migration, withdrawal from the National Health Insurance programme or 31 December 2013. MPH prescriptions were measured on a yearly basis during the study period, and the association between MPH use and mortality was analysed using a repeated-measures time-dependent Cox regression model. The outcome measures included all-cause, unnatural-cause (including suicide, accident and homicide) and natural-cause mortality, obtained from linkage to the National Mortality Register in Taiwan.
The MPH group had lower unadjusted all-cause, natural-, unnatural- and accident-cause mortality than the comparison group. After controlling for potential confounders, MPH use was associated with a significantly lower all-cause mortality (adjusted hazard ratio AHR = 0.81, 95% CI 0.67–0.98, P = 0.027), delayed use of MPH was associated with higher mortality (AHR = 1.05, 95% CI 1.01–1.09) and longer MPH use was associated with lower mortality (AHR = 0.83, 95% CI 0.70–0.98).
MPH use is associated with a reduced overall mortality in children with ADHD in this cohort study, but unmeasured confounding cannot be excluded absolutely.
Cohorting patients who are colonized or infected with multidrug-resistant organisms (MDROs) protects uncolonized patients from acquiring MDROs in healthcare settings. The potential for cross transmission within the cohort and the possibility of colonized patients acquiring secondary isolates with additional antibiotic resistance traits is often neglected. We searched for evidence of cross transmission of KPC+ Klebsiella pneumoniae (KPC-Kp) colonization among cohorted patients in a long-term acute-care hospital (LTACH), and we evaluated the impact of secondary acquisitions on resistance potential.
Genomic epidemiological investigation.
A high-prevalence LTACH during a bundled intervention that included cohorting KPC-Kp–positive patients.
Whole-genome sequencing (WGS) and location data were analyzed to identify potential cases of cross transmission between cohorted patients.
Secondary KPC-Kp isolates from 19 of 28 admission-positive patients were more closely related to another patient’s isolate than to their own admission isolate. Of these 19 cases, 14 showed strong genomic evidence for cross transmission (<10 single nucleotide variants or SNVs), and most of these patients occupied shared cohort floors (12 patients) or rooms (4 patients) at the same time. Of the 14 patients with strong genomic evidence of acquisition, 12 acquired antibiotic resistance genes not found in their primary isolates.
Acquisition of secondary KPC-Kp isolates carrying distinct antibiotic resistance genes was detected in nearly half of cohorted patients. These results highlight the importance of healthcare provider adherence to infection prevention protocols within cohort locations, and they indicate the need for future studies to assess whether multiple-strain acquisition increases risk of adverse patient outcomes.
This article presents a brief review of our case studies of data-driven Integrated Computational Materials Engineering (ICME) for intelligently discovering advanced structural metal materials, including light-weight materials (Ti, Mg, and Al alloys), refractory high-entropy alloys, and superalloys. The basic bonding in terms of topology and electronic structures is recommended to be considered as the building blocks/units constructing the microstructures of advanced materials. It is highlighted that the bonding charge density could not only provide an atomic and electronic insight into the physical nature of chemical bond of materials but also reveal the fundamental strengthening/embrittlement mechanisms and the local phase transformations of planar defects, paving a path in accelerating the development of advanced metal materials via interfacial engineering. Perspectives on the knowledge-based modeling/simulations, machine-learning knowledge base, platform, and next-generation workforce for sustainable ecosystem of ICME are highlighted, thus to call for more duty on the developments of advanced structural metal materials and enhancement of research productivity and collaboration.
First-degree relatives of patients with psychotic disorder have higher levels of polygenic risk (PRS) for schizophrenia and higher levels of intermediate phenotypes.
We conducted, using two different samples for discovery (n = 336 controls and 649 siblings of patients with psychotic disorder) and replication (n = 1208 controls and 1106 siblings), an analysis of association between PRS on the one hand and psychopathological and cognitive intermediate phenotypes of schizophrenia on the other in a sample at average genetic risk (healthy controls) and a sample at higher than average risk (healthy siblings of patients). Two subthreshold psychosis phenotypes, as well as a standardised measure of cognitive ability, based on a short version of the WAIS-III short form, were used. In addition, a measure of jumping to conclusion bias (replication sample only) was tested for association with PRS.
In both discovery and replication sample, evidence for an association between PRS and subthreshold psychosis phenotypes was observed in the relatives of patients, whereas in the controls no association was observed. Jumping to conclusion bias was similarly only associated with PRS in the sibling group. Cognitive ability was weakly negatively and non-significantly associated with PRS in both the sibling and the control group.
The degree of endophenotypic expression of schizophrenia polygenic risk depends on having a sibling with psychotic disorder, suggestive of underlying gene–environment interaction. Cognitive biases may better index genetic risk of disorder than traditional measures of neurocognition, which instead may reflect the population distribution of cognitive ability impacting the prognosis of psychotic disorder.
OBJECTIVES/SPECIFIC AIMS: (1) Assess if the total duration of EEG suppression during a protocolized exposure to general anesthesia predicts cognitive performance in multiple cognitive domains immediately following emergence from anesthesia. (2) Assess if the total duration of EEG suppression in the same individuals predicts the rate of cognitive recovery in a three-hour period following emergence from anesthesia. METHODS/STUDY POPULATION: This was a non-specified substudy of NCT01911195, a multicenter investigation taking place at the University of Michigan, University of Pennsylvania, and Washington University in St. Louis. 30 healthy volunteers aged 20-40 years were recruited to receive general anesthesia. Participants in the anesthesia arm were anesthetized for three hours at isoflurane levels compatible with surgery (1.3 MAC). Multichannel sensor nets were used for EEG acquisition during the anesthetic exposure. EEG suppression was detected through automated voltage-thresholded classification of 2-second signal epochs, with concordance assessed across sensors. Following return of responsiveness to verbal commands, participants completed up to three hours of serial cognitive tests assessing executive function, reaction time, cognitive throughput, and working memory. Non-linear mixed effects models will be used to estimate the initial cognitive deficit and the rate of cognitive recovery following anesthetic exposure; these measures of cognitive function will be assessed in relation to total duration of suppression during anesthesia. RESULTS/ANTICIPATED RESULTS: Participants displayed wide variability in the total amount of suppression during anesthesia, with a median of 31.2 minutes and range from 0 minutes to 115.2 minutes. Initial analyses suggest that greater duration of burst suppression had a weak relationship with participants’ initial cognitive deficits upon return of responsiveness from anesthesia. Model generation of rate of recovery following anesthetic exposure is pending, but we anticipate this will also have a weak relationship with burst suppression. DISCUSSION/SIGNIFICANCE OF IMPACT: In healthy adults receiving a standardized exposure to anesthesia without surgery, burst suppression appears to be a poor predictor of post-anesthesia cognitive task performance. This suggests that burst suppression may have limited utility as a predictive marker of post-operative cognitive functioning, particularly in young adults without significant illness.
Investigate short- and long-term effects of Superstorm Sandy on multiple morbidities among the elderly.
We examined emergency department visits; outpatient visits; and hospital admissions for cardiovascular disease (CVD), respiratory disease, and injury among residents residing in 8 affected counties immediately, 4 months, and 12 months following Superstorm Sandy. Control groups were defined as visits/admissions during the identical time window in the 5 years before (2007-2011) and 1 year after (2013-2014) the storm in affected and nonaffected counties in New York. We performed Poisson regression to test whether there was an association of increased visits/admissions for periods following Superstorm Sandy while controlling for covariates.
We found that the risk for CVD, respiratory disease, and injury visits/admissions was more than twice as high immediately, 4 months, and 12 months after the storm than it was in the control periods. Women were at greater risk at all time periods for CVD (risk ratio [RR], 2.04) and respiratory disease (RRs: 1.89 to 1.92). Whites had higher risk for CVD, respiratory disease, and injury than other racial groups during each period.
We observed increases in CVD, respiratory disease, and injury up to a year following Superstorm Sandy. Findings demonstrate the need to incorporate short- and long-term health effects into public health recovery. (Disaster Med Public Health Preparedness. 2019;13:28-32)
Strong strain-mediated magnetoelectric (ME) coupling in magnetic/ferroelectric heterostructures has great potential for different high-frequency multiferroic devices. In this article, we present the most recent progress in integrated multiferroic devices. Integrated magnetic tunable inductors with a wide operation frequency range, integrated nonreciprocal bandpass filters with dual magnetic and electric-field tunability based on magnetostatics surface waves, and novel radio-frequency nanomechanical ME resonators with pico-Tesla sensitivity for direct current magnetic fields are presented. Finally, a new antenna miniaturization mechanism, acoustically actuated nanomechanical ME antennas, which can successfully miniaturize the size by 1–2 orders, is introduced. With the advantages of high magnetic field sensitivity, highest antenna gain among all nanoscale antennas at similar frequency, integrated capability with complementary metal oxide semiconductor technology, and ground-plane immunity from metallic surfaces and the human body, ME antennas have a bright future for biomedical applications, wearable antennas, and the Internet of Things due to their unique and particular properties.