We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Healthcare facilities have experienced many challenges during the COVID-19 pandemic, including limited personal protective equipment (PPE) supplies. Healthcare personnel (HCP) rely on PPE, vaccines, and other infection control measures to prevent SARS-CoV-2 infections. We describe PPE concerns reported by HCP who had close contact with COVID-19 patients in the workplace and tested positive for SARS-CoV-2. Method: The CDC collaborated with Emerging Infections Program (EIP) sites in 10 states to conduct surveillance for SARS-CoV-2 infections in HCP. EIP staff interviewed HCP with positive SARS-CoV-2 viral tests (ie, cases) to collect data on demographics, healthcare roles, exposures, PPE use, and concerns about their PPE use during COVID-19 patient care in the 14 days before the HCP’s SARS-CoV-2 positive test. PPE concerns were qualitatively coded as being related to supply (eg, low quality, shortages); use (eg, extended use, reuse, lack of fit test); or facility policy (eg, lack of guidance). We calculated and compared the percentages of cases reporting each concern type during the initial phase of the pandemic (April–May 2020), during the first US peak of daily COVID-19 cases (June–August 2020), and during the second US peak (September 2020–January 2021). We compared percentages using mid-P or Fisher exact tests (α = 0.05). Results: Among 1,998 HCP cases occurring during April 2020–January 2021 who had close contact with COVID-19 patients, 613 (30.7%) reported ≥1 PPE concern (Table 1). The percentage of cases reporting supply or use concerns was higher during the first peak period than the second peak period (supply concerns: 12.5% vs 7.5%; use concerns: 25.5% vs 18.2%; p Conclusions: Although lower percentages of HCP cases overall reported PPE concerns after the first US peak, our results highlight the importance of developing capacity to produce and distribute PPE during times of increased demand. The difference we observed among selected groups of cases may indicate that PPE access and use were more challenging for some, such as nonphysicians and nursing home HCP. These findings underscore the need to ensure that PPE is accessible and used correctly by HCP for whom use is recommended.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
OBJECTIVES/GOALS: The Monte Carlo dose calculation method is often considered the “gold standard” for patient dose calculations and can be as radiation dose measurements. Our study aims to develop a true Monte Carlo model that can be implemented in our clinic as part of our routine patient-specific quality assurance. METHODS/STUDY POPULATION: We have configured and validated a model of one of our linear accelerators used for radiation therapy treatments using the EGSnrc Monte Carlo simulation software. Measured dosimetric data was obtained from the linear accelerator and was used as the standard to compare the doses calculated with our model in EGSnrc. We will compare dose calculations between commercial treatment planning systems, the EGSnrc Monte Carlo model, and patient-specific measurements. We will implement the Monte Carlo model in our clinic for routine second-checks of patient plans, and to recalculate plans delivered to patients using machine log files. RESULTS/ANTICIPATED RESULTS: Our Monte Carlo model is within 1% agreement with our measured dosimetric data, and is an accurate representation of our linear accelerators used for patient treatments. With this high level of accuracy, we have begun simulating more complex patient treatment geometries, and expect the level of accuracy to be within 1% of measured data. We believe the Monte Carlo calculation based on machine log files will correlate with patient-specific QA analysis and results. The Monte Carlo model will be a useful tool in improving our patient-specific quality assurance protocol and can be utilized in further research. DISCUSSION/SIGNIFICANCE OF IMPACT: This work can be implemented directly in clinical practice to ensure patient doses are calculated as accurately as possible. These methods can be used by clinics who do not have access to more advanced dose calculation software, ensuring accuracy for all patients undergoing radiotherapy treatments.
Eye’ve Seen Enough: Oculogyric Crisis in a 13 year old Male Treated for Comorbid ADHD and Psychosis After Stopping Lisdexamphetamine
Background:
Oculogyric crisis is a dystonic movement disorder caused by sustained contractions of ocular muscles that may last minutes to hours. It is known to occur during hypodopaminergic states. Combined use of stimulants and antipsychotics increase the risk for developing a hypodopaminergic state in the brain leading to various dystonic reactions described as a stimulant-antipsychotic syndrome (SAS).
Case:
A unique example of SAS occurred in a 13 year-old male with comorbid ADHD and schizoaffective disorder hospitalized for acute psychotic decompensation who had been treated for several months with risperidone and lisdexamphetamine. On admission, the patient had received olanzapine 5mg ODT for acute agitation and lisdexamphetamine was discontinued. He started to cross-taper from risperidone to quetiapine. In this setting he developed dystonia including an oculogyric crisis that resolved with diphenhydramine.
Conclusions:
In this case the use of lisdexamphetamine and risperidone may have set up an environment where there was decreased endogenously made dopamine and up-regulation of postsynaptic dopamine receptors. Upon discontinuation of the lisdexamphetamine and acute use of additional atypical antipsychotics (olanzapine and quetiapine), the body experienced a hypodopaminergic state resulting in dystonic reactions that included an oculogyric crisis. This case is unique from previously reported cases by occurring with the use and discontinuation of lisdexamphetamine while most reported cases involved a derivative of methylphenidate as the stimulant. This report adds to the literature showing the importance of monitoring and being aware of potential medication interactions especially when treating for comorbid conditions. This is even more important to recognize after adding or removing either of these medications.
To evaluate the National Health Safety Network (NHSN) hospital-onset Clostridioides difficile infection (HO-CDI) standardized infection ratio (SIR) risk adjustment for general acute-care hospitals with large numbers of intensive care unit (ICU), oncology unit, and hematopoietic cell transplant (HCT) patients.
Design:
Retrospective cohort study.
Setting:
Eight tertiary-care referral general hospitals in California.
Methods:
We used FY 2016 data and the published 2015 rebaseline NHSN HO-CDI SIR. We compared facility-wide inpatient HO-CDI events and SIRs, with and without ICU data, oncology and/or HCT unit data, and ICU bed adjustment.
Results:
For these hospitals, the median unmodified HO-CDI SIR was 1.24 (interquartile range [IQR], 1.15–1.34); 7 hospitals qualified for the highest ICU bed adjustment; 1 hospital received the second highest ICU bed adjustment; and all had oncology-HCT units with no additional adjustment per the NHSN. Removal of ICU data and the ICU bed adjustment decreased HO-CDI events (median, −25%; IQR, −20% to −29%) but increased the SIR at all hospitals (median, 104%; IQR, 90%–105%). Removal of oncology-HCT unit data decreased HO-CDI events (median, −15%; IQR, −14% to −21%) and decreased the SIR at all hospitals (median, −8%; IQR, −4% to −11%).
Conclusions:
For tertiary-care referral hospitals with specialized ICUs and a large number of ICU beds, the ICU bed adjustor functions as a global adjustment in the SIR calculation, accounting for the increased complexity of patients in ICUs and non-ICUs at these facilities. However, the SIR decrease with removal of oncology and HCT unit data, even with the ICU bed adjustment, suggests that an additional adjustment should be considered for oncology and HCT units within general hospitals, perhaps similar to what is done for ICU beds in the current SIR.
We collected dietary records over the course of nine months to comprehensively characterize the consumption patterns of Malagasy people living in remote rainforest areas of north-eastern Madagascar.
Design:
The present study was a prospective longitudinal cohort study to estimate dietary diversity and nutrient intake for a suite of macronutrients, micronutrients and vitamins for 152 randomly selected households in two communities.
Setting:
Madagascar, with over 25 million people living in an area the size of France, faces a multitude of nutritional challenges. Micronutrient-poor staples, especially rice, roots and tubers, comprise nearly 80 % of the Malagasy diet by weight. The remaining dietary components (including wild foods and animal-source foods) are critical for nutrition. We focus our study in north-eastern Madagascar, characterized by access to rainforest, rice paddies and local agriculture.
Participants:
We enrolled men, women and children of both sexes and all ages in a randomized sample of households in two communities.
Results:
Although the Household Dietary Diversity Score and Food Consumption Score reflect high dietary diversity, the Minimum Dietary Diversity–Women indicator suggests poor micronutrient adequacy. The food intake data confirm a mixed nutritional picture. We found that the median individual consumed less than 50 % of his/her age/sex-specific Estimated Average Requirement (EAR) for vitamins A, B12, D and E, and Ca, and less than 100 % of his/her EAR for energy, riboflavin, folate and Na.
Conclusions:
Malnutrition in remote communities of north-eastern Madagascar is pervasive and multidimensional, indicating an urgent need for comprehensive public health and development interventions focused on providing nutritional security.
We sought to define the prevalence of echocardiographic abnormalities in long-term survivors of paediatric hematopoietic stem cell transplantation and determine the utility of screening in asymptomatic patients. We analysed echocardiograms performed on survivors who underwent hematopoietic stem cell transplantation from 1982 to 2006. A total of 389 patients were alive in 2017, with 114 having an echocardiogram obtained ⩾5 years post-infusion. A total of 95 patients had echocardiogram performed for routine surveillance. The mean time post-hematopoietic stem cell transplantation was 13 years. Of 95 patients, 77 (82.1%) had ejection fraction measured, and 10/77 (13.0%) had ejection fraction z-scores ⩽−2.0, which is abnormally low. Those patients with abnormal ejection fraction were significantly more likely to have been exposed to anthracyclines or total body irradiation. Among individuals who received neither anthracyclines nor total body irradiation, only 1/31 (3.2%) was found to have an abnormal ejection fraction of 51.4%, z-score −2.73. In the cohort of 77 patients, the negative predictive value of having a normal ejection fraction given no exposure to total body irradiation or anthracyclines was 96.7% at 95% confidence interval (83.3–99.8%). Systolic dysfunction is relatively common in long-term survivors of paediatric hematopoietic stem cell transplantation who have received anthracyclines or total body irradiation. Survivors who are asymptomatic and did not receive radiation or anthracyclines likely do not require surveillance echocardiograms, unless otherwise indicated.
We examine the morphological variation of a Paleozoic pterineid during a time of relative ecological and taxonomic stability in the Middle Devonian Appalachian Basin in central and eastern New York. We discuss the taxonomic status of the Middle Devonian bivalve Actinopteria boydi (Conrad, 1842) and quantify the variability of its shell disk as well as the width and angle of the auricles and sulci of this otherwise character-poor bivalve species using geometric morphometric techniques employing Cartesian landmarks. We compare variants from three stratigraphic levels (Skaneateles, Ludlowville, and Moscow formations) and from different habitats characterized by lithofacies.
The phenotypic variation observed in our data does not amount to an overall directional shift in morphology, i.e., they constitute reversible changes of morphology in a single variable taxon. Our study finds small-scale variation in morphology that represents evidence for ecophenotypic variation through ~3–4 Myr. Differences in substrate coupled with water energy seem to impact this taxon’s morphology. Although no clearly separated groups can be observed, material from muddy facies develops variants with, on average, rounder and broader shell disks than are found in material from silty facies. This morphology could have increased the flow rate of water channeled over the posterior shell portion thereby improving filtration rate, which is especially beneficial in environments with low water energy.
Wildlife populations provide harvestable meat to people and contribute to local food security. Throughout the year, and particularly at times of agricultural food shortages, wildlife and other wild foods play a critical role in supporting food security and enhancing local human nutrition. We explored the distribution of food security benefits of agricultural food production and a particular ecosystem provisioning service – wildlife harvest in the Makira Natural Park (MNP) of Madagascar – at community, household and individual levels. We found strong variation in wildlife consumption both among communities and among households and less variation among individuals within households. Mean household wildlife consumption in the target community was 10 kg per year ranging by approximately two orders of magnitude, with poorer and more food insecure households more reliant on wildlife for food. Meats (including wildlife) appeared to be evenly distributed within households, unaffected by age, sex, birth order and body weight, while other foods (including stew, rice and other staples) appeared to be allocated based on body mass. Reductions in wildlife consumption cause increased risk of food insecurity and specific nutritional deficiencies. The findings from our multilevel study suggest that disaggregated analysis that merges ecosystem services theory and the microeconomics of resource allocation allows for a more accurate valuation approach.
Few would dispute that the nature of work, and the workers who perform it, has evolved considerably in the 70 years since the founding of the Society for Industrial and Organizational Psychology (SIOP) as the American Psychological Association's (APA's) Division 14, focused on industrial, business, and organizational psychology. Yet, in many ways the populations sampled in industrial–organizational (I-O) psychology research have failed to keep pace with this evolution. Bergman and Jean (2016) highlight how I-O research samples underrepresent (relative to the labor market) low- or medium-skill workers, wage earners, and temporary workers, resulting in a body of science that is overly focused on salaried, professional managers and executives. Though these discrepancies in the nature of individuals’ work and employment are certainly present and problematic in organizational research, one issue that should not be overlooked is that of adequately representing nationality and culture in I-O research samples.
As organizations become more complex and dynamic, individuals' ability to learn from experience becomes more important. Recently, the concept of learning agility has attracted considerable attention from human resource professionals and consultants interested in selecting on and developing employees' ability to learn from experience. However, the academic community has been largely absent from this discussion of learning agility, and the concept remains ill defined and poorly measured. This article presents a constructive critique of the existing literature on learning agility, seeks to clarify the definition and conceptualization of the construct, and situates learning agility within a broader nomological network of related constructs. We conclude by discussing several important directions for future research on learning agility.
This article responds to and extends the commentaries offered in response to our focal article on learning agility. After summarizing the basic themes in the commentaries, we use this response to clarify points that were unclear in our original article and push back on certain points raised in a few of the responses. In particular, we reframe the rigor–relevance debate from an “either–or” to a “both–and” discussion, clarify the relationship between learning agility and ability to learn, explain how learning agility in organizations moves beyond cognition, and describe how exchanges such as the one we have collectively engaged in here are central to progressing the scientific study on learning agility and its effective use in practice.
In rural Minnesota, it is common for paramedics providing advanced life support (ALS) to rendezvous with ambulances providing only basic life support (BLS). These “intercepts” presumably allow for a higher level of care when patients have certain problems or need ALS interventions. The aim of this study was to review and understand the frequency of para-medic intercepts with regard to the actual care rendered and transport urgency (lights and sirens vs. none).
Methods:
All paramedic intercepts occurring between January 2003 and December 2007 for one multi-site emergency medical services (EMS) provider were reviewed for ALS interventions and treatments provided. In addition, the urgency of responses to the dispatch call or “intercept” and transport to a receiving facility were analyzed.
Results:
During the study period, 1,675 paramedic intercepts occurred and were reviewed. The ALS ambulances responded to the dispatch emergently (lights and sirens) in 97.5% of intercepts (1,633), but emergently transported only 24.2% of the patients (405). Paramedics performed no interventions above BLS levels in 11.6% (194) of the cases. Of the remaining 1,481 patients who received ALS interventions, 955 (64.4%) received no treatment or diagnostic testing other than electrocardiographic monitoring, intravenous access, or both.
Conclusions:
A significant discrepancy between emergent responses and actual ALS care rendered during intercept calls was demonstrated. Given the significant rate of EMS worker fatalities and transferable patient care costs, further study is needed to determine whether costs and safety are potentially improved by decreasing emergent responses. Future directions include developing or emulating Medical Priority Dispatch System triage protocols for advanced services providing intercepts. In addition, further study of patient outcomes between intercept and non-intercept cases is necessary.
The electron channeling contrast imaging (ECCI) technique was utilized to investigate atomic step morphologies and dislocation densities in 3C-SiC films grown by chemical vapor deposition (CVD) on Si (001) substrates. ECCI in this study was performed inside a commercial scanning electron microscope using an electron backscatter diffraction (EBSD) system equipped with forescatter diode detectors. This approach allowed simultaneous imaging of atomic steps, verified by atomic force microscopy, and dislocations at the film surface. EBSD analysis verified the orientation and monocrystalline quality of the 3C-SiC films. Dislocation densities in 3C-SiC films were measured locally using ECCI, with qualitative verification by x-ray diffraction. Differences in the dislocation density across a 50 mm diameter 3C-SiC film could be attributed to subtle variations during the carbonization process across the substrate surface.
We use the Newell Test as a basis for evaluating ACT-R as an effective architecture for cognitive engineering. Of the 12 functional criteria discussed by Anderson & Lebiere (A&L), we discuss the strengths and weaknesses of ACT-R on the six that we postulate are the most relevant to cognitive engineering.