To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
Autocrats confront a number of threats to their power, some from within the regime and others from foreign actors. To understand how these threats interact and affect autocratic survival, we build a model where an autocratic leader can be ousted by a domestic opposition and a foreign actor. We concentrate on the impact that foreign threats have on the stability of autocratic leadership and show that the presence of foreign threats increases the probability an autocrat retains power. Focusing on two cases, one where a foreign actor and the domestic opposition have aligned interests and one where their interests are misaligned, we elucidate two distinct mechanisms. First, when interests are aligned, autocrats are compelled to increase domestic security to alleviate international pressure. Second, when interests are misaligned, autocrats exploit the downstream threat of foreign intervention to deter domestic threats. We also show that autocrats have incentives to cultivate ideological views hostile to broader interests among politically influential domestic actors.
Intensified cover cropping practices are increasingly viewed as an herbicide resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover cropping tactics, including (1) facilitation of reduced herbicide inputs, and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter- and summer- annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha-1 in corn and 3,000 to 5,500 kg ha-1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production with cover cropping tactics providing an additive weed suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (> 7.6 cm diameter) at the time of pre-plant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (> 10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
The COVID-19 pandemic has created a high demand on personal protective equipment, including disposable N95 masks. Given the need for mask reuse, we tested the feasibility of vaporized hydrogen peroxide (VHP), ultraviolet light (UV), and ethanol decontamination strategies on N95 mask integrity and the ability to remove the infectious potential of SARS-CoV-2.
Disposable N95 masks, including medical grade (1860, 1870+) and industrial grade (8511) masks, were treated by vaporized hydrogen peroxide (VHP), ultraviolet light (UV), and ethanol decontamination. Mask degradation was tested using a quantitative respirator fit testing. Pooled clinical samples of SARS-CoV-2 were applied to mask samples, treated and and then either sent immediately for real-time reverse transcriptase–polymerase chain reaction (RT-PCR) or incubated with Vero E6 cells to assess for viracidal effect.
Both ethanol and UV decontamination showed functional degradation to different degrees while VHP treatment showed no significant change after 2 treatments.We also report a single SARS-CoV-2 virucidal experiment using Vero E6 cell infection in which only ethanol treatment eliminated detectable SARS-CoV-2 RNA.
We hope our data will guide further research for evidenced-based decisions for disposable N95 mask reuse and help protect caregivers from SARS-CoV-2 and other pathogens.
OBJECTIVES/GOALS: The goal of this project was to assess the scientific impact of Miami CTSI’s Mentored Career Development (KL2) Program using bibliometric tools and network visualization in addition to the traditional metrics used to provide a comprehensive evaluation. METHODS/STUDY POPULATION: Scholarly productivity of KL2 scholars were tracked using REDCap. For bibliometric data analysis and visualization, publications were queried using iCite (NIH Office of Portfolio Analysis) and Web of Science database. A total of 173 publications produced by eight KL2 scholars from 2013-2018 were analyzed and categorized into pre-award, during award, and post-award periods. iCite was used to assess scientific influence and translation. Scientific networks and collaboration were visualized using VOSviewer (Centre for Science and Technology Studies, Leiden University). CTSA Common Metrics were tracked using the Results Based Accountability framework. RESULTS/ANTICIPATED RESULTS: Albeit of modest size, the Miami CTSI’s KL2 Program had significant scientific productivity and impact in its first five years. Our KL2 scholars’ publications were cited twice as frequently as other papers in their fields. Further, 48% of publications post KL2 award were above the NIH 50th percentile and had higher citation impact compared to the average NIH-funded paper; 11% were in the top 10% NIH citation ranking. In contrast, only 20% of the publications pre-KL2 award were above the NIH 50th percentile. The program also promoted research collaboration; network visualizations indicate larger co-authorship and organization networks of KL2 scholars post-award. DISCUSSION/SIGNIFICANCE OF IMPACT: Bibliometric and data visualization approaches helped us better identify trends and gauge effectiveness of the KL2 program. These findings provided useful insight into the scientific influence and impact of our scholars’ work.
Fluoroquinolones (FQs) and extended-spectrum cephalosporins (ESCs) are associated with higher risk of Clostridioides difficile infection (CDI). Decreasing the unnecessary use of FQs and ESCs is a goal of antimicrobial stewardship. Understanding how prescribers perceive the risks and benefits of FQs and ESCs is needed.
We conducted interviews with clinicians from 4 hospitals. Interviews elicited respondent perceptions about the risk of ESCs, FQs, and CDI. Interviews were audio recorded, transcribed, and analyzed using a flexible coding approach.
Interviews were conducted with 64 respondents (38 physicians, 7 nurses, 6 advance practice providers, and 13 pharmacists). ESCs and FQs were perceived to have many benefits, including infrequent dosing, breadth of coverage, and greater patient adherence after hospital discharge. Prescribers stated that it was easy to make decisions about these drugs, so they were especially appealing to use in the context of time pressures. They described having difficulty discontinuing these drugs when prescribed by others due to inertia and fear. Prescribers were skeptical about targeting specific drugs as a stewardship approach and felt that the risk of a negative outcome from under treatment of a suspected bacterial infection was a higher priority than the prevention of CDI.
Prescribers in this study perceived many advantages to using ESCs and FQs, especially under conditions of time pressure and uncertainty. In making decisions about these drugs, prescribers balance risk and benefit, and they believed that the risk of CDI was acceptable in compared with the risk of undertreatment.
Individuals with tardive dyskinesia (TD) who completed a long-term study (KINECT 3 or KINECT 4) of valbenazine (40 or 80 mg/day, once-daily for up to 48 weeks followed by 4-week washout) were enrolled in a subsequent study (NCT02736955) that was primarily designed to further evaluate the long-term safety of valbenazine.
Participants were initiated at 40 mg/day (following prior valbenazine washout). At week 4, dosing was escalated to 80 mg/day based on tolerability and clinical assessment of TD; reduction to 40 mg/day was allowed for tolerability. The study was planned for 72 weeks or until termination due to commercial availability of valbenazine. Assessments included the Clinical Global Impression of Severity-TD (CGIS-TD), Patient Satisfaction Questionnaire (PSQ), and treatment-emergent adverse events (TEAEs).
At study termination, 85.7% (138/161) of participants were still active. Four participants had reached week 60, and none reached week 72. The percentage of participants with a CGIS-TD score ≤2 (normal/not ill or borderline ill) increased from study baseline (14.5% [23/159]) to week 48 (64.3% [36/56]). At baseline, 98.8% (158/160) of participants rated their prior valbenazine experience with a PSQ score ≤2 (very satisfied or somewhat satisfied). At week 48, 98.2% (55/56) remained satisfied. Before week 4 (dose escalation), 9.4% of participants had ≥1 TEAE. After week 4, the TEAE incidence was 49.0%. No TEAE occurred in ≥5% of participants during treatment (before or after week 4).
Valbenazine was well-tolerated and persistent improvements in TD were found in adults who received once-daily treatment for >1 year.
Introduction: Digital distraction is being integrated into pediatric pain care, but its efficacy is currently unknown. We conducted a systematic review to determine the effect of digital technology distraction on pain and distress for children experiencing acutely painful conditions or medical procedures. Methods: We searched eight online databases (MEDLINE, Embase, Cochrane Library, CINAHL, PsycINFO, IEEE Xplore, Ei Compendex, Web of Science), grey literature sources, scanned reference lists, and contacted experts for quantitative studies where digital technologies were used as distraction for acutely painful conditions or procedures in children. Study selection was performed by two independent reviewers with consensus. One reviewer extracted relevant study data and another verified it for accuracy. Appraisal of risk of bias within studies and the certainty of the body of evidence were performed independently in duplicate, with the final appraisal determined by consensus. The primary outcomes of interest were child pain and distress. Results: Of 3247 unique records identified by the search, we included 106 studies (n = 7820) that reported on digital technology distractors (e.g., virtual reality; videogames) used during common procedures (e.g., venipuncture, minor dental procedures, burn treatments). We located no studies reporting on painful conditions. For painful procedures, digital distraction resulted in a modest but clinically important reduction in self-reported pain (SMD -0.48, 95% CI -0.66 to -0.29, 46 RCTs, n = 3200), observer-reported pain (SMD -0.68, 95% CI -0.91 to -0.45, 17 RCTs, n = 1199), behavioural pain (SMD -0.57, 95% CI -0.94 to -0.19, 19 RCTs, n = 1173), self-reported distress (SMD -0.49, 95% CI -0.70 to -0.27, 19 RCTs, n = 1818), observer-reported distress (SMD -0.47, 95% CI -0.77 to -0.17, 10 RCTs, n = 826), and behavioural distress (SMD -0.35, 95% CI -0.59 to -0.12, 17 RCTs, n = 1264) compared to usual care. Few studies directly compared different distractors or provided subgroup data to inform applicability. Conclusion: Digital distraction provides modest pain and distress reduction for children undergoing painful procedures; its superiority over non-digital distractors is not established. Healthcare providers and parents should strongly consider using distractions as a pain-reduction strategy for children and teens during common painful procedures (e.g., needle pokes, dental fillings). Context, child preference, and availability should inform the choice of distractor.
Introduction: eCTAS is a real time electronic triage decision-support tool designed to improve patient safety and quality of care by standardizing the application of the Canadian Triage and Acuity Scale (CTAS). The tool dynamically calculates a recommended CTAS score based on the presenting complaint, vital signs and selected clinical modifiers. The primary objective was to assess consistency of CTAS score distributions across 35 emergency departments (EDs) by 16 presenting complaints pre and post eCTAS implementation. Methods: This retrospective cohort study used population-based administrative data from January 2016 to December 2018 from all hospital EDs in Ontario that had implemented eCTAS with at least 9 months of data. Following a 3-month stabilization period, we compared data for 6 months post-eCTAS implementation to the same 6-month period the previous year (pre-implementation) to account for potential seasonal variation, patient volume and case-mix. We included triage encounters of adult (≥18 years) patients if they had one of 16 pre-specified high-volume, presenting complaints. A paired-samples t-test was used to determine consistency by estimating the absolute difference in CTAS distribution for each presenting complaint, by each hospital, pre and post eCTAS implementation, compared to the overall average of the 35 EDs. Results: There were 183,231 triage encounters in the pre-eCTAS cohort and 179,983 in the post-eCTAS cohort from 35 EDs across the province. Triage scores were more consistent with the overall average after eCTAS implementation in 6 (37.5%) presenting complaints: chest pain (cardiac features) (p < 0.001), extremity weakness/symptoms of cerebrovascular accident (p < 0.001), fever (p < 0.001), shortness of breath (p < 0.001), syncope (p = 0.02), and hyperglycemia (p = 0.03). Triage consistency was similar pre and post eCTAS implementation for the presenting complaints of altered level of consciousness, anxiety/situational crisis, confusion, depression/suicidal/deliberate self-harm, general weakness, head injury, palpitations, seizure, substance misuse/intoxication or vertigo. Conclusion: A standardized, electronic approach to performing triage assessments increased consistency in CTAS scores across many, but not all, high-volume CEDIS complaints. This does not reflect triage accuracy, as there are no known benchmarks for triage accuracy. Improvements in consistency were greatest for sentinel presenting complaints with a minimum allowable CTAS score.
Background: Since January 1, 2016 2358 people have died from opioid poisoning in Alberta. Buprenorphine/naloxone (bup/nal) is the recommended first line treatment for opioid use disorder (OUD) and this treatment can be initiated in emergency departments and urgent care centres (EDs). Aim Statement: This project aims to spread a quality improvement intervention to all 107 adult EDs in Alberta by March 31, 2020. The intervention supports clinicians to initiate bup/nal for eligible individuals and provide rapid referrals to OUD treatment clinics. Measures & Design: Local ED teams were identified (administrators, clinical nurse educators, physicians and, where available, pharmacists and social workers). Local teams were supported by a provincial project team (project manager, consultant, and five physician leads) through a multi-faceted implementation process using provincial order sets, clinician education products, and patient-facing information. We used administrative ED and pharmacy data to track the number of visits where bup/nal was given in ED, and whether discharged patients continued to fill any opioid agonist treatment (OAT) prescription 30 days after their index ED visit. OUD clinics reported the number of referrals received from EDs and the number attending their first appointment. Patient safety event reports were tracked to identify any unintended negative impacts. Evaluation/Results: We report data from May 15, 2018 (program start) to September 31, 2019. Forty-nine EDs (46% of 107) implemented the program and 22 (45% of 49) reported evaluation data. There were 5385 opioid-related visits to reporting ED sites after program adoption. Bup/nal was given during 832 ED visits (663 unique patients): 7 visits in the 1st quarter the program operated, 55 in the 2nd, 74 in the 3rd, 143 in the 4th, 294 in the 5th, and 255 in the 6th. Among 505 unique discharged patients with 30 day follow up data available 319 (63%) continued to fill any OAT prescription after receiving bup/nal in ED. 16 (70%) of 23 community clinics provided data. EDs referred patients to these clinics 440 times, and 236 referrals (54%) attended their first follow-up appointment. Available data may under-report program impact. 5 patient safety events have been reported, with no harm or minimal harm to the patient. Discussion/Impact: Results demonstrate effective spread and uptake of a standardized provincial ED based early medical intervention program for patients who live with OUD.
The curves recommended for calibrating radiocarbon (14C) dates into absolute dates have been updated. For calibrating atmospheric samples from the Northern Hemisphere, the new curve is called IntCal20. This is accompanied by associated curves SHCal20 for the Southern Hemisphere, and Marine20 for marine samples. In this “companion article” we discuss advances and developments that have led to improvements in the updated curves and highlight some issues of relevance for the general readership. In particular the dendrochronological based part of the curve has seen a significant increase in data, with single-year resolution for certain time ranges, extending back to 13,910 calBP. Beyond the tree rings, the new curve is based upon an updated combination of marine corals, speleothems, macrofossils, and varved sediments and now reaches back to 55,000 calBP. Alongside these data advances, we have developed a new, bespoke statistical curve construction methodology to allow better incorporation of the diverse constituent records and produce a more robust curve with uncertainties. Combined, these data and methodological advances offer the potential for significant new insight into our past. We discuss some implications for the user, such as the dating of the Santorini eruption and also some consequences of the new curve for Paleolithic archaeology.
Increased impulsivity is a diagnostic feature of mania in bipolar disorder (BD). However it is unclear whether increased impulsivity is also a trait feature of BD and therefore present in remission. Trait impulsivity can also be construed as a personality dimension but the relationship between personality and impulsivity in BD has not been explored. The aim of this study was to examine the relationship of impulsivity to clinical status and personality characteristics in patients with BD.
We measured impulsivity using the Barratt Impulsiveness Scale (BIS-11) and personality dimensions using Eysenck Personality Questionnaire in 106 BD patients and demographically matched healthy volunteers. Clinical symptoms were assessed in all participants using the Clinical Global Impressions Scale, the Montgomery-Asberg Depression Rating Scale and the Young Mania Rating Scale. Based on their clinical status patients were divided in remitted (n = 36), subsyndromal (n = 25) and syndromal (n = 45).
There was no difference in BIS-11 and EPQ scores between remitted patients and healthy subjects. Impulsivity, Neuroticism and Psychoticism scores were increased in subsyndromal and syndromal patients. Within the BD group, total BIS-11 score was predicted mainly by symptoms severity followed by Psychoticism and Neuroticism scores.
Increased impulsivity may not be a trait feature of BD. Symptom severity is the most significant determinant of impulsivity measures even in subsyndromal patients.
Before October 2012 there was no service level agreement for psychiatry cover in Whiston Hospital, an acute trust in the UK. The Crisis team would visit on goodwill to assess patients. This changed when a Liaison Psychiatry (LP) service was commissioned to provide 24 hour cover, Monday to Sunday for the Emergency Department (ED) for adults.
To quantify waiting times to be assessed by psychiatry, comparing the new LP Service (intervention group) to its predecessor (control). The null hypothesis being that the waiting time for the control and intervention group are the same.
The authors prospectively collected data on all referrals received by the LP service in the first three months of operation n=305 and retrospectively collected data on a random sample of 50 patients referred from ED in the same months 2011 (control).
The median time from referral to the time of psychiatric assessment in the control group was 162.5 minutes [IQR 130–330], the mean time was 246.16 [95% CI 180 to 312]. The median time from referral to the time of psychiatric assessment following the introduction of the LP service was 30 minutes [IQR 15-90], the mean time was 79.63 [95% CI 65 to 93]. When the two samples were compared using an independent t test they were significantly different p<0.002.
The new LP service has decreased the median wait for a psychiatry assessment by 132 minutes. The team currently seeS 82% of referrals within 60 minutes. This improves patient safety and encourages appropriate and timely discharge.
The Department of Health in the UK wants the National Health Service to make £20 Billion worth of efficiency savings by 2015 to reinvest.
In the UK the General Hospitals use paper records which are then scanned to create electronic records while Psychiatric Hospitals require that information to be typed on to their electronic records and these electronic records are not available to each other.
Therefore liaison psychiatry assessments require a written entry to be made in the Medical notes and a second entry typed on to the psychiatric electronic patient record which requires a full psychiatric history.
This duplication in typing information was consuming a considerable amount of this Teams time and resources which could have instead been spent with patients.
To identify how much time is spent by Staff typing information on to the psychiatric electronic patient records.
We electronically checked for the preceding three months the amount of time spent typing information on to the electronic records after every liaison psychiatry assessment.
We were then able to obtain the average for every week.
On average about 36 to 40 hours were spent every week typing information on to the electronic records.
Liaison Psychiatry should dispense with the requirement for information to be duplicated on to the electronic patient records and should instead scan the written entry made in the Medical notes.
This should lead to a saving of about £50,000, enough to employ an additional member of Staff every week.
Neurocognitive impairments robustly predict functional outcome. However, heterogeneity in neurocognition is common within diagnostic groups, and data-driven analyses reveal homogeneous neurocognitive subgroups cutting across diagnostic boundaries.
To determine whether data-driven neurocognitive subgroups of young people with emerging mental disorders are associated with 3-year functional course.
Model-based cluster analysis was applied to neurocognitive test scores across nine domains from 629 young people accessing mental health clinics. Cluster groups were compared on demographic, clinical and substance-use measures. Mixed-effects models explored associations between cluster-group membership and socio-occupational functioning (using the Social and Occupational Functioning Assessment Scale) over 3 years, adjusted for gender, premorbid IQ, level of education, depressive, positive, negative and manic symptoms, and diagnosis of a primary psychotic disorder.
Cluster analysis of neurocognitive test scores derived three subgroups described as ‘normal range’ (n = 243, 38.6%), ‘intermediate impairment’ (n = 252, 40.1%), and ‘global impairment’ (n = 134, 21.3%). The major mental disorder categories (depressive, anxiety, bipolar, psychotic and other) were represented in each neurocognitive subgroup. The global impairment subgroup had lower functioning for 3 years of follow-up; however, neither the global impairment (B = 0.26, 95% CI −0.67 to 1.20; P = 0.581) or intermediate impairment (B = 0.46, 95% CI −0.26 to 1.19; P = 0.211) subgroups differed from the normal range subgroup in their rate of change in functioning over time.
Neurocognitive impairment may follow a continuum of severity across the major syndrome-based mental disorders, with data-driven neurocognitive subgroups predictive of functional course. Of note, the global impairment subgroup had longstanding functional impairment despite continuing engagement with clinical services.
Duchenne muscular dystrophy is associated with progressive cardiorespiratory failure, including left ventricular dysfunction.
Methods and Results:
Males with probable or definite diagnosis of Duchenne muscular dystrophy, diagnosed between 1 January, 1982 and 31 December, 2011, were identified from the Muscular Dystrophy Surveillance Tracking and Research Network database. Two non-mutually exclusive groups were created: patients with ≥2 echocardiograms and non-invasive positive pressure ventilation-compliant patients with ≥1 recorded ejection fraction. Quantitative left ventricular dysfunction was defined as an ejection fraction <55%. Qualitative dysfunction was defined as mild, moderate, or severe. Progression of quantitative left ventricular dysfunction was modelled as a continuous time-varying outcome. Change in qualitative left ventricle function was assessed by the percentage of patients within each category at each age. Forty-one percent (n = 403) had ≥2 ejection fractions containing 998 qualitative assessments with a mean age at first echo of 10.8 ± 4.6 years, with an average first ejection fraction of 63.1 ± 12.6%. Mean age at first echo with an ejection fraction <55 was 15.2 ± 3.9 years. Thirty-five percent (140/403) were non-invasive positive pressure ventilation-compliant and had ejection fraction information. The estimated rate of decline in ejection fraction from first ejection fraction was 1.6% per year and initiation of non-invasive positive pressure ventilation did not change this rate.
In our cohort, we observed that left ventricle function in patients with Duchenne muscular dystrophy declined over time, independent of non-invasive positive pressure ventilation use. Future studies are needed to examine the impact of respiratory support on cardiac function.
Healthcare personnel (HCP) were recruited to provide serum samples, which were tested for antibodies against Ebola or Lassa virus to evaluate for asymptomatic seroconversion.
From 2014 to 2016, 4 patients with Ebola virus disease (EVD) and 1 patient with Lassa fever (LF) were treated in the Serious Communicable Diseases Unit (SCDU) at Emory University Hospital. Strict infection control and clinical biosafety practices were implemented to prevent nosocomial transmission of EVD or LF to HCP.
All personnel who entered the SCDU who were required to measure their temperatures and complete a symptom questionnaire twice daily were eligible.
No employee developed symptomatic EVD or LF. EVD and LF antibody studies were performed on sera samples from 42 HCP. The 6 participants who had received investigational vaccination with a chimpanzee adenovirus type 3 vectored Ebola glycoprotein vaccine had high antibody titers to Ebola glycoprotein, but none had a response to Ebola nucleoprotein or VP40, or a response to LF antigens.
Patients infected with filoviruses and arenaviruses can be managed successfully without causing occupation-related symptomatic or asymptomatic infections. Meticulous attention to infection control and clinical biosafety practices by highly motivated, trained staff is critical to the safe care of patients with an infection from a special pathogen.
Myelomeningocele (MMC), the most common congenital abnormality of the central nervous system (CNS), occurs due to failure of the neural tube to close in the first 4 weeks after conception and is characterized by a fluid-filled sac containing an exposed spinal cord and nerves. Myeloschisis is similar to MMC except that a membranous sac is not present and the defect is wider (Figure 44.1). The consequence of an open neural tube defect is abnormal development of the CNS. The neural elements become damaged from exposure to the toxic effects of amniotic fluid, leading to associated long-term morbidity and mortality. Cerebrospinal fluid (CSF) leaks out through the MMC and as a consequence the hindbrain herniates into the cervical spinal canal and blocks CSF circulation, leading to hydrocephalus and brain damage. Although 75% of individuals affected with spina bifida survive to adulthood, the one-year survival rate for infants is 88–96% [1, 2]. More than 80% of affected individuals require a ventriculo-peritoneal shunt to divert CSF in order to decompress the associated hydrocephalus, and this is dependent upon lesion level, with the need being greater for those with higher level lesions . The need for a shunt is associated with complications including infection, obstruction, displacement, and shunt revisions [3, 4]. More than 75% of patients have radiographic evidence of the Chiari II malformation (hindbrain herniation, brain stem abnormalities, and a small posterior fossa) that can manifest clinically as apnea, swallowing difficulties, quadriparesis, and coordination difficulties in up to one-third of affected individuals [5–7]. Functional motor levels correlate with lesion level in approximately 39% of patients, but in over half the functional level correlates to anatomic lesions two levels higher . Wheelchair use correlates with lesion level; 90% of patients with a thoracic lesion use a wheelchair while 45% with a lumbar lesion and 17% with a sacral lesion use a wheelchair . Bladder and bowel incontinence are also associated with MMC, necessitating the use of bowel and bladder regimens including clean intermittent catheterization and enemas. Urologic complications include recurrent urinary tract infections, vesicoureteral reflux, and upper urinary tract dilation . Additionally, the overwhelming majority of infants will require intervention for a foot deformity . For those living long term with spina bifida, up to one-third of adults require daily assistance and a high rate of unexpected death has been noted [11, 12].