To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To design a meditation protocol and test its feasibility, acceptability and efficacy in conjunction with yoga training (YT) for persons with schizophrenia (SZ).
The meditation protocol consisted of Anapana (observing normal respiration) and Yoga Nidra (supine, restful awareness). In a single-blind randomised controlled trial, medicated and clinically stable outpatients diagnosed with SZ were randomised to receive treatment as usual (TAU), TAU augmented with YT or TAU augmented with meditation and yoga training (MYT) for 3 weeks (N = 145). Acceptability, clinical, social and cognitive functions were assessed after 3-week and 3-month post-randomisation using within-group and between-group analyses with repeated measures multivariate tests.
No group-wise differences in compliance, study discontinuation, major/serious side effects or adverse events were noted. For six assessed clinical variables, the direction of changes were in the desired direction and the effect sizes were greater in the MYT group compared with the TAU group at both time points. Changes in social function variables were greater at 3 months than at 3 weeks. Nominally significant improvement in individual cognitive domains were noted in all groups at both time points. All effect sizes were in the small to medium range.
MYT is feasible and acceptable and shows modest benefits for persons with SZ. MYT can also improve quality of life and clinical symptoms. Larger studies of longer duration are warranted.
Cotard’s syndrome is a nihilistic delusion where the individual believes they are dead, partly dead, or replaced by an animal. The delusion that their body has been replaced by a purely inanimate azooic (but physical entity), such as a robot or a droid, has not hitherto been described.
Case study: This 60-year-old, right-handed, female, with a past history of schizophrenia presented with complaints of depression, irritability, and anger. When confronted with commitment papers signed by her father, she denied their truthfulness, insisting that he had been replaced by an imposter. This belief persisted unabated, despite treatment with 20 mg of haloperidol per day. Over time, she expressed the belief that she had been replaced by another person, whom she refused to identify. The following day she refused all food and water proclaiming that she had died and been replaced by a machine revealing, “I am not her. I am a robot.” Soon thereafter she developed tremulousness, stiffness, and rigidity. After haloperidol was decreased and benztropine started, these parkinsonism symptoms subsided, but her delusions persisted.
Abnormalities in physical examination: General: decreased blink frequency. Neurologic examination: Mental status examination: bradyphrenia, hypoverbal, blunted affect. Oriented ×2. Motor examination: bradykinetic, cogwheel rigidity in both upper extremities. Gait examination: slow shuffling gait, reduced bilateral arm swing. Cerebellar examination: resting tremor in both upper extremities at 3 cycles per second. Other: EEG: focal sharp transients in the left temporal region. MRI with and without contrast: normal. Toxicological, metabolic, endocrine screening: normal.
This illustrated sequential presentations of three delusions of misidentification. Upon presentation, she exhibited Capgras syndrome, the delusional belief that a familiar person has been replaced by a double. The nidus for this may have been the discovery that her father had signed her commitment papers. This was followed by the belief she was a double of herself, which is the syndrome of Reverse Subjective Doubles. Finally, she manifested Cotard’s syndrome in a previously undescribed manner, believing she had died and become a robot. Cotard’s and Capgras syndromes are known to present sequentially rather than concurrently, whereas the patient presented concurrently with all three syndromes. Drug-induced parkinsonism may have made the patient subjectively feel stiff, which she interpreted as being rigid like a robot. She was bradykinetic, did not eat or drink, and had rigidity, suggesting that these were somatic manifestations of her underlying delusion of being a robot or alternatively, may have been the somatic nidus for the delusion. Those who present with Cotard’s syndrome warrant evaluation for underlying medical conditions, serving as a substrate for this delusion.
The Pediatric and Congenital Electrophysiology Society (PACES) is a global organisation committed to the care of children and adults with CHD and arrhythmias.
To evaluate the global needs and potential inequities as it relates to cardiac implantable electronic devices.
ARROW (Assessment of Rhythm Resources arOund the World) is an online survey about cardiac implantable electronic devices, sent electronically to physicians within the field of Cardiology, Pediatric Cardiology, Electrophysiology and Pediatric Electrophysiology.
ARROW received 42 responders from 28 countries, 50% from low-/middle-income regions. The main differences between low-/middle- and high-income regions include availability of expertise on paediatric electrophysiology (50% versus 93%, p < 00.5) and possibility to perform invasive procedures (35% versus 93%, p < 0.005). Implant of devices in low-income areas relies significantly on patient’s resources (71%). The follow-up of the devices is on the hands of paediatric cardiologist/electrophysiologist in higher resources centres (93% versus 50%, p < 0.05).
The ARROW survey represents an initial assessment of the geographical characteristics in the field of Pediatric Electrophysiology. The next step is to make this “state of the art” more extensive to other aspects of the expertise. The relevance of collecting this data before the World Congress of Pediatric Cardiology and Cardiac Surgery (WCPCCS) in 2023 in Washington DC was emphasised in order to share the resulting information with the international community and set a plan of action to assist the development of arrhythmia services for children within developing regions of the world.
Despite evidence favoring perioperative antibiotic prophylaxis (ABP) use in patients undergoing craniotomy to reduce rates of surgical site infections (SSIs), standardized protocols are lacking. We describe demographic characteristics, risk factors, and ABP choice in patients with craniotomy complicated with SSI.
Retrospective case series from January 1, 2017, through December 31, 2020.
Tertiary-care referral center.
Adults who underwent craniotomy and were diagnosed with an SSI.
Logistic regression to estimate odds ratios and 95% confidence intervals to identify factors associated with SSIs.
In total, 5,328 patients undergoing craniotomy were identified during the study period; 59 (1.1%) suffered an SSI. Compared with non-SSI cases, patients with SSI had a significantly higher frequency of emergency procedures: 13.5% versus 5.8% (P = .02; odds ratio [OR], 2.52; 95% confidene interval [CI], 1.10–5.06; P = .031). Patients with SSI had a higher rate of a dirty (5.1% vs 0.9%) and lower rate of clean-contaminated (3.3% vs 14.5%) wound class than those without infection (P = .002). Nearly all patients received ABP before craniotomy (98.3% in the SSI group vs 99.6% in the non-SSI group; P = .10). Combination of vancomycin and cefazolin as dual therapy was more prevalent in the group of patients without infection (n = 1,761, 34.1%) than those with SSI (n = 4, 6.8%) (P < .001), associated with decreased odds for SSI (OR, 0.17; 95% CI, 0.005–0.42; P ≤ .001).
SSI are frequently seen after an emergent neurosurgical procedure and a dirty wound classification. Combination of prophylactic cefazolin and vancomycin is associated with decreased risk for SSI.
To assess the rate and factors associated with healthcare personnel (HCP) testing positive for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) after an occupational exposure.
Retrospective cohort study.
Academic medical center with sites in Minnesota, Wisconsin, Arizona, and Florida.
HCP with a high or medium risk occupational exposure to a patient or other HCP with SARS-CoV-2.
We reviewed the records of HCP with significant occupational exposures from March 20, 2020, through December 31, 2020. We then performed regression analysis to assess the impact of demographic and occupational variables to assess their impact on the likelihood of testing positive for SARS-CoV-2.
In total, 2,253 confirmed occupational exposures occurred during the study period. Employees were the source for 57.1% of exposures. Overall, 101 HCP (4.5%) tested positive in the postexposure period. Of these, 80 had employee sources of exposure and 21 had patient sources of exposure. The postexposure infection rate was 6.2% when employees were the source, compared to 2.2% with patient sources. In a multivariate analysis, occupational exposure from an employee source had a higher risk of testing positive compared to a patient source (odds ratio [OR], 3.22; 95% confidence interval [CI], 1.72–6.04). Sex, age, high-risk exposure, and HCP role were not associated with an increased risk of testing positive.
The risk of acquiring coronavirus disease 2019 (COVID-19) following a significant occupational exposure has remained relatively low, even in the prevaccination era. Exposure to an infectious coworker carries a higher risk than exposure to a patient. Continued vigilance and precautions remain necessary in healthcare settings.
Health technology assessment (HTA) agencies are considering adopting a lifecycle approach to assessments to address uncertainties in the evidence base at launch and to revisit the clinical and economic value of therapies in a dynamic clinical landscape. For reassessments of therapies post launch, HTA agencies are looking to real-world evidence (RWE) to enhance the clinical and economic evidence base, though challenges and concerns in using RWE in decision-making exists. Stakeholders are embarking on demonstration projects to address the challenges and concerns and to further define when and how RWE can be used in HTA decision making. The Institute for Clinical and Economic Review piloted a 24-month observational RWE reassessment. Key learnings from this pilot include identifying the benefits and challenges with using RWE in reassessments and considerations on prioritizing and selecting topics relevant for RWE updates.
The present study aimed to quantify the burden of structural heart disease in Nepali children.
We performed a school-based cross-sectional echocardiographic screening study with cluster random sampling among children 5–16 years of age.
Between December 2012 and January 2019, 6573 children (mean age 10.6 ± 2.9 years) from 41 randomly selected schools underwent echocardiographic screening. Structural heart disease was detected in 14.0 per 1000 children (95% CI 11.3–17.1) and was congenital in 3.3 per 1000 (95% CI 2.1–5.1) and rheumatic in 10.6 per 1000 (95% CI 8.3–13.4). Rates of rheumatic heart disease were higher among children attending public as compared to private schools (OR 2.8, 95% CI 1.6–5.2, p = 0.0001).
Rheumatic heart disease accounted for three out of four cases of structural heart disease and was more common among children attending public as compared to private schools.
There is a limited literature available showing mental health burden among adolescents following cyberbullying.
Aim is to evaluate the association of low mood and suicidality amongst cyberbullied adolescents.
A study on CDC National Youth Risk Behavior Surveillance (YRBS) (1991-2017). Responses from adolescence related to cyberbullying and suicidality were evaluated. Chi-square and mix-effect multivariable logistic regression analysis was performed to find out the association of cyberbullying with sadness/hopelessness, suicide consideration, plan, and attempts.
A total of 10,463 adolescents, 14.8% of adolescents faced cyberbullying a past year. There was a higher prevalence of cyberbullying in youths aged 15-17 years (25 vs 26 vs 23%), which included more females to males (68 vs 32%).(p<0.0001) Caucasians (53%) had the highest number of responses to being cyberbullied compared to Hispanics (24%), African Americans (11%).(p<0.0001) There was an increased prevalence of cyberbullied youths with feelings of sadness/hopelessness (59.6 vs 25.8%), higher numbers considering suicide (40.4 vs 13.2%), suicide plan (33.2 vs 10.8%), and multiple suicidal attempts in comparison to non-cyberbullied.(p<0.0001) On regression analysis, cyberbullied adolescence had a 155% higher chance of feeling sad and hopeless [aOR=2.55; 95%CI=2.39-2.72], considered suicide [1.52 (1.39-1.66)], and suicide plan [1.24 (1.13-1.36)].
In our study, cyberbullying was associated with negative mental health outcomes. Further research is warranted to examine the impact and outcomes of cyberbullying amongst adolescents and guiding the policies to mitigate the consequences.
Anticholinergic medications block cholinergic transmission. The central effects of anticholinergic drugs can be particularly marked in patients with dementia. Furthermore, anticholinergics antagonise the effects of cholinesterase inhibitors, the main dementia treatment.
This study aimed to assess anticholinergic drug prescribing among dementia patients before and after admission to UK acute hospitals.
352 patients with dementia were included from 17 hospitals in the UK. All were admitted to surgical, medical or Care of the Elderly wards in 2019. Information about patients’ prescriptions were recorded on a standardised form. An evidence-based online calculator was used to calculate the anticholinergic drug burden of each patient. The correlation between two subgroups upon admission and discharge was tested with Spearman’s Rank Correlation.
Table 1 shows patient demographics. On admission, 37.8% of patients had an anticholinergic burden score ≥1 and 5.68% ≥3. At discharge, 43.2% of patients had an anticholinergic burden score ≥1 and 9.1% ≥3. The increase was statistically significant (rho 0.688; p=2.2x10-16). The most common group of anticholinergic medications prescribed at discharge were psychotropics (see Figure 1). Among patients prescribed cholinesterase inhibitors, 44.9% were also taking anticholinergic medications.
This multicentre cross-sectional study found that people with dementia are frequently prescribed anticholinergic drugs, even if also taking cholinesterase inhibitors, and are significantly more likely to be discharged with a higher anticholinergic drug burden than on admission to hospital.
Conflict of interest
This project was planned and executed by the authors on behalf of SPARC (Student Psychiatry Audit and Research Collaborative). We thank the National Student Association of Medical Research for allowing us use of the Enketo platform. Judith Harrison was su
The usage of mobile phones has seen exponential growth worldwide.1,2 While college students use mobile applications for educational purposes, the reports of adverse health problems are emerging.3,4
Investigate the impact of mobile usage patterns on the life of medical students and its association with psychiatric effects concerning ringxiety and nomophobia.
Data was collected from the 300 medical students of Ashwini Rural Medical College of India through a survey for this cross-sectional study. Chi-square (χ2) was used for statistics that revealed association, mobile phone usage patterns, including time spent before sleep, in classrooms or clinics, and frequency of update checks.
A significant association was found between time spent on mobile before sleep and duration of sleep, and mobile usage in classrooms or clinics and psychological effects (p<0.0001). Significant association observed between mobile use in classes or clinics and the frequency of update checks, and the frequency of update checks and psychological effects (p<0.0001). About 78% of participants distracted in self-study due to mobile. Updates checked every 10 minutes by 14.7%, every hourly by 43%, and during breaks by 42.3%. Mobile low network caused anxiety (13.3%) and irritability (67.3%). About 41.7% of students couldn’t abstain from mobile use for a day. Every student used the mobile phone averagely for 24 minutes before they went to sleep.
Our study results highlight the prevalence of ringxiety and nomophobia in medical school students. With the surging dependency on mobile phones and technology, we need to cautiously monitor its adverse effects on psychology and psychiatric conditions.
To investigate the relative contributions of cerebral cortex and basal ganglia to movement stopping, we tested the optimum combination Stop Signal Reaction Time (ocSSRT) and median visual reaction time (RT) in patients with Alzheimer’s disease (AD) and Parkinson’s disease (PD) and compared values with data from healthy controls.
Thirty-five PD patients, 22 AD patients, and 29 healthy controls were recruited to this study. RT and ocSSRT were measured using a hand-held battery-operated electronic box through a stop signal paradigm.
The mean ocSSRT was found to be 309 ms, 368 ms, and 265 ms in AD, PD, and healthy controls, respectively, and significantly prolonged in PD compared to healthy controls (p = 0.001). The ocSSRT but not RT could separate AD from PD patients (p = 0.022).
Our data suggest that subcortical networks encompassing dopaminergic pathways in the basal ganglia play a more important role than cortical networks in movement-stopping. Combining ocSSRT with other putative indices or biomarkers of AD (and other dementias) could increase the accuracy of early diagnosis.
To determine the impact of electronic health record (EHR)–based interventions and test restriction on Clostridioides difficile tests (CDTs) and hospital-onset C. difficile infection (HO-CDI).
Quasi-experimental study in 3 hospitals.
957-bed academic (hospital A), 354-bed (hospital B), and 175-bed (hospital C) academic-affiliated community hospitals.
Three EHR-based interventions were sequentially implemented: (1) alert when ordering a CDT if laxatives administered within 24 hours (January 2018); (2) cancellation of CDT orders after 24 hours (October 2018); (3) contextual rule-driven order questions requiring justification when laxative administered or lack of EHR documentation of diarrhea (July 2019). In February 2019, hospital C implemented a gatekeeper intervention requiring approval for all CDTs after hospital day 3. The impact of the interventions on C. difficile testing and HO-CDI rates was estimated using an interrupted time-series analysis.
C. difficile testing was already declining in the preintervention period (annual change in incidence rate [IR], 0.79; 95% CI, 0.72–0.87) and did not decrease further with the EHR interventions. The laxative alert was temporally associated with a trend reduction in HO-CDI (annual change in IR from baseline, 0.85; 95% CI, 0.75–0.96) at hospitals A and B. The gatekeeper intervention at hospital C was associated with level (IRR, 0.50; 95% CI, 0.42-0.60) and trend reductions in C. difficile testing (annual change in IR, 0.91; 95% CI, 0.85–0.98) and level (IRR 0.42; 95% CI, 0.22–0.81) and trend reductions in HO-CDI (annual change in IR, 0.68; 95% CI, 0.50–0.92) relative to the baseline period.
Test restriction was more effective than EHR-based clinical decision support to reduce C. difficile testing in our 3-hospital system.
Individuals with treatment-resistant depression (TRD) experience a high burden of illness. Current guidelines recommend a stepped care approach for treating depression, but the extent to which best-practice care pathways are adhered to is unclear.
To explore the extent and nature of ‘treatment gaps’ (non-adherence to stepped care pathways) experienced by a sample of patients with established TRD (non-response to two or more adequate treatments in the current depressive episode) across three cities in the UK.
Five treatment gaps were considered and compared with guidelines, in a cross-sectional retrospective analysis: delay to receiving treatment, lack of access to psychological therapies, delays to medication changes, delays to adjunctive (pharmacological augmentation) treatment and lack of access to secondary care. We additionally explored participant characteristics associated with the extent of treatment gaps experienced.
Of 178 patients with TRD, 47% had been in the current depressive episode for >1 year before initiating antidepressants; 53% had received adequate psychological therapy. A total of 47 and 51% had remained on an unsuccessful first and second antidepressant trial respectively for >16 weeks, and 24 and 27% for >1 year before medication switch, respectively. Further, 54% had tried three or more antidepressant medications within their episode, and only 11% had received adjunctive treatment.
There appears to be a considerable difference between treatment guidelines for depression and the reality of care received by people with TRD. Future research examining representative samples of patients could determine recommendations for optimising care pathways, and ultimately outcomes, for individuals with this illness.
Tardive dyskinesia (TD) is a persistent and potentially disabling movement disorder associated with prolonged exposure to antipsychotics and other dopamine receptor blocking agents. Long-term safety of the approved TD medication, valbenazine, was demonstrated in 2 clinical trials (KINECT 3 [NCT02274558], KINECT 4 [NCT02405091]). Data from these trials were analyzed post hoc to evaluate the onset and resolution of adverse events (AEs).
Participants in KINECT 3 and KINECT 4 received up to 48 weeks of once-daily valbenazine (40 or 80 mg). Data from these studies were pooled and analyzed to assess the incidence, time to first occurrence, and resolution for the following AEs of potential clinical interest: akathisia, balance disorder, dizziness, parkinsonism, somnolence/sedation, suicidal behavior/ideation, and tremor.
In the pooled population (N=314), all AEs of potential clinical interest occurred in <10% of participants, with somnolence (9.6%), suicidal behavior/ideation (6.4%), and dizziness (5.7%) being the most common AEs. Mean time to first occurrence ranged from 36 days (akathisia [n=9]) to 224 days (parkinsonism [n=2]). By end of study (or last study visit), resolution of AEs was as follows: 100% (suicidal ideation/behavior, parkinsonism); >85% (somnolence/sedation, dizziness); >70% (akathisia, balance disorder, tremor).
In long-term clinical trials, the incidence of AEs of potential clinical interest was low (<10%) and most were resolved by end of treatment (>70–100%). All patients taking valbenazine should be routinely monitored for AEs, particularly those that may exacerbate the motor symptoms associated with TD.
We assessed long-term incidence and prevalence trends of dementia and parkinsonism across major ethnic and immigrant groups in Ontario.
Linking administrative databases, we established two cohorts (dementia 2001–2014 and parkinsonism 2001–2015) of all residents aged 20 to 100 years with incident diagnosis of dementia (N = 387,937) or parkinsonism (N = 59,617). We calculated age- and sex-standardized incidence and prevalence of dementia and parkinsonism by immigrant status and ethnic groups (Chinese, South Asian, and the General Population). We assessed incidence and prevalence trends using Poisson regression and Cochran–Armitage trend tests.
Across selected ethnic groups, dementia incidence and prevalence were higher in long-term residents than recent or longer-term immigrants from 2001 to 2014. During this period, age- and sex-standardized incidence of dementia in Chinese, South Asian, and the General Population increased, respectively, among longer-term immigrants (by 41%, 58%, and 42%) and long-term residents (28%, 7%, and 4%), and to a lesser degree among recent immigrants. The small number of cases precluded us from assessing parkinsonism incidence trends. For Chinese, South Asian, and the General Population, respectively, prevalence of dementia and parkinsonism modestly increased over time among recent immigrants but significantly increased among longer-term immigrants (dementia: 134%, 217%, and 117%; parkinsonism: 55%, 54%, and 43%) and long-term residents (dementia: 97%, 132%, and 71%; parkinsonism: 18%, 30%, and 29%). Adjustment for pre-existing conditions did not appear to explain incidence trends, except for stroke and coronary artery disease as potential drivers of dementia incidence.
Recent immigrants across major ethnic groups in Ontario had considerably lower rates of dementia and parkinsonism than long-term residents, but this difference diminished with longer-term immigrants.
The coronavirus disease 2019 (COVID-19) pandemic has led to significant strain on front-line healthcare workers.
In this multicentre study, we compared the psychological outcomes during the COVID-19 pandemic in various countries in the Asia-Pacific region and identified factors associated with adverse psychological outcomes.
From 29 April to 4 June 2020, the study recruited healthcare workers from major healthcare institutions in five countries in the Asia-Pacific region. A self-administrated survey that collected information on prior medical conditions, presence of symptoms, and scores on the Depression Anxiety Stress Scales and the Impact of Events Scale-Revised were used. The prevalence of depression, anxiety, stress and post-traumatic stress disorder (PTSD) relating to COVID-19 was compared, and multivariable logistic regression identified independent factors associated with adverse psychological outcomes within each country.
A total of 1146 participants from India, Indonesia, Singapore, Malaysia and Vietnam were studied. Despite having the lowest volume of cases, Vietnam displayed the highest prevalence of PTSD. In contrast, Singapore reported the highest case volume, but had a lower prevalence of depression and anxiety. In the multivariable analysis, we found that non-medically trained personnel, the presence of physical symptoms and presence of prior medical conditions were independent predictors across the participating countries.
This study highlights that the varied prevalence of psychological adversity among healthcare workers is independent of the burden of COVID-19 cases within each country. Early psychological interventions may be beneficial for the vulnerable groups of healthcare workers with presence of physical symptoms, prior medical conditions and those who are not medically trained.
Primary pests such as Rhyzoperta dominica may increase the contents of dockage, dust, and frass in grain mass. Although it has been suggested that frass can affect the population growth of stored product pests and ecological interactions among primary and secondary pests in stored grain, this has not been validated experimentally. Therefore, this work experimentally tested the hypothesis that R. dominica wheat frass may support population increases in secondary pests such as Tribolium confusum, T. castaneum, and Oryzaephilus surinamensis for the first time. The effect of frass on secondary pest performance was compared with the effects of various physical qualities of wheat grain (i.e., intact grain kernels, grain fragments, flour, grain + frass) and an artificially enriched control diet (milled wheat kernels, oat flakes, and yeast). The results showed that the clean intact grain kernels did not support the population growth of any tested species, and the nutrient-rich control diet provided the best support. Frass was a significantly better food medium for O. surinamensis and T. castaneum than flour or cracked grain, while T. confusum performed equally well on flour and frass. Our results showed that in terms of food quality and suitability for the tested species, frass occupied an intermediate position between the optimized breeding diet and simple uniform cereal diets such as cracked grain or flour. The results suggest that (i) the wheat frass of primary pest R. dominica is a riskier food source for the development of the tested secondary pests than intact or cracked wheat grain or flour; (ii) frass has the potential to positively influence interspecific interactions between R. dominica and the tested secondary pests; and (iii) wheat grain should be cleaned if increases in R. dominica populations and/or accumulated frass are detected.
The pervasive problem of irreproducibility of preclinical research represents a substantial threat to the translation of CTSA-generated health interventions. Key stakeholders in the research process have proposed solutions to this challenge to encourage research practices that improve reproducibility. However, these proposals have had minimal impact, because they either 1. take place too late in the research process, 2. focus exclusively on the products of research instead of the processes of research, and/or 3. fail to take into account the driving incentives in the research enterprise. Because so much clinical and translational science is team-based, CTSA hubs have a unique opportunity to leverage Science of Team Science research to implement and support innovative, evidence-based, team-focused, reproducibility-enhancing activities at a project’s start, and across its evolution. Here, we describe the impact of irreproducibility on clinical and translational science, review its origins, and then describe stakeholders’ efforts to impact reproducibility, and why those efforts may not have the desired effect. Based on team-science best practices and principles of scientific integrity, we then propose ways for Translational Teams to build reproducible behaviors. We end with suggestions for how CTSAs can leverage team-based best practices and identify observable behaviors that indicate a culture of reproducible research.
Presently, evidence guiding clinicians on the optimal approach to safely screen patients for coronavirus disease 2019 (COVID-19) to a nonemergent hospital procedure is scarce. In this report, we describe our experience in screening for SARS-CoV-2 prior to semiurgent and urgent hospital procedures.
Retrospective case series.
A single tertiary-care medical center.
Our study cohort included patients ≥18 years of age who had semiurgent or urgent hospital procedures or surgeries.
Overall, 625 patients were screened for SARS-CoV-2 using a combination of phone questionnaire (7 days prior to the anticipated procedure), RT-PCR and chest computed tomography (CT) between March 1, 2020, and April 30, 2020.
Of the 625 patients, 520 scans (83.2%) were interpreted as normal; 1 (0.16%) had typical features of COVID-19; 18 scans (2.88%) had indeterminate features of COVID-19; and 86 (13.76%) had atypical features of COVID-19. In total, 640 RT-PCRs were performed, with 1 positive result (0.15%) in a patient with a CT scan that yielded an atypical finding. Of the 18 patients with chest CTs categorized as indeterminate, 5 underwent repeat negative RT-PCR nasopharyngeal swab 1 week after their initial swab. Also, 1 patient with a chest CT categorized as typical had a follow-up repeat negative RT-PCR, indicating that the chest CT was likely a false positive. After surgery, none of the patients developed signs or symptoms suspicious of COVID-19 that would indicate the need for a repeated RT-PCR or CT scan.
In our experience, chest CT scanning did not prove provide valuable information in detecting asymptomatic cases of SARS-CoV-2 (COVID-19) in our low-prevalence population.