We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In 2016, the National Center for Advancing Translational Science launched the Trial Innovation Network (TIN) to address barriers to efficient and informative multicenter trials. The TIN provides a national platform, working in partnership with 60+ Clinical and Translational Science Award (CTSA) hubs across the country to support the design and conduct of successful multicenter trials. A dedicated Hub Liaison Team (HLT) was established within each CTSA to facilitate connection between the hubs and the newly launched Trial and Recruitment Innovation Centers. Each HLT serves as an expert intermediary, connecting CTSA Hub investigators with TIN support, and connecting TIN research teams with potential multicenter trial site investigators. The cross-consortium Liaison Team network was developed during the first TIN funding cycle, and it is now a mature national network at the cutting edge of team science in clinical and translational research. The CTSA-based HLT structures and the external network structure have been developed in collaborative and iterative ways, with methods for shared learning and continuous process improvement. In this paper, we review the structure, function, and development of the Liaison Team network, discuss lessons learned during the first TIN funding cycle, and outline a path toward further network maturity.
Improving the quality and conduct of multi-center clinical trials is essential to the generation of generalizable knowledge about the safety and efficacy of healthcare treatments. Despite significant effort and expense, many clinical trials are unsuccessful. The National Center for Advancing Translational Science launched the Trial Innovation Network to address critical roadblocks in multi-center trials by leveraging existing infrastructure and developing operational innovations. We provide an overview of the roadblocks that led to opportunities for operational innovation, our work to develop, define, and map innovations across the network, and how we implemented and disseminated mature innovations.
New technologies and disruptions related to Coronavirus disease-2019 have led to expansion of decentralized approaches to clinical trials. Remote tools and methods hold promise for increasing trial efficiency and reducing burdens and barriers by facilitating participation outside of traditional clinical settings and taking studies directly to participants. The Trial Innovation Network, established in 2016 by the National Center for Advancing Clinical and Translational Science to address critical roadblocks in clinical research and accelerate the translational research process, has consulted on over 400 research study proposals to date. Its recommendations for decentralized approaches have included eConsent, participant-informed study design, remote intervention, study task reminders, social media recruitment, and return of results for participants. Some clinical trial elements have worked well when decentralized, while others, including remote recruitment and patient monitoring, need further refinement and assessment to determine their value. Partially decentralized, or “hybrid” trials, offer a first step to optimizing remote methods. Decentralized processes demonstrate potential to improve urban-rural diversity, but their impact on inclusion of racially and ethnically marginalized populations requires further study. To optimize inclusive participation in decentralized clinical trials, efforts must be made to build trust among marginalized communities, and to ensure access to remote technology.
One challenge for multisite clinical trials is ensuring that the conditions of an informative trial are incorporated into all aspects of trial planning and execution. The multicenter model can provide the potential for a more informative environment, but it can also place a trial at risk of becoming uninformative due to lack of rigor, quality control, or effective recruitment, resulting in premature discontinuation and/or non-publication. Key factors that support informativeness are having the right team and resources during study planning and implementation and adequate funding to support performance activities. This communication draws on the experience of the National Center for Advancing Translational Science (NCATS) Trial Innovation Network (TIN) to develop approaches for enhancing the informativeness of clinical trials. We distilled this information into three principles: (1) assemble a diverse team, (2) leverage existing processes and systems, and (3) carefully consider budgets and contracts. The TIN, comprised of NCATS, three Trial Innovation Centers, a Recruitment Innovation Center, and 60+ CTSA Program hubs, provides resources to investigators who are proposing multicenter collaborations. In addition to sharing principles that support the informativeness of clinical trials, we highlight TIN-developed resources relevant for multicenter trial initiation and conduct.
Neonates and infants who undergo congenital cardiac surgery frequently have difficulty with feeding. The factors that predispose these patients to require a gastrostomy tube have not been well defined. We aimed to report the incidence and describe hospital outcomes and characteristics in neonates and infants undergoing congenital cardiac surgery who required gastrostomy tube placement.
Materials and method:
A retrospective review was performed on patients undergoing congenital cardiac surgery between October 2015 and December 2020. Patients were identified by International Classification of Diseases 10th Revision codes, utilising the performance improvement database Vizient® Clinical Data Base, and stratified by age at admission: neonates (<1 month) and infants (1–12 months). Outcomes were compared and comparative analysis performed between admissions with and without gastrostomy tube placement.
Results:
There were 11,793 admissions, 3519 (29.8%) neonates and 8274 (70.2%) infants. We found an increased incidence of gastrostomy tube placement in neonates as compared to infants following congenital cardiac surgery (23.1% versus 6%, p = <0.001). Outcomes in neonates and infants were similar with increased length of stay and cost in those requiring a gastrostomy tube. Gastrostomy tube placement was noted to be more likely in neonates and infants with upper airway anomalies, congenital abnormalities, hospital infections, and genetic abnormalities.
Discussion:
Age at hospitalisation for congenital cardiac surgery is a definable risk factor for gastrostomy tube requirement. Additional factors contribute to gastrostomy tube placement and should be used when counselling families regarding the potential requirement of a gastrostomy tube.
The Trial Innovation Network has established an infrastructure for single IRB review in response to federal policies. The Network’s single IRB (sIRBs) have successfully supported over 70 multisite studies via more than 800 reliance arrangements. This has generated several lessons learned that can benefit the national clinical research enterprise, as we work to improve the conduct of clinical trials. These lessons include distinguishing the roles of the single IRB from institutional Human Research Protections programs, establishing a consistent sIRB review model, standardizing collection of local context and supplemental, study-specific information, and educating and empowering lead study teams to support their sites.
Paramedics received training in point-of-care ultrasound (POCUS) to assess for cardiac contractility during management of medical out-of-hospital cardiac arrest (OHCA). The primary outcome was the percentage of adequate POCUS video acquisition and accurate video interpretation during OHCA resuscitations. Secondary outcomes included POCUS impact on patient management and resuscitation protocol adherence.
Methods:
A prospective, observational cohort study of paramedics was performed following a four-hour training session, which included a didactic lecture and hands-on POCUS instruction. The Prehospital Echocardiogram in Cardiac Arrest (PECA) protocol was developed and integrated into the resuscitation algorithm for medical non-shockable OHCA. The ultrasound (US) images were reviewed by a single POCUS expert investigator to determine the adequacy of the POCUS video acquisition and accuracy of the video interpretation. Change in patient management and resuscitation protocol adherence data, including end-tidal carbon dioxide (EtCO2) monitoring following advanced airway placement, adrenaline administration, and compression pauses under ten seconds, were queried from the prehospital electronic health record (EHR).
Results:
Captured images were deemed adequate in 42/49 (85.7%) scans and paramedic interpretation of sonography was accurate in 43/49 (87.7%) scans. The POCUS results altered patient management in 14/49 (28.6%) cases. Paramedics adhered to EtCO2 monitoring in 36/36 (100.0%) patients with an advanced airway, adrenaline administration for 38/38 (100.0%) patients, and compression pauses under ten seconds for 36/38 (94.7%) patients.
Conclusion:
Paramedics were able to accurately obtain and interpret cardiac POCUS videos during medical OHCA while adhering to a resuscitation protocol. These findings suggest that POCUS can be effectively integrated into paramedic protocols for medical OHCA.
This study aimed to explore effects of adjunctive minocycline treatment on inflammatory and neurogenesis markers in major depressive disorder (MDD). Serum samples were collected from a randomised, placebo-controlled 12-week clinical trial of minocycline (200 mg/day, added to treatment as usual) for adults (n = 71) experiencing MDD to determine changes in interleukin-6 (IL-6), lipopolysaccharide binding protein (LBP) and brain derived neurotrophic factor (BDNF). General Estimate Equation modelling explored moderation effects of baseline markers and exploratory analyses investigated associations between markers and clinical outcomes. There was no difference between adjunctive minocycline or placebo groups at baseline or week 12 in the levels of IL-6 (week 12; placebo 2.06 ± 1.35 pg/ml; minocycline 1.77 ± 0.79 pg/ml; p = 0.317), LBP (week 12; placebo 3.74 ± 0.95 µg/ml; minocycline 3.93 ± 1.33 µg/ml; p = 0.525) or BDNF (week 12; placebo 24.28 ± 6.69 ng/ml; minocycline 26.56 ± 5.45 ng/ml; p = 0.161). Higher IL-6 levels at baseline were a predictor of greater clinical improvement. Exploratory analyses suggested that the change in IL-6 levels were significantly associated with anxiety symptoms (HAMA; p = 0.021) and quality of life (Q-LES-Q-SF; p = 0.023) scale scores. No other clinical outcomes were shown to have this mediation effect, nor did the other markers (LBP or BDNF) moderate clinical outcomes. There were no overall changes in IL-6, LBP or BDNF following adjunctive minocycline treatment. Exploratory analyses suggest a potential role of IL-6 on mediating anxiety symptoms with MDD. Future trials may consider enrichment of recruitment by identifying several markers or a panel of factors to better represent an inflammatory phenotype in MDD with larger sample size.
Many mental disorders, including depression, bipolar disorder and schizophrenia, are associated with poor dietary quality and nutrient intake. There is, however, a deficit of research looking at the relationship between obsessive–compulsive disorder (OCD) severity, nutrient intake and dietary quality.
Aims
This study aims to explore the relationship between OCD severity, nutrient intake and dietary quality.
Method
A post hoc regression analysis was conducted with data combined from two separate clinical trials that included 85 adults with diagnosed OCD, using the Structured Clinical Interview for DSM-5. Nutrient intakes were calculated from the Dietary Questionnaire for Epidemiological Studies version 3.2, and dietary quality was scored with the Healthy Eating Index for Australian Adults – 2013.
Results
Nutrient intake in the sample largely aligned with Australian dietary guidelines. Linear regression models adjusted for gender, age and total energy intake showed no significant associations between OCD severity, nutrient intake and dietary quality (all P > 0.05). However, OCD severity was inversely associated with caffeine (β = −15.50, 95% CI −28.88 to −2.11, P = 0.024) and magnesium (β = −6.63, 95% CI −12.72 to −0.53, P = 0.034) intake after adjusting for OCD treatment resistance.
Conclusions
This study showed OCD severity had little effect on nutrient intake and dietary quality. Dietary quality scores were higher than prior studies with healthy samples, but limitations must be noted regarding comparability. Future studies employing larger sample sizes, control groups and more accurate dietary intake measures will further elucidate the relationship between nutrient intake and dietary quality in patients with OCD.
Microscopic examination of blood smears remains the gold standard for laboratory inspection and diagnosis of malaria. Smear inspection is, however, time-consuming and dependent on trained microscopists with results varying in accuracy. We sought to develop an automated image analysis method to improve accuracy and standardization of smear inspection that retains capacity for expert confirmation and image archiving. Here, we present a machine learning method that achieves red blood cell (RBC) detection, differentiation between infected/uninfected cells, and parasite life stage categorization from unprocessed, heterogeneous smear images. Based on a pretrained Faster Region-Based Convolutional Neural Networks (R-CNN) model for RBC detection, our model performs accurately, with an average precision of 0.99 at an intersection-over-union threshold of 0.5. Application of a residual neural network-50 model to infected cells also performs accurately, with an area under the receiver operating characteristic curve of 0.98. Finally, combining our method with a regression model successfully recapitulates intraerythrocytic developmental cycle with accurate lifecycle stage categorization. Combined with a mobile-friendly web-based interface, called PlasmoCount, our method permits rapid navigation through and review of results for quality assurance. By standardizing assessment of Giemsa smears, our method markedly improves inspection reproducibility and presents a realistic route to both routine lab and future field-based automated malaria diagnosis.
Obsessive–compulsive disorder (OCD) is often challenging to treat and resistant to psychological interventions and prescribed medications. The adjunctive use of nutraceuticals with potential neuromodulatory effects on underpinning pathways such as the glutamatergic and serotonergic systems is one novel approach.
Objective
To assess the effectiveness and safety of a purpose-formulated combination of nutraceuticals in treating OCD: N-acetyl cysteine, L-theanine, zinc, magnesium, pyridoxal-5′ phosphate, and selenium.
Methods
A 20-week open label proof-of-concept study was undertaken involving 28 participants with treatment-resistant DSM-5-diagnosed OCD, during 2017 to 2020. The primary outcome measure was the Yale-Brown Obsessive–Compulsive Scale (YBOCS), administered every 4 weeks.
Results
An intention-to-treat analysis revealed an estimated mean reduction across time (baseline to week-20) on the YBOCS total score of −7.13 (95% confidence interval = −9.24, −5.01), with a mean reduction of −1.21 points per post-baseline visit (P ≤ .001). At 20-weeks, 23% of the participants were considered “responders” (YBOCS ≥35% reduction and “very much” or “much improved” on the Clinical Global Impression-Improvement scale). Statistically significant improvements were also revealed on all secondary outcomes (eg, mood, anxiety, and quality of life). Notably, treatment response on OCD outcome scales (eg, YBOCS) was greatest in those with lower baseline symptom levels, while response was limited in those with relatively more severe OCD.
Conclusions
While this pilot study lacks placebo-control, the significant time effect in this treatment-resistant OCD population is encouraging and suggests potential utility especially for those with lower symptom levels. Our findings need to be confirmed or refuted via a follow-up placebo-controlled study.
The COVID-19 pandemic prompted the development and implementation of hundreds of clinical trials across the USA. The Trial Innovation Network (TIN), funded by the National Center for Advancing Translational Sciences, was an established clinical research network that pivoted to respond to the pandemic.
Methods:
The TIN’s three Trial Innovation Centers, Recruitment Innovation Center, and 66 Clinical and Translational Science Award Hub institutions, collaborated to adapt to the pandemic’s rapidly changing landscape, playing central roles in the planning and execution of pivotal studies addressing COVID-19. Our objective was to summarize the results of these collaborations and lessons learned.
Results:
The TIN provided 29 COVID-related consults between March 2020 and December 2020, including 6 trial participation expressions of interest and 8 community engagement studios from the Recruitment Innovation Center. Key lessons learned from these experiences include the benefits of leveraging an established infrastructure, innovations surrounding remote research activities, data harmonization and central safety reviews, and early community engagement and involvement.
Conclusions:
Our experience highlighted the benefits and challenges of a multi-institutional approach to clinical research during a pandemic.
Advanced imaging techniques are enhancing research capacity focussed on the developmental origins of adult health and disease (DOHaD) hypothesis, and consequently increasing awareness of future health risks across various subareas of DOHaD research themes. Understanding how these advanced imaging techniques in animal models and human population studies can be both additively and synergistically used alongside traditional techniques in DOHaD-focussed laboratories is therefore of great interest. Global experts in advanced imaging techniques congregated at the advanced imaging workshop at the 2019 DOHaD World Congress in Melbourne, Australia. This review summarizes the presentations of new imaging modalities and novel applications to DOHaD research and discussions had by DOHaD researchers that are currently utilizing advanced imaging techniques including MRI, hyperpolarized MRI, ultrasound, and synchrotron-based techniques to aid their DOHaD research focus.
Several grass and broadleaf weed species around the world have evolved multiple-herbicide resistance at alarmingly increasing rates. Research on the biochemical and molecular resistance mechanisms of multiple-resistant weed populations indicate a prevalence of herbicide metabolism catalyzed by enzyme systems such as cytochrome P450 monooxygenases and glutathione S-transferases and, to a lesser extent, by glucosyl transferases. A symposium was conducted to gain an understanding of the current state of research on metabolic resistance mechanisms in weed species that pose major management problems around the world. These topics, as well as future directions of investigations that were identified in the symposium, are summarized herein. In addition, the latest information on selected topics such as the role of safeners in inducing crop tolerance to herbicides, selectivity to clomazone, glyphosate metabolism in crops and weeds, and bioactivation of natural molecules is reviewed.
Objectives: The objective of this study was to evaluate the feasibility and implementation of a standardized medically supervised concussion protocol established between a city-wide AAA hockey league and a multi-disciplinary concussion program. Methods: We conducted a retrospective review of injury surveillance, clinical and healthcare utilization data from all athletes evaluated and managed through the Winnipeg AAA Hockey concussion protocol during the 2016-2017 season. We also conducted post-season email surveys of head coaches and parents responsible for athletes who competed in the same season. Results: During the 2016-2017 season, 28 athletes were evaluated through the medically supervised concussion protocol, with two athletes undergoing evaluation for repeat injuries (a total of 30 suspected injuries and consultations). In all, 96.7% of the athletes managed through the concussion protocol were captured by the league-designated Concussion Protocol Coordinator and 100% of eligible athletes underwent complete medical follow-up and clearance to return to full hockey activities. Although 90% of responding head coaches and 91% of parents were aware of the concussion protocol, survey results suggest that some athletes who sustained suspected concussions were not managed through the protocol. Head coaches and parents also indicated that athlete education and communication between medical and sport stakeholders were other elements of the concussion protocol that could be improved. Conclusion: Successful implementation of a medically supervised concussion protocol for youth hockey requires clear communication between sport stakeholders and timely access to multi-disciplinary experts in traumatic brain and spine injuries. Standardized concussion protocols for youth sports may benefit from periodic evaluations by sport stakeholders and incorporation of national guideline best practices and resources.
Purpose: To examine the safety and tolerability of clinical graded aerobic treadmill testing in recovering adolescent moderate and severe traumatic brain injury (TBI) patients referred to a multidisciplinary pediatric concussion program. Methods: We completed a retrospective case series of two moderate and five severe TBI patients (mean age, 17.3 years) who underwent initial Buffalo Concussion Treadmill Testing at a mean time of 71.6 days (range, 55-87) postinjury. Results: Six patients completed one graded aerobic treadmill test each and one patient underwent initial and repeat testing. There were no complications. Five initial treadmill tests were completely tolerated and allowed an accurate assessment of exercise tolerance. Two initial tests were terminated early by the treatment team because of neurological and cardiorespiratory limitations. As a result of testing, two patients were cleared for aerobic exercise as tolerated and four patients were treated with individually tailored submaximal aerobic exercise programs resulting in subjective improvement in residual symptoms and/or exercise tolerance. Repeat treadmill testing in one patient performed after 1 month of treatment with submaximal aerobic exercise prescription was suggestive of improved exercise tolerance. One patient was able to tolerate aerobic exercise following surgery for posterior glottic stenosis. Conclusions: Preliminary results suggest that graded aerobic treadmill testing is a safe, well tolerated, and clinically useful tool to assess exercise tolerance in appropriately selected adolescent patients with TBI. Future prospective studies are needed to evaluate the effect of tailored submaximal aerobic exercise prescription on exercise tolerance and patient outcomes in recovering adolescent moderate and severe TBI patients.
Objectives: To summarize the clinical characteristics and outcomes of pediatric sports-related concussion (SRC) patients who were evaluated and managed at a multidisciplinary pediatric concussion program and examine the healthcare resources and personnel required to meet the needs of this patient population. Methods: We conducted a retrospective review of all pediatric SRC patients referred to the Pan Am Concussion Program from September 1st, 2013 to May 25th, 2015. Initial assessments and diagnoses were carried out by a single neurosurgeon. Return-to-Play decision-making was carried out by the multidisciplinary team. Results: 604 patients, including 423 pediatric SRC patients were evaluated at the Pan Am Concussion Program during the study period. The mean age of study patients was 14.30 years (SD: 2.32, range 7-19 years); 252 (59.57%) were males. Hockey (182; 43.03%) and soccer (60; 14.18%) were the most commonly played sports at the time of injury. Overall, 294 (69.50%) of SRC patients met the clinical criteria for concussion recovery, while 75 (17.73%) were lost to follow-up, and 53 (12.53%) remained in active treatment at the end of the study period. The median duration of symptoms among the 261 acute SRC patients with complete follow-up was 23 days (IQR: 15, 36). Overall, 25.30% of pediatric SRC patients underwent at least one diagnostic imaging test and 32.62% received referral to another member of our multidisciplinary clinical team. Conclusion: Comprehensive care of pediatric SRC patients requires access to appropriate diagnostic resources and the multidisciplinary collaboration of experts with national and provincially-recognized training in TBI.
Various medications and devices are available for facilitation of emergent endotracheal intubations (EETIs). The objective of this study was to survey which medications and devices are being utilized for intubation by Canadian physicians.
Methods
A clinical scenario-based survey was developed to determine which medications physicians would administer to facilitate EETI, their first choice of intubation device, and backup strategy should their first choice fail. The survey was distributed to Canadian emergency medicine (EM) and intensive care unit (ICU) physicians using web-based and postal methods. Physicians were asked questions based on three scenarios (trauma; pneumonia; heart failure) and responded using a 5-point scale ranging from “always” to “never” to capture usual practice.
Results
The survey response rate was 50.2% (882/1,758). Most physicians indicated a Macintosh blade with direct laryngoscopy would “always/often” be their first choice of intubation device in the three scenarios (mean 85% [79%-89%]) followed by video laryngoscopy (mean 37% [30%-49%]). The most common backup device chosen was an extraglottic device (mean 59% [56%-60%]). The medications most physicians would “always/often” administer were fentanyl (mean 45% [42%-51%]) and etomidate (mean 38% [25%-50%]). EM physicians were more likely than ICU physicians to paralyze patients for EETI (adjusted odds ratio 3.40; 95% CI 2.90-4.00).
Conclusions
Most EM and ICU physicians utilize direct laryngoscopy with a Macintosh blade as a primary device for EETI and an extraglottic device as a backup strategy. This survey highlights variation in Canadian practice patterns for some aspects of intubation in critically ill patients.
This study tests competing models of the relation between depression and polysubstance use over the course of adolescence. Participants included a nationwide sample of adolescents (N = 3,604), ages 12 to 17 at study Wave 1, assessed annually for 3 years. Models were tested using cohort-sequential latent growth curve modeling to determine whether depressive symptoms at baseline predicted concurrent and age-related changes in drug use, whether drug use at baseline predicted concurrent and age-related changes in depressive symptoms, and whether initial levels of depression predicted changes in substance use significantly better than vice versa. The results suggest a transactional model such that early polysubstance use promotes early depressive symptoms, which in turn convey elevated risk for increasing polysubstance use over time, which in turn conveys additional risk for future depressive symptoms, even after accounting for gender, ethnicity, and household income. In contrast, early drug use did not portend risk for future depressive symptoms. These findings suggest a complicated pattern of interrelations over time and indicate that many current models of co-occurring polysubstance use