We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To investigate the timing and routes of contamination of the rooms of patients newly admitted to the hospital.
Design:
Observational cohort study and simulations of pathogen transfer.
Setting:
A Veterans’ Affairs hospital.
Participants:
Patients newly admitted to the hospital with no known carriage of healthcare-associated pathogens.
Methods:
Interactions between the participants and personnel or portable equipment were observed, and cultures of high-touch surfaces, floors, bedding, and patients’ socks and skin were collected for up to 4 days. Cultures were processed for Clostridioides difficile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE). Simulations were conducted with bacteriophage MS2 to assess plausibility of transfer from contaminated floors to high-touch surfaces and to assess the effectiveness of wearing slippers in reducing transfer.
Results:
Environmental cultures became positive for at least 1 pathogen in 10 (59%) of the 17 rooms, with cultures positive for MRSA, C. difficile, and VRE in the rooms of 10 (59%), 2 (12%), and 2 (12%) participants, respectively. For all 14 instances of pathogen detection, the initial site of recovery was the floor followed in a subset of patients by detection on sock bottoms, bedding, and high-touch surfaces. In simulations, wearing slippers over hospital socks dramatically reduced transfer of bacteriophage MS2 from the floor to hands and to high-touch surfaces.
Conclusions:
Floors may be an underappreciated source of pathogen dissemination in healthcare facilities. Simple interventions such as having patients wear slippers could potentially reduce the risk for transfer of pathogens from floors to hands and high-touch surfaces.
Aggressive behavior in middle childhood can contribute to peer rejection, subsequently increasing risk for substance use in adolescence. However, the quality of peer relationships a child experiences can be associated with his or her genetic predisposition, a genotype–environment correlation (rGE). In addition, recent evidence indicates that psychosocial preventive interventions can buffer genetic predispositions for negative behavior. The current study examined associations between polygenic risk for aggression, aggressive behavior, and peer rejection from 8.5 to 10.5 years, and the subsequent influence of peer rejection on marijuana use in adolescence (n = 515; 256 control, 259 intervention). Associations were examined separately in control and intervention groups for children of families who participated in a randomized controlled trial of the family-based preventive intervention, the Family Check-Up . Using time-varying effect modeling (TVEM), polygenic risk for aggression was associated with peer rejection from approximately age 8.50 to 9.50 in the control group but no associations were present in the intervention group. Subsequent analyses showed peer rejection mediated the association between polygenic risk for aggression and adolescent marijuana use in the control group. The role of rGEs in middle childhood peer processes and implications for preventive intervention programs for adolescent substance use are discussed.
Gloves and gowns are used during patient care to reduce contamination of personnel and prevent pathogen transmission.
Objective:
To determine whether the use of gowns adds a substantial benefit over gloves alone in preventing patient-to-patient transfer of a viral DNA surrogate marker.
Methods:
In total, 30 source patients had 1 cauliflower mosaic virus surrogate marker applied to their skin and clothing and a second to their bed rail and bedside table. Personnel caring for the source patients were randomized to wear gloves, gloves plus cover gowns, or no barrier. Interactions with up to 7 subsequent patients were observed, and the percentages of transfer of the DNA markers were compared among the 3 groups.
Results:
In comparison to the no-barrier group (57.8% transfer of 1 or both markers), there were significant reductions in transfer of the DNA markers in the gloves group (31.1% transfer; odds ratio [OR], 0.16; 95% confidence interval [CI], 0.02-0.73) and the gloves-plus-gown group (25.9% transfer; OR, 0.11; 95% CI, 0.01–0.51). The addition of a cover gown to gloves during the interaction with the source patient did not significantly reduce the transfer of the DNA marker (P = .53). During subsequent patient interactions, transfer of the DNA markers was significantly reduced if gloves plus gowns were worn and if hand hygiene was performed (P < .05).
Conclusions:
Wearing gloves or gloves plus gowns reduced the frequency of patient-to-patient transfer of a viral DNA surrogate marker. The use of gloves plus gowns during interactions with the source patient did not reduce transfer in comparison to gloves alone.
Elevated wall temperatures and impinging shock interactions are prevalent features in hypersonic flight. Currently, there is a lack of literature regarding experimental studies examining both features in a flight-representative environment. This work details hot-wall, hypersonic, impinging shock/boundary-layer interaction experiments performed in the T4 Stalker Tube. The model configuration was a two-dimensional heated flat plate and a shock generator. The surface of the graphite flat plate was resistively heated to a mean temperature from $T_w=298\ \textrm {K}$ to $T_w\approx 675\ \textrm {K}$ during an experimental run. An oblique shock, generated by a plate that was inclined at $10^{\circ }$ or $12^{\circ }$ to the free stream, was impinged on the heated flat plate to induce boundary layer separation. The primary flow condition produced Mach 7 flight-equivalent nozzle-supply enthalpy with a unit Reynolds number of $4.93\times 10^6\ \textrm {m}^{-1}$. More flow conditions with lower unit Reynolds numbers and flow enthalpies were considered to examine flow separation characteristics. Schlieren and infrared thermography captured the flow field and the wall temperature distribution, respectively. The results showed that the size of the flow separation grew with a higher $T_w$ and a lower unit Reynolds number. Moreover, the scaled separation of the present data showed a high discrepancy with existing separation correlations developed from a supersonic impinging shock and a hypersonic compression ramp, mainly due to the higher shock strength. Instead, the present data followed a scaling law that includes the pressure ratio across the impinging shock with a slight dependence on the wall temperature ratio.
The Cognitive Battery of the National Institutes of Health Toolbox (NIH-TB) is a collection of assessments that have been adapted and normed for administration across the lifespan and is increasingly used in large-scale population-level research. However, despite increasing adoption in longitudinal investigations of neurocognitive development, and growing recommendations that the Toolbox be used in clinical applications, little is known about the long-term temporal stability of the NIH-TB, particularly in youth.
Methods
The present study examined the long-term temporal reliability of the NIH-TB in a large cohort of youth (9–15 years-old) recruited across two data collection sites. Participants were invited to complete testing annually for 3 years.
Results
Reliability was generally low-to-moderate, with intraclass correlation coefficients ranging between 0.31 and 0.76 for the full sample. There were multiple significant differences between sites, with one site generally exhibiting stronger temporal stability than the other.
Conclusions
Reliability of the NIH-TB Cognitive Battery was lower than expected given early work examining shorter test-retest intervals. Moreover, there were very few instances of tests meeting stability requirements for use in research; none of the tests exhibited adequate reliability for use in clinical applications. Reliability is paramount to establishing the validity of the tool, thus the constructs assessed by the NIH-TB may vary over time in youth. We recommend further refinement of the NIH-TB Cognitive Battery and its norming procedures for children before further adoption as a neuropsychological assessment. We also urge researchers who have already employed the NIH-TB in their studies to interpret their results with caution.
La pandémie de la COVID-19 et l’état d’urgence publique qui en a découlé ont eu des répercussions significatives sur les personnes âgées au Canada et à travers le monde. Il est impératif que le domaine de la gérontologie réponde efficacement à cette situation. Dans la présente déclaration, les membres du conseil d’administration de l’Association canadienne de gérontologie/Canadian Association on Gerontology (ACG/CAG) et ceux du comité de rédaction de La Revue canadienne du vieillissement/Canadian Journal on Aging (RCV/CJA) reconnaissent la contribution des membres de l’ACG/CAG et des lecteurs de la RCV/CJA. Les auteurs exposent les voies complexes par lesquelles la COVID-19 affecte les personnes âgées, allant du niveau individuel au niveau populationnel. Ils préconisent une approche impliquant des équipes collaboratives pluridisciplinaires, regroupant divers champs de compétences, et différentes perspectives et méthodes d’évaluation de l’impact de la COVID-19.
Decisions to treat large-vessel occlusion with endovascular therapy (EVT) or intravenous alteplase depend on how physicians weigh benefits against risks when considering patients’ comorbidities. We explored EVT/alteplase decision-making by stroke experts in the setting of comorbidity/disability.
Methods:
In an international multi-disciplinary survey, experts chose treatment approaches under current resources and under assumed ideal conditions for 10 of 22 randomly assigned case scenarios. Five included comorbidities (cancer, cardiac/respiratory/renal disease, mild cognitive impairment [MCI], physical dependence). We examined scenario/respondent characteristics associated with EVT/alteplase decisions using multivariable logistic regressions.
Results:
Among 607 physicians (38 countries), EVT was chosen less often in comorbidity-related scenarios (79.6% under current resources, 82.7% assuming ideal conditions) versus six “level-1A” scenarios for which EVT/alteplase was clearly indicated by current guidelines (91.1% and 95.1%, respectively, odds ratio [OR] [current resources]: 0.38, 95% confidence interval 0.31–0.47). However, EVT was chosen more often in comorbidity-related scenarios compared to all other 17 scenarios (79.6% versus 74.4% under current resources, OR: 1.34, 1.17–1.54). Responses favoring alteplase for comorbidity-related scenarios (e.g. 75.0% under current resources) were comparable to level-1A scenarios (72.2%) and higher than all others (60.4%). No comorbidity independently diminished EVT odds when considering all scenarios. MCI and dependence carried higher alteplase odds; cancer and cardiac/respiratory/renal disease had lower odds. Being older/female carried lower EVT odds. Relevant respondent characteristics included performing more EVT cases/year (higher EVT-, lower alteplase odds), practicing in East Asia (higher EVT odds), and in interventional neuroradiology (lower alteplase odds vs neurology).
Conclusion:
Moderate-to-severe comorbidities did not consistently deter experts from EVT, suggesting equipoise about withholding EVT based on comorbidities. However, alteplase was often foregone when respondents chose EVT. Differences in decision-making by patient age/sex merit further study.
The false codling moth (FCM), Thaumatotibia leucotreta (Lepidoptera: Tortricidae) is an insect pest which represents an important threat to the production and marketing of a wide range of agricultural crops in the African-Caribbean-Pacific (ACP) countries. The FCM reduces not only the yield and quality of the crop but also as a quarantine insect pest, restricts the trade of susceptible agricultural produce on the international market. In addition, little research has been conducted in the ACP countries on the bio-ecology and sustainable management of this pest, especially on vegetables for export. Thus, action-oriented research aimed at understanding the bio-ecology of this important pest is essential to achieve effective management. Various management interventions against this pest have been used in some parts of the world, especially in South Africa on citrus. Currently, farm sanitation is regarded as the key management strategy. Exploring and improving on other interventions such as Sterile Insect Technique, monitoring and mass trapping of male moths, augmentative biological control, use of bio-pesticides, protected cultivation and cold treatment may help to mitigate the expansion of FCM into other countries, especially in the European and Mediterranean Plant Protection Organization region where it has become a regulated insect pest since 2014. This review discussed the bio-ecology of FCM and highlighted some of the challenges and opportunities for its effective management and its implication for international trade, especially the export of chillies from the ACP countries into the European Union market which requires strict phytosanitary regulations.
The COVID-19 pandemic and subsequent state of public emergency have significantly affected older adults in Canada and worldwide. It is imperative that the gerontological response be efficient and effective. In this statement, the board members of the Canadian Association on Gerontology/L’Association canadienne de gérontologie (CAG/ACG) and the Canadian Journal on Aging/La revue canadienne du vieillissement (CJA/RCV) acknowledge the contributions of CAG/ACG members and CJA/RCV readers. We also profile the complex ways that COVID-19 is affecting older adults, from individual to population levels, and advocate for the adoption of multidisciplinary collaborative teams to bring together different perspectives, areas of expertise, and methods of evaluation in the COVID-19 response.
Alluvial mineral sands rank among the most complex subjects for mineral characterization due to the diverse range of minerals present in the sediments, which may collectively contain a daunting number of elements (>20) in major or minor concentrations (>1 wt%). To comprehensively characterize the phase abundance and chemistry of these complex mineral specimens, a method was developed using hyperspectral x-ray and cathodoluminescence mapping in an electron probe microanalyser (EPMA), coupled with automated cluster analysis and quantitative analysis of clustered x-ray spectra. This method proved successful in identifying and quantifying over 40 phases from mineral sand specimens, including unexpected phases with low modal abundance (<0.1%). The standard-based quantification method measured compositions in agreement with expected stoichiometry, with elemental detection limits in the range of <10–1,000 ppm, depending on phase abundance, and proved reliable even for challenging mineral species, such as the multi-rare earth element (REE) bearing mineral xenotime [(Y,REE)PO4] for which 24 elements were analyzed, including 12 overlapped REEs. The mineral identification procedure was also capable of characterizing mineral groups that exhibit significant compositional variability due to the substitution of multiple elements, such as garnets (Mg, Ca, Fe, Mn, Cr), pyroxenes (Mg, Ca, Fe), and amphiboles (Na, Mg, Ca, Fe, Al).
Depression is among the most common mental illnesses in Canada. Although many factors contribute to depression, stress is among the most commonly reported. Studies suggest that marginalized groups often experience high levels of stress.
Objective
To examine associations between ethnicity and depressive symptoms among university students.
Aim
To identify if ethnic groups, particularly Aboriginal students, are at greater risk of depression.
Methods
Online survey data were collected from students attending eight universities in the Canadian Maritime Provinces (n = 10,180). Depressive symptoms were assessed using the 12-item version of the Center for Epidemiological Studies Depression Scale. Ethnicity was organized into five groups: Caucasian only, Aboriginal only, Aboriginals with other ethnicities, Mixed Ethnicity (not including Aboriginal), and Other (single ethnicity not including Aboriginal or Caucasian). Unadjusted and adjusted logistic regression models were used to assess associations between ethnicity and elevated depressive symptoms. Adjusted models accounted for demographic, socioeconomic, and behavioural characteristics.
Results
In adjusted analyses for men, Mixed (OR: 2.01; 95% CI: 1.12–3.63) and Other ethnic students (OR: 1.47; 95% CI: 1.11–1.96) were more likely to have elevated depressive symptoms than Caucasians. There were no differences between those who were Aboriginal and those who were Caucasian. In unadjusted and adjusted analyses for women, depressive symptoms in ethnic groups (including Aboriginals) were not significantly different from Caucasians.
Conclusion
Among male university students in the Maritime, ethnicity (other than being Aboriginal) was associated with depressive symptoms in comparison to Caucasians, after adjusting for covariates. However, among women, ethnicity was not significantly associated with depressive symptoms.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
This article describes the development, implementation, and evaluation of a complex methotrexate ethics case used in teaching a Pharmacy Law and Ethics course. Qualitative analysis of student reflective writings provided useful insight into the students’ experience and comfort level with the final ethics case in the course. These data demonstrate a greater student appreciation of different perspectives, the potential for conflict in communicating about such cases, and the importance of patient autonomy. Faculty lessons learned are also described, facilitating adoption of this methotrexate ethics case by other healthcare profession educators.
An asymptotic model is derived for the competitive diffusion-limited evaporation of multiple thin sessile droplets under the assumption that the droplets are well separated. Exact solutions of the model are obtained for a pair of and for a polygonal array of identical droplets, and the model is found to perform well even outside its formal range of validity, up to and including the limit of touching droplets. The shielding effect of droplets on each other is demonstrated, and the model is used to investigate the effect of this shielding on droplet evolutions and lifetimes, as well as on the coffee-ring effect. The theoretical predictions of the model are found to be in good agreement with recent experimental results for seven relatively closely spaced droplets, suggesting that the model could be a useful tool for studying a wide range of other droplet configurations.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
Design:
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
Methods:
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Results:
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
Conclusions:
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
Methods
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Results
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Conclusions
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
Methods:
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
Results:
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
Conclusions:
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.
Several research teams have previously traced patterns of emerging conduct problems (CP) from early or middle childhood. The current study expands on this previous literature by using a genetically-informed, experimental, and long-term longitudinal design to examine trajectories of early-emerging conduct problems and early childhood discriminators of such patterns from the toddler period to adolescence. The sample represents a cohort of 731 toddlers and diverse families recruited based on socioeconomic, child, and family risk, varying in urbanicity and assessed on nine occasions between ages 2 and 14. In addition to examining child, family, and community level discriminators of patterns of emerging conduct problems, we were able to account for genetic susceptibility using polygenic scores and the study's experimental design to determine whether random assignment to the Family Check-Up (FCU) discriminated trajectory groups. In addition, in accord with differential susceptibility theory, we tested whether the effects of the FCU were stronger for those children with higher genetic susceptibility. Results augmented previous findings documenting the influence of child (inhibitory control [IC], gender) and family (harsh parenting, parental depression, and educational attainment) risk. In addition, children in the FCU were overrepresented in the persistent low versus persistent high CP group, but such direct effects were qualified by an interaction between the intervention and genetic susceptibility that was consistent with differential susceptibility. Implications are discussed for early identification and specifically, prevention efforts addressing early child and family risk.