To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
When a liberal-democratic state signs a treaty or wages a war, does its whole polity do those things? In this article, we approach this question via the recent social ontological literature on collective agency. We provide arguments that it does and that it does not. The arguments are presented via three considerations: the polity's control over what the state does; the polity's unity; and the influence of individual polity members. We suggest that the answer to our question differs for different liberal-democratic states and depends on two underlying considerations: (1) the amount of discretion held by the state's officeholders; (2) the extent to which the democratic procedure is deliberative rather than aggregative.
Sleep quantity and quality are associated with executive function (EF) in experimental studies, and in individuals with sleep disorders. With advancing age, sleep quantity and quality decline, as does the ability to perform EF tasks, suggesting that sleep disruption may contribute to age-related EF declines. This cross-sectional cohort study tested the hypothesis that poorer sleep quality (i.e., the frequency and duration of awakenings) and/or quantity may partly account for age-related EF deficits.
Community-dwelling older adults (N = 184) completed actigraphic sleep monitoring then a range of EF tasks. Two EF factors were extracted using exploratory structural equation modeling. Sleep variables did not mediate the relationship between age and EF factors. Post hoc moderated mediation analyses were conducted to test whether cognitive reserve compensates for sleep-related EF deficits, using years of education as a proxy measure of cognitive reserve.
We found a significant interaction between cognitive reserve and the number and frequency of awakenings, explaining a small (approximately 3%), but significant amount of variance in EF. Specifically, in individuals with fewer than 11 years of education, greater sleep disturbance was associated with poorer EF, but sleep did not impact EF in those with more education. There was no association between age and sleep quantity.
This study highlights the role of cognitive reserve in the sleep–EF relationship, suggesting individuals with greater cognitive reserve may be able to counter the impact of disturbed sleep on EF. Therefore, improving sleep may confer some protection against EF deficits in vulnerable older adults.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled nursing facility (SNF), and the strategies that controlled transmission.
Design, Setting, and Participants:
Cohort study during March 22–May 4, 2020 of all staff and residents at a 780-bed SNF in San Francisco, California.
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPS) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2; whole genome sequencing (WGS) characterized viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact to a confirmed case; restricting movements between units; implementing surgical face masking facility-wide; and recommended PPE (isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Of 725 staff and residents tested through targeted testing and serial PPS, twenty-one (3%) were SARS-CoV-2-positive; sixteen (76%) staff and 5 (24%) residents. Fifteen (71%) were linked to a single unit. Targeted testing identified 17 (81%) cases; PPS identified 4 (19%). Most (71%) cases were identified prior to IPC intervention. WGS was performed on SARS-CoV-2 isolates from four staff and four residents; five were of Santa Clara County lineage and the three others were distinct lineages.
Early implementation of targeted testing, serial PPS, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.
Observing fetal development in utero is vital to further the understanding of later-life diseases. Magnetic resonance imaging (MRI) offers a tool for obtaining a wealth of information about fetal growth, development, and programming not previously available using other methods. This review provides an overview of MRI techniques used to investigate the metabolic and cardiovascular consequences of the developmental origins of health and disease (DOHaD) hypothesis. These methods add to the understanding of the developing fetus by examining fetal growth and organ development, adipose tissue and body composition, fetal oximetry, placental microstructure, diffusion, perfusion, flow, and metabolism. MRI assessment of fetal growth, organ development, metabolism, and the amount of fetal adipose tissue could give early indicators of abnormal fetal development. Noninvasive fetal oximetry can accurately measure placental and fetal oxygenation, which improves current knowledge on placental function. Additionally, measuring deficiencies in the placenta’s transport of nutrients and oxygen is critical for optimizing treatment. Overall, the detailed structural and functional information provided by MRI is valuable in guiding future investigations of DOHaD.
To develop a pediatric research agenda focused on pediatric healthcare-associated infections and antimicrobial stewardship topics that will yield the highest impact on child health.
The study included 26 geographically diverse adult and pediatric infectious diseases clinicians with expertise in healthcare-associated infection prevention and/or antimicrobial stewardship (topic identification and ranking of priorities), as well as members of the Division of Healthcare Quality and Promotion at the Centers for Disease Control and Prevention (topic identification).
Using a modified Delphi approach, expert recommendations were generated through an iterative process for identifying pediatric research priorities in healthcare associated infection prevention and antimicrobial stewardship. The multistep, 7-month process included a literature review, interactive teleconferences, web-based surveys, and 2 in-person meetings.
A final list of 12 high-priority research topics were generated in the 2 domains. High-priority healthcare-associated infection topics included judicious testing for Clostridioides difficile infection, chlorhexidine (CHG) bathing, measuring and preventing hospital-onset bloodstream infection rates, surgical site infection prevention, surveillance and prevention of multidrug resistant gram-negative rod infections. Antimicrobial stewardship topics included β-lactam allergy de-labeling, judicious use of perioperative antibiotics, intravenous to oral conversion of antimicrobial therapy, developing a patient-level “harm index” for antibiotic exposure, and benchmarking and or peer comparison of antibiotic use for common inpatient conditions.
We identified 6 healthcare-associated infection topics and 6 antimicrobial stewardship topics as potentially high-impact targets for pediatric research.
Background: This review describes the epidemiology of carbapenemase-producing organisms (CPO) in both the community and hospitalized populations in the province of Alberta. Methods: Newly identified CPO-positive individuals from April 1, 2013, to March 31, 2018, were retrospectively reviewed from 3 data sources, which shared a common provincial CPO case definition: (1) positive CPO results from the Provincial Laboratory for Public Health, which provides all referral and confirmatory CPO testing, (2) CPO cases reported to Alberta Health, and (3) CPO surveillance from Alberta Health Services Infection Prevention and Control (IPC). The 3 data sources were collated, and initial CPO cases were classified according to their likely location of acquisition: hospital-acquired, hospital-identified, on admission, and community-identified. Risk factors and adverse outcomes were obtained from linkage to administrative data. Results: In total, 171 unique individuals were newly identified with a first-time CPO case. Also, 15% (25 of 171) were hospital-acquired (HA), 21% (36 of 171) were hospital-identified (HI), 33% (57 of 171) were on admission, and 31% (53 of 171) were community identified. Overall, 9% (5 of 171) resided in long-term care facilities. Of all patients in acute-care facilities, 30% (35 of 118) had infections and 70% were colonized. Overall, 38% (65 of 171) had an acute-care admission in the 1 year prior to CPO identification; 59% (63 of 106) of those who did not have a previous admission had received healthcare outside Alberta. A large proportion of on-admission cases (81%, 46 of 57) and community-identified (66%, 33 of 53) cases did not have any acute-care admissions in Alberta in the previous year. Overall, 10% (14 of 171) had ICU admissions in Alberta within 30 days of CPO identification, and 5% (8 of 171) died within 30 days. The most common carbapenemase gene identified was NDM-1 (53%, 90 of 171). Conclusions: These findings highlight the robust nature of Alberta’s provincial CPO surveillance network. We reviewed 3 different databases (laboratory, health ministry, IPC) to obtain comprehensive data to better understand the epidemiology of CPO in both the community and hospital settings. More than half of the individuals with CPO were initially identified in the community or on admission. Most had received healthcare outside Alberta, and no acute-care admissions occurred in Alberta in the previous year. It is important to be aware of the growing reservoir of CPO outside the hospital setting because it could impact future screening and management practices.
Background: Bloodstream infections (BSIs) due to methicillin-resistant Staphylococcus aureus (MRSA) are important causes of morbidity and mortality in hospitalized patients. Long-term national MRSA BSI surveillance establishes rates for internal and external comparison and provide insight into epidemiologic, molecular, and resistance trends. Here, we present and discuss National MRSA BSI incidence rates and trends over time in Canadian acute-care hospitals from 2008 to 2018. Methods: The Canadian Nosocomial Infection Surveillance Programme (CNISP) is a collaborative effort of the Association of Medical Microbiology and Infectious Disease Canada and the Public Health Agency of Canada. Since 1995, the CNISP has conducted hospital-based sentinel surveillance of MRSA BSIs. Data were collected using standardized definitions and forms from hospitals that participate in the CNISP (48 hospitals in 2008 to 62 hospitals in 2018). For each MRSA BSI identiﬁed, the medical record was reviewed for clinical and demographic information and when possible, 1 blood-culture isolate per patient was submitted to a central laboratory for further molecular characterization and susceptibility testing. Results: From 2008 to 2013, MRSA BSI rates per 10,000 patient days were relatively stable (0.60–0.56). Since 2014, MRSA BSI rates have gradually increased from 0.66 to 1.05 in 2018. Although healthcare-associated (HA) MRSA BSI has shown a minimal increase (0.40 in 2014 to 0.51 in 2018), community-acquired (CA) MRSA BSI has increased by 150%, from 0.20 in 2014 to 0.50 in 2018 (Fig. 1). Laboratory characterization revealed that the proportion of isolates identified as CMRSA 2 (USA 100) decreased each year, from 39% in 2015 to 28% in 2018, while CMRSA 10 (USA 300) has increased from 41% to 47%. Susceptibility testing shows a decrease in clindamycin resistance from 82% in 2013 to 41% in 2018. Conclusions: Over the last decade, ongoing prospective MRSA BSI surveillance has shown relatively stable HA-MRSA rates, while CA-MRSA BSI rates have risen substantially. The proportion of isolates most commonly associated with HA-MRSA BSI (CMRSA2/USA 100) are decreasing and, given that resistance trends are tied to the prevalence of specific epidemic types, a large decrease in clindamycin resistance has been observed. MRSA BSI surveillance has shown a changing pattern in the epidemiology and laboratory characterization of MRSA BSI. The addition of hospitals in later years that may have had higher rates of CA-MRSA BSI could be a confounding factor. Continued comprehensive national surveillance will provide valuable information to address the challenges of infection prevention and control of MRSA BSI in hospitals.
Background: Carbapenemase-producing Enterobacterales (CPE) have rapidly become a global health concern and are associated with substantial morbidity and mortality due to limited treatment options. Travel to endemic areas, especially healthcare exposure in these areas, is an important risk factor for acquisition. We describe the evolving epidemiology, molecular features, and outcomes of CPE in Canada through surveillance by the Canadian Nosocomial Infection Surveillance Program (CNISP). Methods: CNISP has conducted surveillance for CPE among inpatients and outpatients of all ages since 2010. Participating acute-care facilities submit eligible specimens to the National Microbiology Laboratory for detection of carbapenemase production, and epidemiological data are collected. Incidence rates per 10,000 patient days are calculated based on inpatient data. Results: In total, 59 CNISP hospitals in 10 Canadian provinces representing 21,789 beds and 6,785,013 patient days participated in this surveillance. From 2010 to 2018, 118 (26%) CPE-infected and 547 (74%) CPE-colonized patients were identified. Few pediatric cases were identified (n = 18). Infection incidence rates remain low and stable (0.02 per 10,000 patient days in 2010 to 0.03 per 10,000 patient days in 2018), and colonization incidence rates have increased by 89% over the surveillance period. Overall, 92% of cases were acquired in a healthcare facility: 61% (n = 278) in a Canadian healthcare facility and 31% (n = 142) in a healthcare facility outside Canada. Of the 8% of cases not acquired in a healthcare facility, 50% (16 of 32) reported travel outside of Canada in the 12 months prior to positive culture. The distribution of carbapenemases varied by region; New Delhi metallo-B-lactamase (NDM) was dominant (59%) in western Canada and Klebsiella pneumoniae carbapenemase (KPC) (66%) in central Canada. NDM and class D carbapenemase OXA-48 were more commonly identified among those who traveled outside of Canada, whereas KPC was more commonly identified among patients without travel. In addition, 30-day all-cause mortality was 14% (25 of 181) among CPE infected patients and 32% (14 of 44) among those with bacteremia. Conclusions: CPE rates remain low in Canada; however, national surveillance data suggest that the increase in CPE in Canada is now being driven by local nosocomial transmission as well as travel and healthcare within endemic areas. Changes in screening practices may have contributed to the increase in colonizations; however, these data are currently lacking and will be collected moving forward. These data highlight the need to intensify surveillance and coordinate infection control measures to prevent further spread of CPE in Canadian acute-care hospitals.
Susy Hota reports contracted research for Finch Therapeutics. Allison McGeer reports funds to her institution for projects for which she is the principal investigator from Pfizer and Merck, as well as consulting fees from the following companies: Sanofi-Pasteur, Sunovion, GSK, Pfizer, and Cidara.
Background: Nosocomial central-line–associated bloodstream infections (CLABSIs) are an important cause of morbidity and mortality in hospitalized patients. CLABSI surveillance establishes rates for internal and external comparison, identifies risk factors, and allows assessment of interventions. Objectives: To determine the frequency of CLABSIs among adult patients admitted to intensive care units (ICUs) in CNISP hospitals and evaluate trends over time. Methods: CNISP is a collaborative effort of the Canadian Hospital Epidemiology Committee, the Association of Medical Microbiologists and Infectious Disease Canada and the Public Health Agency of Canada. Since 1995, CNISP has conducted hospital-based sentinel surveillance of healthcare-associated infections. Overall, 55 CNISP hospitals participated in ≥1 year of CLABSI surveillance. Adult ICUs are categorized as mixed ICUs or cardiovascular (CV) surgery ICUs. Data were collected using standardized definitions and collection forms. Line-day denominators for each participating ICU were collected. Negative-binomial regression was used to test for linear trends, with robust standard errors to account for clustering by hospital. We used the Fisher exact test to compare binary variables. Results: Each year, 28–42 adult ICUs participated in surveillance (27–37 mixed, 6–8 CV surgery). In both mixed ICUs and CV-ICUs, rates remained relatively stable between 2011 and 2018 (Fig. 1). In mixed ICUs, CLABSI rates were 1.0 per 1,000 line days in 2011, and 1.0 per 1,000 line days in 2018 (test for linear trend, P = .66). In CV-ICUs, CLABSI rates were 1.1 per 1,000 line days in 2011 and 0.8 per 1,000 line days in 2018 (P = .19). Case age and gender distributions were consistent across the surveillance period. The 30-day all-cause mortality rate was 29% in 2011 and in 2018 (annual range, 29%–35%). Between 2011 and 2018, the percentage of isolated microorganisms that were coagulase-negative staphylococci (CONS) decreased from 31% to 18% (P = .004). The percentage of other gram-positive organisms increased from 32% to 37% (P = .34); Bacillus increased from 0% to 4% of isolates and methicillin-susceptible Staphylococcus aureus from 2% to 6%). The gram-negative organisms increased from 21% to 27% (P = .19). Yeast represented 16% in 2011 and 18% in 2018; however, the percentage of yeast that were Candida albicans decreased over time (58% of yeast in 2011 and 30% in 2018; P = .04). Between 2011 and 2018, the most commonly identified species of microorganism in each year were CONS (18% in 2018) and Enterococcus spp (18% in 2018). Conclusions: Ongoing CLABSI surveillance has shown stable rates of CLABSI in adult ICUs from 2011 to 2018. The causative microorganisms have changed, with CONS decreasing from 31% to 18%.
Funding: CNISP is funded by the Public Health Agency of Canada.
Disclosures: Allison McGeer reports funds to her for studies, for which she is the principal investigator, from Pfizer and Merck, as well as consulting fees from Sanofi-Pasteur, Sunovion, GSK, Pfizer, and Cidara.
The criteria for objective memory impairment in mild cognitive impairment (MCI) are vaguely defined. Aggregating the number of abnormal memory scores (NAMS) is one way to operationalise memory impairment, which we hypothesised would predict progression to Alzheimer’s disease (AD) dementia.
As part of the Australian Imaging, Biomarkers and Lifestyle Flagship Study of Ageing, 896 older adults who did not have dementia were administered a psychometric battery including three neuropsychological tests of memory, yielding 10 indices of memory. We calculated the number of memory scores corresponding to z ≤ −1.5 (i.e., NAMS) for each participant. Incident diagnosis of AD dementia was established by consensus of an expert panel after 3 years.
Of the 722 (80.6%) participants who were followed up, 54 (7.5%) developed AD dementia. There was a strong correlation between NAMS and probability of developing AD dementia (r = .91, p = .0003). Each abnormal memory score conferred an additional 9.8% risk of progressing to AD dementia. The area under the receiver operating characteristic curve for NAMS was 0.87 [95% confidence interval (CI) .81–.93, p < .01]. The odds ratio for NAMS was 1.67 (95% CI 1.40–2.01, p < .01) after correcting for age, sex, education, estimated intelligence quotient, subjective memory complaint, Mini-Mental State Exam (MMSE) score and apolipoprotein E ϵ4 status.
Aggregation of abnormal memory scores may be a useful way of operationalising objective memory impairment, predicting incident AD dementia and providing prognostic stratification for individuals with MCI.
To evaluate the validity and reproducibility of a 152-item semi-quantitative FFQ (SFFQ) for estimating flavonoid intakes.
Over a 1-year period, participants completed two SFFQ and two weighed 7-d dietary records (7DDR). Flavonoid intakes from the SFFQ were estimated separately using Harvard (SFFQHarvard) and Phenol-Explorer (SFFQPE) food composition databases. 7DDR flavonoid intakes were derived using the Phenol-Explorer database (7DDRPE). Validity was assessed using Spearman’s rank correlation coefficients deattenuated for random measurement error (rs), and reproducibility was assessed using rank intraclass correlation coefficients.
This validation study included primarily participants from two large observational cohort studies.
Six hundred forty-one men and 724 women.
When compared with two 7DDRPE, the validity of total flavonoid intake assessed by SFFQPE was high for both men and women (rs = 0·77 and rs = 0·74, respectively). The rs for flavonoid subclasses ranged from 0·47 for flavones to 0·78 for anthocyanins in men and from 0·46 for flavonols to 0·77 for anthocyanins in women. We observed similarly moderate (0·4–0·7) to high (≥0·7) validity when using SFFQHarvard estimates, except for flavonesHarvard (rs = 0·25 for men and rs = 0·19 for women). The SFFQ demonstrated high reproducibility for total flavonoid and flavonoid subclass intake estimates when using either food composition database. The intraclass correlation coefficients ranged from 0·69 (flavonolsPE) to 0·80 (proanthocyanidinsPE) in men and from 0·67 (flavonolsPE) to 0·77 (flavan-3-ol monomersHarvard) in women.
SFFQ-derived intakes of total flavonoids and flavonoid subclasses (except for flavones) are valid and reproducible for both men and women.
Adverse programming of adult non-communicable disease can be induced by poor maternal nutrition during pregnancy and the periconception period has been identified as a vulnerable period. In the current study, we used a mouse maternal low-protein diet fed either for the duration of pregnancy (LPD) or exclusively during the preimplantation period (Emb-LPD) with control nutrition provided thereafter and postnatally to investigate effects on fetal bone development and quality. This model has been shown previously to induce cardiometabolic and neurological disease phenotypes in offspring. Micro 3D computed tomography examination at fetal stages Embryonic day E14.5 and E17.4, reflecting early and late stages of bone formation, demonstrated LPD treatment caused increased bone formation of relative high mineral density quality in males, but not females, at E14.5, disproportionate to fetal growth, with bone quality maintained at E17.5. In contrast, Emb-LPD caused a late increase in male fetal bone growth, proportionate to fetal growth, at E17.5, affecting central and peripheral skeleton and of reduced mineral density quality relative to controls. These altered dynamics in bone growth coincide with increased placental efficiency indicating compensatory responses to dietary treatments. Overall, our data show fetal bone formation and mineral quality is dependent upon maternal nutritional protein content and is sex-specific. In particular, we find the duration and timing of poor maternal diet to be critical in the outcomes with periconceptional protein restriction leading to male offspring with increased bone growth but of poor mineral density, thereby susceptible to later disease risk.
Spinal muscular atrophy (SMA) is a devastating rare disease that affects individuals regardless of ethnicity, gender, and age. The first-approved disease-modifying therapy for SMA, nusinursen, was approved by Health Canada, as well as by American and European regulatory agencies following positive clinical trial outcomes. The trials were conducted in a narrow pediatric population defined by age, severity, and genotype. Broad approval of therapy necessitates close follow-up of potential rare adverse events and effectiveness in the larger real-world population.
The Canadian Neuromuscular Disease Registry (CNDR) undertook an iterative multi-stakeholder process to expand the existing SMA dataset to capture items relevant to patient outcomes in a post-marketing environment. The CNDR SMA expanded registry is a longitudinal, prospective, observational study of patients with SMA in Canada designed to evaluate the safety and effectiveness of novel therapies and provide practical information unattainable in trials.
The consensus expanded dataset includes items that address therapy effectiveness and safety and is collected in a multicenter, prospective, observational study, including SMA patients regardless of therapeutic status. The expanded dataset is aligned with global datasets to facilitate collaboration. Additionally, consensus dataset development aimed to standardize appropriate outcome measures across the network and broader Canadian community. Prospective outcome studies, data use, and analyses are independent of the funding partner.
Prospective outcome data collected will provide results on safety and effectiveness in a post-therapy approval era. These data are essential to inform improvements in care and access to therapy for all SMA patients.
OBJECTIVES/GOALS: To characterize the various social and health trajectories of women released from jail, and how these trajectories influence women’s risky sexual and drug behaviors. To identify areas in which prevention programs and community interventions can be implemented to improve social and health outcomes. METHODS/STUDY POPULATION: The present study analyzes data collected as part of the sexual health empowerment (SHE Project) health literacy intervention. Participants were recruited from three county jails in the greater Kansas City area. At baseline, participants completed a survey that assessed participants’ sociodemographic characteristics and social histories prior to incarceration. Women were recruited between 2014-2016 and followed up annually after program completion to complete follow-up surveys to assess long-term health and social circumstances. The present study is a secondary analysis of baseline and follow-up data. Final analyses will include survey data from 126 women. RESULTS/ANTICIPATED RESULTS: In this study, we use Hobfoll’s Conservation of Resources (COR) Theory to conceptualize the impacts of stress on the social and health behaviors of justice-involved women in the years following release from jail. We hypothesize that “loss spirals”, a term coined by Stevan Hobfoll, creates psychological stress that drive justice-involved women to assume behaviors that will generate more resources and help to cope with the stress. We expect to find that women struggle to maintain ties to stable housing, employment, and support, which we believe to be central to “loss spirals.” Additionally, we expect to find that these “loss spirals” are associated with sexual and drug health risks. DISCUSSION/SIGNIFICANCE OF IMPACT: This study aims to define a succinct longitudinal timeline assessing biopsychosocial outcomes of women released from jail in order to improve prevention and intervention techniques for the improvement in social and health circumstances of women leaving jail and their reduction in recidivism.
The aim of this study was to explore associations between internet/email use in a large sample of older English adults with their social isolation and loneliness. Data from the English Longitudinal Study of Ageing Wave 8 were used, with complete data available for 4,492 men and women aged ⩾ 50 years (mean age = 64.3, standard deviation = 13.3; 51.7% males). Binomial logistic regression was used to analyse cross-sectional associations between internet/email use and social isolation and loneliness. The majority of older adults reported using the internet/email every day (69.3%), fewer participants reported once a week (8.5%), once a month (2.6%), once every three months (0.7%), less than every three months (1.5%) and never (17.4%). No significant associations were found between internet/email use and loneliness, however, non-linear associations were found for social isolation. Older adults using the internet/email either once a week (odds ratio (OR) = 0.60, 95% confidence interval (CI) = 0.49–0.72) or once a month (OR = 0.60, 95% CI = 0.45–0.80) were significantly less likely to be socially isolated than every day users; those using internet/email less than once every three months were significantly more likely to be socially isolated than every day users (OR = 2.87, 95% CI = 1.28–6.40). Once every three months and never users showed no difference in social isolation compared with every day users. Weak associations were found between different online activities and loneliness, and strong associations were found with social isolation. The study updated knowledge of older adults’ internet/email habits, devices used and activities engaged in online. Findings may be important for the design of digital behaviour change interventions in older adults, particularly in groups at risk of or interventions targeting loneliness and/or social isolation.
Research using single-word paradigms has established that forced language switching incurs processing costs for some bilinguals, yet, less research has addressed this phenomenon at the utterance level or considered real-world applications. The current study examined the impacts of forced language switching on spoken output and stress using a simulated virtual meeting. Twenty Spanish–English heritage bilinguals responded to general work-oriented questions in monolingual English (control) or language-switching (experimental) conditions. Responses were analyzed for mean length of utterance (MLU) and type-token-ratio (TTR). Multilevel modeling revealed an interaction effect of Condition (control vs. experimental) and question order on MLU, such that participants in the experimental condition produced significantly shorter utterances by the end of the task. Participants also had significantly lower lexical variation (TTR) overall in the experimental than the control condition. A 2 × 2 ANOVA revealed a significant effect of Condition and an interaction of Task (pre- vs. posttask) and Condition, such that participants in the control condition reported significantly lower stress after the activity. Results demonstrated the impact of a forced switching condition on production at the utterance level. Findings have implications for theory and scenarios in which heritage bilinguals are asked to use multiple languages in the workplace.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
This study integrated an experimental medicine approach and a randomized cross-over clinical trial design following CONSORT recommendations to evaluate a cognitive training (CT) intervention for attention deficit hyperactivity disorder (ADHD). The experimental medicine approach was adopted because of documented pathophysiological heterogeneity within the diagnosis of ADHD. The cross-over design was adopted to provide the intervention for all participants and make maximum use of data.
Children (n = 93, mean age 7.3 +/− 1.1 years) with or sub-threshold for ADHD were randomly assigned to CT exercises over 15 weeks, before or after 15 weeks of treatment-as-usual (TAU). Fifteen dropped out of the CT/TAU group and 12 out of the TAU/CT group, leaving 66 for cross-over analysis. Seven in the CT/TAU group completed CT before dropping out making 73 available for experimental medicine analyses. Attention, response inhibition, and working memory were assessed before and after CT and TAU.
Children were more likely to improve with CT than TAU (27/66 v. 13/66, McNemar p = 0.02). Consistent with the experimental medicine hypotheses, responders improved on all tests of executive function (p = 0.009–0.01) while non-responders improved on none (p = 0.27–0.81). The degree of clinical improvement was predicted by baseline and change scores in focused attention and working memory (p = 0.008). The response rate was higher in inattentive and combined subtypes than hyperactive-impulsive subtype (p = 0.003).
Targeting cognitive dysfunction decreases clinical symptoms in proportion to improvement in cognition. Inattentive and combined subtypes were more likely to respond, consistent with targeted pathology and clinically relevant heterogeneity within ADHD.