We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Time-series cross-section (TSCS) data are prevalent in political science, yet many distinct challenges presented by TSCS data remain underaddressed. We focus on how dependence in both space and time complicates estimating either spatial or temporal dependence, dynamics, and effects. Little is known about how modeling one of temporal or cross-sectional dependence well while neglecting the other affects results in TSCS analysis. We demonstrate analytically and through simulations how misspecification of either temporal or spatial dependence inflates estimates of the other dimension’s dependence and thereby induces biased estimates and tests of other covariate effects. Therefore, we recommend the spatiotemporal autoregressive distributed lag (STADL) model with distributed lags in both space and time as an effective general starting point for TSCS model specification. We illustrate with two example reanalyses and provide R code to facilitate researchers’ implementation—from automation of common spatial-weights matrices (W) through estimated spatiotemporal effects/response calculations—for their own TSCS analyses.
Optimizing needleless connector hub disinfection practice is a key strategy in central-line–associated bloodstream infection (CLABSI) prevention. In this mixed-methods evaluation, 3 products with varying scrub times were tested for experimental disinfection followed by a qualitative nursing assessment of each.
Methods:
Needleless connectors were inoculated with varying concentrations of Staphylococcus epidermidis, Pseudomonas aeruginosa, and Staphylococcus aureus followed by disinfection with a 70% isopropyl alcohol (IPA) wipe (a 15-second scrub time and a 15-second dry time), a 70% IPA cap (a 10-second scrub time and a 5-second dry time), or a 3.15% chlorhexidine gluconate with 70% IPA (CHG/IPA) wipe (a 5-second scrub time and a 5-second dry time). Cultures of needleless connectors were obtained after disinfection to quantify bacterial reduction. This was followed by surveying a convenience sample of nursing staff with intensive care unit assignments at an academic tertiary hospital on use of each product.
Results:
All products reduced overall bacterial burden when compared to sterile water controls, however the IPA and CHG/IPA wipes were superior to the IPA caps when product efficacy was compared. Nursing staff noted improved compliance with CHG/IPA wipes compared with the IPA wipes and the IPA caps, with many preferring the lesser scrub and dry times required for disinfection.
Conclusion:
Achieving adequate bacterial disinfection of needleless connectors while maximizing healthcare staff compliance with scrub and dry times may be best achieved with a combination CHG/IPA wipe.
Our objective was to quantify the cross-sectional associations between dietary fatty acid (DFA) patterns and cognitive function among Hispanic/Latino adults. This study included data from 8,942 participants of the Hispanic Community Health Study/Study of Latinos, a population-based cohort study (weighted age 56.2 y and proportion female 55.2%). The NCI (National Cancer Institute) method was used to estimate dietary intake from two 24-hr recalls. We derived DFA patterns using principal components analysis with 26 fatty acid and total plant and animal monounsaturated fatty acid (MUFA) input variables. Global cognitive function was calculated as the average z-score of 4 neurocognitive tests. Survey linear regression models included multiple potential confounders such as age, sex, education, depressive symptoms, physical activity, energy intake, and cardiovascular disease. DFA patterns were characterized by consumption of long-chain saturated fatty acids (SFA), animal-based MUFA, and trans fatty acids (Factor 1); short to medium-chain SFA (Factor 2); very-long-chain omega-3 polyunsaturated fatty acids (PUFA) (Factor 3); very-long-chain SFA and plant-based MUFA and PUFA (Factor 4). Factor 2 was associated with greater scores for global cognitive function (β=0.037 ± 0.012) and the Digit Symbol Substitution (DSS) (β=0.56±0.17), Brief Spanish English Verbal Learning-Sum (B-SEVLT) (β=0.23 ± 0.11), and B-SEVLT-Recall (β=0.11 ± 0.05) tests (P<0.05 for all). Factors 1 (β=0.04 ± 0.01) and 4 (β=0.70 ± 0.18) were associated with the DSS test (P<0.05 for all). Consumption of short to medium-chain SFA may be associated with higher cognitive function among U.S.-residing Hispanic/Latino adults. Prospective studies are necessary to confirm these findings.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Recent cannabis exposure has been associated with lower rates of neurocognitive impairment in people with HIV (PWH). Cannabis’s anti-inflammatory properties may underlie this relationship by reducing chronic neuroinflammation in PWH. This study examined relations between cannabis use and inflammatory biomarkers in cerebrospinal fluid (CSF) and plasma, and cognitive correlates of these biomarkers within a community-based sample of PWH.
Methods:
263 individuals were categorized into four groups: HIV− non-cannabis users (n = 65), HIV+ non-cannabis users (n = 105), HIV+ moderate cannabis users (n = 62), and HIV+ daily cannabis users (n = 31). Differences in pro-inflammatory biomarkers (IL-6, MCP-1/CCL2, IP-10/CXCL10, sCD14, sTNFR-II, TNF-α) by study group were determined by Kruskal–Wallis tests. Multivariable linear regressions examined relationships between biomarkers and seven cognitive domains, adjusting for age, sex/gender, race, education, and current CD4 count.
Results:
HIV+ daily cannabis users showed lower MCP-1 and IP-10 levels in CSF compared to HIV+ non-cannabis users (p = .015; p = .039) and were similar to HIV− non-cannabis users. Plasma biomarkers showed no differences by cannabis use. Among PWH, lower CSF MCP-1 and lower CSF IP-10 were associated with better learning performance (all ps < .05).
Conclusions:
Current daily cannabis use was associated with lower levels of pro-inflammatory chemokines implicated in HIV pathogenesis and these chemokines were linked to the cognitive domain of learning which is commonly impaired in PWH. Cannabinoid-related reductions of MCP-1 and IP-10, if confirmed, suggest a role for medicinal cannabis in the mitigation of persistent inflammation and cognitive impacts of HIV.
Mass asymptomatic SARS-CoV-2 nucleic acid amplified testing of healthcare personnel (HCP) was performed at a large tertiary health system. A low period-prevalence of positive HCP was observed. Of those who tested positive, half had mild symptoms in retrospect. HCP with even mild symptoms should be isolated and tested.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
Methods
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Results
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
Conclusions
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Little is known about the neural substrates of suicide risk in mood disorders. Improving the identification of biomarkers of suicide risk, as indicated by a history of suicide-related behavior (SB), could lead to more targeted treatments to reduce risk.
Methods
Participants were 18 young adults with a mood disorder with a history of SB (as indicated by endorsing a past suicide attempt), 60 with a mood disorder with a history of suicidal ideation (SI) but not SB, 52 with a mood disorder with no history of SI or SB (MD), and 82 healthy comparison participants (HC). Resting-state functional connectivity within and between intrinsic neural networks, including cognitive control network (CCN), salience and emotion network (SEN), and default mode network (DMN), was compared between groups.
Results
Several fronto-parietal regions (k > 57, p < 0.005) were identified in which individuals with SB demonstrated distinct patterns of connectivity within (in the CCN) and across networks (CCN-SEN and CCN-DMN). Connectivity with some of these same regions also distinguished the SB group when participants were re-scanned after 1–4 months. Extracted data defined SB group membership with good accuracy, sensitivity, and specificity (79–88%).
Conclusions
These results suggest that individuals with a history of SB in the context of mood disorders may show reliably distinct patterns of intrinsic network connectivity, even when compared to those with mood disorders without SB. Resting-state fMRI is a promising tool for identifying subtypes of patients with mood disorders who may be at risk for suicidal behavior.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Method:
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
Results:
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
Conclusions:
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
The increased use of insecticide seed treatments in rice has raised many questions about the potential benefits of these products. In 2014 and 2015, a field experiment was conducted near Stuttgart and Lonoke, AR, to evaluate whether an insecticide seed treatment could possibly lessen injury from acetolactate synthase (ALS)–inhibiting herbicides in imidazolinone-resistant (IR) rice. Two IR cultivars were tested (a hybrid, ‘CLXL745’, and an inbred, ‘CL152’), with and without an insecticide seed treatment (thiamethoxam). Four different herbicide combinations were evaluated: a nontreated control, two applications of bispyribac-sodium (hereafter bispyribac), two applications of imazethapyr, and two applications of imazethapyr plus bispyribac. The first herbicide application was to two- to three-leaf rice, and the second immediately prior to flooding (one- to two-tiller). At both 2 and 4 wk after final treatment (WAFT), the sequential applications of imazethapyr or bispyribac plus imazethapyr were more injurious to CLXL745 than CL152. This increased injury led to decreased groundcover 3 WAFT. Rice treated with thiamethoxam was less injured than nontreated rice and had improved groundcover and greater canopy heights. Even with up to 32% injury, the rice plants recovered by the end of the growing season, and yields within a cultivar were similar with and without a thiamethoxam seed treatment across all herbicide treatments. Based on these results, thiamethoxam can partially protect rice from injury caused by ALS-inhibiting herbicides as well as increase groundcover and canopy height; that is, the injury to rice never negatively affected yield.
Most agree that models of binary time-series-cross-sectional data in political science often possess unobserved unit-level heterogeneity. Despite this, there is no clear consensus on how best to account for these potential unit effects, with many of the issues confronted seemingly misunderstood. For example, one oft-discussed concern with rare events data is the elimination of no-event units from the sample when estimating fixed effects models. Many argue that this is a reason to eschew fixed effects in favor of pooled or random effects models. We revisit this issue and clarify that the main concern with fixed effects models of rare events data is not inaccurate or inefficient coefficient estimation, but instead biased marginal effects. In short, only evaluating event-experiencing units gives an inaccurate estimate of the baseline risk, yielding inaccurate (often inflated) estimates of predictor effects. As a solution, we propose a penalized maximum likelihood fixed effects (PML-FE) estimator, which retains the complete sample by providing finite estimates of the fixed effects for each unit. We explore the small sample performance of PML-FE versus common alternatives via Monte Carlo simulations, evaluating the accuracy of both parameter and effects estimates. Finally, we illustrate our method with a model of civil war onset.
The evolution of herbicide resistance is making it extremely difficult for US rice producers to use chemical control on weed species such as barnyardgrass and red rice. To combat herbicide resistance, it is imperative that alternative herbicide sites of action (SOAs) be incorporated into rice whenever possible. There are currently no very-long-chain fatty acid–inhibiting herbicides (WSSA Group 15) labeled for use in US rice; however, pethoxamid is one such herbicide currently under development. If appropriate rice tolerance and weed control can be established, pethoxamid would represent a unique herbicide SOA for use in US rice. We conducted field trials near Stuttgart, AR, in 2015 and near Colt and Lonoke, AR, in 2016 to assess selectivity of pethoxamid and weed control alone and in combination with other herbicides as a delayed preemergence (DPRE) application in drill-seeded rice. Pethoxamid was applied at 0, 420, or 560 g ai ha–1 alone and in combination with clomazone, imazethapyr, pendimethalin, and quinclorac. Minimal rice injury occurred with any treatment assessed. A reduction in rice shoot density and plant height compared to the nontreated control followed the use of pethoxamid; however, no decrease in yield resulted. The highest levels of barnyardgrass control followed the use of imazethapyr at 91% and quinclorac at 89% regardless of the presence of pethoxamid near Lonoke; however, pethoxamid applied at both rates in combination with clomazone and quinclorac increased barnyardgrass control compared to clomazone and quinclorac applied alone. Near Colt, barnyardgrass control of 92% and 96% resulted from pethoxamid alone, averaged over the high and low rates. Based on these data, rice can tolerate pethoxamid when applied DPRE, and adequate levels of barnyardgrass control can be achieved at the rates evaluated within a program; hence, pethoxamid appears to be a viable option for use in rice to allow for increased rotation of herbicide SOAs to combat herbicide-resistant and difficult-to-control weeds.
Each year there are multiple reports of drift occurrences, and the majority of drift complaints in rice are from imazethapyr or glyphosate. In 2014 and 2015, multiple field experiments were conducted near Stuttgart, AR, and near Lonoke, AR, to evaluate whether insecticide seed treatments would reduce injury from glyphosate or imazethapyr drift or decrease the recovery time following exposure to a low rate of these herbicides. Study I was referred to as the “seed treatment study,” and Study II was the “drift timing study.” In the seed treatment study the conventional rice cultivar ‘Roy J’ was planted, and herbicide treatments included imazethapyr at 10.5 g ai ha–1, glyphosate at 126 g ae ha–1, or no herbicide. Each plot had either a seed treatment of thiamethoxam, clothianidin, chlorantraniliprole, or no insecticide seed treatment. The herbicides were applied at the two- to three-leaf growth stage. Crop injury was assessed 1, 3, and 5 wk after application. Averaged over site-years, thiamethoxam-treated rice had less injury than rice with no insecticide seed treatment at each rating, along with an increased yield. Clothianidin-treated rice had an increased yield over no insecticide seed treatment, but the reduction in injury for both herbicides was less pronounced than in the thiamethoxam-treated plots. Overall, chlorantraniliprole was generally the least effective of the three insecticides in reducing injury from either herbicide and in protecting rice yield potential. A second experiment conducted at Stuttgart, AR, was meant to determine whether damage to rice from glyphosate and imazethapyr was influenced by the timing (15, 30, and 45 d after planting) of exposure to herbicides for thiamethoxam-treated and nontreated rice. There was an overall reduction in injury with the use of thiamethoxam, but the reduction in injury was not dependent on the timing of the drift event. Reduction in damage from physical drift of glyphosate and imazethapyr as well as increased yields over the absence of an insecticide seed treatment appear to be an added benefit.
Pennsylvania smartweed [Persicaria pensylvanica (L.) M. Gómez] is a common weed of rice (Oryza sativa L.) in the midsouthern United States and has recently become a concern for farmers because of reduced tillage systems. Acetolactate synthase (ALS) inhibitors have been extensively used for controlling smartweeds in imidazolinone-resistant and conventional rice. In the present study, we confirmed resistance to commonly used ALS inhibitors in rice and characterized the underlying resistance mechanism in a P. pensylvanica biotype from southeast Missouri. A dose–response experiment was conducted in the greenhouse using bensulfuron-methyl, imazethapyr, and bispyribac-sodium to determine the resistance index (resistance/susceptibility [R/S]) based on GR50 estimates. The target-site ALS gene was amplified from R and S plants, and sequences were analyzed for mutations known to confer ALS-inhibitor resistance. The P. pensylvanica biotype in question was found to be resistant to bensulfuron-methyl (R/S=2,330), imazethapyr (R/S=12), and bispyribac-sodium (R/S=6). Sequencing of the ALS gene from R plants revealed two previously known mutations (Pro-197-Ser, Ala-122-Ser) conferring resistance to sulfonylureas and imidazolinones. This is the first report of ALS-inhibitor resistance in P. pensylvanica.
Herbicide resistance to several of the most common weed species in US rice production, such as barnyardgrass and red rice, has made weed control extremely difficult with available herbicide options. No very-long-chain fatty acid–inhibiting herbicides are labeled for use in US rice; however, pethoxamid is one such herbicide under development for soil-applied use to control grasses and small-seeded broadleaves in rice and various row crops. Field trials were conducted in 2015 and 2016 near Stuttgart, AR, for rice tolerance and in 2016 near Colt, AR, and Lonoke, AR, for weed control with the use of pethoxamid-containing rice herbicide programs. Pethoxamid was applied alone and in a program at 420 and 560 g ai ha–1 with other herbicides labeled in rice including clomazone, quinclorac, propanil, imazethapyr, and carfentrazone POST. Injury less than 10% was seen for all treatments 2 wk after application in 2015 and 2016, except for pethoxamid at 420 g ha–1 to clomazone to one-leaf rice. Rice injury dissipated to less than 5% following all treatments by 4 wk after flood establishment. Barnyardgrass was controlled 95% or more near Colt and 93% or more near Lonoke for herbicide programs including clomazone PRE followed by pethoxamid plus quinclorac or imazethapyr at three- to four-leaf stage rice. Considering the minimal injury and high levels of barnyardgrass control associated with pethoxamid-containing weed control programs, pethoxamid provides a unique and effective site of action for use in US rice production.
Treatment for hoarding disorder is typically performed by mental health professionals, potentially limiting access to care in underserved areas.
Aims
We aimed to conduct a non-inferiority trial of group peer-facilitated therapy (G-PFT) and group psychologist-led cognitive–behavioural therapy (G-CBT).
Method
We randomised 323 adults with hording disorder 15 weeks of G-PFT or 16 weeks of G-CBT and assessed at baseline, post-treatment and longitudinally (≥3 months post-treatment: mean 14.4 months, range 3–25). Predictors of treatment response were examined.
Results
G-PFT (effect size 1.20) was as effective as G-CBT (effect size 1.21; between-group difference 1.82 points, t = −1.71, d.f. = 245, P = 0.04). More homework completion and ongoing help from family and friends resulted in lower severity scores at longitudinal follow-up (t = 2.79, d.f. = 175, P = 0.006; t = 2.89, d.f. = 175, P = 0.004).
Conclusions
Peer-led groups were as effective as psychologist-led groups, providing a novel treatment avenue for individuals without access to mental health professionals.
Declaration of interest
C.A.M. has received grant funding from the National Institutes of Health (NIH) and travel reimbursement and speakers’ honoraria from the Tourette Association of America (TAA), as well as honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. K.D. receives research support from the NIH and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. R.S.M. receives research support from the National Institute of Mental Health, National Institute of Aging, the Hillblom Foundation, Janssen Pharmaceuticals (research grant) and the Alzheimer's Association. R.S.M. has also received travel support from the National Institute of Mental Health for Workshop participation. J.Y.T. receives research support from the NIH, Patient-Centered Outcomes Research Institute and the California Tobacco Related Research Program, and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. All other authors report no conflicts of interest.