We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This systematic literature review aimed to provide an overview of the characteristics and methods used in studies applying the disability-adjusted life years (DALY) concept for infectious diseases within European Union (EU)/European Economic Area (EEA)/European Free Trade Association (EFTA) countries and the United Kingdom. Electronic databases and grey literature were searched for articles reporting the assessment of DALY and its components. We considered studies in which researchers performed DALY calculations using primary epidemiological data input sources. We screened 3053 studies of which 2948 were excluded and 105 studies met our inclusion criteria. Of these studies, 22 were multi-country and 83 were single-country studies, of which 46 were from the Netherlands. Food- and water-borne diseases were the most frequently studied infectious diseases. Between 2015 and 2022, the number of burden of infectious disease studies was 1.6 times higher compared to that published between 2000 and 2014. Almost all studies (97%) estimated DALYs based on the incidence- and pathogen-based approach and without social weighting functions; however, there was less methodological consensus with regards to the disability weights and life tables that were applied. The number of burden of infectious disease studies undertaken across Europe has increased over time. Development and use of guidelines will promote performing burden of infectious disease studies and facilitate comparability of the results.
Inter- and intragenerational relationships are known to be important in maintaining the wellbeing of older people. A key aspect of these relationships is the exchange of both emotional and instrumental social support. However, relatively little is known about how this exchange of support changes in the context of widespread disruption. The COVID-19 pandemic provides an opportunity to examine how older people's family relationships are impacted by such social change. The present qualitative study explores how older people in the United Kingdom experienced changes in inter- and intragenerational support during the COVID-19 pandemic. Participants (N = 33) were recruited through a large-scale nationally representative survey (https://www.sheffield.ac.uk/psychology-consortium-covid19). We asked how life had been pre-pandemic, how they experienced the first national lockdown and what the future might hold in store. The data were analysed using constructivist grounded theory. This paper focuses on the importance of family relationships and how they changed as a consequence of the pandemic. We found that the family support system had been interrupted, that there were changes in the methods of support and that feelings of belonging were challenged. We argue that families were brought into disequilibrium through changes in the exchange of inter- and intragenerational support. The important role of grandchildren for older adults was striking and challenged by the pandemic. The significance of social connectedness and support within the family had not changed during the pandemic, but it could no longer be lived in the same way. The desire to be close to family members and to support them conflicted with the risk of pandemic infection. Our study found support for the COVID-19 Social Connectivity Paradox: the need for social connectedness whilst maintaining social distance. This challenged family equilibrium, wellbeing and quality of life in older people.
Background: Antibiotics are frequently prescribed–and overprescribed–at hospital discharge, leading to adverse-events and patient harm. Our understanding of how to optimize prescribing at discharge is limited. Recently, we published the ROAD (Reducing Overuse of Antibiotics at Discharge) Home Framework, which identified potential strategies to improve antibiotic prescribing at discharge across 3 tiers: Tier 1–Critical infrastructure, Tier 2–Broad inpatient interventions, Tier 3–Discharge-specific strategies. Here, we used the ROAD Home Framework to assess the association of stewardship strategies with antibiotic overuse at discharge and to describe pathways toward improved discharge prescribing. Methods: In fall 2019, we surveyed 39 Michigan hospitals on their antibiotic stewardship strategies. For patients hospitalized at participating hospitals July 1, 2017, through July 30, 2019, and treated for community-acquired pneumonia (CAP) and urinary tract infection (UTI), we assessed the association of reported strategies with days of antibiotic overuse at discharge. Days of antibiotic overuse at discharge were defined based on national guidelines and included unnecessary therapy, excess duration, and suboptimal fluoroquinolone use. We evaluated the association of stewardship strategies with days of discharge antibiotic overuse 2 ways: (1) all stewardship strategies were assumed to have equal weight, and (2) strategies weighted using the ROAD Home Framework with tier 3 (discharge-specific) strategies had the highest weight. Results: Overall, 39 hospitals with 20,444 patients (56.5% CAP; 43.5% UTI) were included. The survey response rate was 100% (39 of 39). Hospitals reported a median of 12 (IQR, 9–14) of 33 possible stewardship strategies (Fig. 1). On bivariable analyses, review of antibiotics prior to discharge was the only strategy consistently associated with lower antibiotic overuse at discharge (aIRR, 0.543; 95% CI, 0.335–0.878). On multivariable analysis, weighting by ROAD Home tier predicted antibiotic overuse at discharge for both CAP and UTI. For diseases combined, having more weighted strategies was associated with lower antibiotic overuse at discharge (aIRR per weighted intervention, 0.957; 95% CI, 0.927–0.987). Discharge-specific stewardship strategies were associated with a 12.4% relative decrease in antibiotic overuse days at discharge. Based on these findings, 3 pathways emerged to improve antibiotic use at discharge (Fig. 2): inpatient-focused strategies, “doing it all,” and discharge-focused strategies. Conclusions: The more stewardship strategies reported, the lower a hospitals’ antibiotic overuse at discharge. However, different pathways to improve discharge antibiotic use exist. Thus, discharge stewardship strategies should be tailored. Specifically, hospitals with limited stewardship resources and infrastructure should consider implementing a discharge-specific strategy straightaway. In contrast, hospitals that already have substantial inpatient infrastructure may benefit from proactively incorporating discharge into their existing strategies.
Background: Nursing home (NH) residents and staff were at high risk for COVID-19 early in the pandemic; several studies estimated seroprevalence of infection in NH staff to be 3-fold higher among CNAs and nurses compared to other staff. Risk mitigation added in Fall 2020 included systematic testing of residents and staff (and furlough if positive) to reduce transmission risk. We estimated risks for SARS-CoV-2 infection among NH staff during the first winter surge before widespread vaccination. Methods: Between February and May 2021, voluntary serologic testing was performed on NH staff who were seronegative for SARS-CoV-2 in late Fall 2020 (during a previous serology study at 14 Georgia NHs). An exposure assessment at the second time point covered prior 3 months of job activities, community exposures, and self-reported COVID-19 vaccination, including very recent vaccination (≤4 weeks). Risk factors for seroconversion were estimated by job type using multivariable logistic regression, accounting for interval community-incidence and interval change in resident infections per bed. Results: Among 203 eligible staff, 72 (35.5%) had evidence of interval seroconversion (Fig. 1). Among 80 unvaccinated staff, interval infection was significantly higher among CNAs and nurses (aOR, 4.9; 95% CI, 1.4–20.7) than other staff, after adjusting for race and interval community incidence and facility infections. This risk persisted but was attenuated when utilizing the full study cohort including those with very recent vaccination (aOR, 1.8; 95% CI, 0.9–3.7). Conclusions: Midway through the first year of the pandemic, NH staff with close or common resident contact continued to be at increased risk for infection despite enhanced infection prevention efforts. Mitigation strategies, prior to vaccination, did not eliminate occupational risk for infection. Vaccine utilization is critical to eliminate occupational risk among frontline healthcare providers.
This article investigates two current incarnations of ‘lo-fi’ music and questions the extent to which these subgenres are actually low in fidelity. In essence, mainstream ‘hi-fi’ productions use similar effects, such as filtering to sound like a radio or adding noise to sound like a vinyl record. To understand lo-fi today, this article explores music by a lo-fi hip-hop producer and a lo-fi ambient producer, drawing upon the analytical methods of Alan Moore and Dennis Smalley. First to be discussed is Glimlip, one of the many anonymous producers behind the popular Lofi Girl YouTube streams. The next discussed is Amulets, an ambient musician known for using hacked and looped cassette tapes. Analyses of their music demonstrate a level of care in production that goes against the idea that lo-fi is primitive or naive.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Design:
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Setting:
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
Participants:
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Results:
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
Conclusions:
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
To estimate prior severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among skilled nursing facility (SNF) staff in the state of Georgia and to identify risk factors for seropositivity as of fall 2020.
Design:
Baseline survey and seroprevalence of the ongoing longitudinal Coronavirus 2019 (COVID-19) Prevention in Nursing Homes study.
Setting:
The study included 14 SNFs in the state of Georgia.
Participants:
In total, 792 SNF staff employed or contracted with participating SNFs were included in this study. The analysis included 749 participants with SARS-CoV-2 serostatus results who provided age, sex, and complete survey information.
Methods:
We estimated unadjusted odds ratios (ORs) and 95% confidence intervals (95% CIs) for potential risk factors and SARS-CoV-2 serostatus. We estimated adjusted ORs using a logistic regression model including age, sex, community case rate, SNF resident infection rate, working at other facilities, and job role.
Results:
Staff working in high-infection SNFs were twice as likely (unadjusted OR, 2.08; 95% CI, 1.45–3.00) to be seropositive as those in low-infection SNFs. Certified nursing assistants and nurses were 3 times more likely to be seropositive than administrative, pharmacy, or nonresident care staff: unadjusted OR, 2.93 (95% CI, 1.58–5.78) and unadjusted OR, 3.08 (95% CI, 1.66–6.07). Logistic regression yielded similar adjusted ORs.
Conclusions:
Working at high-infection SNFs was a risk factor for SARS-CoV-2 seropositivity. Even after accounting for resident infections, certified nursing assistants and nurses had a 3-fold higher risk of SARS-CoV-2 seropositivity than nonclinical staff. This knowledge can guide prioritized implementation of safer ways for caregivers to provide necessary care to SNF residents.
This chapter synthesises insights from the Deep Decarbonisation Pathways Project (DDPP), which provided detailed analysis of how 16 countries representing three-quarters of global emissions can transition to very low-carbon economies. The four ‘pillars’ of decarbonisation are identified as: achieving low or zero-carbon electricity supply; electrification and fuel switching in transport, industry and housing; ambitious energy efficiency improvements; and reducing non-energy emissions. The chapter focuses on decarbonisation scenarios for Australia. It shows that electricity supply can be readily decarbonised and greatly expanded to cater for electrification of transport, industry and buildings. There would be remaining emissions principally from industry and agriculture, these could be fully compensated through land-based carbon sequestration. The analysis shows that such decarbonisation would be consistent with continued growth in GDP and trade, and would require very little change in economic structure of Australia’s economy. Australia is rich in renewable energy potential, which could re-enable new industries such as energy-intensive manufacturing for export
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
The emotion of pride appears to be a neurocognitive guidance system to capitalize on opportunities to become more highly valued and respected by others. Whereas the inputs and the outputs of pride are relatively well understood, little is known about how the pride system matches inputs to outputs. How does pride work? Here we evaluate the hypothesis that pride magnitude matches the various outputs it controls to the present activating conditions – the precise degree to which others would value the focal individual if the individual achieved a particular achievement. Operating in this manner would allow the pride system to balance the competing demands of effectiveness and economy, to avoid the dual costs of under-deploying and over-deploying its outputs. To test this hypothesis, we measured people's responses regarding each of 25 socially valued traits. We observed the predicted magnitude matchings. The intensities of the pride feeling and of various motivations of pride (communicating the achievement, demanding better treatment, investing in the valued trait and pursuing new challenges) vary in proportion: (a) to one another; and (b) to the degree to which audiences value each achievement. These patterns of magnitude matching were observed both within and between the USA and India. These findings suggest that pride works cost-effectively, promoting the pursuit of achievements and facilitating the gains from others’ valuations that make those achievements worth pursuing.
We explore marine reservoir effects (MREs) in seal bones from the northern Bering and Chukchi Seas regions. Ringed and bearded seals have served as dietary staples in human populations along the coasts of Arctic northeast Asia and North America for several millennia. Radiocarbon (14C) dates on seal bones and terrestrial materials (caribou, plants seeds, wood, and wood charcoal) were compared from archaeological sites in the Bering Strait region of northwestern Alaska to assess MREs in these sea mammals over time. We also compared these results to 14C dates on modern seal specimens collected in AD 1932 and 1946 from the Bering Sea region. Our paired archaeological samples were recovered from late Holocene archaeological features, including floors from dwellings and cache pits, that date between 1600 and 130 cal BP. 14C dates on seal bones from the northern Bering and Chukchi Seas show differences [R(t)] of 800 ± 140 years from to their terrestrial counterparts, and deviations of 404 ± 112 years (ΔR) from the marine calibration curve.
Goosegrass control options in bermudagrass are limited. Topramezone is one option that offers excellent control of mature goosegrass, but application to bermudagrass results in unacceptable symptoms of bleaching and necrosis typical of hydroxyphenylpyruvate dioxygenase inhibitors. Previous research has shown that adding chelated iron reduced the phytotoxicity of topramezone without reducing the efficacy of the herbicide, resulting in safening when applied to bermudagrass. Our objective was to examine additional iron sources to determine whether similar safening effects occur with other sources. Field trials were conducted in the summers of 2016 to 2018 (Auburn University). Mixtures of topramezone and methylated seed oil were combined with six different commercial iron sources, including sodium ferric ethylenediamine di-o-hydroxyphenyl-acetate (FeEDDHA), ferrous diethylenetriamine pentaacetic acid (FeDTPA), iron citrate, FeSO4, and a combination of iron oxide/sucrate/sulfate, some of which contained nitrogen. Bermudagrass necrosis and bleaching symptoms were visually rated on a 0% to 100% scale. Reflectance (normalized difference vegetation index) and clipping yield measurements were also collected. Application of FeDTPA and FeSO4 reduced symptoms of bleaching and necrosis when applied with topramezone. Other treatments that contained nitrogen did not reduce injury but did reduce bermudagrass recovery time following the appearance of necrosis. Inclusion of small amounts of nitrogen often negated the safening effects of FeSO4. The iron oxide/sucrate/sulfate product had no effect on bleaching or necrosis. Data suggest that the iron source had a differential effect on bleaching and necrosis reduction when applied in combination with topramezone to bermudagrass. Overall, FeSO4 and FeDTPA safened topramezone the most on bermudagrass.
Predictors of new-onset bipolar disorder (BD) or psychotic disorder (PD) have been proposed on the basis of retrospective or prospective studies of ‘at-risk’ cohorts. Few studies have compared concurrently or longitudinally factors associated with the onset of BD or PDs in youth presenting to early intervention services. We aimed to identify clinical predictors of the onset of full-threshold (FT) BD or PD in this population.
Method
Multi-state Markov modelling was used to assess the relationships between baseline characteristics and the likelihood of the onset of FT BD or PD in youth (aged 12–30) presenting to mental health services.
Results
Of 2330 individuals assessed longitudinally, 4.3% (n = 100) met criteria for new-onset FT BD and 2.2% (n = 51) met criteria for a new-onset FT PD. The emergence of FT BD was associated with older age, lower social and occupational functioning, mania-like experiences (MLE), suicide attempts, reduced incidence of physical illness, childhood-onset depression, and childhood-onset anxiety. The emergence of a PD was associated with older age, male sex, psychosis-like experiences (PLE), suicide attempts, stimulant use, and childhood-onset depression.
Conclusions
Identifying risk factors for the onset of either BD or PDs in young people presenting to early intervention services is assisted not only by the increased focus on MLE and PLE, but also by recognising the predictive significance of poorer social function, childhood-onset anxiety and mood disorders, and suicide attempts prior to the time of entry to services. Secondary prevention may be enhanced by greater attention to those risk factors that are modifiable or shared by both illness trajectories.
To identify risk factors for asymptomatic Clostridioides difficile colonization among hospitalized adults utilizing a meta-analysis, which may enable early identification of colonized patients at risk of spreading C. difficile.
Design:
Meta-analysis and systematic review.
Methods:
We systematically searched MEDLINE, Scopus, Web of Science, and EMBASE from January 1, 1975, to February 15, 2020, for articles related to C. difficile colonization among hospitalized adults. Studies with multivariable analyses evaluating risk factors for asymptomatic colonization were eligible.
Results:
Among 5,506 studies identified in the search, 19 studies met the inclusion criteria. Included studies reported 20,334 adult patients of whom 1,588 were asymptomatically colonized with C. difficile. Factors associated with an increased risk of colonization were hospitalization in the previous 6 months (OR, 2.18; 95% CI, 1.86–2.56; P < .001), use of gastric acid suppression therapy within the previous 8 weeks (OR, 1.42; 95% CI, 1.17–1.73; P < .001), tube feeding (OR, 2.02; 95% CI, 1.06–3.85; P = .03), and corticosteroid use in the previous 8 weeks (OR, 1.58; 95% CI, 1.14–2.17; P = .006). Receipt of antibiotics in the previous 3 months (OR, 1.37; 95% CI, 0.94–2.01; P = .10) was not associated with statistically significant effects on risk of colonization.
Conclusions:
C. difficile colonization was significantly associated with previous hospitalization, gastric acid suppression, tube feeding, and corticosteroid use. Recognition of these risk factors may assist in identifying asymptomatic carriers of C. difficile and taking appropriate measures to reduce transmission.
POST goosegrass and other grassy weed control in bermudagrass is problematic. Fewer herbicides that can control goosegrass are available due to regulatory pressure and herbicide resistance. Alternative herbicide options that offer effective control are needed. Previous research demonstrates that topramezone controls goosegrass, crabgrass, and other weed species; however, injury to bermudagrass may be unacceptable. The objective of this research was to evaluate the safening potential of topramezone combinations with different additives on bermudagrass. Field trials were conducted at Auburn University during summer and fall from 2015 to 2018 and 2017 to 2018, respectively. Treatments included topramezone mixtures and methylated seed oil applied in combination with five different additives: triclopyr, green turf pigment, green turf paint, ammonium sulfate, and chelated iron. Bermudagrass bleaching and necrosis symptoms were visually rated. Normalized-difference vegetative index measurements and clipping yield data were also collected. Topramezone plus chelated iron, as well as topramezone plus triclopyr, reduced bleaching potential the best; however, the combination of topramezone plus triclopyr resulted in necrosis that outweighed reductions in bleaching. Masking agents such as green turf paint and green turf pigment were ineffective in reducing injury when applied with topramezone. The combination of topramezone plus ammonium sulfate should be avoided because of the high level of necrosis. Topramezone-associated bleaching symptoms were transient and lasted 7 to 14 d on average. Findings from this research suggest that chelated iron added to topramezone and methylated seed oil mixtures acted as a safener on bermudagrass.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
The purpose of this study was to examine the effectiveness, satisfaction, and acceptance of a low-cost Lombard-response (LR) device in a group of individuals with Parkinson’s disease (IWPD) and their communication partners (CPs).
Method:
Sixteen IWPD and hypophonia and their CPs participated in the study. The IWPD wore a LR device that included a small MP3 player (Sony Walkman) and headphones playing a multi-talker noise audio file at 80 dB during lab-based speech tasks and during their daily conversational speech over a 2-week device trial period. Outcome measures included average conversational speech intensity and scores on a questionnaire related to speech impairment, communication effectiveness, and device satisfaction.
Results:
Conversational speech intensity of the IWPD is increased by 7 to 10 dB with the LR device. Following a 2-week trial period, eight of the IWPD (50%) gave the LR device moderate-to-high satisfaction and effectiveness ratings and decided to purchase the device for long-term daily use. At the 4-month follow-up, none of the IWPDs were still using the LR device. Device rejection was related to discomfort (loudness), headaches, interference with cognition, and difficulty controlling device.
Conclusion:
Short-term acceptance and satisfaction with the LR device was moderate, but long-term acceptance, beyond 4 months, was absent. Future studies are required to determine if other types of low-cost LR devices can be developed that improve long-term efficacy and device acceptance in IWPD and hypophonia.
Identifying early risk factors for the development of social anxiety symptoms has important translational implications. Accurately identifying which children are at the highest risk is of critical importance, especially if we can identify risk early in development. We examined continued risk for social anxiety symptoms at the transition to adolescence in a community sample of children (n = 112) that had been observed for high fearfulness at age 2 and tracked for social anxiety symptoms from preschool through age 6. In our previous studies, we found that a pattern of dysregulated fear (DF), characterized by high fear in low threat contexts, predicted social anxiety symptoms at ages 3, 4, 5, and 6 years across two samples. In the current study, we re-evaluated these children at 11–13 years of age by using parent and child reports of social anxiety symptoms, parental monitoring, and peer relationship quality. The scores for DF uniquely predicted adolescents’ social anxiety symptoms beyond the prediction that was made by more proximal measures of behavioral (e.g., kindergarten social withdrawal) and concurrent environmental risk factors (e.g., parental monitoring, peer relationships). Implications for early detection, prevention, and intervention are discussed.