To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present the analysis of global sympagic primary production (PP) from 300 years of pre-industrial and historical simulations of the E3SMv1.1-BGC model. The model includes a novel, eight-element sea ice biogeochemical component, MPAS-Seaice zbgc, which is resolved in three spatial dimensions and uses a vertical transport scheme based on internal brine dynamics. Modeled ice algal chlorophyll-a concentrations and column-integrated values are broadly consistent with observations, though chl-a profile fractions indicate that upper ice communities of the Southern Ocean are underestimated. Simulations of polar integrated sea ice PP support the lower bound in published estimates for both polar regions with mean Arctic values of 7.5 and 15.5 TgC/a in the Southern Ocean. However, comparisons of the polar climate state with observations, using a maximal bound for ice algal growth rates, suggest that the Arctic lower bound is a significant underestimation driven by biases in ocean surface nitrate, and that correction of these biases supports as much as 60.7 TgC/a of net Arctic PP. Simulated Southern Ocean sympagic PP is predominantly light-limited, and regional patterns, particularly in the coastal high production band, are found to be negatively correlated with snow thickness.
Early life adversity (ELA) has been associated with inflammation and immunosenescence, as well as hyporeactivity of the HPA axis. Because the immune system and the HPA axis are tightly intertwined around the glucocorticoid receptor (GR), we examined peripheral GR functionality in the EpiPath cohort among participants who either had been exposed to ELA (separation from parents and/or institutionalization followed by adoption; n = 40) or had been reared by their biological parents (n = 72).
Expression of the strict GR target genes FKBP5 and GILZ as well as total and 1F and 1H GR transcripts were similar between groups. Furthermore, there were no differences in GR sensitivity, examined by the effects of dexamethasone on IL6 production in LPS-stimulated whole blood. Although we did not find differences in methylation at the GR 1F exon or promoter region, we identified a region of the GR 1H promoter (CpG 1-9) that showed lower methylation levels in ELA.
Our results suggest that peripheral GR signaling was unperturbed in our cohort and the observed immune phenotype does not appear to be secondary to an altered GR response to the perturbed HPA axis and glucocorticoid (GC) profile, although we are limited in our measures of GR activity and time points.
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.
On October 7, 2016, Hurricane Matthew traveled along the coasts of Florida, Georgia, and South Carolina causing flooding and power outages. The Georgia Department of Public Health (DPH) developed the Web-based Responder Safety, Tracking, and Resilience (R-STaR) system to monitor the health and safety of public health responders and to inform disaster response planning for Hurricane Matthew. Using R-STaR, responders (n = 126) were e-mailed a daily survey while deployed to document injuries or harmful exposures and a post-deployment survey on their post-deployment health and satisfaction with using R-STaR. DPH epidemiologists contacted responders reporting injuries or exposures to determine the need for medical care. Frequencies were tabulated for quantitative survey responses, and qualitative data were summarized into key themes. Five percent (6/126) of responders reported injuries, and 81% (43/53) found R-STaR easy to use. Suggestions for R-STaR improvement included improving accessibility using mobile platforms and conducting pre-event training of responders on R-STaR. Lessons learned from R-STaR development and evaluation can inform the development and improvement of responder health surveillance systems at other local and state health departments and disaster and emergency response agencies. (Disaster Med Public Health Preparedness. 2019;13:74–81).
Non-suicidal self-injury (NSSI) prospectively predicts suicidal thoughts and behaviors in civilian populations. Despite high rates of suicide among US military members, little is known about the prevalence and course of NSSI, or how NSSI relates to suicidal thoughts and behaviors, in military personnel.
We conducted secondary analyses of two representative surveys of active-duty soldiers (N = 21 449) and newly enlisted soldiers (N = 38 507) from the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS).
The lifetime prevalence of NSSI is 6.3% (1.2% 12-month prevalence) in active-duty soldiers and 7.9% (1.3% 12-month prevalence) in new soldiers. Demographic risk factors for lifetime NSSI include female sex, younger age, non-Hispanic white ethnicity, never having married, and lower educational attainment. The association of NSSI with temporally primary internalizing and externalizing disorders varies by service history (new v. active-duty soldiers) and gender (men v. women). In both active-duty and new soldiers, NSSI is associated with increased odds of subsequent onset of suicidal ideation [adjusted odds ratio (OR) = 1.66–1.81] and suicide attempts (adjusted OR = 2.02–2.43), although not with the transition from ideation to attempt (adjusted OR = 0.92–1.36). Soldiers with a history of NSSI are more likely to have made multiple suicide attempts, compared with soldiers without NSSI.
NSSI is prevalent among US Army soldiers and is associated with significantly increased odds of later suicidal thoughts and behaviors, even after NSSI has resolved. Suicide risk assessments in military populations should screen for history of NSSI.
Introduction: Deep vein thrombosis (DVT) can lead to significant morbidity and mortality if not diagnosed and treated promptly. Currently, few methods aside from venous duplex scanning can rule out DVT in patients presenting to the Emergency Department (ED). Current screening tools, including the use of the subjective Wells score, frequently leads to unnecessary investigations and anticoagulation. In this study, we sought to determine whether two-site compression point-of-care ultrasound (POCUS) combined with a negative age-adjusted D-dimer test can accurately rule out DVT in ED patients irrespective of the modified Wells score. Methods: This is a single-center, prospective observational study in the ED of the Jewish General Hospital in Montreal. We are recruiting a convenience sample of patients presenting to the ED with symptoms suggestive of DVT. All enrolled patients are risk-stratified using the modified Wells criteria for DVT, then undergo two-site compression POCUS, and testing for age-adjusted D-dimer. Patients with DVT unlikely according to modified Wells score, negative POCUS and negative age-adjusted D-dimer are discharged home and receive a three-month phone follow-up. Patients with DVT likely according to modified Wells score, a positive POCUS or a positive age-adjusted D-dimer, will undergo a venous duplex scan. A true negative DVT is defined as either a negative venous duplex scan or a negative follow-up phone questionnaire for patients who were sent home without a venous duplex scan. Results: Of the 42 patients recruited thus far, the mean age is 56 years old and 42.8% are male. Twelve (28.6%) patients had DVT unlikely as per modified Wells score, negative POCUS and negative age-adjusted D-dimer and were discharged home. None of these patients developed a DVT on three-month follow-up. Thirty patients (71.4%) had either a DVT likely as per modified Wells score, a positive POCUS or a positive age-adjusted D-dimer and underwent a venous duplex scan. Of those, six patients had a confirmed DVT (3 proximal & 3 distal). POCUS detected all proximal DVTs, while combined POCUS and age-adjusted D-dimer detected all proximal and distal DVTs. None of the patients with a negative POCUS and age-adjusted D-dimer were found to have a DVT. Conclusion: Two-site compression POCUS combined with a negative age-adjusted D-dimer test appears to accurately rule out DVT in ED patients without the need for follow-up duplex venous scan. Using this approach would alleviate the need to calculate the Wells score, and also reduce the need for radiology-performed duplex venous scan for many patients.
Schizophrenia (SZ) is a severe neuropsychiatric disorder associated with disrupted connectivity within the thalamic-cortico-cerebellar network. Resting-state functional connectivity studies have reported thalamic hypoconnectivity with the cerebellum and prefrontal cortex as well as thalamic hyperconnectivity with sensory cortical regions in SZ patients compared with healthy comparison participants (HCs). However, fundamental questions remain regarding the clinical significance of these connectivity abnormalities.
Resting state seed-based functional connectivity was used to investigate thalamus to whole brain connectivity using multi-site data including 183 SZ patients and 178 matched HCs. Statistical significance was based on a voxel-level FWE-corrected height threshold of p < 0.001. The relationships between positive and negative symptoms of SZ and regions of the brain demonstrating group differences in thalamic connectivity were examined.
HC and SZ participants both demonstrated widespread positive connectivity between the thalamus and cortical regions. Compared with HCs, SZ patients had reduced thalamic connectivity with bilateral cerebellum and anterior cingulate cortex. In contrast, SZ patients had greater thalamic connectivity with multiple sensory-motor regions, including bilateral pre- and post-central gyrus, middle/inferior occipital gyrus, and middle/superior temporal gyrus. Thalamus to middle temporal gyrus connectivity was positively correlated with hallucinations and delusions, while thalamus to cerebellar connectivity was negatively correlated with delusions and bizarre behavior.
Thalamic hyperconnectivity with sensory regions and hypoconnectivity with cerebellar regions in combination with their relationship to clinical features of SZ suggest that thalamic dysconnectivity may be a core neurobiological feature of SZ that underpins positive symptoms.
Our understanding of the complex relationship between schizophrenia symptomatology and etiological factors can be improved by studying brain-based correlates of schizophrenia. Research showed that impairments in value processing and executive functioning, which have been associated with prefrontal brain areas [particularly the medial orbitofrontal cortex (MOFC)], are linked to negative symptoms. Here we tested the hypothesis that MOFC thickness is associated with negative symptom severity.
This study included 1985 individuals with schizophrenia from 17 research groups around the world contributing to the ENIGMA Schizophrenia Working Group. Cortical thickness values were obtained from T1-weighted structural brain scans using FreeSurfer. A meta-analysis across sites was conducted over effect sizes from a model predicting cortical thickness by negative symptom score (harmonized Scale for the Assessment of Negative Symptoms or Positive and Negative Syndrome Scale scores).
Meta-analytical results showed that left, but not right, MOFC thickness was significantly associated with negative symptom severity (βstd = −0.075; p = 0.019) after accounting for age, gender, and site. This effect remained significant (p = 0.036) in a model including overall illness severity. Covarying for duration of illness, age of onset, antipsychotic medication or handedness weakened the association of negative symptoms with left MOFC thickness. As part of a secondary analysis including 10 other prefrontal regions further associations in the left lateral orbitofrontal gyrus and pars opercularis emerged.
Using an unusually large cohort and a meta-analytical approach, our findings point towards a link between prefrontal thinning and negative symptom severity in schizophrenia. This finding provides further insight into the relationship between structural brain abnormalities and negative symptoms in schizophrenia.
Introduction: Undiagnosed deep vein thrombosis (DVT) can lead to significant morbidity and mortality, including death from DVT-associated massive pulmonary embolism (PE). While several validated clinical prediction rules, blood test and imaging modalities exist to investigate a potential DVT, there is currently a lack of rapid, accessible and reliable methods to exclude the possibility of DVT without resorting to formal venous duplex scanning. Currently, the use in the ED of a validated clinical prediction rule combined to either a high-sensitivity D-dimer test or ultrasonography of the lower extremities has a poor predictive value, as 75-90% of patients suspected of DVT have a negative formal venous duplex scan. Compression bedside ultrasound has however recently been shown to be a safe, rapid and accurate method for the diagnosis of proximal DVT in the emergency department with a high sensitivity and specificity (combined sensitivity and specificity of 96.1% and 96.8%, respectively1). Research question: In the present study, we will primarily assess whether two-site compression POCUS combined with a negative age-adjusted D-dimer test can accurately rule out DVT in ED patients regardless of the Wells criteria. Methods: This is a single-center, prospective, observational study carried out over one year in the Emergency Department of the Jewish General Hospital in Montreal, Quebec. We aim to enroll a convenience sample of 475 patients aged 18 years and older presenting to the ED with symptoms suggestive of a DVT. All enrolled patients will receive the standard of care required for a lower leg DVT presentation. After calculating Patients DVT risk using modified wells criteria, all patients will undergo POCUS for DVT followed by a D-dimer test. Based on their results, patients will either undergo formal duplex scanning, or will be discharged without further testing and receive a three-month phone follow-up. A true negative lower leg DVT will be defined as follows: (1) Negative follow-up phone questionnaire for patients who were sent home with no formal duplex venous scanning. (2) Negative formal duplex venous scanning for patients who were deemed likely to have lower leg DVT using the Wells score, with a negative D-dimer and POCUS. Age adjusted DVT was added to account for below knee DVT and avoid the need for patients to return for fellow up duplex study in 1 week. To estimate our technique’s sensitivity with a 4% margin of error with 95% confidence intervals, 92 confirmed DVT patients are needed. We expect to recruit a total 475 patients within one-year period at the JGH (95 DVT-positive patients and 380 DVT-negative patients). Impact: The use of compression bedside ultrasound with a negative age-adjusted D-dimer test to rule out DVT in the ED may accelerate the decision regarding patient disposition and significantly decrease the length of patient stay in the ED. In addition, it may help avoid unnecessary medical interventions and diagnostic tests, thus representing potential quality of care and cost-saving improvements as well.
Introduction: The Institute of Medicine (IOM) has recommended that high-quality, evidence-based guidelines be developed for emergency medical services (EMS). The National Association of EMS Physicians (NAEMSP) has outlined a strategy that will see this task fulfilled, consisting of multiple working groups focused on all aspects of guideline development and implementation. A first step, and our objective, was a cataloguing and appraisal of the current guidelines targeting EMS providers. Methods: A systematic search of the literature was conducted in MEDLINE (1175), EMBASE (519), PubMed (14), Trip (416), and guidelines.gov (64) through May 1, 2016. Two independent reviewers screened titles for relevance to prehospital care, and then abstracts for essential guideline features, including a systematic review, a grading system, and an association between level of evidence and strength of recommendation. All disagreements were moderated by a third party. Citations meeting inclusion criteria were appraised with the AGREE II tool, which looks at six different domains of guideline quality, containing a total of 23 items rated from 1 to 7. Each guideline was appraised by three separate reviewers, and composite scores were calculated by averaging the scaled domain totals. Results: After primary (kappa 97%) and secondary (kappa 93%) screening, 49 guidelines were retained for full review. Only three guidelines obtained a score of >90%, the topics of which included aeromedical transport, analgesia in trauma, and resuscitation of avalanche victims. Only two guidelines scored between 80% and 90%, the topics of which included stroke and pediatric seizure management. One guideline, splinting in an austere environment, scored between 70% and 80%. Nine guidelines scored between 60% and 70%, the topics of which included ischemic stroke, cardiovascular life support, hemorrhage control, intubation, triage, hypothermia, and fibrinolytic use. Of the remaining guidelines, 14 scored between 50% and 60%, and 20 obtained a score of <50%. Conclusion: There are few high-quality, evidence-based guidelines in EMS. Of those that are published, the majority fail to meet established quality measures. Although a lack of randomized controlled trials (RCTs) conducted in the prehospital field continues to limit guideline development, suboptimal methodology is also commonplace within the existing literature.
We describe the performance of the Boolardy Engineering Test Array, the prototype for the Australian Square Kilometre Array Pathfinder telescope. Boolardy Engineering Test Array is the first aperture synthesis radio telescope to use phased array feed technology, giving it the ability to electronically form up to nine dual-polarisation beams. We report the methods developed for forming and measuring the beams, and the adaptations that have been made to the traditional calibration and imaging procedures in order to allow BETA to function as a multi-beam aperture synthesis telescope. We describe the commissioning of the instrument and present details of Boolardy Engineering Test Array’s performance: sensitivity, beam characteristics, polarimetric properties, and image quality. We summarise the astronomical science that it has produced and draw lessons from operating Boolardy Engineering Test Array that will be relevant to the commissioning and operation of the final Australian Square Kilometre Array Path telescope.
To limit tail biting incidence, most pig producers in Europe tail dock their piglets. This is despite EU Council Directive 2008/120/EC banning routine tail docking and allowing it only as a last resort. The paper aims to understand what it takes to fulfil the intentions of the Directive by examining economic results of four management and housing scenarios, and by discussing their consequences for animal welfare in the light of legal and ethical considerations. The four scenarios compared are: ‘Standard Docked’, a conventional housing scenario with tail docking meeting the recommendations for Danish production (0.7 m2/pig); ‘Standard Undocked’, which is the same as ‘Standard Docked’ but with no tail docking, ‘Efficient Undocked’ and ‘Enhanced Undocked’, which have increased solid floor area (0.9 and 1.0 m2/pig, respectively) provision of loose manipulable materials (100 and 200 g/straw per pig per day) and no tail docking. A decision tree model based on data from Danish and Finnish pig production suggests that Standard Docked provides the highest economic gross margin with the least tail biting. Given our assumptions, Enhanced Undocked is the least economic, although Efficient Undocked is better economically and both result in a lower incidence of tail biting than Standard Undocked but higher than Standard Docked. For a pig, being bitten is worse for welfare (repeated pain, risk of infections) than being docked, but to compare welfare consequences at a farm level means considering the number of affected pigs. Because of the high levels of biting in Standard Undocked, it has on average inferior welfare to Standard Docked, whereas the comparison of Standard Docked and Enhanced (or Efficient) Undocked is more difficult. In Enhanced (or Efficient) Undocked, more pigs than in Standard Docked suffer from being tail bitten, whereas all the pigs avoid the acute pain of docking endured by the pigs in Standard Docked. We illustrate and discuss this ethical balance using numbers derived from the above-mentioned data. We discuss our results in the light of the EU Directive and its adoption and enforcement by Member States. Widespread use of tail docking seems to be accepted, mainly because the alternative steps that producers are required to take before resorting to it are not specified in detail. By tail docking, producers are acting in their own best interests. We suggest that for the practice of tail docking to be terminated in a way that benefits animal welfare, changes in the way pigs are housed and managed may first be required.
It is well known that web-based interventions can be effective treatments for depression. However, dropout rates in web-based interventions are typically high, especially in self-guided web-based interventions. Rigorous empirical evidence regarding factors influencing dropout in self-guided web-based interventions is lacking due to small study sample sizes. In this paper we examined predictors of dropout in an individual patient data meta-analysis to gain a better understanding of who may benefit from these interventions.
A comprehensive literature search for all randomized controlled trials (RCTs) of psychotherapy for adults with depression from 2006 to January 2013 was conducted. Next, we approached authors to collect the primary data of the selected studies. Predictors of dropout, such as socio-demographic, clinical, and intervention characteristics were examined.
Data from 2705 participants across ten RCTs of self-guided web-based interventions for depression were analysed. The multivariate analysis indicated that male gender [relative risk (RR) 1.08], lower educational level (primary education, RR 1.26) and co-morbid anxiety symptoms (RR 1.18) significantly increased the risk of dropping out, while for every additional 4 years of age, the risk of dropping out significantly decreased (RR 0.94).
Dropout can be predicted by several variables and is not randomly distributed. This knowledge may inform tailoring of online self-help interventions to prevent dropout in identified groups at risk.
Postnatal depression affects about 10–15% of women in the year after giving birth. Many women and healthcare professionals would like an effective and accessible non-pharmacological treatment for postnatal depression.
Women who fulfilled the International Classification of Diseases (ICD)-10 criteria for major depression in the first 6 months postnatally were randomized to receive usual care plus a facilitated exercise intervention or usual care only. The intervention involved two face-to-face consultations and two telephone support calls with a physical activity facilitator over 6 months to support participants to engage in regular exercise. The primary outcome was symptoms of depression using the Edinburgh Postnatal Depression Scale (EPDS) at 6 months post-randomization. Secondary outcomes included EPDS score as a binary variable (recovered and improved) at 6 and 12 months post-randomization.
A total of 146 women were potentially eligible and 94 were randomized. Of these, 34% reported thoughts of self-harming at baseline. After adjusting for baseline EPDS, analyses revealed a −2.04 mean difference in EPDS score, favouring the exercise group [95% confidence interval (CI) −4.11 to 0.03, p = 0.05]. When also adjusting for pre-specified demographic variables the effect was larger and statistically significant (mean difference = −2.26, 95% CI −4.36 to −0.16, p = 0.03). Based on EPDS score a larger proportion of the intervention group was recovered (46.5% v. 23.8%, p = 0.03) compared with usual care at 6 months follow-up.
This trial shows that an exercise intervention that involved encouragement to exercise and to seek out social support to exercise may be an effective treatment for women with postnatal depression, including those with thoughts of self-harming.
This paper describes the system architecture of a newly constructed radio telescope – the Boolardy engineering test array, which is a prototype of the Australian square kilometre array pathfinder telescope. Phased array feed technology is used to form multiple simultaneous beams per antenna, providing astronomers with unprecedented survey speed. The test array described here is a six-antenna interferometer, fitted with prototype signal processing hardware capable of forming at least nine dual-polarisation beams simultaneously, allowing several square degrees to be imaged in a single pointed observation. The main purpose of the test array is to develop beamforming and wide-field calibration methods for use with the full telescope, but it will also be capable of limited early science demonstrations.
Tail biting is a serious animal welfare and economic problem in pig production. Tail docking, which reduces but does not eliminate tail biting, remains widespread. However, in the EU tail docking may not be used routinely, and some ‘alternative’ forms of pig production and certain countries do not allow tail docking at all. Against this background, using a novel approach focusing on research where tail injuries were quantified, we review the measures that can be used to control tail biting in pigs without tail docking. Using this strict criterion, there was good evidence that manipulable substrates and feeder space affect damaging tail biting. Only epidemiological evidence was available for effects of temperature and season, and the effect of stocking density was unclear. Studies suggest that group size has little effect, and the effects of nutrition, disease and breed require further investigation. The review identifies a number of knowledge gaps and promising avenues for future research into prevention and mitigation. We illustrate the diversity of hypotheses concerning how different proposed risk factors might increase tail biting through their effect on each other or on the proposed underlying processes of tail biting. A quantitative comparison of the efficacy of different methods of provision of manipulable materials, and a review of current practices in countries and assurance schemes where tail docking is banned, both suggest that daily provision of small quantities of destructible, manipulable natural materials can be of considerable benefit. Further comparative research is needed into materials, such as ropes, which are compatible with slatted floors. Also, materials which double as fuel for anaerobic digesters could be utilised. As well as optimising housing and management to reduce risk, it is important to detect and treat tail biting as soon as it occurs. Early warning signs before the first bloody tails appear, such as pigs holding their tails tucked under, could in future be automatically detected using precision livestock farming methods enabling earlier reaction and prevention of tail damage. However, there is a lack of scientific studies on how best to respond to outbreaks: the effectiveness of, for example, removing biters and/or bitten pigs, increasing enrichment, or applying substances to tails should be investigated. Finally, some breeding companies are exploring options for reducing the genetic propensity to tail bite. If these various approaches to reduce tail biting are implemented we propose that the need for tail docking will be reduced.
Soprano Emma Juch (1860–1939), famous in the 1880s and 1890s, combined singing in concerts and festivals with a short English-language operatic career. Because Juch exemplifies a typical prima donna of the late nineteenth century, her life provides a perspective on the American cultural landscape that a focus on star performers cannot capture. Like all female singers, she had to negotiate between competing stereotypes about divas and the nineteenth-century distrust of women who led public lives. In response to these pressures, she constructed an image of a vigorous American singer who nevertheless understood her expected role in society. During the Gilded Age, opera's place in American culture was changing. Foreign-language opera became increasingly associated with wealth, the highest performance quality, and sometimes even cultural and moral uplift, whereas English-language opera suggested popular entertainment for the middle class and mediocre performance standards. The American Opera Company and Juch's own Emma Juch English Grand Opera Company attempted to fight against these assumptions and center opera in English performed by native singers as an important component of a distinctly American musical tradition. She was unsuccessful, however, and Juch's career, which began with great promise, lost momentum after her opera troupe folded and she slid into obscurity.