We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Identify which NIH Toolbox Cognition Battery (NIHTB-CB) subtest(s) best differentiate healthy controls (HC) from those with amnestic mild cognitive impairment (aMCI) and compare the discriminant accuracy between a model using a priori “Norm Adjusted” scores versus “Unadjusted” standard scores with age, sex, race/ethnicity, and education controlled for within the model. Racial differences were also examined.
Methods:
Participants were Black/African American (B/AA) and White consensus-confirmed (HC = 96; aMCI = 62) adults 60–85 years old that completed the NIHTB-CB for tablet. Discriminant function analysis (DFA) was used in the Total Sample and separately for B/AA (n = 80) and White participants (n = 78).
Results:
Picture Sequence Memory (an episodic memory task) was the highest loading coefficient across all DFA models. When stratified by race, differences were noted in the pattern of the highest loading coefficients within the DFAs. However, the overall discriminant accuracy of the DFA models in identifying HCs and those with aMCI did not differ significantly by race (B/AA, White) or model/score type (Norm Adjusted versus Unadjusted).
Conclusions:
Racial differences were noted despite the use of normalized scores or demographic covariates—highlighting the importance of including underrepresented groups in research. While the models were fairly accurate at identifying consensus-confirmed HCs, the models proved less accurate at identifying White participants with an aMCI diagnosis. In clinical settings, further work is needed to optimize computerized batteries and the use of NIHTB-CB norm adjusted scores is recommended. In research settings, demographically corrected scores or within model correction is suggested.
Young, older, pregnant, and immunocompromised (YOPI) people are most vulnerable to foodborne illnesses due to impaired or underdeveloped immune systems(1). There is a lack of information regarding how YOPI groups access, receive or use information about food safety, what influences their food safety behaviour, and their preferences for receiving food safety advice. The objective of this research was to develop a better understanding of how YOPI consumers in New Zealand access and use food safety information, the types and sources of food safety information used, how information and advice are obtained, and how these influence their decision-making about food safety practices and related behaviours. Research questions were guided by a rapid review of literature. Twenty qualitative focus groups (comprising of either young, old, pregnant, or immunocompromised individuals) based in one of three locations in New Zealand were conducted. This was complemented with data from health care providers from relevant sectors (nutritionists, dietitians, aged care providers, cancer nurses, Well Child Tamariki Ora providers, and midwives). Recruitment included a focus on ethnic groups (Māori and Pasifika) to ensure diversity of experiences and perspectives were represented in the research and to reflect NZFS’s interest in developing fit-for-purpose messages and resources for these YOPI populations. Thematic and segmentation analysis was conducted to understand current food safety behaviours and how to best communicate food safety matters. Typologies of participants were developed by grouping participants based on common features: attitudes, beliefs, and experiences. The research revealed most participants are comfortable with their food safety practices and reported habitual behaviours. Many YOPI did not perceive themselves to be at a greater risk of foodborne illness, particularly older people. A key finding was that access to information does not necessarily lead to behaviour change. Groups undergoing periods of change (immunocompromised, pregnant and young) were more likely to seek additional information. Families and health professionals are trusted sources of information, with all groups reporting some use of the internet as an information source. An individual’s risk perception was the main motivating factor for obtaining and following advice. Habit, cost of food, and lack of information were key barriers to obtaining or acting on information, along with pregnant people reporting social pressures as a reason to not obtain or act on relevant advice. In general, there are three key types of food-safety messaging all groups would like to receive: situation-specific advice; information received alongside other key information (e.g., starting solids); and general information for the whole population. Gaining insights into YOPI preferences on food safety matters can aid the development of appropriate communication and engagement methods of the risks and impacts of food safety matters to vulnerable people.
Dietary therapies have revolutionised treatment for irritable bowel syndrome (IBS). However, response rates to the diet with the highest evidence of efficacy (the low FODMAP diet) remain at 50-75%, suggesting other potential drivers of symptom onset. A low food chemical elimination-rechallenge diet targeting bioactive food chemicals (including salicylates, amines, glutamate and other additives), is commonly applied in Australia in patients exhibiting both gastrointestinal and extra-intestinal symptoms. One key food chemical, salicylate, has been shown to elicit symptoms in IBS patients with aspirin-sensitivity(1), and 77% of IBS patients have reported amine-rich foods trigger symptoms(2). However, data supporting the full low chemical diet is scant, and safety concerns exist due to its restrictive nature potentially causing nutritional deficiencies and disordered eating. This cross-sectional survey aimed to evaluate the frequency of co-existing extra-intestinal symptoms, as well as explore patient perceptions and use of the low chemical diet in those with IBS and healthy controls. Participants with IBS (IBS-Severity Scoring System (IBS-SSS) >75), and healthy controls (not meeting Rome IV and IBS-SSS ≤75) were recruited via online advertisement. Validated questionnaires were used to assess gastrointestinal symptoms (IBS-SSS), extraintestinal symptoms (extended PHQ-12), nutrient (Comprehensive Nutritional Assessment Tool) and food additive intake (IBD-Food additive questionnaire). Additional questionnaires assessed use of dietary therapies with specific focus on food chemicals. Data was analysed using independent samples t-test and chi-square test. 204 IBS (Total IBS-SSS, 277 ± 79) and 22 healthy controls (36 ± 28, p<0.01) completed the study. IBS participants were more likely to report extra-intestinal symptoms including headaches (p<0.01), migraines (p = 0.03), fatigue (p<0.01), difficulty sleeping (p = 0.03), rhinitis (p = 0.02), urticaria (p = 0.04) and mood disturbance (p<0.01). IBS participants were more likely to report at least one food chemical as a trigger for gastrointestinal (38% vs 13%, p = 0.03) and/or extra-intestinal (30% vs 9%, p = 0.04) symptoms. In the IBS group, the most common suspected dietary triggers for gastrointestinal symptoms were salicylates (19%) followed by MSG (17%) and artificial colours (14%); while for extra-intestinal symptoms, MSG (15%) was most common, followed by amines (14%), and sulphites (12%). There was no significant difference in consumption of ultra-processed, additive containing foods. Twenty-one (10%) IBS participants were following a low chemical diet, with dietary advice provided by a dietitian (n = 13), general practitioner (n = 6), gastroenterologist (n = 6), naturopath (n = 3), family/friend (n = 4) and/or the diet was self-initiated (n = 7). Fourteen of the 21 (67%) reported following both a low food chemical and low FODMAP diet. Patients with IBS are more likely to report extra-intestinal symptoms compared to healthy controls. Despite limited evidence, a low food chemical diet is utilised to manage both gastrointestinal and extra-intestinal symptoms. Of concern, many respondents following a low food chemical diet reported also following a low FODMAP diet, which may have implications for nutritional adequacy.
We consider planar flow involving two viscous fluids in a porous medium. One fluid is injected through a line source at the origin and moves radially outwards, pushing the second, ambient fluid outwards. There is an interface between the two fluids and if the inner injected fluid is of lower viscosity, the interface is unstable to small disturbances and radially directed unstable Saffman–Taylor fingers are produced. A linearized theory is presented and is compared with nonlinear results obtained using a numerical spectral method. An additional theory is also discussed, in which the sharp interface is replaced with a narrow diffuse interfacial region. We show that the nonlinear results are in close agreement with the linearized theory for small-amplitude disturbances at early times, but that large-amplitude fingers develop at later times and can even detach completely from the initial injection region.
In research, and particularly clinical trials, it is important to identify persons at high risk for developing Alzheimer’s Disease (AD), such as those with Mild Cognitive Impairment (MCI). However, not all persons with this diagnosis have a high risk of AD as MCI can be broken down further into amnestic MCI (aMCI), who have a high risk specifically for AD, and non-amnestic MCI (naMCI), who are predominantly at risk for other dementias. People with aMCI largely differ from healthy controls and naMCI on memory tasks as it is the hallmark criteria for an amnestic diagnosis. Given the growing use of the NIH Toolbox Cognition battery in research trials, this project investigated which Toolbox Cognition measures best differentiated aMCI from naMCI and in comparison to persons with normal cognition.
Participants and Methods:
A retrospective data analysis was conducted investigating performance on NIH Toolbox Cognition tasks among 199 participants enrolled in the Michigan Alzheimer’s Disease Research Center. All participants were over age 50 (51-89 years, M=70.64) and had a diagnosis of aMCI (N=74), naMCI (N=24), or Normal Cognition (N=101). Potential demographic differences were investigated using chi-square and ANOVAs. Repeated measure general linear model was used to look at potential group differences in Toolbox Cognition performance, covarying for age which was statistically different in aMCI versus Normal participants. Linear regression was used to determine which cognitive abilities, as measured by the Uniform Data Set-3 (UDS3), might contribute to Toolbox differences noted in naMCI versus aMCI groups.
Results:
As expected, aMCI had lower Toolbox memory scores compared to naMCI (p=0.007) and Normals (p<0.001). Interestingly, naMCI had lower Oral Reading scores than both aMCI (p=0.008) and Normals (p<0.001). There were no other Toolbox performance differences between the MCI groups. 19.4% of the variance in Oral Reading scores was explained by performance on the following UDS3 measures: Benson delayed recall (inverse relationship) and backward digit span and phonemic fluency (positive relationship).
Conclusions:
In this study, Toolbox Picture Sequence Memory and Oral Reading scores differentiated aMCI and naMCI groups. While the difference in memory was expected, it was surprising that the naMCI group performed worse than the aMCI and normal groups on the Toolbox Oral Reading task, a task presumed to reflect Crystalized abilities resistive to cognitive decline. Results suggest that Oral Reading is primarily positively associated with working memory and executive tasks from the UDS3, but negatively associated with visual memory. It is possible that the Oral Reading subtest is sensitive to domains of deficit aside from memory that can best distinguish aMCI from naMCI. A better understanding of the underlying features in the Oral Reading task will assist in better characterizing deficit patterns seen in naMCI, making selection of aMCI participants more effective in clinical trials.
Herbicide-resistant annual bluegrass (Poa annua L.) has become a problem in non-arable land areas. In arable fields, P. annua is frequently of lower priority in weed control program due to the variety of control options available and the relatively modest impact on crop yield compared with other species. In Ireland, postemergence herbicides are not primarily intended for P. annua control, but some herbicides, including the acetolactate synthase (ALS) inhibitor mesosulfuron-methyl + iodosulfuron-methyl, exhibit P. annua activity. In this study, a suspected P. annua population (POAAN-R) that survived mesosulfuron-methyl + iodosulfuron-methyl at 0.75 of the field recommended rate was sampled from a wheat (Triticum aestivum L.) field in County Dublin, Ireland. Single-dose testing confirmed that the suspected POAAN-R had evolved resistance to mesosulfuron-methyl + iodosulfuron-methyl and, additionally, to pyroxsulam (not registered in Ireland for P. annua control), but was sensitive to clethodim, glyphosate, pendimethalin, and flufenacet. Dose–response experiments indicated that POAAN-R was more resistant (GR50 resistance index) to both mesosulfuron-methyl + iodosulfuron-methyl (47.8 times) and pyroxsulam (38.0 times) than sensitive POAAN-S, and this was associated with the mutation at Trp-574 in the ALS protein. Malathion (a cytochrome P450 [P450] inhibitor) pretreatment did not reverse POAAN-R resistance to mesosulfuron-methyl + iodosulfuron-methyl or pyroxsulam at the field rate or above. The natural inherent mutation at Ile-1781 in acetyl-CoA carboxylase protein had no effect on both POAAN-R and POAAN-S sensitivity to clethodim. The glyphosate sensitivity of POAAN-R also corresponded with no known mutation in 5-enolpyruvylshikimate-3-phosphate synthase protein. Based on field histories, poor early-season weed control coupled with intensive use of mesosulfuron-methyl + iodosulfuron-methyl (often at reduced rates) has unintentionally selected for ALS inhibitor–resistant POAAN-R. This is the first report to characterize resistance in P. annua to ALS-inhibiting herbicides mesosulfuron-methyl + iodosulfuron-methyl and pyroxsulam in an arable setting. There is an opportunity to effectively control POAAN-R using herbicides, but this needs a wide-ranging and varied approach, coupled with cultural/nonchemical practices.
Clinical trials face many challenges with meeting projected enrollment and retention goals. A study’s recruitment materials and messaging convey necessary key information and therefore serve as a critical first impression with potential participants. Yet study teams often lack the resources and skills needed to develop engaging, culturally tailored, and professional-looking recruitment materials. To address this gap, the Recruitment Innovation Center recently developed a Recruitment & Retention Materials Content and Design Toolkit, which offers research teams guidance, actionable tips, resources, and customizable templates for creating trial-specific study materials. This paper seeks to describe the creation and contents of this new toolkit.
Over the past 2 decades, several categorizations have been proposed for the abnormalities of the aortic root. These schemes have mostly been devoid of input from specialists of congenital cardiac disease. The aim of this review is to provide a classification, from the perspective of these specialists, based on an understanding of normal and abnormal morphogenesis and anatomy, with emphasis placed on the features of clinical and surgical relevance. We contend that the description of the congenitally malformed aortic root is simplified when approached in a fashion that recognizes the normal root to be made up of 3 leaflets, supported by their own sinuses, with the sinuses themselves separated by the interleaflet triangles. The malformed root, usually found in the setting of 3 sinuses, can also be found with 2 sinuses, and very rarely with 4 sinuses. This permits description of trisinuate, bisinuate, and quadrisinuate variants, respectively. This feature then provides the basis for classification of the anatomical and functional number of leaflets present. By offering standardized terms and definitions, we submit that our classification will be suitable for those working in all cardiac specialties, whether pediatric or adult. It is of equal value in the settings of acquired or congenital cardiac disease. Our recommendations will serve to amend and/or add to the existing International Paediatric and Congenital Cardiac Code, along with the Eleventh iteration of the International Classification of Diseases provided by the World Health Organization.
Background: Our aim was to develop a National Quality Indicators Set for the Care of Adults Hospitalized for Neurological Problems, to serve as a foundation to build regional or national quality initiatives in Canadian neurology centres. Methods: We used a national eDelphi process to develop a suite of quality indicators and a parallel process of surveys and patient focus groups to identify patient priorities. Canadian content and methodology experts were invited to participate. To be included, >70% of participants had to rate items as critical and <15% had to rate it as not important. Two rounds of surveys and consensus meetings were used identify and rank indicators, followed by national consultation with members of the Canadian Neurological Society. Results: 38 neurologists and methodologists and 56 patients/caregivers participated in this project. An initial list of 91 possible quality indicators was narrowed to 40 indicators across multiple categories of neurological conditions. 21 patient priorities were identified. Conclusions: This quality indicators suite can be used regionally or nationally to drive improvement initiatives for inpatient neurology care. In addition, we identified multiple opportunities for further research where evidence was lacking or patient and provider priorities did not align.
Ocean-driven melt of Antarctic ice shelves is an important control on mass loss from the ice sheet, but is complex to study due to significant variability in melt rates both spatially and temporally. Here we assess the strengths and weakness of satellite and field-based observations as tools for testing models of ice-shelf melt. We discuss how the complementary use of field, satellite and model data can be a powerful but underutilised tool for studying melt processes. Finally, we identify some community initiatives working to collate and publish coordinated melt rate datasets, which can be used in future for validating satellite-derived maps of melt and evaluating processes in numerical simulations.
Neurological involvement associated with SARS-CoV-2 infection is increasingly recognized. However, the specific characteristics and prevalence in pediatric patients remain unclear. The objective of this study was to describe the neurological involvement in a multinational cohort of hospitalized pediatric patients with SARS-CoV-2.
Methods:
This was a multicenter observational study of children <18 years of age with confirmed SARS-CoV-2 infection or multisystemic inflammatory syndrome (MIS-C) and laboratory evidence of SARS-CoV-2 infection in children, admitted to 15 tertiary hospitals/healthcare centers in Canada, Costa Rica, and Iran February 2020–May 2021. Descriptive statistical analyses were performed and logistic regression was used to identify factors associated with neurological involvement.
Results:
One-hundred forty-seven (21%) of 697 hospitalized children with SARS-CoV-2 infection had neurological signs/symptoms. Headache (n = 103), encephalopathy (n = 28), and seizures (n = 30) were the most reported. Neurological signs/symptoms were significantly associated with ICU admission (OR: 1.71, 95% CI: 1.15–2.55; p = 0.008), satisfaction of MIS-C criteria (OR: 3.71, 95% CI: 2.46–5.59; p < 0.001), fever during hospitalization (OR: 2.15, 95% CI: 1.46–3.15; p < 0.001), and gastrointestinal involvement (OR: 2.31, 95% CI: 1.58–3.40; p < 0.001). Non-headache neurological manifestations were significantly associated with ICU admission (OR: 1.92, 95% CI: 1.08–3.42; p = 0.026), underlying neurological disorders (OR: 2.98, 95% CI: 1.49–5.97, p = 0.002), and a history of fever prior to hospital admission (OR: 2.76, 95% CI: 1.58–4.82; p < 0.001).
Discussion:
In this study, approximately 21% of hospitalized children with SARS-CoV-2 infection had neurological signs/symptoms. Future studies should focus on pathogenesis and long-term outcomes in these children.
Virtual reality has emerged as a unique educational modality for medical trainees. However, incorporation of virtual reality curricula into formal training programmes has been limited. We describe a multi-centre effort to develop, implement, and evaluate the efficacy of a virtual reality curriculum for residents participating in paediatric cardiology rotations.
Methods:
A virtual reality software program (“The Stanford Virtual Heart”) was utilised. Users are placed “inside the heart” and explore non-traditional views of cardiac anatomy. Modules for six common congenital heart lesions were developed, including narrative scripts. A prospective case–control study was performed involving three large paediatric residency programmes. From July 2018 to June 2019, trainees participating in an outpatient cardiology rotation completed a 27-question, validated assessment tool. From July 2019 to February 2020, trainees completed the virtual reality curriculum and assessment tool during their cardiology rotation. Qualitative feedback on the virtual reality experience was also gathered. Intervention and control group performances were compared using univariate analyses.
Results:
There were 80 trainees in the control group and 52 in the intervention group. Trainees in the intervention group achieved higher scores on the assessment (20.4 ± 2.9 versus 18.8 ± 3.8 out of 27 questions answered correctly, p = 0.01). Further analysis showed significant improvement in the intervention group for questions specifically testing visuospatial concepts. In total, 100% of users recommended integration of the programme into the residency curriculum.
Conclusions:
Virtual reality is an effective and well-received adjunct to clinical curricula for residents participating in paediatric cardiology rotations. Our results support continued virtual reality use and expansion to include other trainees.
Understanding how cardiovascular structure and physiology guide management is critically important in paediatric cardiology. However, few validated educational tools are available to assess trainee knowledge. To address this deficit, paediatric cardiologists and fellows from four institutions collaborated to develop a multimedia assessment tool for use with medical students and paediatric residents. This tool was developed in support of a novel 3-dimensional virtual reality curriculum created by our group.
Methods:
Educational domains were identified, and questions were iteratively developed by a group of clinicians from multiple centres to assess understanding of key concepts. To evaluate content validity, content experts completed the assessment and reviewed items, rating item relevance to educational domains using a 4-point Likert scale. An item-level content validity index was calculated for each question, and a scale-level content validity index was calculated for the assessment tool, with scores of ≥0.78 and ≥0.90, respectively, representing excellent content validity.
Results:
The mean content expert assessment score was 92% (range 88–97%). Two questions yielded ≤50% correct content expert answers. The item-level content validity index for 29 out of 32 questions was ≥0.78, and the scale-level content validity index was 0.92. Qualitative feedback included suggestions for future improvement. Questions with ≤50% content expert agreement and item-level content validity index scores <0.78 were removed, yielding a 27-question assessment tool.
Conclusions:
We describe a multi-centre effort to create and validate a multimedia assessment tool which may be implemented within paediatric trainee cardiology curricula. Future efforts may focus on content refinement and expansion to include additional educational domains.
Approximately 10% of patients report allergies to penicillin, yet >90% of these allergies are not clinically significant. Patients reporting penicillin allergies are often treated with second-line, non–β-lactam antibiotics that are typically broader spectrum and more toxic. Orders for β-lactam antibiotics for these patients trigger interruptive alerts, even when there is electronic health record (EHR) data indicating prior β-lactam exposure.
Objective:
To describe the rate that interruptive penicillin allergy alerts display for patients who have previously had a β-lactam exposure.
Design:
Retrospective EHR review from January 2013 through June 2018.
Setting:
A nonprofit health system including 1 large tertiary-care medical center, a smaller associated hospital, 2 emergency departments, and ˜250 outpatient clinics.
Participants:
All patients with EHR-documented of penicillin allergies.
Methods:
We examined interruptive penicillin allergy alerts and identified the number and percentage of alerts that display for patients with a prior administration of a penicillin class or other β-lactam antibiotic.
Results:
Of 115,081 allergy alerts that displayed during the study period, 8% were displayed for patients who had an inpatient administration of a penicillin antibiotic after the allergy was noted, and 49% were displayed for patients with a prior inpatient administration of any β-lactam.
Conclusions:
Many interruptive penicillin allergy alerts display for patients who would likely tolerate a penicillin, and half of all alerts display for patients who would likely tolerate another β-lactam.
Cognitive developmental research continues to shift from a mechanistic paradigm toward a more contextualized approach, especially in the search to uncover contextual factors that may play a role in cognitive development (see Rogoff, Dahl, & Callahan, 2018). This is certainly the case in the memory literature, where there exists rich documentation of children’s memory skills, but less research on the origins of mnemonic strategies and how they are supported by contextual aspects of children’s everyday lives. This chapter builds on the existing literature on children’s deliberate memory and strategy use and highlights one exemplar of this shift, namely the evolution of a program of research by Ornstein, Coffman and colleagues, the Classroom Memory Study. This collaborative work began as an effort to characterize children’s changing skills over time while simultaneously working to identify mechanisms in the elementary classroom context that may underlie children’s developing strategies for remembering – and has now evolved to include an examination of other cognitive outcomes as well as the development of experimental manipulations that can lead to teacher interventions that may facilitate children’s cognitive growth.
Higher consumption of ‘ultra-processed’ (UP) foods has been linked to adverse health outcomes. The present paper aims to characterise percentage energy from UP foods by participant socio-economic status (SES), diet quality, self-reported food expenditure and energy-adjusted diet cost. Participants in the population-based Seattle Obesity Study III (n 755) conducted in WA in 2016–2017 completed socio-demographic and food expenditure surveys and the FFQ. Education and residential property values were measures of SES. Retail prices of FFQ component foods (n 378) were used to estimate individual-level diet cost. Healthy Eating Index (HEI-2015) and Nutrient Rich Food Index 9.3 (NRF9.3) were measures of diet quality. UP foods were identified following NOVA classification. Multivariable linear regressions were used to test associations between UP foods energy, socio-demographics, two estimates of food spending and diet quality measures. Higher percentage energy from UP foods was associated with higher energy density, lower HEI-2015 and NRF9.3 scores. The bottom decile of diet cost ($216·4/month) was associated with 67·5 % energy from UP foods; the top decile ($369·9/month) was associated with only 48·7 % energy from UP foods. Percentage energy from UP foods was inversely linked to lower food expenditures and diet cost. In multivariate analysis, percentage energy from UP foods was predicted by lower food expenditures, diet cost and education, adjusting for covariates. Percentage energy from UP foods was linked to lower food spending and lower SES. Efforts to reduce UP foods consumption, an increasingly common policy measure, need to take affordability, food expenditures and diet costs into account.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.