We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Nasal septoplasty is one of the most performed procedures within ENT. Nasal obstruction secondary to a deviated nasal septum is the primary indication for functional septoplasty. Since the coronavirus disease 2019 pandemic, waiting lists have increased and are now long. This study assessed patients on the waiting list for septoplasty and/or inferior turbinate reduction surgery using the Nasal Obstruction Symptom Evaluation instrument.
Method
Patients on our waiting list for septoplasty and/or inferior turbinate reduction surgery were reviewed using a validated patient-reported outcome measure tool to assess symptom severity.
Results
Eighty-six out of a total of 88 patients (98 per cent) had Nasal Obstruction Symptom Evaluation scores of 30 or more. In addition, 78 (89 per cent) and 50 (57 per cent) patients were classified as having ‘severe’ or ‘extreme’ nasal obstruction, respectively. Two patients scored less than 30 and were classified as having non-significant nasal obstruction.
Conclusion
The Nasal Obstruction Symptom Evaluation instrument is a quick and easy way to validate septoplasty waiting lists. In this study, two patients were identified who no longer required surgery.
White matter hyperintensity (WMH) burden is greater, has a frontal-temporal distribution, and is associated with proxies of exposure to repetitive head impacts (RHI) in former American football players. These findings suggest that in the context of RHI, WMH might have unique etiologies that extend beyond those of vascular risk factors and normal aging processes. The objective of this study was to evaluate the correlates of WMH in former elite American football players. We examined markers of amyloid, tau, neurodegeneration, inflammation, axonal injury, and vascular health and their relationships to WMH. A group of age-matched asymptomatic men without a history of RHI was included to determine the specificity of the relationships observed in the former football players.
Participants and Methods:
240 male participants aged 45-74 (60 unexposed asymptomatic men, 60 male former college football players, 120 male former professional football players) underwent semi-structured clinical interviews, magnetic resonance imaging (structural T1, T2 FLAIR, and diffusion tensor imaging), and lumbar puncture to collect cerebrospinal fluid (CSF) biomarkers as part of the DIAGNOSE CTE Research Project. Total WMH lesion volumes (TLV) were estimated using the Lesion Prediction Algorithm from the Lesion Segmentation Toolbox. Structural equation modeling, using Full-Information Maximum Likelihood (FIML) to account for missing values, examined the associations between log-TLV and the following variables: total cortical thickness, whole-brain average fractional anisotropy (FA), CSF amyloid ß42, CSF p-tau181, CSF sTREM2 (a marker of microglial activation), CSF neurofilament light (NfL), and the modified Framingham stroke risk profile (rFSRP). Covariates included age, race, education, APOE z4 carrier status, and evaluation site. Bootstrapped 95% confidence intervals assessed statistical significance. Models were performed separately for football players (college and professional players pooled; n=180) and the unexposed men (n=60). Due to differences in sample size, estimates were compared and were considered different if the percent change in the estimates exceeded 10%.
Results:
In the former football players (mean age=57.2, 34% Black, 29% APOE e4 carrier), reduced cortical thickness (B=-0.25, 95% CI [0.45, -0.08]), lower average FA (B=-0.27, 95% CI [-0.41, -.12]), higher p-tau181 (B=0.17, 95% CI [0.02, 0.43]), and higher rFSRP score (B=0.27, 95% CI [0.08, 0.42]) were associated with greater log-TLV. Compared to the unexposed men, substantial differences in estimates were observed for rFSRP (Bcontrol=0.02, Bfootball=0.27, 994% difference), average FA (Bcontrol=-0.03, Bfootball=-0.27, 802% difference), and p-tau181 (Bcontrol=-0.31, Bfootball=0.17, -155% difference). In the former football players, rFSRP showed a stronger positive association and average FA showed a stronger negative association with WMH compared to unexposed men. The effect of WMH on cortical thickness was similar between the two groups (Bcontrol=-0.27, Bfootball=-0.25, 7% difference).
Conclusions:
These results suggest that the risk factor and biological correlates of WMH differ between former American football players and asymptomatic individuals unexposed to RHI. In addition to vascular risk factors, white matter integrity on DTI showed a stronger relationship with WMH burden in the former football players. FLAIR WMH serves as a promising measure to further investigate the late multifactorial pathologies of RHI.
The relationship between costs and health benefits of branded pharmaceuticals remains controversial. This paper examines the incremental costs incurred for incremental health benefits gained from the largest available sample of cost-effectiveness studies of branded drugs in the USA, the 1994–2015 Tufts Registry of Cost-Effectiveness Analyses. Earlier studies used small, specialized samples of drugs. We use linear regression analysis to estimate the association in those studies between additional quality-adjusted life years (QALYs) and incremental pharmaceutical costs. The preferred sample uses 476 studies involving branded pharmaceuticals with both higher costs and increased effectiveness compared to the previous standard of care. Regressions of costs on QALYs imply that an additional QALY is associated, on average, with a $28,561 increase in cost (95 % CI, $18,853–$38,270). This regression explains 20 % of the variation in sample costs. In this analytical sample, a share of the variation in the cost of pharmaceuticals is, therefore, not random but rather associated with variation in QALYs; prices are to some extent “value-based.” Our results are robust to varying sample inclusion criteria and to the funding source. In subgroup analyses, the highest cost per QALY was $44,367 (95 % CI, $35,373–$53,361). Costs of pharmaceuticals in this data set are, on average, lower than common estimates of the monetary value of a QALY to American consumers. As in other studies, we find that sellers of patent-protected beneficial new technology appear to capture only a fraction of the benefits provided.
Ketamine, an N-methyl-d-aspartate receptor antagonist, has been “repurposed” as a rapid-acting antidepressant for treatment-resistant depression (TRD). The s-enantiomer of ketamine, “esketamine,” was FDA approved for TRD and depressive symptoms in adults with major depressive disorder with suicidal ideations/behaviors. Intravenous (IV) ketamine, although financially less expensive, is often not covered by insurance and intranasal (IN) esketamine, although covered by insurance can be expensive. There is a paucity of literature on efficacy data comparing subanesthetic IV ketamine and IN esketamine for TRD in a real-world scenario. Thus, we conducted this study comparing the efficacy and the number of treatments required to achieve remission/response with repeated use of subanesthetic IV ketamine/IN esketamine among TRD patients.
Methods
This was an observational study where we included adults (≥18 years) with TRD who provided consent and had received up to 6 IV ketamine infusions (0.5 mg/kg, infused over 40 minutes) or up to 8 intranasal (IN) esketamine (56/84 mg) treatments for TRD at the Mayo Clinic Depression Center. Depression symptoms were measured utilizing the self-report 16-Item Quick Inventory of Depressive Symptomatology (QIDS-SR) scale before and 24 hours after ketamine/esketamine treatment. Remission and response were defined as QIDS-SR 16 score ≤5 and ≥50% change in QIDS-SR 16, respectively. Continuous variables are reported as means ± SD and categorical variables as counts and percentages. The Wilcoxon rank-sum test was used to compare continuous variables. Chi-square and Fisher’s exact tests were used to compare categorical variables. The number of treatments to remission/response was calculated.
Results
Sixty-three adults with TRD, middle-aged (47.0 ± 12.1 years), predominantly female (65%), of which 76% (n = 48) and 24% (n = 15) received IV ketamine and IN esketamine, respectively. Mean (SE) change in QIDS-SR 16 score was −8.7 ± 0.7 (P < .001), a significant reduction (improvement) from baseline (mean ± SD = 17.6 ± 3.7). Overall remission and response rates were 36.5% and 55.6%, respectively in the acute phase. Response (56.3% vs 53.3%) and remission rates (39.6% vs 26.7%) were similar among patients who received IV ketamine or IN esketamine, respectively (P > .05). The mean number of treatments received to achieve response (2.5 ± 1.6 vs 4.6 ± 2.1) and remission (2.4 ± 1.3 vs 6.3 ± 2.4) were significantly lower among patients who received IV ketamine compared to IN esketamine (P < .005). Most patients tolerated both treatments well.
Conclusion
Intravenous ketamine and intranasal esketamine showed similar response/remission in TRD patients but the number of treatments required to achieve response/remission was significantly lower with IV ketamine compared to IN esketamine. These findings need to be investigated in a randomized control trial comparing these two treatment interventions.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
In 2019, a 42-year-old African man who works as an Ebola virus disease (EVD) researcher traveled from the Democratic Republic of Congo (DRC), near an ongoing EVD epidemic, to Philadelphia and presented to the Hospital of the University of Pennsylvania Emergency Department with altered mental status, vomiting, diarrhea, and fever. He was classified as a “wet” person under investigation for EVD, and his arrival activated our hospital emergency management command center and bioresponse teams. He was found to be in septic shock with multisystem organ dysfunction, including circulatory dysfunction, encephalopathy, metabolic lactic acidosis, acute kidney injury, acute liver injury, and diffuse intravascular coagulation. Critical care was delivered within high-risk pathogen isolation in the ED and in our Special Treatment Unit until a diagnosis of severe cerebral malaria was confirmed and EVD was definitively excluded.
This report discusses our experience activating a longitudinal preparedness program designed for rare, resource-intensive events at hospitals physically remote from any active epidemic but serving a high-volume international air travel port-of-entry.
Surgery for CHD has been slow to develop in parts of the former Soviet Union. The impact of an 8-year surgical assistance programme between an emerging centre and a multi-disciplinary international team that comprised healthcare professionals from developed cardiac programmes is analysed and presented.
Material and methods
The international paediatric assistance programme included five main components – intermittent clinical visits to the site annually, medical education, biomedical engineering support, nurse empowerment, and team-based practice development. Data were analysed from visiting teams and local databases before and since commencement of assistance in 2007 (era A: 2000–2007; era B: 2008–2015). The following variables were compared between periods: annual case volume, operative mortality, case complexity based on Risk Adjustment for Congenital Heart Surgery (RACHS-1), and RACHS-adjusted standardised mortality ratio.
Results
A total of 154 RACHS-classifiable operations were performed during era A, with a mean annual case volume by local surgeons of 19.3 at 95% confidence interval 14.3–24.2, with an operative mortality of 4.6% and a standardised mortality ratio of 2.1. In era B, surgical volume increased to a mean of 103.1 annual cases (95% confidence interval 69.1–137.2, p<0.0001). There was a non-significant (p=0.84) increase in operative mortality (5.7%), but a decrease in standardised mortality ratio (1.2) owing to an increase in case complexity. In era B, the proportion of local surgeon-led surgeries during visits from the international team increased from 0% (0/27) in 2008 to 98% (58/59) in the final year of analysis.
Conclusions
The model of assistance described in this report led to improved adjusted mortality, increased case volume, complexity, and independent operating skills.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
Flumioxazin is a protoporphyrinogen oxidase inhibitor with potential for POST annual bluegrass control and PRE smooth crabgrass control in bermudagrass. However, flumioxazin applications are often less effective in winter, compared with fall, because of reduced efficacy on mature annual bluegrass. The objective of this research was to evaluate tank-mixtures of flumioxazin with six other herbicide mechanisms of action for POST annual bluegrass control in late winter and residual smooth crabgrass control. Flumioxazin at 0 or 0.42 kg ai ha−1 was evaluated in combination with flazasulfuron at 0.05 kg ai ha−1, glufosinate at 1.26 kg ai ha−1, glyphosate at 0.42 kg ae ha−1, mesotrione at 0.28 kg ai ha−1, pronamide at 1.68 kg ai ha−1, or simazine at 1.12 kg ai ha−1. Flumioxazin alone controlled annual bluegrass 61 to 70% at 8 wk after treatment (WAT) in three experiments from 2012 to 2014 in central Georgia. Flumioxazin tank-mixed with flazasulfuron, glyphosate, glufosinate, pronamide, and simazine provided good (80 to 89%) to excellent (> 90%) control of annual bluegrass at 8 WAT in 2 of 3 yr. These tank-mixtures were also more effective than flumioxazin alone in 2 of 3 yr, and control was greater or equal to the tank-mix partners applied alone. Treatments that included flumioxazin provided excellent (≥ 90%) control of smooth crabgrass at 6 mo after treatment in all 3 yr. Overall, tank-mixing flumioxazin with other herbicide chemistries may improve POST annual bluegrass control, compared with exclusive treatments, and effectively control smooth crabgrass in bermudagrass.