We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We establish a theory of noncommutative (NC) functions on a class of von Neumann algebras with a particular direct sum property, e.g., $B({\mathcal H})$. In contrast to the theory’s origins, we do not rely on appealing to results from the matricial case. We prove that the $k{\mathrm {th}}$ directional derivative of any NC function at a scalar point is a k-linear homogeneous polynomial in its directions. Consequences include the fact that NC functions defined on domains containing scalar points can be uniformly approximated by free polynomials as well as realization formulas for NC functions bounded on particular sets, e.g., the NC polydisk and NC row ball.
The coronavirus disease 2019 pandemic caused substantial changes to healthcare delivery and antibiotic prescribing beginning in March 2020. To assess pandemic impact on Clostridioides difficile infection (CDI) rates, we described patients and trends in facility-level incidence, testing rates, and percent positivity during 2019–2020 in a large cohort of US hospitals.
Methods:
We estimated and compared rates of community-onset CDI (CO-CDI) per 10,000 discharges, hospital-onset CDI (HO-CDI) per 10,000 patient days, and C. difficile testing rates per 10,000 discharges in 2019 and 2020. We calculated percent positivity as the number of inpatients diagnosed with CDI over the total number of discharges with a test for C. difficile. We used an interrupted time series (ITS) design with negative binomial and logistic regression models to describe level and trend changes in rates and percent positivity before and after March 2020.
Results:
In pairwise comparisons, overall CO-CDI rates decreased from 20.0 to 15.8 between 2019 and 2020 (P < .0001). HO-CDI rates did not change. Using ITS, we detected decreasing monthly trends in CO-CDI (−1% per month, P = .0036) and HO-CDI incidence (−1% per month, P < .0001) during the baseline period, prior to the COVID-19 pandemic declaration. We detected no change in monthly trends for CO-CDI or HO-CDI incidence or percent positivity after March 2020 compared with the baseline period.
Conclusions:
While there was a slight downward trajectory in CDI trends prior to March 2020, no significant change in CDI trends occurred during the COVID-19 pandemic despite changes in infection control practices, antibiotic use, and healthcare delivery.
Background: Previously, we reported decreasing postadmission urine-culture rates in hospitalized patients between 2012 and 2017, indicating a possible decrease in hospital-onset urinary tract infections or changes in diagnostic practices in acute-care hospitals (ACHs). In this study, we re-evaluated the trends using more recent data from 2017–2020 to assess whether new trends in hospital urine-culturing practices had emerged. Method: We conducted a longitudinal analysis of monthly urine-culture rates using microbiology data from 355 ACHs participating in the Premier Healthcare Database in 2017–2020. All cultures from the urinary tract collected on or before day 3 were defined as admission urine cultures and those collected on day 4 or later were defined as postadmission urine cultures. We included discharges from months where a hospital reported at least 1 urine culture with microbiology and antimicrobial susceptibility test results. Annual estimates of rates of admission culture and postadmission urine-culture rates were assessed using general estimating equation models with a negative binomial distribution accounting for hospital-level clustering and adjusting for hospital bed size, teaching status, urban–rural designation, discharge month, and census division. Estimated rate for each year (2018, 2019, and 2020) was compared to previous year’s estimated rate using rate ratios (RRs) and 95% confidence intervals (CIs) generated through the multivariable GEE models. Results: From 2017 to 2020, we included 8.7 million discharges and 1,943,540 urine cultures, of which 299,013 (15.4%) were postadmission urine cultures. In 2017–2020, unadjusted admission culture rates were 20.0, 19.6, 17.9, and 18.2 per 100 discharges respectively; similarly, unadjusted postadmission urine-culture rates were 8.6, 7.8, 7.0, and 7.5 per 1,000 patient days. In the multivariable analysis, adjusting for hospital characteristics, no significant changes in admission urine-culture rates were detected during 2017–2019; however, in 2020, admission urine-culture rates increased 6% compared to 2019 (RR, 1.06; 95% CI, 1.02–1.09) (Fig. 1). Postadmission urine-culture rates decreased 4% in 2018 compared to 2017 (RR, 0.96; 95% CI, 0.91–0.99) and 8% in 2019 compared to 2018 (RR, 0.92; 95% CI, 0.87–0.96). In 2020, postadmission urine-culture rates increased 10% compared to 2019 (RR, 1.10; 95% CI, 1.06–1.14) (Fig. 2). Factors significantly associated with postadmission urine-culture rates included discharge month and hospital bed size. For admission urine cultures, discharge month was the only significant factor. Conclusions: Between 2017–2019, postadmission urine-culture rates continued a decreasing trend, while admission culture rates remained unchanged. However, in 2020 both admission and postadmission urine culture rates increased significantly in comparison to 2019.
Alle Verbesserung im Politischen soll von Veredlung des Charakters ausgehen.
—Schiller, Ästhetische Briefe, 9. Br.
[All improvement in the political sphere is to proceed from the ennobling of character.]
ONE CAN SCARCELY conceive of two authors who were seemingly more diametrically opposed. Christoph Martin Wieland (1733–1813) had everything Heinrich von Kleist (1777–1811) wanted: family, farm, fame, financial security. Composed and orderly, Wieland hardly traveled anywhere, except in his imagination. He moved to Weimar in 1772 and remained there and in its environs until his death in 1813, the head of a large family. He was one of the most celebrated and influential German writers of the Classical era. Kleist was nowhere at home and was not aligned with any particular literary school. He was constantly on the move, desirous of the fame and acceptance that eluded him, tumbling from one crisis to another: a Kantian crisis, a creative, and an existential crisis. Although he penned innovative works of all kinds, he had little success in his lifetime. They met once. Wieland invited Kleist into his home for six weeks in the winter of 1803 and offered solace as well as recognition of his dramatic talent when others in Weimar ignored him. Wieland provided an account of that sojourn in a letter dated April 10, 1804 that early on revealed Kleist's “erratic” behavior.
The old and the new in the chapter's title is multivalent. It refers to the age difference between Wieland and Kleist, the difference in their writing styles, and the difference in their aesthetic concepts. It also refers to the difference in their reactions to the French Revolution (1789–99) and its ultimate aftermath, the Napoleonic Wars (1803–15). Wieland reacted primarily to the former, Kleist to the latter. Wieland was a well-informed, astute observer. Although engaged, Kleist often misjudged situations. The Parteigeist (partisanship) in the chapter's title refers to the attitudinal change occasioned by the Reign of Terror (1793–94). Wieland represents the “old” view of Enlightenment cosmopolitanism and is viewed as a forerunner of the ideals of the European Union, whereas Kleist embodies the “new” view of the end of Enlightenment optimism and can be seen as a precursor of a twentieth-century irrational approach to political questions (Craig, 61, 66, 75).
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
We join Eduardo Bonilla-Silva’s structural theory of the racialized U.S. social system with a situational methodology developed by Arthur L. Stinchcombe and Irving Goffman to analyze how law works as a mechanism that connects formal legal equality with legal cynicism. The data for this analysis come from the trial of a Chicago police detective, Jon Burge, who as leader of an infamous torture squad escaped criminal charges for more than thirty years. Burge was finally charged with perjury and obstruction of justice, charges that obscured and perpetuated the larger structural reality of a code of silence that enabled racist torture of more than a hundred Black men. This case study demonstrates how the non-transparency of courtroom sidebars plays an important role in perpetuating systemic features of American criminal injustice: a code of silence, racist discrimination, and legal cynicism.
To compare Lithium prescribing practices in a Psychiatry of Old Age (POA) Service in the North-West of Ireland among adults aged 65 years and over with best practice guidelines.
Method
Review of the literature informed development of audit standards for Lithium prescribing. These included National Institute for Clinical Excellent (NICE) 2014 guidelines, The British National Formulary (2019) and Maudsley Prescribing Guidelines (2018). Data were collected retrospectively, using an audit-specific data collection tool, from clinical files of POA team caseload, aged 65 years or more and prescribed Lithium over the past one year.
Result
At the time of the audit in February 2020, 18 patients were prescribed lithium, 67% female, average age 74.6 years. Of those prescribed Lithium; 50% (n = 9) had a depression diagnosis, 44% (n = 8) had bipolar affective disorder (BPAD) and 6% (n = 1) had schizoaffective disorder.
78% (n = 14) of patients were on track to meet, or had already met, the NICE standard of 3-monthly serum lithium level. Lithium levels were checked on average 4.5 times in past one year, average lithium level was 0.61mmol/L across the group and 39% (n = 7) had lithium level within recommended therapeutic range (0.6-0.8mmol/L).
83% (n = 15) of patients met the NICE standards of 3 monthly renal tests, thyroid function test was performed in 89% (n = 16) and at least one serum calcium level was documented in 63% (n = 15). Taking into consideration most recent blood test results, 100% (n = 18) had abnormal renal function, 78% (n = 7) had abnormal thyroid function and 60% (n = 9) had abnormal serum calcium.
Half (n = 9) were initiated on lithium by POA service and of these, 56% (n = 5) had documented renal impairment prior to initiation. Of patients on long term lithium therapy at time of referral (n = 9), almost half (n = 4) had a documented history of lithium toxicity.
Conclusion
The results of this audit highlight room for improvement in lithium monitoring of older adults attending POA service. Furthermore, all patients prescribed lithium had impaired renal function, half had abnormal calcium and two fifths had abnormal thyroid function. This is an important finding given the associations between those admitted to hospital with COVID-19 and comorbid kidney disease and increased risk of inpatient death.
Our findings highlight the need for three monthly renal function monitoring in older adults prescribed lithium given the additive adverse effects of increasing age and lithium on the kidney. Close working with specialised renal services to provide timely advice on renal management for those with renal impairment prescribed lithium is important to minimise adverse patient outcomes.
We present continuous estimates of snow and firn density, layer depth and accumulation from a multi-channel, multi-offset, ground-penetrating radar traverse. Our method uses the electromagnetic velocity, estimated from waveform travel-times measured at common-midpoints between sources and receivers. Previously, common-midpoint radar experiments on ice sheets have been limited to point observations. We completed radar velocity analysis in the upper ~2 m to estimate the surface and average snow density of the Greenland Ice Sheet. We parameterized the Herron and Langway (1980) firn density and age model using the radar-derived snow density, radar-derived surface mass balance (2015–2017) and reanalysis-derived temperature data. We applied structure-oriented filtering to the radar image along constant age horizons and increased the depth at which horizons could be reliably interpreted. We reconstructed the historical instantaneous surface mass balance, which we averaged into annual and multidecadal products along a 78 km traverse for the period 1984–2017. We found good agreement between our physically constrained parameterization and a firn core collected from the dry snow accumulation zone, and gained insights into the spatial correlation of surface snow density.
Background: In recent years, the historic declines in the incidence of methicillin-resistant Staphylococcus aureus (MRSA) bloodstream infections (BSIs) in the United States have slowed. We examined trends in the incidence of community-onset (CO) MRSA BSIs among hospitalized persons with and without substance-use diagnoses. Methods: Using data from >200 US hospitals reporting to the Premier Healthcare Database (PHD) during 2012–2017, we conducted a retrospective study among hospitalized persons aged ≥18 years. MRSA BSIs with substance use were defined as hospitalizations having both a blood culture positive for MRSA and at least 1 International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) or ICD-10-CM diagnostic code for substance use including opioids, cocaine, amphetamines, or other substances (excluding cannabis, alcohol, and nicotine). MRSA BSIs were considered community onset when a positive blood culture was collected within 3 days of admission. We assessed annual trends and described characteristics of CO MRSA BSI hospitalizations, stratified by substance use. Results: Of 20,049 MRSA BSIs from 2012 to 2017, 17,634 (88%) were CO. Overall, MRSA BSI incidence decreased 7%, from 178.5 to 166.2 per 100,000 hospitalizations during the study period; However, CO MRSA BSI rates remained stable (152.7 to 149.9 per 100,000 hospitalizations). Among CO MRSA BSIs, 1,838 (10%) were BSIs with substance-use diagnoses; the incidence of CO MRSA BSIs with substance use increased 236% (from 8.2 to 27.6 per 100,000 hospitalizations) and represented a greater proportion of the CO MRSA rate over the study period (Fig. 1). The incidence of CO MRSA BSIs without substance use decreased 15% (from 144.5 to 122.4 per 100,000 hospitalizations). Patients with CO MRSA BSIs with substance use were younger (median, 40 vs 65 years), more likely to be female (50% vs 40%), white (79% vs 69%), and to leave against medical advice (15% vs 1%). Among patients not leaving against medical advice, CO BSI patients with substance-use diagnoses had longer lengths of stay (median, 11 vs 9 days), lower in-hospital mortality (9% vs 14%), and higher hospitalization costs (median, $22,912 vs $17,468) compared to patients without substance-use diagnoses. Conclusions: Although the overall CO MRSA BSI rate remained unchanged from 2012 to 2017, infections with substance use diagnoses increased >3-fold, and infections without substance use diagnoses decreased. These data suggest that the emergence of MRSA associated with substance-use diagnoses threatens potential progress in reducing the incidence of CO MRSA infections. Additional strategies may be needed to prevent MRSA BSI in patients with substance-use diagnoses, and to maintain national progress in the reduction of MRSA infections overall.
Background: Microbiology data are utilized to quantify epidemiology and trends in pathogens, antimicrobial resistance, and bloodstream infections. Understanding variability and trends in rates of hospital-level blood culture utilization may be important for interpreting these findings. Methods: We used clinical microbiology results and discharge data to identify monthly blood culture rates from US hospitals participating in the Premier Healthcare Database during 2012–2017. We included all discharges from months where a hospital reported at least 1 blood culture with microbiology and antimicrobial susceptibility results. Blood cultures drawn on or before day 3 were defined as admission cultures (ACs); blood cultures collected after day 3 were defined as a postadmission cultures (PACs). The AC rate was defined as the proportion of all hospitalizations with an AC. The PAC rate was defined as the number of days with a PAC among all patient days. Generalized estimating equation regression models that accounted for hospital-level clustering with an exchangeable correlation matrix were used to measure associations of monthly rates with hospital bed size, teaching status, urban–rural designation, region, month, and year. The AC rates were modeled using logistic regression, and the PAC rates were modeled using a Poisson distribution. Results: We included 11.7 million hospitalizations from 259 hospitals, accounting for nearly 52 million patient days. The median annual hospital-level AC rate was 27.1%, with interhospital variation ranging from 21.1% (quartile 1) to 35.2% (quartile 3) (Fig. 1). Multivariable models revealed no significant trends over time (P = .74), but statistically significant associations between AC rates with month (P < .001) and region (P = .003), associations with teaching status (P = .063), and urban-rural designation (P = .083) approached statistical significance. There was no association with bed size (P = .38). The median annual hospital-level PAC rate was 11.1 per 1,000 patient days, and interhospital variability ranged from 7.6 (quartile 1) to 15.2 (quartile 3) (Fig. 2). Multivariable models of PAC rates showed no significant trends over time (P = .12). We found associations between PAC rates with month (P = .016), bed size (P = .030), and teaching status (P = .040). PAC rates were not associated with urban–rural designation (P = .52) or region (P = .29). Conclusions: Blood culture utilization rates in this large cohort of hospitals were unchanged between 2012 and 2017, though substantial interhospital variability was detected. Although both AC and PAC rates vary by time of year and potentially by teaching status, AC rates vary by geographic characteristics whereas PAC rates vary by bed size. These factors are important to consider when comparing rates of bloodstream infections by hospital.
Hen Harrier Circus cyaneus and Short-eared Owl Asio flammeus are open-country birds of prey with overlapping distributions. Although both species face similar conservation threats across their ranges, work to date has largely been undertaken at a national scale with few attempts to collate and assess factors relevant to their conservation at an international scale. Here we use an expert knowledge approach to evaluate the impact of conservation threats and the effectiveness of conservation strategies for each species across Europe. We report results of responses to a questionnaire from 23 Hen Harrier experts from nine countries and 12 Short-eared Owl experts from six countries. The majority of responses for both species reported declines in breeding numbers. The perceived impact of threats was broadly similar for both species: ecological factors (predation, extreme weather and prey availability), changes in land use (habitat loss and agricultural intensification) and indirect persecution (accidental nest destruction) were considered to be the greatest threats to breeding Hen Harrier and Short-eared Owl. Short-eared Owl experts also highlighted lack of knowledge and difficulties associated with monitoring as a major conservation challenge. Despite broad-scale similarities, geographical variation was also apparent in the perceived importance of conservation threats, with some threats (such as direct persecution, large-scale afforestation or habitat degradation) requiring country-specific actions. Implementation of different conservation strategies also varied between countries, with the designation of protected areas reported as the most widespread conservation strategy adopted, followed by species and habitat management. However, protected areas (including species-specific protected areas) were perceived to be less effective than active management of species and habitats. These findings highlight the overlap between the conservation requirements of these two species, and the need for collaborative international research and conservation approaches that prioritise pro-active conservation strategies subject to continued assessment and with specific conservation goals.