To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Understanding spatial variation in origination and extinction can help to unravel the mechanisms underlying macroevolutionary patterns. Although methods have been developed for estimating global origination and extinction rates from the fossil record, no framework exists for applying these methods to restricted spatial regions. Here, we test the efficacy of three metrics for regional analysis, using simulated fossil occurrences. These metrics are then applied to the marine invertebrate record of the Permian and Triassic to examine variation in extinction and origination rates across latitudes. Extinction and origination rates were generally uniform across latitudes for these time intervals, including during the Capitanian and Permian–Triassic mass extinctions. The small magnitude of this variation, combined with the possibility of its attribution to sampling bias, cautions against linking any observed differences to contrasting evolutionary dynamics. Our results indicate that origination and extinction levels were more variable across clades than across latitudes.
The 2020 presidential election brought expanded vote-by-mail opportunities, a rise in attacks on this process’s integrity, and the implementation of novel programs such as California’s Where’s My Ballot? system to ensure confidence in mail balloting. Can heightening awareness of this ballot-tracking system and other election protections alleviate fraud concerns and raise turnout? We assess whether messages reinforcing election integrity increased participation in the 2020 election through a large-scale voter mobilization field experiment. California registrants were mailed a letter that described either existing safeguards to prevent vote-by-mail fraud or the ability to track one’s ballot and ensure that it was counted. Analysis of state voter records reveals that neither message increased turnout over a simple election reminder or even no contact, even among subgroups where larger effects might be expected. In the context of a high-profile, high-turnout presidential election, assurances about ballot and electoral integrity did not increase turnout.
This article argues that marriage is a divine institution that pre-dates the state, and marriages are supernaturally effected by God consequent on the exchange of marital consent by the parties, whether or not the state recognises them as marriages. In fact, taking note of, and legislating about, marriage thus properly conceived is not within the state's remit. Despite this, the law in England and Wales is involved with the institution of marriage in three main ways: (1) it purports to define marriage, and its entry and exit conditions; (2) it passes laws affording or denying certain legal benefits or penalties on the basis of marital status; and (3) it registers marriages, and in practice imposes or denies the benefits or penalties just mentioned on the basis of registration of marriage, or lack of it. The supernatural action on God's part of creating marriages is not a fit subject for such involvement on the state's part. The underlying exchange of marital consent by the parties is, by contrast, within the state's sphere of competence, but it is argued that the state should be tracking a broader category of relationships than just those involving the exchange of marital consent. It is suggested that all marriage law should be repealed, and replaced by an Australian-style law of de facto relationships. If the law deals with de facto relationships there is no need for it to be involved with the institution of marriage as well, and that institution can be left to flourish outside the state's grasp. The article goes on to respond to some possible objections.
Patients presenting to hospital with suspected coronavirus disease 2019 (COVID-19), based on clinical symptoms, are routinely placed in a cohort together until polymerase chain reaction (PCR) test results are available. This procedure leads to delays in transfers to definitive areas and high nosocomial transmission rates. FebriDx is a finger-prick point-of-care test (PoCT) that detects an antiviral host response and has a high negative predictive value for COVID-19. We sought to determine the clinical impact of using FebriDx for COVID-19 triage in the emergency department (ED).
We undertook a retrospective observational study evaluating the real-world clinical impact of FebriDx as part of an ED COVID-19 triage algorithm.
Emergency department of a university teaching hospital.
Patients presenting with symptoms suggestive of COVID-19, placed in a cohort in a ‘high-risk’ area, were tested using FebriDx. Patients without a detectable antiviral host response were then moved to a lower-risk area.
Between September 22, 2020, and January 7, 2021, 1,321 patients were tested using FebriDx, and 1,104 (84%) did not have a detectable antiviral host response. Among 1,104 patients, 865 (78%) were moved to a lower-risk area within the ED. The median times spent in a high-risk area were 52 minutes (interquartile range [IQR], 34–92) for FebriDx-negative patients and 203 minutes (IQR, 142–255) for FebriDx-positive patients (difference of −134 minutes; 95% CI, −144 to −122; P < .0001). The negative predictive value of FebriDx for the identification of COVID-19 was 96% (661 of 690; 95% CI, 94%–97%).
FebriDx improved the triage of patients with suspected COVID-19 and reduced the time that severe acute respiratory coronavirus virus 2 (SARS-CoV-2) PCR-negative patients spent in a high-risk area alongside SARS-CoV-2–positive patients.
In my 2002 piece ‘The Meaning of Life’ I argued that Life, meaning the sum of the lives of all living things, had a meaning if and only if it had been purposefully brought about by a designer or creator. Michael Hauskeller has recently criticized this argument, responding that this sense of ‘meaning’ is not the one in view when we are discussing ‘the meaning of life’. In this piece I respond to Hauskeller's argument, and, while I stand by my 2002 argument in terms of one meaning of ‘meaning’, I admit that it does not apply to the different question of what makes a life meaningful. I assert that glorifying God is the activity that contributes the most meaningfulness to a life, though I deny that this is the only activity that can contribute meaningfulness to a life. This makes me, in terms due to Thaddeus Metz, a moderate supernaturalist rather than an extreme supernaturalist. Despite this distinction, Metz has argued in this volume that moderate supernaturalism is vulnerable to the same objection as in his view defeats extreme supernaturalism, and I close by responding to this argument.
Our understanding of major depression is complicated by substantial heterogeneity in disease presentation, which can be disentangled by data-driven analyses of depressive symptom dimensions. We aimed to determine the clinical portrait of such symptom dimensions among individuals in the community.
This cross-sectional study consisted of 25 261 self-reported White UK Biobank participants with major depression. Nine questions from the UK Biobank Mental Health Questionnaire encompassing depressive symptoms were decomposed into underlying factors or ‘symptom dimensions’ via factor analysis, which were then tested for association with psychiatric diagnoses and polygenic risk scores for major depressive disorder (MDD), bipolar disorder and schizophrenia. Replication was performed among 655 self-reported non-White participants, across sexes, and among 7190 individuals with an ICD-10 code for MDD from linked inpatient or primary care records.
Four broad symptom dimensions were identified, encompassing negative cognition, functional impairment, insomnia and atypical symptoms. These dimensions replicated across ancestries, sexes and individuals with inpatient or primary care MDD diagnoses, and were also consistent among 43 090 self-reported White participants with undiagnosed self-reported depression. Every dimension was associated with increased risk of nearly every psychiatric diagnosis and polygenic risk score. However, while certain psychiatric diagnoses were disproportionately associated with specific symptom dimensions, the three polygenic risk scores did not show the same specificity of associations.
An analysis of questionnaire data from a large community-based cohort reveals four replicable symptom dimensions of depression with distinct clinical, but not genetic, correlates.
To determine whether cascade reporting is associated with a change in meropenem and fluoroquinolone consumption.
A quasi-experimental study was conducted using an interrupted time series to compare antimicrobial consumption before and after the implementation of cascade reporting.
A 399-bed, tertiary-care, Veterans’ Affairs medical center.
Antimicrobial consumption data across 8 inpatient units were extracted from the Center for Disease Control and Prevention (CDC) National Health Safety Network (NHSN) antimicrobial use (AU) module from April 2017 through March 2019, reported as antimicrobial days of therapy (DOT) per 1,000 days present (DP).
Cascade reporting is a strategy of reporting antimicrobial susceptibility test results in which secondary agents are only reported if an organism is resistant to primary, narrow-spectrum agents. A multidisciplinary team developed cascade reporting algorithms for gram-negative bacteria based on local antibiogram and infectious diseases practice guidelines, aimed at restricting the use of fluoroquinolones and carbapenems. The algorithms were implemented in March 2018.
Following the implementation of cascade reporting, mean monthly meropenem (P =.005) and piperacillin/tazobactam (P = .002) consumption decreased and cefepime consumption increased (P < .001). Ciprofloxacin consumption decreased by 2.16 DOT per 1,000 DP per month (SE, 0.25; P < .001). Clostridioides difficile rates did not significantly change.
Ciprofloxacin consumption significantly decreased after the implementation of cascade reporting. Mean meropenem consumption decreased after cascade reporting was implemented, but we observed no significant change in the slope of consumption. cascade reporting may be a useful strategy to optimize antimicrobial prescribing.
Background: Updated IDSA-SHEA guidelines recommend different diagnostic approaches to C. difficile depending on whether There are pre-agreed institutional criteria for patient stool submission. If stool submission criteria are in place, nucleic acid amplification testing (NAAT) alone may be used. If not, a multistep algorithm is suggested, incorporating various combinations of toxin enzyme immunoassay (EIA), glutamate dehydrogenase (GDH), and NAAT, with discordant results adjudicated by NAAT. At our institution, we developed a multistep algorithm leading with NAAT with reflex to EIA for toxin testing if NAAT is positive. This algorithm resulted in a significant proportion of patients with discordant results (NAAT positive and toxin EIA negative) that some experts have categorized as possible carriers or C. difficile colonized. In this study, we describe the impact of a multistep algorithm on hospital-onset, community-onset, and healthcare-facility–associated C. difficile infection (HO-CDI, CO-CDI, and HFA-CDI, respectively) rates and the management of possible carriers. Methods: The study setting was a 399-bed, tertiary-care VA Medical Center in Richmond, Virginia. A retrospective chart review was conducted. The multistep C. difficile testing algorithm was implemented June 4, 2019 (Fig. 1). C. difficile testing results and possible carriers were reviewed for the 5 months before and 4 months after implementation (January 2019 to September 2019). Results: In total, 587 NAATs were performed in the inpatient and outpatient setting (mean, 58.7 per month). Overall, 123 NAATs (21%) were positive: 59 in the preintervention period and 63 in the postintervention period. In the postintervention period, 23 positive NAATs (26%) had a positive toxin EIA. Based on LabID events, the mean rate of HO+CO+HCFA CDI cases per 10,000 bed days of care (BDOC) decreased significantly from 9.49 in the preintervention period to 1.15 in the postintervention period (P = .019) (Fig. 2). Also, 9 of the possible carriers (22%) were treated for CDI based on high clinical suspicion, and 6 of the possible carriers (14%) had a previous history of CDI. Of these, 5 (83%) were treated for CDI. In addition, 1 patient (2%) converted from possible carrier to positive toxin EIA within 14 days. The infectious diseases team was consulted for 11 possible carriers (27%). Conclusions: Implementation of a 2-step C difficile algorithm leading with NAAT was associated with a lower rate of HO+CO+HCFA CDI per 10,000 BDOC. A considerable proportion (22%) of possible carriers were treated for CDI but did not count as LabID events. Only 2% of the possible carriers in our study converted to a positive toxin EIA.
While medical nutrition therapy is an essential part of the care for critically ill patients, uncertainty exists about the right form, dosage, timing and route in relation to the phases of critical illness. As enteral nutrition (EN) is often withheld or interrupted during the intensive care unit (ICU) stay, combined EN and parenteral nutrition (PN) may represent an effective and safe option to achieve energy and protein goals as recommended by international guidelines. We hypothesise that critically ill patients at high nutritional risk may benefit from such a combined approach during their stay on the ICU. Therefore, we aim to test if an early combination of EN and high-protein PN (EN+PN) is effective in reaching energy and protein goals in patients at high nutritional risk, while avoiding overfeeding. This approach will be tested in the here-presented EFFORTcombo trial. Nutritionally high-risk ICU patients will be randomised to either high (≥2·2 g/kg per d) or low protein (≤1·2 g/kg per d). In the high protein group, the patients will receive EN+PN; in the low protein group, patients will be given EN alone. EN will be started in accordance with international guidelines in both groups. Efforts will be made to reach nutrition goals within 48–96 h. The efficacy of the proposed nutritional strategy will be tested as an innovative approach by functional outcomes at ICU and hospital discharge, as well as at a 6-month follow-up.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Major depressive disorder and neuroticism (Neu) share a large genetic basis. We sought to determine whether this shared basis could be decomposed to identify genetic factors that are specific to depression.
We analysed summary statistics from genome-wide association studies (GWAS) of depression (from the Psychiatric Genomics Consortium, 23andMe and UK Biobank) and compared them with GWAS of Neu (from UK Biobank). First, we used a pairwise GWAS analysis to classify variants as associated with only depression, with only Neu or with both. Second, we estimated partial genetic correlations to test whether the depression's genetic link with other phenotypes was explained by shared overlap with Neu.
We found evidence that most genomic regions (25/37) associated with depression are likely to be shared with Neu. The overlapping common genetic variance of depression and Neu was genetically correlated primarily with psychiatric disorders. We found that the genetic contributions to depression, that were not shared with Neu, were positively correlated with metabolic phenotypes and cardiovascular disease, and negatively correlated with the personality trait conscientiousness. After removing shared genetic overlap with Neu, depression still had a specific association with schizophrenia, bipolar disorder, coronary artery disease and age of first birth. Independent of depression, Neu had specific genetic correlates in ulcerative colitis, pubertal growth, anorexia and education.
Our findings demonstrate that, while genetic risk factors for depression are largely shared with Neu, there are also non-Neu-related features of depression that may be useful for further patient or phenotypic stratification.
The pore structure of vapour deposited ASW is poorly understood, despite its importance to fundamental processes such as grain chemistry, cooling of star forming regions, and planet formation. We studied structural changes of vapour deposited D2O on intra-molecular to 30 nm length scales at temperatures ranging from 18 to 180 K and observed enhanced mobility from 100 to 150 K. An Arrhenius type model describes the loss of surface area and porosity with a common set of kinetic parameters. The low activation energy (428 K) is commensurate with van der Waals forces between nm-scale substructures in the ice. Our findings imply that water porosity will always change with time, even at low temperatures.
To examine the feasibility of using social media to assess the consumer nutrition environment by comparing sentiment expressed in Yelp reviews with information obtained from a direct observation audit instrument for grocery stores.
Trained raters used the Nutrition Environment Measures Survey in Stores (NEMS-S) in 100 grocery stores from July 2015 to March 2016. Yelp reviews were available for sixty-nine of these stores and were retrieved in February 2017 using the Yelp Application Program Interface. A sentiment analysis was conducted to quantify the perceptions of the consumer nutrition environment in the review text. Pearson correlation coefficients (ρ) were used to compare NEMS-S scores with Yelp review text on food availability, quality, price and shopping experience.
Detroit, Michigan, USA.
Yelp reviews contained more comments about food availability and the overall shopping experience than food price and food quality. Negative sentiment about food prices in Yelp review text and the number of dollar signs on Yelp were positively correlated with observed food prices in stores (ρ=0·413 and 0·462, respectively). Stores with greater food availability were rated as more expensive on Yelp. Other aspects of the food store environment (e.g. overall quality and shopping experience) were captured only in Yelp.
While Yelp cannot replace in-person audits for collecting detailed information on the availability, quality and cost of specific food items, Yelp holds promise as a cost-effective means to gather information on the overall cost, quality and experience of food stores, which may be relevant for nutrition outcomes.
The ventricular assist device is being increasingly used as a “bridge-to-transplant” option in children with heart failure who have failed medical management. Care for this medically complex population must be optimised, including through concomitant pharmacotherapy. Pharmacokinetic/pharmacodynamic alterations affecting pharmacotherapy are increasingly discovered in children supported with extracorporeal membrane oxygenation, another form of mechanical circulatory support. Similarities between extracorporeal membrane oxygenation and ventricular assist devices support the hypothesis that similar alterations may exist in ventricular assist device-supported patients. We conducted a literature review to assess the current data available on pharmacokinetics/pharmacodynamics in children with ventricular assist devices. We found two adult and no paediatric pharmacokinetic/pharmacodynamic studies in ventricular assist device-supported patients. While mechanisms may be partially extrapolated from children supported with extracorporeal membrane oxygenation, dedicated investigation of the paediatric ventricular assist device population is crucial given the inherent differences between the two forms of mechanical circulatory support, and pathophysiology that is unique to these patients. Commonly used drugs such as anticoagulants and antibiotics have narrow therapeutic windows with devastating consequences if under-dosed or over-dosed. Clinical studies are urgently needed to improve outcomes and maximise the potential of ventricular assist devices in this vulnerable population.
The target article explores the role of food insecurity as a contemporary risk factor for human overweight and obesity. The authors provide conditional support for the insurance hypothesis among adult women from high-income countries. We consider the potential contribution of additional factors in producing variation in adiposity patterns between species and across human contexts.
We investigated whether a higher number of fast-food outlets in an individual’s home neighbourhood is associated with increased prevalence of type 2 diabetes mellitus and related risk factors, including obesity.
Three UK-based diabetes screening studies (one general population, two high-risk populations) conducted between 2004 and 2011. The primary outcome was screen-detected type 2 diabetes. Secondary outcomes were risk factors for type 2 diabetes.
In total 10 461 participants (mean age 59 years; 53 % male; 21 % non-White ethnicity).
There was a higher number of neighbourhood (500 m radius from home postcode) fast-food outlets among non-White ethnic groups (P<0·001) and in socially deprived areas (P<0·001). After adjustment (social deprivation, urban/rural, ethnicity, age, sex), more fast-food outlets was associated with significantly increased odds for diabetes (OR=1·02; 95 % CI 1·00, 1·04) and obesity (OR=1·02; 95 % CI 1·00, 1·03). This suggests that for every additional two outlets per neighbourhood, we would expect one additional diabetes case, assuming a causal relationship between the fast-food outlets and diabetes.
These results suggest that increased exposure to fast-food outlets is associated with increased risk of type 2 diabetes and obesity, which has implications for diabetes prevention at a public health level and for those granting planning permission to new fast-food outlets.