We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Whole-genome sequencing (WGS) has traditionally been used in infection prevention to confirm or refute the presence of an outbreak after it has occurred. Due to decreasing costs of WGS, an increasing number of institutions have been utilizing WGS-based surveillance. Additionally, machine learning or statistical modeling to supplement infection prevention practice have also been used. We systematically reviewed the use of WGS surveillance and machine learning to detect and investigate outbreaks in healthcare settings.
Methods:
We performed a PubMed search using separate terms for WGS surveillance and/or machine-learning technologies for infection prevention through March 15, 2021.
Results:
Of 767 studies returned using the WGS search terms, 42 articles were included for review. Only 2 studies (4.8%) were performed in real time, and 39 (92.9%) studied only 1 pathogen. Nearly all studies (n = 41, 97.6%) found genetic relatedness between some isolates collected. Across all studies, 525 outbreaks were detected among 2,837 related isolates (average, 5.4 isolates per outbreak). Also, 35 studies (83.3%) only utilized geotemporal clustering to identify outbreak transmission routes. Of 21 studies identified using the machine-learning search terms, 4 were included for review. In each study, machine learning aided outbreak investigations by complementing methods to gather epidemiologic data and automating identification of transmission pathways.
Conclusions:
WGS surveillance is an emerging method that can enhance outbreak detection. Machine learning has the potential to identify novel routes of pathogen transmission. Broader incorporation of WGS surveillance into infection prevention practice has the potential to transform the detection and control of healthcare outbreaks.
Relapse and recurrence of depression are common, contributing to the overall burden of depression globally. Accurate prediction of relapse or recurrence while patients are well would allow the identification of high-risk individuals and may effectively guide the allocation of interventions to prevent relapse and recurrence.
Aims
To review prognostic models developed to predict the risk of relapse, recurrence, sustained remission, or recovery in adults with remitted major depressive disorder.
Method
We searched the Cochrane Library (current issue); Ovid MEDLINE (1946 onwards); Ovid Embase (1980 onwards); Ovid PsycINFO (1806 onwards); and Web of Science (1900 onwards) up to May 2021. We included development and external validation studies of multivariable prognostic models. We assessed risk of bias of included studies using the Prediction model risk of bias assessment tool (PROBAST).
Results
We identified 12 eligible prognostic model studies (11 unique prognostic models): 8 model development-only studies, 3 model development and external validation studies and 1 external validation-only study. Multiple estimates of performance measures were not available and meta-analysis was therefore not necessary. Eleven out of the 12 included studies were assessed as being at high overall risk of bias and none examined clinical utility.
Conclusions
Due to high risk of bias of the included studies, poor predictive performance and limited external validation of the models identified, presently available clinical prediction models for relapse and recurrence of depression are not yet sufficiently developed for deploying in clinical settings. There is a need for improved prognosis research in this clinical area and future studies should conform to best practice methodological and reporting guidelines.
Metabolites produced by microbial fermentation in the human intestine, especially short-chain fatty acids (SCFAs), are known to play important roles in colonic and systemic health. Our aim here was to advance our understanding of how and why their concentrations and proportions vary between individuals. We have analysed faecal concentrations of microbial fermentation acids from 10 human volunteer studies, involving 163 subjects, conducted at the Rowett Institute, Aberdeen, UK over a 7-year period. In baseline samples, the % butyrate was significantly higher, whilst % iso-butyrate and % iso-valerate were significantly lower, with increasing total SCFA concentration. The decreasing proportions of iso-butyrate and iso-valerate, derived from amino acid fermentation, suggest that fibre intake was mainly responsible for increased SCFA concentrations. We propose that the increase in % butyrate among faecal SCFA is largely driven by a decrease in colonic pH resulting from higher SCFA concentrations. Consistent with this, both total SCFA and % butyrate increased significantly with decreasing pH across five studies for which faecal pH measurements were available. Colonic pH influences butyrate production through altering the stoichiometry of butyrate formation by butyrate-producing species, resulting in increased acetate uptake and butyrate formation, and facilitating increased relative abundance of butyrate-producing species (notably Roseburia and Eubacterium rectale).
The critical period for weed control (CPWC) adds value to integrated weed management by identifying the period during which weeds need to be controlled to avoid yield losses exceeding a defined threshold. However, the traditional application of the CPWC does not identify the timing of control needed for weeds that emerge late in the critical period. In this study, CPWC models were developed from field data in high-yielding cotton crops during three summer seasons from 2005 to 2008, using the mimic weed, common sunflower, at densities of two to 20 plants per square meter. Common sunflower plants were introduced at up to 450 growing degree days (GDD) after crop planting and removed at successive 200 GDD intervals after introduction. The CPWC models were described using extended Gompertz and logistic functions that included weed density, time of weed introduction, and time of weed removal (logistic function only) in the relationships. The resulting models defined the CPWC for late-emerging weeds, identifying a period after weed emergence before weed control was required to prevent yield loss exceeding the yield-loss threshold. When weeds emerged in sufficient numbers toward the end of the critical period, the model predicted that crop yield loss resulting from competition by these weeds would not exceed the yield-loss threshold until well after the end of the CPWC. These findings support the traditional practice of ensuring weeds are controlled before crop canopy closure, with later weed control inputs used as required.
Recent X-ray observations by Jiang et al. have identified an active galactic nucleus (AGN) in the bulgeless spiral galaxy NGC 3319, located just
$14.3\pm 1.1$
Mpc away, and suggest the presence of an intermediate-mass black hole (IMBH;
$10^2\leq M_\bullet/\textrm{M}_{\odot}\leq 10^5$
) if the Eddington ratios are as high as 3 to
$3\times10^{-3}$
. In an effort to refine the black hole mass for this (currently) rare class of object, we have explored multiple black hole mass scaling relations, such as those involving the (not previously used) velocity dispersion, logarithmic spiral arm pitch angle, total galaxy stellar mass, nuclear star cluster mass, rotational velocity, and colour of NGC 3319, to obtain 10 mass estimates, of differing accuracy. We have calculated a mass of
$3.14_{-2.20}^{+7.02}\times10^4\,\textrm{M}_\odot$
, with a confidence of 84% that it is
$\leq $
$10^5\,\textrm{M}_\odot$
, based on the combined probability density function from seven of these individual estimates. Our conservative approach excluded two black hole mass estimates (via the nuclear star cluster mass and the fundamental plane of black hole activity—which only applies to black holes with low accretion rates) that were upper limits of
${\sim}10^5\,{\textrm M}_{\odot}$
, and it did not use the
$M_\bullet$
–
$L_{\textrm 2-10\,\textrm{keV}}$
relation’s prediction of
$\sim$
$10^5\,{\textrm M}_{\odot}$
. This target provides an exceptional opportunity to study an IMBH in AGN mode and advance our demographic knowledge of black holes. Furthermore, we introduce our novel method of meta-analysis as a beneficial technique for identifying new IMBH candidates by quantifying the probability that a galaxy possesses an IMBH.
The Subglacial Antarctic Lakes Scientific Access (SALSA) Project accessed Mercer Subglacial Lake using environmentally clean hot-water drilling to examine interactions among ice, water, sediment, rock, microbes and carbon reservoirs within the lake water column and underlying sediments. A ~0.4 m diameter borehole was melted through 1087 m of ice and maintained over ~10 days, allowing observation of ice properties and collection of water and sediment with various tools. Over this period, SALSA collected: 60 L of lake water and 10 L of deep borehole water; microbes >0.2 μm in diameter from in situ filtration of ~100 L of lake water; 10 multicores 0.32–0.49 m long; 1.0 and 1.76 m long gravity cores; three conductivity–temperature–depth profiles of borehole and lake water; five discrete depth current meter measurements in the lake and images of ice, the lake water–ice interface and lake sediments. Temperature and conductivity data showed the hydrodynamic character of water mixing between the borehole and lake after entry. Models simulating melting of the ~6 m thick basal accreted ice layer imply that debris fall-out through the ~15 m water column to the lake sediments from borehole melting had little effect on the stratigraphy of surficial sediment cores.
The primary aim of this study was to assess the epidemiology of carbapenem-resistant Acinetobacter baumannii (CRAB) for 9 months following a regional outbreak with this organism. We also aimed to determine the differential positivity rate from different body sites and characterize the longitudinal changes of surveillance test results among CRAB patients.
Design:
Observational study.
Setting:
A 607-bed tertiary-care teaching hospital in Milwaukee, Wisconsin.
Patients:
Any patient admitted from postacute care facilities and any patient housed in the same inpatient unit as a positive CRAB patient.
Methods:
Participants underwent CRAB surveillance cultures from tracheostomy secretions, skin, and stool from December 5, 2018, to September 6, 2019. Cultures were performed using a validated, qualitative culture method, and final bacterial identification was performed using mass spectrometry.
Results:
In total, 682 patients were tested for CRAB, of whom 16 (2.3%) were positive. Of the 16 CRAB-positive patients, 14 (87.5%) were residents from postacute care facilities and 11 (68.8%) were African American. Among positive patients, the positivity rates by body site were 38% (6 of 16) for tracheal aspirations, 56% (9 of 16) for skin, and 82% (13 of 16) for stool.
Conclusions:
Residents from postacute care facilities were more frequently colonized by CRAB than patients admitted from home. Stool had the highest yield for identification of CRAB.
Throughout the Ediacaran Period, variable water-column redox conditions persisted along productive ocean margins due to a complex interplay between nutrient supply and oceanographic restriction. These changing conditions are considered to have influenced early faunal evolution, with marine anoxia potentially inhibiting the development of the ecological niches necessary for aerobic life forms. To understand this link between oxygenation and evolution, the combined geochemical and palaeontological study of marine sediments is preferable. Located in the Yangtze Gorges region of southern China, lagoonal black shales at Miaohe preserve alga and putative metazoans, including Eoandromeda, a candidate total-group ctenophore, thereby providing one example of where integrated study is possible. We present a multi-proxy investigation into water-column redox variability during deposition of these shales (c. 560–551 Ma). For this interval, reactive iron partitioning indicates persistent water-column anoxia, while trace metal enrichments and other geochemical data suggest temporal fluctuations between ferruginous, euxinic and rare suboxic conditions. Although trace metal and total organic carbon values imply extensive basin restriction, sustained trace metal enrichment and δ15Nsed data indicate periodic access to open-ocean inventories across a shallow-marine sill. Lastly, δ13Corg values of between −35‰ and −40‰ allow at least partial correlation of the shales at Miaohe with Member IV of the Doushantuo Formation. This study provides evidence for fluctuating redox conditions in the lagoonal area of the Yangtze platform during late Ediacaran time. If these low-oxygen environments were regionally characteristic, then the restriction of aerobic fauna to isolated environments can be inferred.
Clarifying the relationship between depression symptoms and cardiometabolic and related health could clarify risk factors and treatment targets. The objective of this study was to assess whether depression symptoms in midlife are associated with the subsequent onset of cardiometabolic health problems.
Methods
The study sample comprised 787 male twin veterans with polygenic risk score data who participated in the Harvard Twin Study of Substance Abuse (‘baseline’) and the longitudinal Vietnam Era Twin Study of Aging (‘follow-up’). Depression symptoms were assessed at baseline [mean age 41.42 years (s.d. = 2.34)] using the Diagnostic Interview Schedule, Version III, Revised. The onset of eight cardiometabolic conditions (atrial fibrillation, diabetes, erectile dysfunction, hypercholesterolemia, hypertension, myocardial infarction, sleep apnea, and stroke) was assessed via self-reported doctor diagnosis at follow-up [mean age 67.59 years (s.d. = 2.41)].
Results
Total depression symptoms were longitudinally associated with incident diabetes (OR 1.29, 95% CI 1.07–1.57), erectile dysfunction (OR 1.32, 95% CI 1.10–1.59), hypercholesterolemia (OR 1.26, 95% CI 1.04–1.53), and sleep apnea (OR 1.40, 95% CI 1.13–1.74) over 27 years after controlling for age, alcohol consumption, smoking, body mass index, C-reactive protein, and polygenic risk for specific health conditions. In sensitivity analyses that excluded somatic depression symptoms, only the association with sleep apnea remained significant (OR 1.32, 95% CI 1.09–1.60).
Conclusions
A history of depression symptoms by early midlife is associated with an elevated risk for subsequent development of several self-reported health conditions. When isolated, non-somatic depression symptoms are associated with incident self-reported sleep apnea. Depression symptom history may be a predictor or marker of cardiometabolic risk over decades.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Errors inherent in self-reported measures of energy intake (EI) are substantial and well documented, but correlates of misreporting remain unclear. Therefore, potential predictors of misreporting were examined. In Study One, fifty-nine individuals (BMI = 26·1 (sd 3·8) kg/m2, age = 42·7 (sd 13·6) years, females = 29) completed a 14-d stay in a residential feeding behaviour suite where eating behaviour was continuously monitored. In Study Two, 182 individuals (BMI = 25·7 (sd 3·9) kg/m2, age = 42·4 (sd 12·2) years, females = 96) completed two consecutive days in a residential feeding suite and five consecutive days at home. Misreporting was directly quantified by comparing covertly measured laboratory weighed intakes (LWI) with self-reported EI (weighed dietary record (WDR), 24-h recall, 7-d diet history, FFQ). Personal (age, sex and %body fat) and psychological traits (personality, social desirability, body image, intelligence quotient and eating behaviour) were used as predictors of misreporting. In Study One, those with lower psychoticism (P = 0·009), openness to experience (P = 0·006) and higher agreeableness (P = 0·038) reduced EI on days participants knew EI was being measured to a greater extent than on covert days. Isolated associations existed between personality traits (psychoticism and openness to experience), eating behaviour (emotional eating) and differences between the LWI and self-reported EI, but these were inconsistent between dietary assessment techniques and typically became non-significant after accounting for multiplicity of comparisons. In Study Two, sex was associated with differences between LWI and the WDR (P = 0·009), 24-h recall (P = 0·002) and diet history (P = 0·050) in the laboratory, but not home environment. Personal and psychological correlates of misreporting identified displayed no clear pattern across studies or dietary assessment techniques and had little utility in predicting misreporting.
The association between Clostridioides difficile colonization and C. difficile infection (CDI) is unknown in solid-organ transplant (SOT) patients. We examined C. difficile colonization and healthcare-associated exposures as risk factors for development of CDI in SOT patients.
Methods:
The retrospective study cohort included all consecutive SOT patients with at least 1 screening test between May 2017 and April 2018. CDI was defined as the presence of diarrhea (without laxatives), a positive C. difficile clinical test, and the use of C. difficile-directed antimicrobial therapy as ordered by managing clinicians. In addition to demographic variables, exposures to antimicrobials, immunosuppressants, and gastric acid suppressants were evaluated from the time of first screening test to the time of CDI, death, or final discharge.
Results:
Of the 348 SOT patients included in our study, 33 (9.5%) were colonized with toxigenic C. difficile. In total, 11 patients (3.2%) developed CDI. Only C. difficile colonization (odds ratio [OR], 13.52; 95% CI, 3.46–52.83; P = .0002), age (OR, 1.09; CI, 1.02–1.17; P = .0135), and hospital days (OR, 1.05; 95% CI, 1.02–1.08; P = .0017) were independently associated with CDI.
Conclusions:
Although CDI was more frequent in C. difficile colonized SOT patients, the overall incidence of CDI was low in this cohort.
Glyphosate-tolerant and glyphosate-resistant weeds are becoming increasingly problematic in cotton fields in Australia, necessitating a return from a glyphosate dominated system to a more integrated approach to weed management. The development of an integrated weed management system can be facilitated by identifying the critical period for weed control (CPWC), a model that enables cotton growers to optimize the timing of their weed control inputs. Using data from field studies conducted from 2003 to 2015, CPWC models using extended functions, including weed biomass in the relationships, were developed for the mimic weeds, common sunflower and Japanese millet, in high-yielding, fully irrigated cotton. A multispecies CPWC model was developed after combining these data sets with data for mungbean in irrigated cotton, using weed height and weed biomass as descriptors in the models. Comparison of observed and predicted relative cotton-lint yields from the multispecies CPWC model demonstrated that the model reasonably described the competition from these three very different mimic weeds, opening the possibility for cotton growers to use a multispecies CPWC model in their production systems.
Is support for democracy in the United States robust enough to deter undemocratic behavior by elected politicians? We develop a model of the public as a democratic check and evaluate it using two empirical strategies: an original, nationally representative candidate-choice experiment in which some politicians take positions that violate key democratic principles, and a natural experiment that occurred during Montana’s 2017 special election for the U.S. House. Our research design allows us to infer Americans’ willingness to trade-off democratic principles for other valid but potentially conflicting considerations such as political ideology, partisan loyalty, and policy preferences. We find the U.S. public’s viability as a democratic check to be strikingly limited: only a small fraction of Americans prioritize democratic principles in their electoral choices, and their tendency to do so is decreasing in several measures of polarization, including the strength of partisanship, policy extremism, and candidate platform divergence. Our findings echo classic arguments about the importance of political moderation and cross-cutting cleavages for democratic stability and highlight the dangers that polarization represents for democracy.
Research using the critical period for weed control (CPWC) has shown that high-yielding cotton crops are very sensitive to competition from grasses and large broadleaf weeds, but the CPWC has not been defined for smaller broadleaf weeds in Australian cotton. Field studies were conducted over five seasons from 2003 to 2015 to determine the CPWC for smaller broadleaf weeds, using mungbean as a mimic weed. Mungbean was planted at densities of 1, 3, 6, 15, 30, and 60 plants m−2 with or after cotton emergence and added and removed at approximately 0, 150, 300, 450, 600, 750, and 900 degree days of crop growth (GDD). Mungbean competed strongly with cotton, with season-long interference; 60 mungbean plants m−2 resulted in an 84% reduction in cotton yield. A dynamic CPWC function was developed for densities of 1 to 60 mungbean plants m−2 using extended Gompertz and exponential curves including weed density as a covariate. Using a 1% yield-loss threshold, the CPWC defined by these curves extended for the full growing season of the crop at all weed densities. The minimum yield loss from a single weed control input was 35% at the highest weed density of 60 mungbean plants m−2. The relationship for the critical time of weed removal was further improved by substituting weed biomass for weed density in the relationship.
Opioid antagonists may mitigate medication-associated weight gain and/or metabolic dysregulation. ENLIGHTEN-2 evaluated a combination of olanzapine and the opioid antagonist samidorphan (OLZ/SAM) vs olanzapine for effects on weight gain and metabolic parameters over 24 weeks in adults with stable schizophrenia.
METHODS:
This phase 3, double-blind study (ClinicalTrials.gov: NCT02694328) enrolled adults 18–55 yo with stable schizophrenia, randomized 1:1 to once-daily OLZ/SAM or olanzapine. Co-primary endpoints were percent change from baseline in body weight and proportion of patients with ≥10% weight gain at week 24. Waist circumference and fasting metabolic parameters were also measured. Completers could enter a 52-week open-label safety extension.
RESULTS:
561 patients were randomized: 550 were dosed, 538 had ≥1 post-baseline weight assessment, and 352 (64%) completed; 10.9% discontinued due to AEs. At week 24, least squares mean (SE) percent weight change from baseline was 4.21 (0.68)% with OLZ/SAM and 6.59 (0.67)% with olanzapine (difference, −2.38 [0.76]%; P=0.003). Fewer patients treated with OLZ/SAM (17.8%) had ≥10% weight gain vs olanzapine (29.8%; odds ratio=0.50; P=0.003). The change from baseline in waist circumference was significantly smaller with OLZ/SAM (P<0.001). Common AEs (≥10%) with OLZ/SAM and olanzapine were weight increased (24.8%, 36.2%), somnolence (21.2%, 18.1%), dry mouth (12.8%, 8.0%), and increased appetite (10.9%, 12.3%), respectively. Metabolic parameter changes were generally small and remained stable with long-term OLZ/SAM treatment.
DISCUSSION:
OLZ/SAM treatment limited weight gain associated with olanzapine. Metabolic parameter changes were generally small, similar between groups over 24 weeks, and remained stable over an additional 52 weeks of open-label OLZ/SAM treatment.
A combination of olanzapine and samidorphan (OLZ/SAM) is in development for schizophrenia to provide the efficacy of olanzapine while mitigating olanzapine-associated weight gain. The objective of this phase 1 exploratory study was to assess metabolic treatment effects of OLZ/SAM.
Methods:
Healthy, non-obese adults (18–40 years) were randomized 2:2:1 to once-daily OLZ/SAM, olanzapine, or placebo for 21 days. Assessments included oral glucose tolerance test (OGTT), hyperinsulinemic-euglycemic clamp, weight gain, and adverse event (AE) monitoring. Treatment effects were estimated with analysis of covariance.
Results:
Sixty subjects were randomized (OLZ/SAM, n=24; olanzapine, n=24; placebo, n=12); 19 (79.2%), 22 (91.7%), and 11 (91.7%), respectively, completed the study. In the OGTT, olanzapine led to significant hyperinsulinemia (P<0.0001) and significantly reduced insulin sensitivity (2-hour Matsuda index) at day 19 vs baseline (P=0.0012), changes not observed with OLZ/SAM. No significant between-group differences were observed for change from baseline in clamp-derived insulin sensitivity index at day 21. Least squares mean weight change from baseline was similar with OLZ/SAM (3.16 kg) and olanzapine (2.87 kg); both were significantly higher than placebo (0.57 kg; both P<0.01). Caloric intake significantly decreased from baseline to day 22 with OLZ/SAM (P=0.015) but not with olanzapine or placebo. Forty-nine subjects (81.7%) experienced ≥1 AE (OLZ/SAM, 87.5%; olanzapine, 79.2%; placebo, 75.0%).
Conclusions:
In this exploratory study, hyperinsulinemia and decreased insulin sensitivity were observed in the OGTT with olanzapine but not with OLZ/SAM or placebo. Clamp-derived insulin sensitivity index and weight changes were similar with OLZ/SAM and olanzapine in healthy subjects during the 3-week study.
From 2008, the UK’s National Diet and Nutrition Survey (NDNS) changed the method of dietary data collection from a 7-d weighed diary to a 4-d unweighed diary, partly to reduce participant burden. This study aimed to test whether self-reported energy intake changed significantly over the 4-d recording period of the NDNS rolling programme. Analyses used data from the NDNS years 1 (2008/2009) to 8 (2015/2016) inclusive, from participants aged 13 years and older. Dietary records from participants who reported unusual amounts of food and drink consumed on one or more days were excluded, leaving 6932 participants. Mean daily energy intake was 7107 kJ (1698 kcal), and there was a significant decrease of 164 kJ (39 kcal) between days 1 and 4 (P < 0·001). There was no significant interaction of sex or low-energy reporter status (estimated from the ratio of reported energy intake:BMR) with the change in reported energy intake. The decrease in reported energy intake on day 4 compared with day 1 was greater (P < 0·019) for adults with higher BMI (>30 kg/m2) than it was for leaner adults. Reported energy intake decreased over the 4-d recording period of the NDNS rolling programme suggesting that participants change their diet more, or report less completely, with successive days of recording their diet. The size of the effect was relatively minor, however.
Our multi-component photometric decomposition of the largest galaxy sample to date with dynamically-measured black hole masses nearly doubles the number of such galaxies. We have discovered substantially modified scaling relations between the black hole mass and the host galaxy properties, including the spheroid (bulge) stellar mass, the total galaxy stellar mass, and the central stellar velocity dispersion. These refinements partly arose because we were able to explore the scaling relations for various sub-populations of galaxies built by different physical processes, as traced by the presence of a disk, early-type versus late-type galaxies, or a Sérsic versus core-Sérsic spheroid light profile. The new relations appear fundamentally linked with the evolutionary paths followed by galaxies, and they have ramifications for simulations and formation theories involving both quenching and accretion.