We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Medical researchers are increasingly prioritizing the inclusion of underserved communities in clinical studies. However, mere inclusion is not enough. People from underserved communities frequently experience chronic stress that may lead to accelerated biological aging and early morbidity and mortality. It is our hope and intent that the medical community come together to engineer improved health outcomes for vulnerable populations. Here, we introduce Health Equity Engineering (HEE), a comprehensive scientific framework to guide research on the development of tools to identify individuals at risk of poor health outcomes due to chronic stress, the integration of these tools within existing healthcare system infrastructures, and a robust assessment of their effectiveness and sustainability. HEE is anchored in the premise that strategic intervention at the individual level, tailored to the needs of the most at-risk people, can pave the way for achieving equitable health standards at a broader population level. HEE provides a scientific framework guiding health equity research to equip the medical community with a robust set of tools to enhance health equity for current and future generations.
To compare 2 methods of communicating polymerase chain reaction (PCR) blood-culture results: active approach utilizing on-call personnel versus passive approach utilizing notifications in the electronic health record (EHR).
Design:
Retrospective observational study.
Setting:
A tertiary-care academic medical center.
Patients:
Adult patients hospitalized with ≥1 positive blood culture containing a gram-positive organism identified by PCR between October 2014 and January 2018.
Methods:
The standard protocol for reporting PCR results at baseline included a laboratory technician calling the patient’s nurse, who would report the critical result to the medical provider. The active intervention group consisted of an on-call pager system utilizing trained pharmacy residents, whereas the passive intervention group combined standard protocol with real-time in-basket notifications to pharmacists in the EHR.
Results:
Of 209 patients, 105, 61, and 43 patients were in the control, active, and passive groups, respectively. Median time to optimal therapy was shorter in the active group compared to the passive group and control (23.4 hours vs 42.2 hours vs 45.9 hours, respectively; P = .028). De-escalation occurred 12 hours sooner in the active group. In the contaminant group, empiric antibiotics were discontinued faster in the active group (0 hours) than in the control group and the passive group (17.7 vs 7.2 hours; P = .007). Time to active therapy and days of therapy were similar.
Conclusions:
A passive, electronic method of reporting PCR results to pharmacists was not as effective in optimizing stewardship metrics as an active, real-time method utilizing pharmacy residents. Further studies are needed to determine the optimal method of communicating time-sensitive information.
Anorexia nervosa (AN) is a psychiatric disorder with complex etiology, with a significant portion of disease risk imparted by genetics. Traditional genome-wide association studies (GWAS) produce principal evidence for the association of genetic variants with disease. Transcriptomic imputation (TI) allows for the translation of those variants into regulatory mechanisms, which can then be used to assess the functional outcome of genetically regulated gene expression (GReX) in a broader setting through the use of phenome-wide association studies (pheWASs) in large and diverse clinical biobank populations with electronic health record phenotypes.
Methods
Here, we applied TI using S-PrediXcan to translate the most recent PGC-ED AN GWAS findings into AN-GReX. For significant genes, we imputed AN-GReX in the Mount Sinai BioMe™ Biobank and performed pheWASs on over 2000 outcomes to test the clinical consequences of aberrant expression of these genes. We performed a secondary analysis to assess the impact of body mass index (BMI) and sex on AN-GReX clinical associations.
Results
Our S-PrediXcan analysis identified 53 genes associated with AN, including what is, to our knowledge, the first-genetic association of AN with the major histocompatibility complex. AN-GReX was associated with autoimmune, metabolic, and gastrointestinal diagnoses in our biobank cohort, as well as measures of cholesterol, medications, substance use, and pain. Additionally, our analyses showed moderation of AN-GReX associations with measures of cholesterol and substance use by BMI, and moderation of AN-GReX associations with celiac disease by sex.
Conclusions
Our BMI-stratified results provide potential avenues of functional mechanism for AN-genes to investigate further.
Abdominal ultrasonography is an extremely valuable diagnostic tool for all perioperative physicians. While the FAST exam was designed for use in patients with blunt abdominal trauma, its principles are applicable in a wide variety of perioperative settings and can be used to narrow the differential diagnosis in unstable patients. Aortic ultrasound is easy to perform and rapidly confirms or rules out the presence of abdominal aortic aneurysm or dissection. Other uses include gallbladder imaging and evaluation for free intra-peritoneal air. Perioperative and intensive care unit patients will benefit from point-of-care ultrasound, including detailed examination of the abdominal cavity.
Revolving Door Syndrome usually corresponds to what might be called “hospital multiple readmissions phenomenon”. Beyond consequences to the patients and their families, frequent re-readmissions also heavily increase healthcare cost and cause burn out among medical and paramedical staff.
The objective of this study was to find the characteristics of Tunisian patients with “revolving door” syndrome.
The purpose of this study was to identify factors associated to short-term readmissions in a sample of Tunisian patients with schizophrenia.
The authors conducted a retrospective study on 50 patients with schizophrenia or schizo-affective disorders from November to february 2009 in Razi Hospital's “Psychiatry C” service. Patients included in the study had been hospitalized at least 4 times. The patients were analyzed for socio-demographic characteristics, total readmissions and the number of admissions in the last year. Their medication adherence was evaluated by MARS (Medication Adherence Rating Scale) and their insight was evaluated by the Q8 scale.
The sample was composed of 50 patients with schizophrenia or schizo-affective disorders according to DSM-IV, more than 18 years of age, which have been hospitalized more than 4 times.
The sample was composed of 80% men. 74% of the sample was single and 66% were living with their parents. 88% were unemployed. 54% of patients were bad observers and 88% had lack of insight.
The authors found that the typical Tunisian revolving door patient is a single man, living at his parent's, unemployed and with a lack of observation skills and insight.
For life insurers in the United Kingdom (UK), the risk margin is one of the most controversial aspects of the Solvency II regime which came into force in 2016.
The risk margin is the difference between the technical provisions and the best estimate liabilities. The technical provisions are intended to be market-consistent, and so are defined as the amount required to be paid to transfer the business to another undertaking. In practice, the technical provisions cannot be directly calculated, and so the risk margin must be determined using a proxy method; the method chosen for Solvency II is known as the cost-of-capital method.
Following the implementation of Solvency II, the risk margin came under considerable criticism for being too large and too sensitive to interest rate movements. These criticisms are particularly valid for annuity business in the UK – such business is of great significance to the system for retirement provision. A further criticism is that mitigation of the impact of the risk margin has led to an increase in reinsurance of longevity risks, particularly to overseas reinsurers.
This criticism has led to political interest, and the risk margin was a major element of the Treasury Committee inquiry into EU Insurance Regulation.
The working party was set up in response to this criticism. Our brief is to consider both the overall purpose of the risk margin for life insurers and solutions to the current problems, having regard to the possibility of post-Brexit flexibility.
We have concluded that a risk margin in some form is necessary, although its size depends on the level of security desired, and so is primarily a political question.
We have reviewed possible alternatives to the current risk margin, both within the existing cost-of-capital methodology and considering a wide range of alternatives.
We believe that requirements for the risk margin will depend on future circumstances, in particular relating to Brexit, and we have identified a number of possible changes to methodology which should be considered, depending on circumstances.
Recent work has implicated one type of horizontal strabismus (exotropia) as a risk factor for schizophrenia. This new insight raises questions about a potential common developmental origin of the two diseases. Seasonality of births is well established for schizophrenia. Seasonal factors such as light exposure affect eye growth and can cause vision abnormalities, but little is known about seasonality of births in strabismus. We examined birth seasonality in people with horizontal strabismus in a retrospective study in Washoe County, Nevada, and re-examined similar previously obtained data from Osaka, Japan. We then compared seasonal patterns of births between strabismus, refractive error, schizophrenia and congenital toxoplasmosis. Patients with esotropia had a significant seasonality of births, with a deficit in March, then increasing to an excess in September, while patients with exotropia had a distinctly different pattern, with an excess of births in July, gradually decreasing to a deficit in November. These seasonalities were statistically significant with either χ2 or Kolmogorov–Smirnov-type statistics. The birth seasonality of esotropia resembled that for hyperopia, with an increase in amplitude, while the seasonality for myopia involved a phase-shift. There was no correlation between seasonality of births between strabismus and congenital toxoplasmosis. The pattern of an excess of summer births for people with exotropia was remarkably similar to the well-established birth seasonality of one schizophrenia subtype, the deficit syndrome, but not schizophrenia as a whole. This suggests a testable hypothesis: that exotropia may be a risk factor primarily for the deficit type of schizophrenia.
Centenarians have survived into very late life, but whether they reach very old age in good health remains unclear. The purpose of this study was to compare the cardiovascular health status and cognitive functioning of centenarians in the United States with centenarians in Japan.
Design, Setting, and Participants:
This cross-national design compared centenarians from the United States and Japan. The sample of U.S. centenarians was recruited from the Georgia Centenarian Study and included 287 centenarians. The sample of Japanese centenarians was recruited from the Tokyo Centenarian Study and included 304 centenarians.
Measurements:
Cognitive functioning was assessed with a mental status questionnaire, and cardiovascular disease by a health history assessment, blood pressure, and selected blood parameters.
Results:
The results suggest that Tokyo centenarians had lower disease experiences and BMI values, when compared to Georgia centenarians, but blood pressure was higher among Japanese centenarians. Lower levels of hemoglobin in Japanese centenarians and higher levels of C-reactive protein in Georgia were also found. The positive association of hypertension and albumin levels with cognitive functioning and the negative association of stroke occurrence with cognitive functioning were replicated in both countries. Differential effects were obtained for heart problems, BMI, and C-reactive protein (with positive effects for Tokyo centenarians, except for C-reactive protein).
Conclusion:
For extremely old individuals, some markers of cardiovascular disease are replicable across countries, whereas differential effects for cardiovascular health also need to be considered in cardiovascular health.
Sugarbeet, grown for biofuel, is being considered as an alternate cool-season crop in the southeastern United States. Previous research identified ethofumesate PRE and phenmedipham + desmedipham POST as herbicides that controlled troublesome cool-season weeds in the region, specifically cutleaf evening-primrose. Research trials were conducted from 2014 through 2016 to evaluate an integrated system of sweep cultivation and reduced rates of ethofumesate PRE and/or phenmedipham+desmedipham POST for weed control in sugarbeet grown for biofuel. There were no interactions between the main effects of cultivation and herbicides for control of cutleaf evening-primrose and other cool-season species in two out of three years. Cultivation improved control of cool-season weeds, but the effect was largely independent of control provided by herbicides. Of the herbicide combinations evaluated, the best overall cool-season weed control was from systems that included either a 1/2X or 1X rate of phenmedipham+desmedipham POST. Either rate of ethofumesate PRE was less effective than phenmedipham+desmedipham POST. Despite improved cool-season weed control, sugarbeet yield was not affected by cultivation each year of the study. Sugarbeet yields were greater when treated with any herbicide combination that included either a 1/2X or 1X rate of phenmedipham+desmedipham POST compared with either rate of ethofumesate PRE alone or the nontreated control. These results indicate that cultivation has a very limited role in sugarbeet grown for biofuel. The premise of effective weed control based on an integration of cultivation and reduced herbicide rates does not appear to be viable for sugarbeet grown for biofuel.
Sugarbeet, grown for biofuel, is being considered as an alternate cool-season crop in the southeastern U.S. coastal plain. Typically, the crop would be seeded in the autumn, then grow through the winter and be harvested the following spring. Labels for herbicides registered for use on sugarbeet grown in the traditional sugarbeet production regions do not list any of the cool-season weeds common in the southeastern United States. Field trials were initiated near Ty Ty, GA, to evaluate all possible combinations of ethofumesate applied PRE, phenmedipham+desmedipham applied POST, clopyralid POST, and triflusulfuron POST for cool-season weed control in sugarbeet. Phenmedipham+desmedipham alone and in combination with clopyralid and/or triflusulfuron effectively controlled cutleaf eveningprimrose, lesser swinecress, henbit, and corn spurry when applied to seedling weeds. Ethofumesate PRE alone was not as effective in controlling cool-season weeds compared to treatments containing phenmedipham+desmedipham POST. However, ethofumesate PRE applied sequentially with phenmedipham+desmedipham POST improved weed control consistency. Clopyralid and/or triflusulfuron alone did not adequately control cutleaf eveningprimrose. Triflusulfuron alone effectively controlled wild radish. In the 2013–2014 and 2014–2015 seasons, December-applied POST herbicides did not injure sugarbeet. However, in the 2015–2016 season POST herbicides were applied in late October. On the day of treatment, the maximum temperature was 25.4 C, which exceeded the established upper temperature limit of 22 C for safe application of phenmedipham+desmedipham, and sugarbeet plants were severely injured. In the southeastern United States, temperatures frequently exceed 22 C in early autumn, which may limit phenmedipham+desmedipham use for controlling troublesome cool-season weeds of sugarbeet in the region. Weed control options need to be expanded to compensate for this limitation.
Introduction: Emerging evidence suggests a heightened interest in healthy behaviour changes, including smoking cessation, at the beginning of the week. Evidence from Google searches, quitlines, and cessation websites show greater information-seeking and interest in early week quitting.
Aims: This pilot assesses the comparative effectiveness of a smoking cessation intervention that encourages participants to use Mondays as a day to quit or recommit to quitting smoking.
Methods: We partnered with existing smoking cessation group programs to conduct a quasi-experimental, pre–post study. Both comparison and intervention groups received the same standard-care curriculum from program instructors. Intervention group participants received Monday materials including a wallet card and a mantra card during enrolment. On Mondays, intervention participants received an emailed tip-of-the-week and were encouraged to quit or recommit to quitting. Quit buddies were recommended in both groups, but intervention participants were encouraged to check-in with quit buddies on Mondays. The outcomes of smoking abstinence, number and length of quit attempts, and self-efficacy were assessed at the final program session and three months later.
Results: At the last session, intervention group participants who were still smoking had a higher self-efficacy of quitting in the future, rated their programs as more helpful in quitting smoking, and were more likely to rate quit buddies as very helpful. Differences in self-efficacy were no longer observed at the second follow-up. No differences were observed between intervention and standard group participants in abstinence, number of quits, length of quits, or self-efficacy of staying quit at either follow-up.
Conclusions: Encouraging results from this pilot study indicate that further research is needed to explore how Monday messaging may improve smoking cessation programs.
Field studies were conducted at Tifton, GA and Gainesville, FL to quantify the phytotoxicity of endothall formulation, rate, and time of application on peanut in a weed-free experiment. Peanut treated with mono (N,N-dimethylalkylamine) salt of endothall (DMAA endothall) were more necrotic than those treated with dipotassium salt of endothall (DP endothall), though necrosis was temporary. Injury from DMAA endothall at rates of 0.6 to 1.1 kg ai/ha was similar to the standard treatment of bentazon plus paraquat for most parameters. Peanut treated with the highest rate of DMAA endothall (4.5 kg/ha) were more necrotic and took longer to recover than lower rates. The highest rate of DP endothall (4.5 kg ai/ha) stunted peanut more than any DMAA endothall treatment. However, lower rates of DP endothall (0.6 to 2.2 kg/ha) were generally less injurious than DMAA endothall at equivalent rates. Peanut yields were not affected by either formulation of endothall at 0.6 to 1.1 kg/ha, applied from vegetative emergence through 4 wk after emergence.
Previous reports have suggested that bentazon [3-(1-methylethyl)-(1H)-2,1,3-benzothiadiazin-4(3H)-one 2,2-dioxide] tolerance among soybean genotypes is the result of differential translocation or metabolism. The basis for tolerance was reexamined using susceptible and tolerant genotypes. Tolerant genotypes (‘Hill’ and ‘Clark 63’) were found to tolerate 100- to 300-fold more bentazon than susceptible genotypes (‘L78–3263’, ‘Hurrelbrink’, and ‘PI 229.342’). Minor differences in absorption and translocation occurred among the genotypes but they did not correlate with tolerance. Tolerant genotypes metabolized 80 to 90% of absorbed bentazon within 24 h, while susceptible genotypes metabolized only 10 to 15%. Two major metabolites, the glycosyl conjugates of 6- and 8-hydroxybentazon, were formed in tolerant genotypes. Susceptible genotypes did not form the hydroxybentazon conjugates but instead produced relatively low levels of two unidentified metabolites. It is concluded that differential bentazon tolerance among soybean genotypes is linked to the ability to form both the 6- and 8-hydroxybentazon conjugates.
Benefits of reduced tillage and diverse crop rotations include reversing soil C loss, and improving soil quality and function. However, adoption of these strategies is lagging, particularly in the Upper Midwest, due to a perception that reduced tillage lowers crop yields. Therefore, an 8-year comparison of these conservation systems with a conventional, tilled, 2-year rotation system was conducted to evaluate effects on yields, system productivity (measured with potential gross returns) and weed seed densities. This study compared conventional moldboard plow + chisel till (CT) to reduced strip-tillage + no-tillage (ST), each with a 2-year (2y) or 4-year (4y) crop rotation, abbreviated as CT-2y, CT-4y, ST-2y and ST-4y. The 2y rotation was corn (Zea mays L.) and soybean (Glycine max [L.] Merr.); the 4y rotation was corn, soybean, spring wheat (Triticum aestivum L.) underseeded with alfalfa (Medicago sativa L.) and alfalfa. Only corn grain was significantly influenced by tillage strategy; CT systems yielded more than ST systems, regardless of rotation. Soybean grain yields were similar among CT-2y, CT-4y, ST-4y and lowest in the ST-2y. Yields of wheat and alfalfa were the same under both tillage strategies. Weed seed densities were higher in wheat and alfalfa, followed by corn then soybean, but were not influenced by tillage or rotation, nor universally negatively correlated to yield. Due to greater corn yields, overall system productivity was highest in CT-2y, the same between CT-4y and ST-2y, and lowest in ST-4y. Within years, productivity of CT-2y was different from only one other system at a time in 3 of 8 years and had the same productivity as all systems in another 3 of 8 years. Additionally, the similarity of productivity among three of four systems in 6 of 8 years indicated reduced tillage and diverse rotations have potential for adoption. Results support the need for research on a rotational tillage strategy, i.e., moldboard plowing before corn, to improve overall productivity if using ST before soybean, wheat and alfalfa.