To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Abdominal ultrasonography is an extremely valuable diagnostic tool for all perioperative physicians. While the FAST exam was designed for use in patients with blunt abdominal trauma, its principles are applicable in a wide variety of perioperative settings and can be used to narrow the differential diagnosis in unstable patients. Aortic ultrasound is easy to perform and rapidly confirms or rules out the presence of abdominal aortic aneurysm or dissection. Other uses include gallbladder imaging and evaluation for free intra-peritoneal air. Perioperative and intensive care unit patients will benefit from point-of-care ultrasound, including detailed examination of the abdominal cavity.
Revolving Door Syndrome usually corresponds to what might be called “hospital multiple readmissions phenomenon”. Beyond consequences to the patients and their families, frequent re-readmissions also heavily increase healthcare cost and cause burn out among medical and paramedical staff.
The objective of this study was to find the characteristics of Tunisian patients with “revolving door” syndrome.
The purpose of this study was to identify factors associated to short-term readmissions in a sample of Tunisian patients with schizophrenia.
The authors conducted a retrospective study on 50 patients with schizophrenia or schizo-affective disorders from November to february 2009 in Razi Hospital's “Psychiatry C” service. Patients included in the study had been hospitalized at least 4 times. The patients were analyzed for socio-demographic characteristics, total readmissions and the number of admissions in the last year. Their medication adherence was evaluated by MARS (Medication Adherence Rating Scale) and their insight was evaluated by the Q8 scale.
The sample was composed of 50 patients with schizophrenia or schizo-affective disorders according to DSM-IV, more than 18 years of age, which have been hospitalized more than 4 times.
The sample was composed of 80% men. 74% of the sample was single and 66% were living with their parents. 88% were unemployed. 54% of patients were bad observers and 88% had lack of insight.
The authors found that the typical Tunisian revolving door patient is a single man, living at his parent's, unemployed and with a lack of observation skills and insight.
For life insurers in the United Kingdom (UK), the risk margin is one of the most controversial aspects of the Solvency II regime which came into force in 2016.
The risk margin is the difference between the technical provisions and the best estimate liabilities. The technical provisions are intended to be market-consistent, and so are defined as the amount required to be paid to transfer the business to another undertaking. In practice, the technical provisions cannot be directly calculated, and so the risk margin must be determined using a proxy method; the method chosen for Solvency II is known as the cost-of-capital method.
Following the implementation of Solvency II, the risk margin came under considerable criticism for being too large and too sensitive to interest rate movements. These criticisms are particularly valid for annuity business in the UK – such business is of great significance to the system for retirement provision. A further criticism is that mitigation of the impact of the risk margin has led to an increase in reinsurance of longevity risks, particularly to overseas reinsurers.
This criticism has led to political interest, and the risk margin was a major element of the Treasury Committee inquiry into EU Insurance Regulation.
The working party was set up in response to this criticism. Our brief is to consider both the overall purpose of the risk margin for life insurers and solutions to the current problems, having regard to the possibility of post-Brexit flexibility.
We have concluded that a risk margin in some form is necessary, although its size depends on the level of security desired, and so is primarily a political question.
We have reviewed possible alternatives to the current risk margin, both within the existing cost-of-capital methodology and considering a wide range of alternatives.
We believe that requirements for the risk margin will depend on future circumstances, in particular relating to Brexit, and we have identified a number of possible changes to methodology which should be considered, depending on circumstances.
Recent work has implicated one type of horizontal strabismus (exotropia) as a risk factor for schizophrenia. This new insight raises questions about a potential common developmental origin of the two diseases. Seasonality of births is well established for schizophrenia. Seasonal factors such as light exposure affect eye growth and can cause vision abnormalities, but little is known about seasonality of births in strabismus. We examined birth seasonality in people with horizontal strabismus in a retrospective study in Washoe County, Nevada, and re-examined similar previously obtained data from Osaka, Japan. We then compared seasonal patterns of births between strabismus, refractive error, schizophrenia and congenital toxoplasmosis. Patients with esotropia had a significant seasonality of births, with a deficit in March, then increasing to an excess in September, while patients with exotropia had a distinctly different pattern, with an excess of births in July, gradually decreasing to a deficit in November. These seasonalities were statistically significant with either χ2 or Kolmogorov–Smirnov-type statistics. The birth seasonality of esotropia resembled that for hyperopia, with an increase in amplitude, while the seasonality for myopia involved a phase-shift. There was no correlation between seasonality of births between strabismus and congenital toxoplasmosis. The pattern of an excess of summer births for people with exotropia was remarkably similar to the well-established birth seasonality of one schizophrenia subtype, the deficit syndrome, but not schizophrenia as a whole. This suggests a testable hypothesis: that exotropia may be a risk factor primarily for the deficit type of schizophrenia.
Centenarians have survived into very late life, but whether they reach very old age in good health remains unclear. The purpose of this study was to compare the cardiovascular health status and cognitive functioning of centenarians in the United States with centenarians in Japan.
Design, Setting, and Participants:
This cross-national design compared centenarians from the United States and Japan. The sample of U.S. centenarians was recruited from the Georgia Centenarian Study and included 287 centenarians. The sample of Japanese centenarians was recruited from the Tokyo Centenarian Study and included 304 centenarians.
Cognitive functioning was assessed with a mental status questionnaire, and cardiovascular disease by a health history assessment, blood pressure, and selected blood parameters.
The results suggest that Tokyo centenarians had lower disease experiences and BMI values, when compared to Georgia centenarians, but blood pressure was higher among Japanese centenarians. Lower levels of hemoglobin in Japanese centenarians and higher levels of C-reactive protein in Georgia were also found. The positive association of hypertension and albumin levels with cognitive functioning and the negative association of stroke occurrence with cognitive functioning were replicated in both countries. Differential effects were obtained for heart problems, BMI, and C-reactive protein (with positive effects for Tokyo centenarians, except for C-reactive protein).
For extremely old individuals, some markers of cardiovascular disease are replicable across countries, whereas differential effects for cardiovascular health also need to be considered in cardiovascular health.
Sugarbeet, grown for biofuel, is being considered as an alternate cool-season crop in the southeastern United States. Previous research identified ethofumesate PRE and phenmedipham + desmedipham POST as herbicides that controlled troublesome cool-season weeds in the region, specifically cutleaf evening-primrose. Research trials were conducted from 2014 through 2016 to evaluate an integrated system of sweep cultivation and reduced rates of ethofumesate PRE and/or phenmedipham+desmedipham POST for weed control in sugarbeet grown for biofuel. There were no interactions between the main effects of cultivation and herbicides for control of cutleaf evening-primrose and other cool-season species in two out of three years. Cultivation improved control of cool-season weeds, but the effect was largely independent of control provided by herbicides. Of the herbicide combinations evaluated, the best overall cool-season weed control was from systems that included either a 1/2X or 1X rate of phenmedipham+desmedipham POST. Either rate of ethofumesate PRE was less effective than phenmedipham+desmedipham POST. Despite improved cool-season weed control, sugarbeet yield was not affected by cultivation each year of the study. Sugarbeet yields were greater when treated with any herbicide combination that included either a 1/2X or 1X rate of phenmedipham+desmedipham POST compared with either rate of ethofumesate PRE alone or the nontreated control. These results indicate that cultivation has a very limited role in sugarbeet grown for biofuel. The premise of effective weed control based on an integration of cultivation and reduced herbicide rates does not appear to be viable for sugarbeet grown for biofuel.
Sugarbeet, grown for biofuel, is being considered as an alternate cool-season crop in the southeastern U.S. coastal plain. Typically, the crop would be seeded in the autumn, then grow through the winter and be harvested the following spring. Labels for herbicides registered for use on sugarbeet grown in the traditional sugarbeet production regions do not list any of the cool-season weeds common in the southeastern United States. Field trials were initiated near Ty Ty, GA, to evaluate all possible combinations of ethofumesate applied PRE, phenmedipham+desmedipham applied POST, clopyralid POST, and triflusulfuron POST for cool-season weed control in sugarbeet. Phenmedipham+desmedipham alone and in combination with clopyralid and/or triflusulfuron effectively controlled cutleaf eveningprimrose, lesser swinecress, henbit, and corn spurry when applied to seedling weeds. Ethofumesate PRE alone was not as effective in controlling cool-season weeds compared to treatments containing phenmedipham+desmedipham POST. However, ethofumesate PRE applied sequentially with phenmedipham+desmedipham POST improved weed control consistency. Clopyralid and/or triflusulfuron alone did not adequately control cutleaf eveningprimrose. Triflusulfuron alone effectively controlled wild radish. In the 2013–2014 and 2014–2015 seasons, December-applied POST herbicides did not injure sugarbeet. However, in the 2015–2016 season POST herbicides were applied in late October. On the day of treatment, the maximum temperature was 25.4 C, which exceeded the established upper temperature limit of 22 C for safe application of phenmedipham+desmedipham, and sugarbeet plants were severely injured. In the southeastern United States, temperatures frequently exceed 22 C in early autumn, which may limit phenmedipham+desmedipham use for controlling troublesome cool-season weeds of sugarbeet in the region. Weed control options need to be expanded to compensate for this limitation.
Introduction: Emerging evidence suggests a heightened interest in healthy behaviour changes, including smoking cessation, at the beginning of the week. Evidence from Google searches, quitlines, and cessation websites show greater information-seeking and interest in early week quitting.
Aims: This pilot assesses the comparative effectiveness of a smoking cessation intervention that encourages participants to use Mondays as a day to quit or recommit to quitting smoking.
Methods: We partnered with existing smoking cessation group programs to conduct a quasi-experimental, pre–post study. Both comparison and intervention groups received the same standard-care curriculum from program instructors. Intervention group participants received Monday materials including a wallet card and a mantra card during enrolment. On Mondays, intervention participants received an emailed tip-of-the-week and were encouraged to quit or recommit to quitting. Quit buddies were recommended in both groups, but intervention participants were encouraged to check-in with quit buddies on Mondays. The outcomes of smoking abstinence, number and length of quit attempts, and self-efficacy were assessed at the final program session and three months later.
Results: At the last session, intervention group participants who were still smoking had a higher self-efficacy of quitting in the future, rated their programs as more helpful in quitting smoking, and were more likely to rate quit buddies as very helpful. Differences in self-efficacy were no longer observed at the second follow-up. No differences were observed between intervention and standard group participants in abstinence, number of quits, length of quits, or self-efficacy of staying quit at either follow-up.
Conclusions: Encouraging results from this pilot study indicate that further research is needed to explore how Monday messaging may improve smoking cessation programs.
Benefits of reduced tillage and diverse crop rotations include reversing soil C loss, and improving soil quality and function. However, adoption of these strategies is lagging, particularly in the Upper Midwest, due to a perception that reduced tillage lowers crop yields. Therefore, an 8-year comparison of these conservation systems with a conventional, tilled, 2-year rotation system was conducted to evaluate effects on yields, system productivity (measured with potential gross returns) and weed seed densities. This study compared conventional moldboard plow + chisel till (CT) to reduced strip-tillage + no-tillage (ST), each with a 2-year (2y) or 4-year (4y) crop rotation, abbreviated as CT-2y, CT-4y, ST-2y and ST-4y. The 2y rotation was corn (Zea mays L.) and soybean (Glycine max [L.] Merr.); the 4y rotation was corn, soybean, spring wheat (Triticum aestivum L.) underseeded with alfalfa (Medicago sativa L.) and alfalfa. Only corn grain was significantly influenced by tillage strategy; CT systems yielded more than ST systems, regardless of rotation. Soybean grain yields were similar among CT-2y, CT-4y, ST-4y and lowest in the ST-2y. Yields of wheat and alfalfa were the same under both tillage strategies. Weed seed densities were higher in wheat and alfalfa, followed by corn then soybean, but were not influenced by tillage or rotation, nor universally negatively correlated to yield. Due to greater corn yields, overall system productivity was highest in CT-2y, the same between CT-4y and ST-2y, and lowest in ST-4y. Within years, productivity of CT-2y was different from only one other system at a time in 3 of 8 years and had the same productivity as all systems in another 3 of 8 years. Additionally, the similarity of productivity among three of four systems in 6 of 8 years indicated reduced tillage and diverse rotations have potential for adoption. Results support the need for research on a rotational tillage strategy, i.e., moldboard plowing before corn, to improve overall productivity if using ST before soybean, wheat and alfalfa.
Tillage is decreasing globally due to recognized benefits of fuel savings and improved soil health in the absence of disturbance. However, a perceived inability to control weeds effectively and economically hinders no-till adoption in organic production systems in the Upper Midwest, USA. A strip-tillage (ST) strategy was explored as an intermediate approach to reducing fuel use and soil disturbance, and still controlling weeds. An 8-year comparison was made between two tillage approaches, one primarily using ST the other using a combination of conventional plow, disk and chisel tillage [conventional tillage (CT)]. Additionally, two rotation schemes were explored within each tillage system: a 2-year rotation (2y) of corn (Zea mays L.), and soybean (Glycine max [L.] Merr.) with a winter rye (Secale cereale L.) cover crop; and a 4-year rotation (4y) of corn, soybean, spring wheat (Triticum aestivum L.) underseeded with alfalfa (Medicago sativa L.), and a second year of alfalfa. These treatments resulted in comparison of four main management systems CT-2y, CT-4y, ST-2y and ST-4y, which also were managed under fertilized and non-fertilized conditions. Yields, whole system productivity (evaluated with potential gross returns), and weed seed densities (first 4 years) were measured. Across years, yields of corn, soybean and wheat were greater by 34% or more under CT than ST but alfalfa yields were the same. Within tillage strategies, corn yields were the same in 2y and 4y rotations, but soybean yields, only under ST, were 29% lower in the fertilized 4y than 2 yr rotation. In the ST-4y system yields of corn and soybean were the same in fertilized and non-fertilized treatments. Over the entire rotation, system productivity was highest in the fertilized CT-2y system, but the same among fertilized ST-4y, and non-fertilized ST-2y, ST-4y, and CT-4y systems. Over the first 4 years, total weed seed density increased comparatively more under ST than CT, and was negatively correlated to corn yields in fertilized CT systems and soybean yields in the fertilized ST-2y system. These results indicated ST compromised productivity, in part due to insufficient weed control, but also due to reduced nutrient availability. ST and diverse rotations may yet be viable options given that overall productivity of fertilized ST-2y and CT-4y systems was within 70% of that in the fertilized CT-2y system. Closing the yield gap between ST and CT would benefit from future research focused on organic weed and nutrient management, particularly for corn.
Numerous factors influence late-life depressive symptoms in adults, many not thoroughly characterized. We addressed whether genetic and environmental influences on depressive symptoms differed by age, sex, and physical illness.
The analysis sample included 24 436 twins aged 40–90 years drawn from the Interplay of Genes and Environment across Multiple Studies (IGEMS) Consortium. Biometric analyses tested age, sex, and physical illness moderation of genetic and environmental variance in depressive symptoms.
Women reported greater depressive symptoms than men. After age 60, there was an accelerating increase in depressive symptom scores with age, but this did not appreciably affect genetic and environmental variances. Overlap in genetic influences between physical illness and depressive symptoms was greater in men than in women. Additionally, in men extent of overlap was greater with worse physical illness (the genetic correlation ranged from near 0.00 for the least physical illness to nearly 0.60 with physical illness 2 s.d. above the mean). For men and women, the same environmental factors that influenced depressive symptoms also influenced physical illness.
Findings suggested that genetic factors play a larger part in the association between depressive symptoms and physical illness for men than for women. For both sexes, across all ages, physical illness may similarly trigger social and health limitations that contribute to depressive symptoms.