To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Although in vitro fertilization (IVF) was developed for the treatment of tubal infertility, , it soon became apparent that human IVF had many other applications such as male factor subfertility , unexplained subfertility  and restoring fertility in women without functioning ovaries using ovum  or embryo donation. Although ovum donation was originally used to treat women with Turner’s syndrome , it has also been successfully applied to women with other causes of premature ovarian insufficiency (POI) over the last 35 years. The concept of gamete donation is not new, with sperm donation (DI) having been utilized, initially with fresh sperm and subsequently with stored frozen sperm for several decades . In DI the woman’s partner becomes the social father but is not the genetic father, whereas in oocyte donation the woman who has the child is the birth and social mother, but not the genetic mother. Although the child is not directly genetically linked, egg donation allows the patient to carry and deliver her husband’s genetic child.
Recent studies have tried to find a reliable way of predicting the development of Alzheimer´s Disease (AD) among patients with mild cognitive impairment (MCI), often focusing on olfactory dysfunction or semantic memory. Our study aimed to validate these findings while also comparing the predictive accuracy of olfactory and semantic assessments for this purpose.
Six hundred fifty patients (median age 68, 58% females) including controls, SCD (subjective cognitive decline), non-amnestic MCI (naMCI), amnestic MCI (aMCI), and AD patients were tested for olfactory dysfunction by means of odor identification testing and semantic memory. Of those 650 patients, 120 participants with SCD, naMCI, or aMCI at baseline underwent a follow-up examination after two years on average. Of these 120 patients, 12% had developed AD at follow-up (converters), while 88% did not develop AD at follow-up (non-converters).
Analysis showed a significant difference only for initial olfactory identification between converters and non-converters. Sensitivity of impairment of olfactory identification for AD prediction was low at 46.2%, although specificity was high at 81.9%. Semantic memory impairment at baseline was not significantly related to AD conversion, although, when naming objects, significant differences were found between AD patients and all other groups and between naMCI and aMCI patients compared to controls and SCD patients.
Objective olfactory assessments are promising instruments for predicting the conversion to AD among MCI patients. However, due to their low sensitivity and high specificity, a combination with other neuropsychological tests might lead to an improved predictive accuracy. Further longitudinal studies with more participants are required to investigate the usefulness of semantic memory tests in this case.
It has been well established that milk yield is affected both by milking frequency and due to the removal of residual milk, but the influence of a combination of these factors is unclear. In this study, four mid-lactation cows were used in a 4 × 4 Latin square design to test the hypothesis that the effects of more frequent milking and residual milk removal on milk yield and composition are additive and alter milk fatty acid composition. Treatments comprised two or four times daily milking in combination with (or without) residual milk removal over a 96 h interval preceded by a 2 d pretreatment period and followed by a 8 d washout in each 14 d experimental period. Milk was sampled at each milking for the analysis of gross composition and SCC. Samples of available and residual milk collected on the last milking during each treatment period were collected and submitted for fatty acid composition analysis. Increases in milking frequency and residual milk removal alone or in combination had no effect on milk yield or on the secretion of lactose and protein in milk. However, residual milk removal during more frequent milking increased milk fat yield. Milking treatments had no major influence on the fatty acid composition of available milk, but resulted in rather small changes in the relative abundance of specific fatty acids, with no evidence that the additive effects of treatments were due to higher utilisation of preformed fatty acids relative to fatty acid synthesis de novo. For all treatments, fat composition of available and residual milk was rather similar indicating a highly uniform fatty acid composition of milk fat within the mammary gland.
Objectives: The Priorities and Evaluation Committee (PEC) funding recommendations for new cancer drugs in British Columbia, Canada have been based on both clinical and economic evidence. The British Columbia Ministry of Health makes funding decisions. We assessed the association between cost-effectiveness of cancer drugs considered from 1998 to 2008 and the subsequent funding decisions.
Methods: All proposals submitted to the PEC between 1998 and 2008 were reviewed, and the association between cost-effectiveness and funding decisions was examined by (i) using logistic regression to test the hypothesis that interventions with higher incremental cost-effectiveness ratios (ICERs) have a lower probability of receiving a positive funding decision and (ii) using parametric and nonparametric tests to determine if a statistically significant difference exists between the mean cost-effectiveness of funded versus not funded proposals. A sub-analysis was conducted to determine if the findings varied across different outcome measures.
Results: Of the 149 proposals reviewed, 78 reported cost-effectiveness using various outcome measures. In the proposals that used life-years gained as the outcome (n = 22), a statistically significant difference of nearly $115,000 was observed between the mean ICERs for funded proposals ($42,006) and for unfunded proposals ($156,967). An odds ratio indicating higher ICERs have a lower probability of being funded was also found to be statistically significant (p < .05).
Conclusions: Economic evidence appears to play a role in British Columbia cancer funding decisions from 1998 to 2008; other decision-making criteria may also have an important role in recommendations and subsequent funding decisions.
Catheter-associated urinary tract infections (CAUTIs) are among the most common hospital-acquired infections (HAIs). Reducing CAUTI rates has become a major focus of attention due to increasing public health concerns and reimbursement implications.
To implement and describe a multifaceted intervention to decrease CAUTIs in our ICUs with an emphasis on indications for obtaining a urine culture.
A project team composed of all critical care disciplines was assembled to address an institutional goal of decreasing CAUTIs. Interventions implemented between year 1 and year 2 included protocols recommended by the Centers for Disease Control and Prevention for placement, maintenance, and removal of catheters. Leaders from all critical care disciplines agreed to align routine culturing practice with American College of Critical Care Medicine (ACCCM) and Infectious Disease Society of America (IDSA) guidelines for evaluating a fever in a critically ill patient. Surveillance data for CAUTI and hospital-acquired bloodstream infection (HABSI) were recorded prospectively according to National Healthcare Safety Network (NHSN) protocols. Device utilization ratios (DURs), rates of CAUTI, HABSI, and urine cultures were calculated and compared.
The CAUTI rate decreased from 3.0 per 1,000 catheter days in 2013 to 1.9 in 2014. The DUR was 0.7 in 2013 and 0.68 in 2014. The HABSI rates per 1,000 patient days decreased from 2.8 in 2013 to 2.4 in 2014.
Effectively reducing ICU CAUTI rates requires a multifaceted and collaborative approach; stewardship of culturing was a key and safe component of our successful reduction efforts.
Various medications and devices are available for facilitation of emergent endotracheal intubations (EETIs). The objective of this study was to survey which medications and devices are being utilized for intubation by Canadian physicians.
A clinical scenario-based survey was developed to determine which medications physicians would administer to facilitate EETI, their first choice of intubation device, and backup strategy should their first choice fail. The survey was distributed to Canadian emergency medicine (EM) and intensive care unit (ICU) physicians using web-based and postal methods. Physicians were asked questions based on three scenarios (trauma; pneumonia; heart failure) and responded using a 5-point scale ranging from “always” to “never” to capture usual practice.
The survey response rate was 50.2% (882/1,758). Most physicians indicated a Macintosh blade with direct laryngoscopy would “always/often” be their first choice of intubation device in the three scenarios (mean 85% [79%-89%]) followed by video laryngoscopy (mean 37% [30%-49%]). The most common backup device chosen was an extraglottic device (mean 59% [56%-60%]). The medications most physicians would “always/often” administer were fentanyl (mean 45% [42%-51%]) and etomidate (mean 38% [25%-50%]). EM physicians were more likely than ICU physicians to paralyze patients for EETI (adjusted odds ratio 3.40; 95% CI 2.90-4.00).
Most EM and ICU physicians utilize direct laryngoscopy with a Macintosh blade as a primary device for EETI and an extraglottic device as a backup strategy. This survey highlights variation in Canadian practice patterns for some aspects of intubation in critically ill patients.
Because depressive illness is recurrent, recurrence prevention should be a mainstay for reducing its burden on society. One way to reach this goal is to identify malleable risk factors. The ability to attenuate sadness/dysphoria (mood repair) and parasympathetic nervous system functioning, indexed as respiratory sinus arrhythmia (RSA), are impaired during depression and after it has remitted. The present study therefore tested the hypothesis that these two constructs also may mirror risk factors for a recurrent major depressive episode (MDE).
At time 1 (T1), 178 adolescents, whose last MDE had remitted, and their parents, reported on depression and mood repair; youths’ RSA at rest and in response to sad mood induction also were assessed. MDE recurrence was monitored until time 2 (T2) up to 2 years later. Mood repair at T1 (modeled as a latent construct), and resting RSA and RSA response to sadness induction (RSA profile), served to predict onset of first recurrent MDE by T2.
Consistent with expectations, maladaptive mood repair predicted recurrent MDE, above and beyond T1 depression symptoms. Further, atypical RSA profiles at T1 were associated with high levels of maladaptive mood repair, which, in turn, predicted increased risk of recurrent MDE. Thus, maladaptive mood repair mediated the effects of atypical RSA on risk of MDE recurrence.
This study documented that a combination of behavioral and physiological risk factors predicted MDE recurrence in a previously clinically referred sample of adolescents with depression histories. Because mood repair and RSA are malleable, both could be targeted for modification to reduce the risk of recurrent depression in youths.
Behavioural and cardiac responses of multiparous dairy cows (n=24) during milking in a 2×4 stall herringbone milking system were evaluated in this study. Heart rate (HR), parasympathetic tone index (high frequency component, HF) of heart rate variability and sympathovagal balance indicator LF/HF ratio (the ratio of the low frequency (LF) and the HF component) were analysed. Measurement periods were established as follows: (1) standing calm (baseline), (2) udder preparation, (3) milking, (4) waiting after milking in the milking stall and (5) in the night (2 h after milking). Step behaviour was recorded and calculated per minute for the three phases of the milking process (udder preparation, milking and waiting after milking). HR was higher during udder preparation and milking compared with baseline (P=0.03, 0.027, respectively). HF was significantly lower than baseline levels during waiting in the milking stall after milking (P=0.009), however, during udder preparation, milking and 2 h after milking did not differ from baseline (P>0.05, in either case). LF/HF during the three phases of the milking process differed neither from baseline levels nor from each other. Steps occurred more often during waiting after milking than during udder preparation (P=0.042) or during milking (23; P=0.017). Our results suggest that the milking procedure itself was not stressful for these animals. After milking (following the removal of the last teat cup and before leaving the milking stall), both decreased parasympathetic tone (lower HF) and increased stepping rate indicated a sensitive period for animals during this phase.