To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A cohort study was performed from January 2014 to December 2016 in a Brazilian neonatal intensive care unit, including neonates with high risk for infection and death. We estimated bloodstream infection (BSI) incidence and conducted a survival analysis, considering the time to death and to the first episode of BSI as outcomes, comparing very low birth weight (VLBW) neonates with the remaining neonates. An extended Cox model was performed and the hazard ratio (HR) was calculated for different time periods. The study had 1560 neonates included, the incidence and the incidence density of BSI was 22% and 18.6 per 1000 central venous catheter-days, respectively. Considering VLBW neonates as the reference group, the HR for time to death was 4.06 (95% CI 2.75–6.00, P < 0.01) from day 0 to 60 and for time to the first episode of BSI was 1.76 (95% CI 1.31–2.36, P < 0.01) from day 0 to 36. Having the heavier neonates group as reference, the HR for time to the first episode of BSI was 2.94 (95% CI 1.92–4.34, P < 0.01) from day 37 to 90. Late-onset neonatal sepsis prevention measures should consider the differences in risk during time, according to neonates' birth weight.
Natalizumab is an efficacious disease modifying therapy (DMT) for relapsing remitting multiple sclerosis (RRMS), often limited by risk of progressive multifocal leukoencephalopathy. We describe the clinical course of RRMS patients switched from natalizumab to another DMT. We identified all RRMS patients treated with natalizumab ≥3 months with JC virus antibody positivity who switched to another DMT. Overall, 84 individuals switched DMT with 57 (68%) beginning fingolimod. On fingolimod, survival without a relapse was 74% (55.8–85.6%) at 36 months and survival without disease progression was 78% (62.6–87.6%) at 36 months. In conclusion, fingolimod is an effective therapy post-natalizumab.
Before an intervention is publicly funded within the United Kingdom, the cost-effectiveness is assessed by the National Institute of Health and Care Excellence (NICE). The efficacy of an intervention across the patients’ lifetime is often influential of the cost-effectiveness analyses, but is associated with large uncertainties. We reviewed committee documents containing company submissions and evidence review group (ERG) reports to establish the methods used when extrapolating survival data, whether these adhered to NICE Technical Support Document (TSD) 14, and how uncertainty was addressed.
A systematic search was completed on the NHS Evidence Search webpage limited to single technology appraisals of cancer interventions published in 2017, with information obtained from the NICE Web site.
Twenty-eight appraisals were identified, covering twenty-two interventions across eighteen diseases. Every economic model used parametric curves to model survival. All submissions used goodness-of-fit statistics and plausibility of extrapolations when selecting a parametric curve. Twenty-five submissions considered alternate parametric curves in scenario analyses. Six submissions reported including the parameters of the survival curves in the probabilistic sensitivity analysis. ERGs agreed with the company's choice of parametric curve in nine appraisals, and agreed with all major survival-related assumptions in two appraisals.
TSD 14 on survival extrapolation was followed in all appraisals. Despite this, the choice of parametric curve remains subjective. Recent developments in Bayesian approaches to extrapolation are not implemented. More precise guidance on the selection of curves and modelling of uncertainty may reduce subjectivity, accelerating the appraisal process.
Cow routines and behavioral responses are altered substantially following the installation of robot milking. The present study was designed to analyze the effect that switching from milking parlor to automatic milking system (AMS) had on the culling rate (due to various causes) of dairy cattle. For this purpose, culling records and causes for culling were tracked in 23 dairy farms in the Galicia region (NW Spain). The animals in these farms were monitored for 5 years. For the present study, that length of time was divided into three different stages, as follows: 2 years before switching from a milking parlor to AMS (stage 1), the 1st year following the implementation of AMS (stage 2) and the 2nd and 3rd years succeeding the implementation of AMS (stage 3). Cox models for survival analysis were used to estimate the time to culling due to different reasons during stage 1 in relation to stages 2 and 3. The data indicated that the risk of loss due to death or emergency slaughter decreased significantly following the installation of AMS. In contrast, the risk of culling due to low production, udder problems, infertility or lameness increased significantly. Low-production cows (such as cows in advanced lactation due to infertility) or sick cows (such as mastitic or lame cows) allegedly have a noticeable effect both on the performance and the amortization of the cost of AMS, which in turn would lead to a higher probability of elimination than in conventional systems.
Hepatitis E virus genotype 1 (HEV G1) is an important cause of morbidity and mortality in Africa and Asia. HEV G1's natural history, including the incubation period, remains poorly understood, hindering surveillance efforts and effective control. Using individual-level data from 85 travel-related HEV G1 cases in England and Wales, we estimate the incubation period distribution using survival analysis methods, which allow for appropriate inference when only time ranges, rather than exact times are known for the exposure to HEV and symptom onset. We estimated a 29.8-day (95% confidence interval (CI) 24.1–36.0) median incubation period with 5% of people expected to develop symptoms within 14.3 days (95% CI 10.1–21.7) and 95% within 61.9 days (95% CI 47.4–74.4) of exposure. These estimates can help refine clinical case definitions and inform the design of disease burden and intervention studies.
This article examines the survival rates of cooperatives in the French wine industry. Traditional theory claims that cooperatives are inefficient and consequently are prone to failure, but recent literature suggests a higher resilience. Can cooperatives cope better? We find that French wine cooperatives survive longer than corporations. This result is robust in semi-parametric and parametric models, even when we control for mergers and acquisitions. The higher survival rate of wine cooperatives seems to be associated with their ability to pass changes in their business environments on to their members. (JEL Classifications: C41, G30, Q13)
Obesity is a major risk factor for osteoarthritis (OA) whilst there is some evidence that diabetes also increases risk. Metformin is a common oral treatment for those with diabetes.
The aim is to investigate whether metformin reduces the risk of OA.
This was a cohort study set within the Consultations in Primary Care Archive, with 3217 patients with type 2 diabetes. Patients at 13 general practices with recorded type 2 diabetes in the baseline period (2002–2003) and no prior record of OA were identified. Exposure was a prescription for metformin. Outcome was an OA record during follow up. Cox proportional hazard models with Gamma frailty term were fitted: adjusted for age, gender, deprivation, and comorbidity.
There was no association between prescribed metformin treatment at baseline and OA (adjusted HR: 1.02, 95% CI: 0.91, 1.15). A similar non- significant association was found when allowing exposure status of prescription of metformin to vary over time.
The aim of the present study was to evaluate the prediction ability of models that cope with longevity phenotypic expression as uncensored and censored in Nellore cattle. Longevity was defined as the difference between the dates of last weaned calf and cow birth. There were information of 77 353 females, being 61 097 cows with uncensored phenotypic information and 16 256 cows with censored records. These data were analyzed considering three different models: (1) Gaussian linear model (LM), in which only uncensored records were considered; and two models that consider both uncensored and censored records: (2) Censored Gaussian linear model (CLM); and (3) Weibull frailty hazard model (WM). For the model prediction ability comparisons, the data set was randomly divided into training and validation sets, containing 80% and 20% of the records, respectively. There were considered 10 repetitions applying the following restrictions: (a) at least three animals per contemporary group in the training set; and (b) sires with more than 10 progenies with uncensored records (352 sires) should have daughters in the training and validation sets. The variance components estimated using the whole data set in each model were used as true values in the prediction of breeding values of the animals in the training set. The WM model showed the best prediction ability, providing the lowest χ2 average and the highest number of sets in which a model had the smallest value of χ2 statistics. The CLM and LM models showed prediction abilities 2.6% and 3.7% less efficient than WM, respectively. In addition, the accuracies of sire breeding values for LM and CLM were lower than those obtained for WM. The percentages of bulls in common, considering only 10% of sires with the highest breeding values, were around 75% and 54%, respectively, between LM–CLM and LM–WM models, considering all sires, and 75% between LM–CLM and LM–WM, when only sires with more than 10 progenies with uncensored records were taken into account. These results are indicative of reranking of animals in terms of genetic merit between LM, CLM and WM. The model in which censored records of longevity were excluded from the analysis showed the lowest prediction ability. The WM provides the best predictive performance, therefore this model would be recommended to perform genetic evaluation of longevity in this population.
This article is concerned with explaining why peace endures in countries that have experienced a civil armed conflict. We use a mixed methods approach by evaluating six case studies (Burundi, East Timor, El Salvador, Liberia, Nepal, Sierra Leone) and survival analysis that allows us to consider 205 peace episodes since 1990. We find that it is difficult to explain why peace endures using statistical analysis but there is some indication that conflict termination is important in post-conflict stabilisation: negotiated settlements are more likely to break down than military victories. We also consider the impact of UN peacekeeping operations on the duration of peace but find little evidence of their contribution. However, in situations where UN peacekeeping operations are deployed in support of negotiated settlements they do seem to contribute to peace stabilisation.
Objectives: Treatment switching refers to the situation in a randomized controlled trial where patients switch from their randomly assigned treatment onto an alternative. Often, switching is from the control group onto the experimental treatment. In this instance, a standard intention-to-treat analysis does not identify the true comparative effectiveness of the treatments under investigation. We aim to describe statistical methods for adjusting for treatment switching in a comprehensible way for nonstatisticians, and to summarize views on these methods expressed by stakeholders at the 2014 Adelaide International Workshop on Treatment Switching in Clinical Trials.
Methods: We describe three statistical methods used to adjust for treatment switching: marginal structural models, two-stage adjustment, and rank preserving structural failure time models. We draw upon discussion heard at the Adelaide International Workshop to explore the views of stakeholders on the acceptability of these methods.
Results: Stakeholders noted that adjustment methods are based on assumptions, the validity of which may often be questionable. There was disagreement on the acceptability of adjustment methods, but consensus that when these are used, they should be justified rigorously. The utility of adjustment methods depends upon the decision being made and the processes used by the decision-maker.
Conclusions: Treatment switching makes estimating the true comparative effect of a new treatment challenging. However, many decision-makers have reservations with adjustment methods. These, and how they affect the utility of adjustment methods, require further exploration. Further technical work is required to develop adjustment methods to meet real world needs, to enhance their acceptability to decision-makers.
Seed mass is an important plant functional trait linked to germination. For instance, higher-mass seeds often display greater germination compared to lower-mass seeds when exposed to non-stressful conditions. Yet, knowledge pertaining to germination dynamics for different mass-based seed fractions following exposure to abiotic stress is lacking. Here, we assess the germination response of relatively fresh, mass-separated Rudbeckia mollis (Asteraceae) seeds to various simulated seasonal temperatures, supra-optimal temperatures and increasing ageing stress duration. Air density separation yielded three mass-based classes, called light (393 ± 35 μg), intermediate (423 ± 29 μg) and heavy (474 ± 38 μg). Water uptake kinetics indicated that imbibition (0–6 h) and germination lag (6–24 h) were independent of seed mass. Similarly, germination and viability loss of fresh seeds following exposure to seasonal and supra-optimal constant temperatures were independent of mass. However, seed mass influenced germination following increasing ageing stress, with light seeds germinating to a significantly greater extent than intermediate or heavy seeds. For example, final germination per cent in light-class seeds was about 1.7 times greater than intermediate or heavy seeds after 20 d of saturated salt accelerated ageing (SSAA). Seeds stored for 1 year in the laboratory displayed mass-dependent germination patterns similar to seeds following SSAA. Mass-independent germination responses may be a strategy to maintain an annual life history in otherwise difficult environments when R. mollis seeds are relatively fresh. However, differences in germination response between aged and unaged seeds suggest that mass-dependent viability loss may occur in R. mollis.
Invasive pneumococcal disease (IPD), caused by infection with Streptococcus pneumoniae, has a substantial global burden. There are over 90 known serotypes of S. pneumoniae with a considerable body of evidence supporting serotype-specific mortality rates immediately following IPD. This is the first study to consider the association between serotype and longer-term mortality following IPD. Using enhanced surveillance data from the North East of England we assessed both the short-term (30-day) and longer-term (⩽7 years) independent adjusted associations between individual serotypes and mortality following IPD diagnosis using logistic regression and extended Cox proportional hazards models. Of the 1316 cases included in the analysis, 243 [18·5%, 95% confidence interval (CI) 16·4–20·7] died within 30 days of diagnosis. Four serotypes (3, 6A, 9N, 19 F) were significantly associated with overall increased 30-day mortality. Effects were observable only for older adults (⩾60 years). After extension of the window to 12 months and 36 months, one serotype was associated with significantly increased mortality at 12 months (19 F), but no individual serotypes were associated with increased mortality at 36 months. Two serotypes had statistically significant hazard ratios (HR) for longer-term mortality: serotype 1 for reduced mortality (HR 0·51, 95% CI 0·30–0·86) and serotype 9N for increased mortality (HR 2·30, 95% CI 1·29–4·37). The association with serotype 9N was no longer observed after limiting survival analysis to an observation period starting 30 days after diagnosis. This study supports the evidence for associations between serotype and short-term (30-day) mortality following IPD and provides the first evidence for the existence of statistically significant associations between individual serotypes and longer-term variation in mortality following IPD.
Timely recognition and treatment of mental disorders with an onset in childhood and adolescence is paramount, as these are characterized by greater severity and longer persistence than disorders with an onset in adulthood. Studies examining time-to-treatment, also referred to as treatment delay, duration of untreated illness or latency to treatment, and defined as the time between disorder onset and initial treatment contact, are sparse and all based on adult samples. The aim of this study was to describe time-to-treatment and its correlates for any health care professional (any care) and secondary mental health care (secondary care), for a broad range of mental disorders, in adolescents.
Data from the Dutch community-based cohort study TRacking Adolescents’ Individual Lives Survey (TRAILS; N = 2230) were used. The Composite International Diagnostic Interview (CIDI) was administered to assess DSM-IV disorders, the age of onset, and the age of initial treatment contact with any health care professional in 1584 adolescents of 18–20 years old. In total 43% of the adolescents (n = 675) were diagnosed with a lifetime DSM-IV disorder. The age of initial treatment contact with secondary care was based on administrative records from 321 adolescents without a disorder onset before the age of 10. Descriptive statistics, cumulative lifetime probability plots, and Cox regression analyses were used analyze time-to-treatment.
The proportion of adolescents who reported lifetime treatment contact with any care varied from 15% for alcohol dependence to 82% for dysthymia. Regarding secondary care, proportions of lifetime treatment contact were lower for mood disorders and higher for substance dependence. Time-to-treatment for any care varied considerably between and within diagnostic classes. The probability of lifetime treatment contact for mood disorders was above 90%, whereas for other mental disorders this was substantially lower. An earlier age of onset predicted a longer, and the presence of a co-morbid mood disorder predicted a shorter time-to-treatment in general. Disorder severity predicted a shorter time-to-treatment for any care, but not for secondary care. Time-to-treatment for secondary care was shorter for adolescents from low and middle socioeconomic background than for adolescents from a high socioeconomic background.
Although the time-to-treatment was shorter for adolescents than for adults, it was still substantial, and the overall patterns were remarkably similar to those found in adults. Efforts to reduce time-to-treatment should therefore be aimed at children and adolescents. Future research should address mechanisms underlying time-to-treatment and its consequences for early-onset disorders in particular.
Merkel cell carcinoma is a rare, aggressive neurocutaneous malignancy. This study investigated whether patients with Merkel cell carcinoma in the head and neck had poorer outcomes than patients with Merkel cell carcinoma located elsewhere.
A retrospective study was performed of patients with Merkel cell carcinoma treated at the Jewish General Hospital in Montréal, Canada, from 1993 to 2013. Associations between clinicopathological characteristics and disease-free and disease-specific survival rates were examined according to the Kaplan–Meier method.
Twenty-seven patients were identified. Although basic clinicopathological characteristics and treatments were similar between head and neck and non-head and neck Merkel cell carcinoma groups, disease-free and disease-specific survival rates were significantly lower in the head and neck Merkel cell carcinoma group (log-rank test; p = 0.043 and p = 0.001, respectively). Mortality was mainly due to distant metastasis.
Patients with head and neck Merkel cell carcinoma had poorer survival rates than patients with non-head and neck Merkel cell carcinoma in our study. The tendency to obtain close margins, a less predictable metastatic pattern, and/or intrinsic tumour factors related to the head and neck may explain this discrepancy.
There is limited information available regarding the benefits and outcomes of resection of pulmonary metastases arising from head and neck cancers.
A retrospective review was performed of 21 patients who underwent resection of pulmonary metastases of primary head and neck malignancies at Hamamatsu University Hospital. Clinical staging, treatment methods, pathological subtype (particularly squamous cell carcinoma), disease-free interval and overall survival were evaluated.
The 5- and 10-year overall survival rates of the study participants were 67.0 per cent and 55.0 per cent, respectively, as determined by the Kaplan–Meier method. The prognosis for patients with a disease-free interval of less than 24 months was poor compared to those with a disease-free interval of greater than 24 months (p = 0.0234).
Patients with short disease-free intervals, and possibly those who are older than 60 years, should be categorised as having severe disease. However, pulmonary metastases from head and neck malignancies are potentially curable by surgical resection.
Fracking is a controversial practice but is thriving in many areas. We combine a comprehensive data set on local bans and moratoria in the state of New York with local-level census data and spatial characteristics in a spatial econometric analysis of local fracking policies. Some factors, including location in the Utica shale, proportion of registered Democrats, and education level, increase the probability of restrictions on fracking. Extent of local land development, location in highly productive petroleum areas, and number of extant oil and gas wells are among factors that have a negative impact on the likelihood of a ban or moratorium.
Depression is known to run in families, but the effects of parental history of other psychiatric diagnoses on depression rates are less well studied. Few studies have examined the impact of parental psychopathology on depression rates in older age groups.
We established a population-based cohort including all individuals born in Denmark after 1954 and alive on their 10th birthday (N = 29 76 264). Exposure variables were maternal and paternal history of schizophrenia, bipolar disorder, depression, anxiety or ‘other’ psychiatric diagnoses. Incidence rate ratios (IRRs) were estimated using Poisson regressions.
Parental history of any psychiatric diagnosis increased incidence rates of outpatient (maternal: IRR 1.88, p < 0.0001; paternal: IRR 1.68, p < 0.0001) and inpatient (maternal: IRR 1.99, p < 0.0001; paternal: IRR 1.83, p < 0.0001) depression relative to no parental history. IRRs for parental history of non-affective disorders remained relatively stable across age groups, while IRRs for parental affective disorders (unipolar or bipolar) decreased with age from 2.29–3.96 in the youngest age group to 1.53–1.90 in the oldest group. IRR estimates for all parental diagnoses were similar among individuals aged ⩾41 years (IRR range 1.51–1.90).
Parental history of any psychiatric diagnosis is associated with increased incidence rates of unipolar depression. In younger age groups, parental history of affective diagnoses is more strongly associated with rates of unipolar depression than non-affective diagnoses; however, this distinction disappears after age 40, suggesting that parental psychopathology in general, rather than any one disorder, confers risk for depression in middle life.
Background: Recurrence after intracranial aneurysm coiling is a highly prevalent outcome, yet to be understood. We investigated clinical, radiological and procedural factors associated with major recurrence of coiled intracranial aneurysms. Methods: We retrospectively analyzed prospectively collected coiling data (2003-12). We recorded characteristics of aneurysms, patients and interventional techniques, pre-discharge and angiographic follow-up occlusion. The Raymond-Roy classification was used; major recurrence was a change from class I or II to class III, increase in class III remnant, and any recurrence requiring any type of retreatment. Identification of risk factors associated with major recurrence used univariate Cox Proportional Hazards Model followed by multivariate regression analysis of covariates with P<0.1. Results: A total of 467 aneurysms were treated in 435 patients: 283(65%) harboring acutely ruptured aneurysms, 44(10.1%) patients died before discharge and 33(7.6%) were lost to follow-up. A total of 1367 angiographic follow-up studies (range: 1-108 months, Median [interquartile ranges (IQR)]: 37[14-62]) was performed in 384(82.2%) aneurysms. The major recurrence rate was 98(21%) after 6(3.5-22.5) months. Multivariate analysis (358 patients with 384 aneurysms) revealed the risk factors for major recurrence: age>65 y (hazard ratio (HR): 1.61; P=0.04), male sex (HR: 2.13; P<0.01), hypercholesterolemia (HR: 1.65; P=0.03), neck size ≥4 mm (HR: 1.79; P=0.01), dome size ≥7 mm (HR: 2.44; P<0.01), non-stent-assisted coiling (HR: 2.87; P=0.01), and baseline class III (HR: 2.18; P<0.01). Conclusion: Approximately one fifth of the intracranial aneurysms resulted in major recurrence. Modifiable factors for major recurrence were choice of stent-assisted technique and confirmation of adequate baseline occlusion (Class I/II) in the first coiling procedure.
The usefulness of the age-adjusted Charlson Comorbidity Index (ACCI) as a gauge of the impact of comorbidity on survival is known in the geriatric population. In palliative care, there is little research studying the correlation between comorbidity and survival in the advanced stages of oncological disease. The aim of our study was to explore the impact of comorbidity, measured with the ACCI, on survival in our patients. Our hypothesis was that higher ACCI scores would be associated with lower survival rates after the first visit.
We conducted a prospective observational study over one year. Patients were attended by palliative home care teams. The main variables were: survival from metastatic disease after the first visit and ACCI score on the first visit. We also employed a descriptive analysis and a Kaplan–Meier survival analysis, including different ranges of ACCI scores.
The final sample included 66 subjects. The standard patient was a 76-year-old man with lung cancer who had received chemotherapy. The overall average ACCI score was 10.45. Significant differences were found between the different locations of metastatic disease (greater survivals in breast, ovary, and prostate; p = 0.005) and some treatments (hormone and radiotherapy; p = 0.001 for each), but not from the first visit. We found lower survival rates among lung cancer patients with higher comorbidity (ACCI ≥ 11, p = 0.047), with no differences on other primary locations or overall values.
Significance of results:
The data show that comorbidity measured by the ACCI may be an interesting prognostic factor during the late stages of disease, as we have found in lung cancer. More research is certainly needed.