To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter, provides an overview of the care and evaluation for patients undergoing laryngotracheal reconstruction. The authors present the evaluation process, grading and considerations surround patients with subglottic stenosis. The anesthetic considerations for these procedures including the postoperative transfer of care to the critical care team are presented.
The present study examines characteristics of those who benefited from a dietary Fe intervention comprised of salt double-fortified with iodine and Fe (DFS).
Data from a randomized controlled trial were analysed to identify predictors of improved Fe status and resolution of Fe deficiency (serum ferritin (sFt) < 12 μg/l) and low body Fe (body Fe (BI) < 0·0 mg/kg) using non-parametric estimations and binomial regression models.
A tea estate in West Bengal, India.
Female tea pluckers, aged 18–55 years.
Consuming DFS significantly (P = 0·01) predicted resolution of Fe deficiency (relative risk (RR) = 2·31) and of low BI (RR = 2·78) compared with consuming iodized salt. Baseline sFt (β = –0·32 (se 0·03), P < 0·001) and treatment group (β = 0·13 (se 0·03), P < 0·001) significantly predicted change in sFt. The interaction of baseline BI with treatment group (β = –0·11 (se 0·06), P = 0·08) predicted the change in BI. DFS did not significantly predict change in Hb and marginally predicted resolution of anaemia (Hb < 120 g/l).
Baseline Fe status, as assessed by sFt and BI, and consumption of DFS predict change in Fe status and resolution of Fe deficiency and low BI. Anaemia prevalence and Hb level, although simple and inexpensive to measure, may not be adequate to predict resolution of Fe deficiency in response to an intervention of DFS in similar populations with high prevalence of Fe deficiency and multiple nutritional causes of anaemia. These findings will guide appropriate targeting of future interventions.
This analysis was conducted to evaluate the evidence of the efficacy of iron biofortification interventions on iron status and functional outcomes. Iron deficiency is a major public health problem worldwide, with a disproportionate impact on women and young children, particularly those living in resource-limited settings. Biofortification, or the enhancing of micronutrient content in staple crops, is a promising and sustainable agriculture-based approach to improve nutritional status. Previous randomised efficacy trials and meta-analyses have demonstrated that iron-biofortification interventions improved iron biomarkers; however, no systematic reviews to date have examined the efficacy of biofortification interventions on health outcomes. We conducted a systematic review of the efficacy of iron-biofortified staple crops on iron status and functional outcomes: cognitive function (e.g. attention, memory) and physical performance. Five studies from three randomised efficacy trials (i.e. rice, pearl millet, beans) conducted in the Philippines, India and Rwanda were identified for inclusion in this review. Iron status (Hb, serum ferritin, soluble transferrin receptor, total body iron, α-1-acid glycoprotein) was measured at baseline and endline in each trial; two studies reported cognitive outcomes, and no studies reported other functional outcomes. Meta-analyses were conducted using DerSimonian and Laird random-effects methods. Iron-biofortified crop interventions significantly improved cognitive performance in attention and memory domains, compared with conventional crops. There were no significant effects on categorical outcomes such as iron deficiency or anaemia. Further studies are needed to determine the efficacy of iron-biofortified staple crops on human health, including additional functional outcomes and other high-risk populations.
Under the European Union’s Solvency II regulations, insurance firms are required to use a one-year VaR (Value at Risk) approach. This involves a one-year projection of the balance sheet and requires sufficient capital to be solvent in 99.5% of outcomes. The Solvency II Internal Model risk calibrations require annual changes in market indices/term structure for the estimation of risk distribution for each of the Internal Model risk drivers. This presents a significant challenge for calibrators in terms of:
Robustness of the calibration that is relevant to the current market regimes and at the same time able to represent the historically observed worst crisis;
Stability of the calibration model year on year with arrival of new information.
The above points need careful consideration to avoid credibility issues with the Solvency Capital Requirement (SCR) calculation, in that the results are subject to high levels of uncertainty.
For market risks, common industry practice to compensate for the limited number of historic annual data points is to use overlapping annual changes. Overlapping changes are dependent on each other, and this dependence can cause issues in estimation, statistical testing, and communication of uncertainty levels around risk calibrations.
This paper discusses the issues with the use of overlapping data when producing risk calibrations for an Internal Model. A comparison of the overlapping data approach with the alternative non-overlapping data approach is presented. A comparison is made of the bias and mean squared error of the first four cumulants under four different statistical models. For some statistical models it is found that overlapping data can be used with bias corrections to obtain similarly unbiased results as non-overlapping data, but with significantly lower mean squared errors. For more complex statistical models (e.g. GARCH) it is found that published bias corrections for non-overlapping and overlapping datasets do not result in unbiased cumulant estimates and/or lead to increased variance of the process.
In order to test the goodness of fit of probability distributions to the datasets, it is common to use statistical tests. Most of these tests do not function when using overlapping data, as overlapping data breach the independence assumption underlying most statistical tests. We present and test an adjustment to one of the statistical tests (the Kolmogorov Smirnov goodness-of-fit test) to allow for overlapping data.
Finally, we explore the methods of converting “high”-frequency (e.g. monthly data) to “low”-frequency data (e.g. annual data). This is an alternative methodology to using overlapping data, and the approach of fitting a statistical model to monthly data and then using the monthly model aggregated over 12 time steps to model annual returns is explored. There are a number of methods available for this approach. We explore two of the widely used approaches for aggregating the time series.
We report results of the studies relating to the development of the emerging nanostructured molybdenum trioxide (nMoO3)-based biocompatible label-free biosensing platform for breast cancer detection. The structural and morphological studies of the synthesized nMoO3 nanorods were investigated by XRD, SEM, X-ray photoelectron spectroscopic, and TEM techniques. This biocompatible one-dimensional (1D) nMoO3-based biosensing platform exhibited high sensitivity (0.904 µAmL/ng/cm2), wide linear detection range (2.5–110 ng/mL), and a lower detection limit as 2.47 ng/mL toward human epidermal growth factor receptor-2 detection. The results obtained using this sensor platform on serum samples of breast cancer patients were validated using ELISA.
To assess the feasibility of non-contrast T2-weighted magnetic resonance imaging as compared to T1-weighted post-contrast magnetic resonance imaging for detecting acoustic neuroma growth.
Adult patients with acoustic neuroma who underwent at least three magnetic resonance imaging scans of the internal auditory canals with and without contrast in the past nine years were identified. T1- and T2-weighted images were reviewed by three neuroradiologists, and tumour size was measured. Accuracy of the measurements on T2-weighted images was defined as a difference of less than or equal to 2 mm from the measurement on T1-weighted images.
A total of 107 magnetic resonance imaging scans of 26 patients were reviewed. Measurements on T2-weighted magnetic resonance imaging scans were 88 per cent accurate. Measurements on T2-weighted images differed from measurements on T1-weighted images by an average of 1.27 mm, or 10.4 per cent of the total size. The specificity of T2-weighted images was 88.2 per cent and the sensitivity was 77.8 per cent.
The T2-weighted sequences are fairly accurate in measuring acoustic neuroma size and identifying growth if one keeps in mind the caveats associated with the tumour characteristics or location.
Mixing matrices quantify how people with similar or different characteristics make contact with each other, creating potential for disease transmission. Little empirical data on mixing patterns among persons who inject drugs (PWID) are available to inform models of blood-borne disease such as HIV and hepatitis C virus. Egocentric drug network data provided by PWID in Baltimore, Maryland between 2005 and 2007 were used to characterise drug equipment-sharing patterns according to age, race and gender. Black PWID and PWID who were single (i.e. no stable sexual partner) self-reported larger equipment-sharing networks than their white and non-single counterparts. We also found evidence of assortative mixing according to age, gender and race, though to a slightly lesser degree in the case of gender. Highly assortative mixing according to race and gender highlights the existence of demographically isolated clusters, for whom generalised treatment interventions may have limited benefits unless targeted directly. These findings provide novel insights into mixing patterns of PWID for which little empirical data are available. The age-specific assortativity we observed is also significant in light of its role as a key driver of transmission for other pathogens such as influenza and tuberculosis.
Community-led total sanitation (CLTS) is an intervention that strives to end the practice of open defaecation. This study measured the effectiveness of CLTS in Nyando District by examining the association between community open defaecation-free (ODF) status and childhood diarrhoeal illness. A cross-sectional study design was used among households with children ⩽5 years old to ascertain information on acute diarrhoea in the past year (outcome), sanitation and health behaviours. Water testing was conducted to determine Escherichia coli and turbidity levels for 55 water sources. Data were obtained from 210 parents or caregivers from an ODF community and 216 parents or caregivers in a non-ODF community. The non-ODF participants reported a non-significant 16% increased risk of diarrhoea compared with the participants from the ODF community. Children's HIV positivity (adjusted prevalence ratio (aPR) = 2.29; 95% CI 2.07–2.53), unsafe child stool disposal (aPR = 1.92; 95% CI 1.74–2.12) and low household income (aPR = 1.93; 95% CI 1.46–2.56) were associated with diarrhoea, in the non-ODF community. The ODF location had a higher percentage of E. coli in the drinking water compared with the non-ODF location (76.7% vs. 60%). Diarrhoeal disease rates in children ⩽5 years old did not differ by whether a latrine intervention was implemented. Water sampling findings suggest water safety may have decreased the effectiveness of the CLTS’ improvement of childhood diarrhoea. Improved water treatment practices, safe stool disposal and education may improve the CLTS intervention in ODF communities and therefore reduced the risk of childhood diarrhoea.
Calling in staff and preparing the operating room for an urgent surgical procedure is a significant draw on hospital resources and disrupts care of other patients. It has been common practice to treat open fractures on an urgent basis. HTA methods can be applied to examine this prioritization of care, just like they can be applied to the acquisition of drugs and devices.
Our center completed a rapid systematic review of guidelines, systematic reviews, and primary clinical evidence, on urgent surgical debridement and stabilization of open fractures of long bones (“urgent” being defined as within six hours of the injury) compared to surgical debridement and reduction performed at a later time point. Meta-analyses were performed for infection and non-union outcomes and the GRADE system was used to assess the strength of evidence for each conclusion.
We found no published clinical guidelines for the urgency of treating open fractures. A good systematic review on the topic was published in 2012. We found six cohort studies published since completion of the earlier review. The summary odds ratio for any infection in patients with later treatment was 0.97 (95% confidence interval (CI) 0.78–1.22, sixteen studies, 3,615 patients) and for deep or “major” infections was 1.00 (95% CI 0.74–1.34, nine studies, 2,013 patients). The summary odds ratio of non-union with later treatment was 0.95 (95% CI 0.65–1.41, six studies, 1,308 patients). There was no significant heterogeneity in any of the results (I-squared = 0 percent) and no apparent trends in the results as a function of study size or publication date. We graded the strength of each of the conclusions as very low because they were based on cohort studies where the treating physician could elect immediate treatment for patients with severe soft-tissue injuries or patients at risk of complications. This raises the risk of spectrum bias.
Default urgent scheduling of patients with open fractures for surgical debridement and stabilization does not appear to reduce the risk of infection or fracture non-union. Based on this information, our surgery department managers no longer schedule patients with open fractures for immediate surgery unless there are specific circumstances necessitating it.
Predicting recurrent Clostridium difficile infection (rCDI) remains difficult. METHODS. We employed a retrospective cohort design. Granular electronic medical record (EMR) data had been collected from patients hospitalized at 21 Kaiser Permanente Northern California hospitals. The derivation dataset (2007–2013) included data from 9,386 patients who experienced incident CDI (iCDI) and 1,311 who experienced their first CDI recurrences (rCDI). The validation dataset (2014) included data from 1,865 patients who experienced incident CDI and 144 who experienced rCDI. Using multiple techniques, including machine learning, we evaluated more than 150 potential predictors. Our final analyses evaluated 3 models with varying degrees of complexity and 1 previously published model.
Despite having a large multicenter cohort and access to granular EMR data (eg, vital signs, and laboratory test results), none of the models discriminated well (c statistics, 0.591–0.605), had good calibration, or had good explanatory power.
Our ability to predict rCDI remains limited. Given currently available EMR technology, improvements in prediction will require incorporating new variables because currently available data elements lack adequate explanatory power.
Patient experience is becoming a central focus of healthcare. A broad range of studies on how to increase patient satisfaction ratings exists; however, they lack the specificity to adequately guide physicians and hospitals on how to improve patient experience. The objective of this study was to define the aspects of patient experience within paediatric cardiologist practices that can serve as predictors of excellent patient satisfaction. From 1 January, 2013 to 28 February, 2015 (26 months), outpatients who visited paediatric cardiologists were asked to complete a 39-question patient satisfaction survey regarding their experience. Surveys were collected over a 26-month period by Press Ganey, an independent provider of patient satisfaction surveys. Participants were asked to rate their experience on a 1–5 Likert-scale: a score of 1 demonstrated a “poor” experience, whereas a score of 5 demonstrated a “very good” experience. This retrospective study of 2468 responses determined that cheerfulness of the practice (r=0.85, p<0.001), a cohesive staff (r=0.83, p<0.001), and a care provider explaining problems and conditions (r=0.81, p<0.001) were key aspects of a paediatric cardiologist’s practice that can be used as predictors of overall patient satisfaction. Awareness of how doctors can personalise a patient’s experience is vital to achieve greater patient satisfaction and, ultimately, better patient outcomes.
There has been a drop in clinical research in India following stringent conditions put in place by the Indian Supreme Court in 2013. The Court's orders came in the wake of irregularities highlighted in the conduct of clinical trials in the country. This paper highlights the steps taken by the Indian regulator, the Central Drugs Standard Control Organisation to comply with these directions. These are of three kinds: strengthening regulatory institutions, protecting participant safety and creating regulatory certainty for sponsors and investigators. Examples include the large-scale training of Ethics Committees, framing detailed guidelines on compensation and audiovisual recording of the informed consent process, as well as reducing the time taken to process applications. It is expected that these measures will inspire confidence for the much-needed resumption of clinical research.
Most research on interventions to counter stigma and discrimination has
focused on short-term outcomes and has been conducted in high-income
To synthesise what is known globally about effective interventions to
reduce mental illness-based stigma and discrimination, in relation first
to effectiveness in the medium and long term (minimum 4 weeks), and
second to interventions in low- and middle-income countries (LMICs).
We searched six databases from 1980 to 2013 and conducted a
multi-language Google search for quantitative studies addressing the
research questions. Effect sizes were calculated from eligible studies
where possible, and narrative syntheses conducted. Subgroup analysis
compared interventions with and without social contact.
Eighty studies (n = 422 653) were included in the
review. For studies with medium or long-term follow-up (72, of which 21
had calculable effect sizes) median standardised mean differences were
0.54 for knowledge and −0.26 for stigmatising attitudes. Those containing
social contact (direct or indirect) were not more effective than those
without. The 11 LMIC studies were all from middle-income countries.
Effect sizes were rarely calculable for behavioural outcomes or in LMIC
There is modest evidence for the effectiveness of anti-stigma
interventions beyond 4 weeks follow-up in terms of increasing knowledge
and reducing stigmatising attitudes. Evidence does not support the view
that social contact is the more effective type of intervention for
improving attitudes in the medium to long term. Methodologically strong
research is needed on which to base decisions on investment in
The contribution of subsidized food commodities to total food consumption is unknown. We estimated the proportion of individual energy intake from food commodities receiving the largest subsidies from 1995 to 2010 (corn, soyabeans, wheat, rice, sorghum, dairy and livestock).
Integrating information from three federal databases (MyPyramid Equivalents, Food Intakes Converted to Retail Commodities, and What We Eat in America) with data from the 2001–2006 National Health and Nutrition Examination Surveys, we computed a Subsidy Score representing the percentage of total energy intake from subsidized commodities. We examined the score’s distribution and the probability of having a ‘high’ (≥70th percentile) v. ‘low’ (≤30th percentile) score, across the population and subgroups, using multivariate logistic regression.
Community-dwelling adults in the USA.
Participants (n 11 811) aged 18–64 years.
Median Subsidy Score was 56·7 % (interquartile range 47·2–65·4 %). Younger, less educated, poorer, and Mexican Americans had higher scores. After controlling for covariates, age, education and income remained independently associated with the score: compared with individuals aged 55–64 years, individuals aged 18–24 years had a 50 % higher probability of having a high score (P<0·0001). Individuals reporting less than high-school education had 21 % higher probability of having a high score than individuals reporting college completion or higher (P=0·003); individuals in the lowest tertile of income had an 11 % higher probability of having a high score compared with individuals in the highest tertile (P=0·02).
Over 50 % of energy in US diets is derived from federally subsidized commodities.
This article describes a review undertaken in 2012–2013 by Nutrition and Dietetics, Flinders University, to assess the Indigenous health curriculum of the Bachelor of Nutrition and Dietetics (BND) and Masters of Nutrition and Dietetics (MND). An action research framework was used to guide and inform inquiry. This involved four stages, each of which provided information to reach a final decision about how to progress forward. First, relevant information was collected to present to stakeholders. This included identification of acknowledged curriculum frameworks, a review of other accredited nutrition and dietetics courses in Australia, a review of Indigenous health topics at Flinders University, including liaison with the Poche Centre for Indigenous Health and Well-Being (Indigenous health teaching and research unit), and a review of BND and MND current curriculum related to Indigenous health. Second, input was sought from stakeholders. This involved a workshop with practising dietitians and nutritionists from South Australia and the Northern Territory and discussions with Flinders University Nutrition and Dietetics academic staff. Third, a new curriculum was developed. Nine areas were identified for this curriculum, including reflexivity, approach and role, history and health status, worldview, beliefs and values, systems and structures, relationship building and communication, food and food choice, appreciating and understanding diversity, and nutrition issues and health status. Fourth, a final outcome was achieved, which was the decision to introduce a core, semester-long Indigenous health topic for BND students. A secondary outcome was strengthening of Indigenous health teaching across the BND and MND. The process and findings will be useful to other university courses looking to assess and expand their Indigenous health curriculum.