To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study aimed to describe the microscopic over-under cartilage tympanoplasty technique, provide hearing results and detail clinically significant complications.
This was a retrospective case series chart review study of over-under cartilage tympanoplasty procedures performed by the senior author between January 2015 and January 2019 at three tertiary care centres. Cases were excluded for previous or intra-operative cholesteatoma, if a mastoidectomy was performed during the procedure or if ossiculoplasty was performed. Hearing results and complications were obtained.
Sixty-eight tympanoplasty procedures met the inclusion criteria. The median age was 13 years (range, 3–71 years). The mean improvement in pure tone average was 6 dB (95 per cent confidence interval 4–9 dB; p < 0.0001). The overall perforation closure rate was 97 per cent (n = 66). Revision surgery was recommended for a total of 6 cases (9 per cent) including 2 post-operative perforations, 1 case of middle-ear cholesteatoma and 3 cases of external auditory canal scarring.
Over-under cartilage tympanoplasty is effective at improving clinically meaningful hearing with a low rate of post-operative complications.
The aim of this study was to describe the sensitivity of various C-reactive protein (CRP) cut-off values to identify patients requiring magnetic resonance imaging evaluation for pyogenic spinal infection among emergency department (ED) adults presenting with neck or back pain.
We prospectively enrolled a convenience series of adults presenting to a community ED with neck or back pain in whom ED providers had concern for pyogenic spinal infection in a derivation cohort from 2004 to 2010 and a validation cohort from 2010 to 2018. The validation cohort included only patients with pyogenic spinal infection. We analysed diagnostic test characteristics of various CRP cut-off values.
We enrolled 232 patients and analysed 201 patients. The median age was 55 years, 43.8% were male, 4.0% had history of intravenous drug use, and 20.9% had recent spinal surgery. In the derivation cohort, 38 (23.9%) of 159 patients had pyogenic spinal infection. Derivation sensitivity and specificity of CRP cut-off values were > 3.5 mg/L (100%, 24.8%), > 10 mg/L (100%, 41.3%), > 30 mg/L (100%, 61.2%), and > 50 mg/L (89.5%, 69.4%). Validation sensitivities of CRP cut-off values were > 3.5 mg/L (97.6%), > 10 mg/L (97.6%), > 30 mg/L (90.4%), and > 50 mg/L (85.7%).
CRP cut-offs beyond the upper limit of normal had high sensitivity for pyogenic spinal infection in this adult ED population. Elevated CRP cut-off values of 10 mg/L and 30 mg/L require validation in other settings.
Psychological stress is associated with accelerated cellular aging and increased risk for aging-related diseases, but the underlying molecular mechanisms are unclear.
We examined the effect of stress on a DNA methylation age predictor that was shown to correlate strongly with chronological age across human tissues (Horvath 2013). Genome-wide DNA methylation was measured in peripheral blood using the 450K Illumina array in three independent cohorts: the Grady Trauma Project/GTP (N=366); a panic disorder case/control sample recruited at the Max Planck Institute of Psychiatry/MPI-P (N=318); and the Conte Center for the Psychobiology of Early-Life Trauma/Conte (N=42). Age acceleration was calculated by subtracting chronological age from age predicted by DNA methylation. Psychiatric symptomatology and stressors were assessed using standard questionnaires.
DNA methylation age strongly correlated with chronological age in all samples (r=0.9, p=2.5x10<sup>-133</sup>). Cumulative lifetime stress but not childhood or current stress predicted age acceleration in GTP (p=0.012) and MPI-P (p=0.021). Moreover, epigenetic age acceleration predicted depression (GTP: p=0.002; Conte: p=0.014) and panic disorder (p=0.007). In secondary analyses, we examined the effect of lifetime stress on individual CpGs of the DNA methylation age predictor. After correcting for multiple comparisons, we identified in both GTP and MPI-P a stress-regulated CpG near MCAM, a gene implicated in aging-related diseases, including cardiovascular disease and cancers.
Cumulative lifetime stress, but not childhood or current stress, and psychiatric phenotypes are associated with accelerated epigenetic aging. Our findings may explain the accelerated cellular aging and increased disease risk associated with chronic stress and psychiatric disorders.
Association between leptin and ghrelin plasma levels and alcohol craving have been found in few studies but they have failed to differentiate this correlation with alcohol withdrawal state.
To research this correlation in a different population and to study this correlation with respect to hyper-excitable state of alcohol withdrawal.
To study levels of leptin and ghrelin in relation with alcohol withdrawal and craving.
Twenty-five indoor patients fulfilling the alcohol dependence criteria were assessed for alcohol withdrawal symptoms and craving. Leptin and ghrelin levels were measured on 1st day, @ the end of 1st week, @ the end of 3rd week of stopping alcohol. Withdrawal was assessed using CIWA-A at day 1 and day 7, craving was assessed using PENN's scale of craving at the end of week 1 and week 3. Control group consisted of 15 first-degree relatives not taking alcohol.
It was found that leptin [t (38) = 2.95, P = 0.005] and ghrelin [t (38) = 2.56, P = 0.015] were significantly higher in alcohol-dependent patients. Levels of hormones had no significant correlation with alcohol withdrawal scores but had positive correlation with craving scores after abstinence.
Leptin and ghrelin, known for balancing the energy homeostasis of body, also seem to play a role in pathways of drug dependence and craving. This relation is independent of stress hormone axis as leptin and ghrelin levels are not correlated with withdrawal scores, which is an indicator of stress hormone axis activation during alcohol withdrawal.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Short-term peripheral venous catheter–related bloodstream infection (PVCR-BSI) rates have not been systematically studied in resource-limited countries, and data on their incidence by number of device days are not available.
Prospective, surveillance study on PVCR-BSI conducted from September 1, 2013, to May 31, 2019, in 727 intensive care units (ICUs), by members of the International Nosocomial Infection Control Consortium (INICC), from 268 hospitals in 141 cities of 42 countries of Africa, the Americas, Eastern Mediterranean, Europe, South East Asia, and Western Pacific regions. For this research, we applied definition and criteria of the CDC NHSN, methodology of the INICC, and software named INICC Surveillance Online System.
We followed 149,609 ICU patients for 731,135 bed days and 743,508 short-term peripheral venous catheter (PVC) days. We identified 1,789 PVCR-BSIs for an overall rate of 2.41 per 1,000 PVC days. Mortality in patients with PVC but without PVCR-BSI was 6.67%, and mortality was 18% in patients with PVC and PVCR-BSI. The length of stay of patients with PVC but without PVCR-BSI was 4.83 days, and the length of stay was 9.85 days in patients with PVC and PVCR-BSI. Among these infections, the microorganism profile showed 58% gram-negative bacteria: Escherichia coli (16%), Klebsiella spp (11%), Pseudomonas aeruginosa (6%), Enterobacter spp (4%), and others (20%) including Serratia marcescens. Staphylococcus aureus were the predominant gram-positive bacteria (12%).
PVCR-BSI rates in INICC ICUs were much higher than rates published from industrialized countries. Infection prevention programs must be implemented to reduce the incidence of PVCR-BSIs in resource-limited countries.
This chapter, provides an overview of the care and evaluation for patients undergoing laryngotracheal reconstruction. The authors present the evaluation process, grading and considerations surround patients with subglottic stenosis. The anesthetic considerations for these procedures including the postoperative transfer of care to the critical care team are presented.
The present study examines characteristics of those who benefited from a dietary Fe intervention comprised of salt double-fortified with iodine and Fe (DFS).
Data from a randomized controlled trial were analysed to identify predictors of improved Fe status and resolution of Fe deficiency (serum ferritin (sFt) < 12 μg/l) and low body Fe (body Fe (BI) < 0·0 mg/kg) using non-parametric estimations and binomial regression models.
A tea estate in West Bengal, India.
Female tea pluckers, aged 18–55 years.
Consuming DFS significantly (P = 0·01) predicted resolution of Fe deficiency (relative risk (RR) = 2·31) and of low BI (RR = 2·78) compared with consuming iodized salt. Baseline sFt (β = –0·32 (se 0·03), P < 0·001) and treatment group (β = 0·13 (se 0·03), P < 0·001) significantly predicted change in sFt. The interaction of baseline BI with treatment group (β = –0·11 (se 0·06), P = 0·08) predicted the change in BI. DFS did not significantly predict change in Hb and marginally predicted resolution of anaemia (Hb < 120 g/l).
Baseline Fe status, as assessed by sFt and BI, and consumption of DFS predict change in Fe status and resolution of Fe deficiency and low BI. Anaemia prevalence and Hb level, although simple and inexpensive to measure, may not be adequate to predict resolution of Fe deficiency in response to an intervention of DFS in similar populations with high prevalence of Fe deficiency and multiple nutritional causes of anaemia. These findings will guide appropriate targeting of future interventions.
This analysis was conducted to evaluate the evidence of the efficacy of iron biofortification interventions on iron status and functional outcomes. Iron deficiency is a major public health problem worldwide, with a disproportionate impact on women and young children, particularly those living in resource-limited settings. Biofortification, or the enhancing of micronutrient content in staple crops, is a promising and sustainable agriculture-based approach to improve nutritional status. Previous randomised efficacy trials and meta-analyses have demonstrated that iron-biofortification interventions improved iron biomarkers; however, no systematic reviews to date have examined the efficacy of biofortification interventions on health outcomes. We conducted a systematic review of the efficacy of iron-biofortified staple crops on iron status and functional outcomes: cognitive function (e.g. attention, memory) and physical performance. Five studies from three randomised efficacy trials (i.e. rice, pearl millet, beans) conducted in the Philippines, India and Rwanda were identified for inclusion in this review. Iron status (Hb, serum ferritin, soluble transferrin receptor, total body iron, α-1-acid glycoprotein) was measured at baseline and endline in each trial; two studies reported cognitive outcomes, and no studies reported other functional outcomes. Meta-analyses were conducted using DerSimonian and Laird random-effects methods. Iron-biofortified crop interventions significantly improved cognitive performance in attention and memory domains, compared with conventional crops. There were no significant effects on categorical outcomes such as iron deficiency or anaemia. Further studies are needed to determine the efficacy of iron-biofortified staple crops on human health, including additional functional outcomes and other high-risk populations.
Under the European Union’s Solvency II regulations, insurance firms are required to use a one-year VaR (Value at Risk) approach. This involves a one-year projection of the balance sheet and requires sufficient capital to be solvent in 99.5% of outcomes. The Solvency II Internal Model risk calibrations require annual changes in market indices/term structure for the estimation of risk distribution for each of the Internal Model risk drivers. This presents a significant challenge for calibrators in terms of:
Robustness of the calibration that is relevant to the current market regimes and at the same time able to represent the historically observed worst crisis;
Stability of the calibration model year on year with arrival of new information.
The above points need careful consideration to avoid credibility issues with the Solvency Capital Requirement (SCR) calculation, in that the results are subject to high levels of uncertainty.
For market risks, common industry practice to compensate for the limited number of historic annual data points is to use overlapping annual changes. Overlapping changes are dependent on each other, and this dependence can cause issues in estimation, statistical testing, and communication of uncertainty levels around risk calibrations.
This paper discusses the issues with the use of overlapping data when producing risk calibrations for an Internal Model. A comparison of the overlapping data approach with the alternative non-overlapping data approach is presented. A comparison is made of the bias and mean squared error of the first four cumulants under four different statistical models. For some statistical models it is found that overlapping data can be used with bias corrections to obtain similarly unbiased results as non-overlapping data, but with significantly lower mean squared errors. For more complex statistical models (e.g. GARCH) it is found that published bias corrections for non-overlapping and overlapping datasets do not result in unbiased cumulant estimates and/or lead to increased variance of the process.
In order to test the goodness of fit of probability distributions to the datasets, it is common to use statistical tests. Most of these tests do not function when using overlapping data, as overlapping data breach the independence assumption underlying most statistical tests. We present and test an adjustment to one of the statistical tests (the Kolmogorov Smirnov goodness-of-fit test) to allow for overlapping data.
Finally, we explore the methods of converting “high”-frequency (e.g. monthly data) to “low”-frequency data (e.g. annual data). This is an alternative methodology to using overlapping data, and the approach of fitting a statistical model to monthly data and then using the monthly model aggregated over 12 time steps to model annual returns is explored. There are a number of methods available for this approach. We explore two of the widely used approaches for aggregating the time series.
We report results of the studies relating to the development of the emerging nanostructured molybdenum trioxide (nMoO3)-based biocompatible label-free biosensing platform for breast cancer detection. The structural and morphological studies of the synthesized nMoO3 nanorods were investigated by XRD, SEM, X-ray photoelectron spectroscopic, and TEM techniques. This biocompatible one-dimensional (1D) nMoO3-based biosensing platform exhibited high sensitivity (0.904 µAmL/ng/cm2), wide linear detection range (2.5–110 ng/mL), and a lower detection limit as 2.47 ng/mL toward human epidermal growth factor receptor-2 detection. The results obtained using this sensor platform on serum samples of breast cancer patients were validated using ELISA.
To assess the feasibility of non-contrast T2-weighted magnetic resonance imaging as compared to T1-weighted post-contrast magnetic resonance imaging for detecting acoustic neuroma growth.
Adult patients with acoustic neuroma who underwent at least three magnetic resonance imaging scans of the internal auditory canals with and without contrast in the past nine years were identified. T1- and T2-weighted images were reviewed by three neuroradiologists, and tumour size was measured. Accuracy of the measurements on T2-weighted images was defined as a difference of less than or equal to 2 mm from the measurement on T1-weighted images.
A total of 107 magnetic resonance imaging scans of 26 patients were reviewed. Measurements on T2-weighted magnetic resonance imaging scans were 88 per cent accurate. Measurements on T2-weighted images differed from measurements on T1-weighted images by an average of 1.27 mm, or 10.4 per cent of the total size. The specificity of T2-weighted images was 88.2 per cent and the sensitivity was 77.8 per cent.
The T2-weighted sequences are fairly accurate in measuring acoustic neuroma size and identifying growth if one keeps in mind the caveats associated with the tumour characteristics or location.
Mixing matrices quantify how people with similar or different characteristics make contact with each other, creating potential for disease transmission. Little empirical data on mixing patterns among persons who inject drugs (PWID) are available to inform models of blood-borne disease such as HIV and hepatitis C virus. Egocentric drug network data provided by PWID in Baltimore, Maryland between 2005 and 2007 were used to characterise drug equipment-sharing patterns according to age, race and gender. Black PWID and PWID who were single (i.e. no stable sexual partner) self-reported larger equipment-sharing networks than their white and non-single counterparts. We also found evidence of assortative mixing according to age, gender and race, though to a slightly lesser degree in the case of gender. Highly assortative mixing according to race and gender highlights the existence of demographically isolated clusters, for whom generalised treatment interventions may have limited benefits unless targeted directly. These findings provide novel insights into mixing patterns of PWID for which little empirical data are available. The age-specific assortativity we observed is also significant in light of its role as a key driver of transmission for other pathogens such as influenza and tuberculosis.
Community-led total sanitation (CLTS) is an intervention that strives to end the practice of open defaecation. This study measured the effectiveness of CLTS in Nyando District by examining the association between community open defaecation-free (ODF) status and childhood diarrhoeal illness. A cross-sectional study design was used among households with children ⩽5 years old to ascertain information on acute diarrhoea in the past year (outcome), sanitation and health behaviours. Water testing was conducted to determine Escherichia coli and turbidity levels for 55 water sources. Data were obtained from 210 parents or caregivers from an ODF community and 216 parents or caregivers in a non-ODF community. The non-ODF participants reported a non-significant 16% increased risk of diarrhoea compared with the participants from the ODF community. Children's HIV positivity (adjusted prevalence ratio (aPR) = 2.29; 95% CI 2.07–2.53), unsafe child stool disposal (aPR = 1.92; 95% CI 1.74–2.12) and low household income (aPR = 1.93; 95% CI 1.46–2.56) were associated with diarrhoea, in the non-ODF community. The ODF location had a higher percentage of E. coli in the drinking water compared with the non-ODF location (76.7% vs. 60%). Diarrhoeal disease rates in children ⩽5 years old did not differ by whether a latrine intervention was implemented. Water sampling findings suggest water safety may have decreased the effectiveness of the CLTS’ improvement of childhood diarrhoea. Improved water treatment practices, safe stool disposal and education may improve the CLTS intervention in ODF communities and therefore reduced the risk of childhood diarrhoea.
Calling in staff and preparing the operating room for an urgent surgical procedure is a significant draw on hospital resources and disrupts care of other patients. It has been common practice to treat open fractures on an urgent basis. HTA methods can be applied to examine this prioritization of care, just like they can be applied to the acquisition of drugs and devices.
Our center completed a rapid systematic review of guidelines, systematic reviews, and primary clinical evidence, on urgent surgical debridement and stabilization of open fractures of long bones (“urgent” being defined as within six hours of the injury) compared to surgical debridement and reduction performed at a later time point. Meta-analyses were performed for infection and non-union outcomes and the GRADE system was used to assess the strength of evidence for each conclusion.
We found no published clinical guidelines for the urgency of treating open fractures. A good systematic review on the topic was published in 2012. We found six cohort studies published since completion of the earlier review. The summary odds ratio for any infection in patients with later treatment was 0.97 (95% confidence interval (CI) 0.78–1.22, sixteen studies, 3,615 patients) and for deep or “major” infections was 1.00 (95% CI 0.74–1.34, nine studies, 2,013 patients). The summary odds ratio of non-union with later treatment was 0.95 (95% CI 0.65–1.41, six studies, 1,308 patients). There was no significant heterogeneity in any of the results (I-squared = 0 percent) and no apparent trends in the results as a function of study size or publication date. We graded the strength of each of the conclusions as very low because they were based on cohort studies where the treating physician could elect immediate treatment for patients with severe soft-tissue injuries or patients at risk of complications. This raises the risk of spectrum bias.
Default urgent scheduling of patients with open fractures for surgical debridement and stabilization does not appear to reduce the risk of infection or fracture non-union. Based on this information, our surgery department managers no longer schedule patients with open fractures for immediate surgery unless there are specific circumstances necessitating it.
Predicting recurrent Clostridium difficile infection (rCDI) remains difficult. METHODS. We employed a retrospective cohort design. Granular electronic medical record (EMR) data had been collected from patients hospitalized at 21 Kaiser Permanente Northern California hospitals. The derivation dataset (2007–2013) included data from 9,386 patients who experienced incident CDI (iCDI) and 1,311 who experienced their first CDI recurrences (rCDI). The validation dataset (2014) included data from 1,865 patients who experienced incident CDI and 144 who experienced rCDI. Using multiple techniques, including machine learning, we evaluated more than 150 potential predictors. Our final analyses evaluated 3 models with varying degrees of complexity and 1 previously published model.
Despite having a large multicenter cohort and access to granular EMR data (eg, vital signs, and laboratory test results), none of the models discriminated well (c statistics, 0.591–0.605), had good calibration, or had good explanatory power.
Our ability to predict rCDI remains limited. Given currently available EMR technology, improvements in prediction will require incorporating new variables because currently available data elements lack adequate explanatory power.
Patient experience is becoming a central focus of healthcare. A broad range of studies on how to increase patient satisfaction ratings exists; however, they lack the specificity to adequately guide physicians and hospitals on how to improve patient experience. The objective of this study was to define the aspects of patient experience within paediatric cardiologist practices that can serve as predictors of excellent patient satisfaction. From 1 January, 2013 to 28 February, 2015 (26 months), outpatients who visited paediatric cardiologists were asked to complete a 39-question patient satisfaction survey regarding their experience. Surveys were collected over a 26-month period by Press Ganey, an independent provider of patient satisfaction surveys. Participants were asked to rate their experience on a 1–5 Likert-scale: a score of 1 demonstrated a “poor” experience, whereas a score of 5 demonstrated a “very good” experience. This retrospective study of 2468 responses determined that cheerfulness of the practice (r=0.85, p<0.001), a cohesive staff (r=0.83, p<0.001), and a care provider explaining problems and conditions (r=0.81, p<0.001) were key aspects of a paediatric cardiologist’s practice that can be used as predictors of overall patient satisfaction. Awareness of how doctors can personalise a patient’s experience is vital to achieve greater patient satisfaction and, ultimately, better patient outcomes.
There has been a drop in clinical research in India following stringent conditions put in place by the Indian Supreme Court in 2013. The Court's orders came in the wake of irregularities highlighted in the conduct of clinical trials in the country. This paper highlights the steps taken by the Indian regulator, the Central Drugs Standard Control Organisation to comply with these directions. These are of three kinds: strengthening regulatory institutions, protecting participant safety and creating regulatory certainty for sponsors and investigators. Examples include the large-scale training of Ethics Committees, framing detailed guidelines on compensation and audiovisual recording of the informed consent process, as well as reducing the time taken to process applications. It is expected that these measures will inspire confidence for the much-needed resumption of clinical research.