We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Post-tonsillectomy bleeding is the most frequent complication of tonsillectomy. Inherited platelet function disorders have an estimated prevalence of 1 per cent. Any association between post-tonsillectomy bleeds and undiagnosed inherited platelet function disorders has not been investigated before.
Objectives
To assess the prevalence of inherited platelet function disorders in a cohort of post-tonsillectomy bleed patients.
Methods
An observational cohort study was conducted using hospital digital records. Platelet function analyser 100 (‘PFA-100’) closure time was tested on post-tonsillectomy bleed patients who presented to hospital.
Results
Between 2013 and 2017, 9 of 91 post-tonsillectomy bleed patients who underwent platelet function analyser 100 testing (9.89 per cent) had positive results. Five patients (5.49 per cent) had undiagnosed inherited platelet function disorders. Four patients had false positive results secondary to a non-steroidal anti-inflammatory drug effect (specificity of 95.3 per cent) proven by repeat testing six weeks later, off medication. The false negative rate was 0 per cent.
Conclusion
The prevalence of inherited platelet function disorders in our post-tonsillectomy bleed cohort is five-fold higher than in the general population. Platelet function analyser 100 testing when patients present with a post-tonsillectomy bleed allows management of their inherited platelet function disorder.
OBJECTIVES/SPECIFIC AIMS: Objectives and goals of this study will be to: (1) compare fecal microbiota and fecal organic acids in irritable bowel syndrome (IBS) patients and controls and (2) investigate the association between colonic transit and fecal microbiota in IBS patients and controls. METHODS/STUDY POPULATION: We propose an investigation of fecal organic acids, colonic transit and fecal microbiota in 36 IBS patients and 18 healthy controls. The target population will be adults ages 18–65 years meeting Rome IV criteria for IBS (both diarrhea- and constipation-predominant, IBS-D and IBS-C) and asymptomatic controls. Exclusion criteria are: (a) history of microscopic colitis, inflammatory bowel disease, celiac disease, visceral cancer, chronic infectious disease, immunodeficiency, uncontrolled thyroid disease, liver disease, or elevated AST/ALT>2.0× the upper limit of normal, (b) prior radiation therapy of the abdomen or abdominal surgeries with the exception of appendectomy or cholecystectomy >6 months before study initiation, (c) ingestion of prescription, over the counter, or herbal medications affecting gastrointestinal transit or study interpretation within 6 months of study initiation for controls or within 2 days before study initiation for IBS patients, (d) pregnant females, (e) antibiotic usage within 3 months before study participation, (f) prebiotic or probiotic usage within the 2 weeks before study initiation, (g) tobacco users. Primary outcomes will be fecal bile acid excretion and profile, short-chain fatty acid excretion and profile, colonic transit, and fecal microbiota. Secondary outcomes will be stool characteristics based on responses to validated bowel diaries. Stool samples will be collected from participants during the last 2 days of a 4-day 100 g fat diet and split into 3 samples for fecal microbiota, SCFA, and bile acid analysis and frozen. Frozen aliquots will be shipped to the Metabolite Profiling Facility at Purdue University and the Mayo Clinic Department of Laboratory Medicine and Pathology for SCFA and bile acid measurements, respectively. Analysis of fecal microbiota will be performed in the research laboratory of Dr David Nelson in collaboration with bioinformatics expertise affiliated with the Nelson lab. Colonic transit time will be measured with the previously validated method using radio-opaque markers. Generalized linear models will be used as the analysis framework for comparing study endpoints among groups. RESULTS/ANTICIPATED RESULTS: This study seeks to examine the innovative concept that specific microbial signatures are associated with increased fecal excretion of organic acids to provide unique insights on a potential mechanistic link between altered intraluminal organic acids and fecal microbiota. DISCUSSION/SIGNIFICANCE OF IMPACT: Results may lead to development of targets for novel therapies and diagnostic biomarkers for IBS, emphasizing the role of the fecal metabolome.
Accurate crop varietal identification is the backbone of any high-quality assessment of outcomes and impacts. Sweetpotato (Ipomoea batatas) varieties have important nutritional differences, and there is a strong interest to identify nutritionally superior varieties for dissemination. In agricultural household surveys, such information is often collected based on the farmer's self-report. In this article, we present the results of a data capture experiment on sweet potato varietal identification in southern Ethiopia. Three household-based methods of identifying varietal adoption are tested against the benchmark of DNA fingerprinting: (A) Elicitation from farmers with basic questions for the most widely planted variety; (B) Farmer elicitation on five sweet potato phenotypic attributes by showing a visual-aid protocol; and (C) Enumerator recording observations on five sweet potato phenotypic attributes using a visual-aid protocol and visiting the field. In total, 20% of farmers identified a variety as improved when in fact it was local and 19% identified a variety as local when it was in fact improved. The variety names given by farmers delivered inconsistent and inaccurate varietal identities. Visual-aid protocols employed in methods B and C were better than those in method A, but greatly underestimated the adoption estimates given by the DNA fingerprinting method. Our results suggest that estimating the adoption of improved varieties with methods based on farmer self-reports is questionable and point towards a wider use of DNA fingerprinting in adoption and impact assessments.
A Health Technology Assessment (HTA) systematic review was undertaken in rheumatoid arthritis (RA) of treat-to-target (TTT) studies (n = 16) in which studies were grouped according to: TTT versus usual care, trials comparing different targets, or trials comparing different treatment protocols. To our knowledge, this was the first RA TTT review where studies were grouped in this way. We wanted to compare if our approach had been adopted in reviews of hypertension, hyperlipidemia or diabetes.
METHODS:
We searched MEDLINE for systematic reviews (SRs) of TTT studies in hypertension, hyperlipidaemia or diabetes.
RESULTS:
Eleven SRs were included; eight were in diabetes, and four were in hypertension, while none were in hyperlipidaemia. The diabetes SRs evaluated different insulin regimens (n = 3), non-insulin medications (n = 1), any antidiabetic treatment (n = 2), metformin monotherapy versus combination therapy (n = 1), and tight versus conventional glucose control (n = 1). The metformin review grouped studies by outcome whereas all other diabetes SRs grouped studies by treatment. Two hypertension SRs evaluated the effects of any treatment on two blood pressure targets, whereas one evaluated two different treatment regimen effects on the same blood pressure target. No SR in hypertension or diabetes included a mix of TTT versus usual care, and/or same treatment protocol different targets, and/or different treatment protocols same target study designs.
CONCLUSIONS:
In RA TTT does not refer to a single concept but a range of different approaches to the treatment of patients and the evidence reflects this. Whilst our approach to grouping RA TTT studies in a review was novel, this made it complex for us to synthesize evidence and draw general conclusions. We did not identify any TTT reviews in hypertension or diabetes including a mix of the TTT approaches we identified in RA. At present, a comparison of the strengths and limitations of our TTT review study grouping with reviews of hypertension, hyperlipidemia or diabetes cannot be made.
There are 341 000 patients in the United States who are dependent on routine dialysis for survival. Recent large-scale disasters have emphasized the importance of disaster preparedness, including supporting dialysis units, for people with chronic disease. Contingency plans for staffing are important for providing continuity of care for a technically challenging procedure such as dialysis. PReparing Emergency Personnel in Dialysis (PREP-D) is a just-in-time training program designed to train individuals having minimum familiarity with the basic steps of dialysis to support routine dialysis staff during a disaster.
Methods
A 5-module educational program was developed through a collaborative, multidisciplinary effort. A pilot study testing the program was performed using 20 nontechnician dialysis facility employees and 20 clinical-year medical students as subjects.
Results
When comparing pretest and posttest scores, the entire study population showed a mean improvement of 28.9%, with dialysis facility employees and medical students showing improvements of 21.8% and 36.4%, respectively (P < .05 for all comparisons).
Conclusions
PREP-D participants were able to demonstrate improved tests scores when taught in a just-in-time training format. The knowledge gained by using the PREP-D program during a staffing shortage may allow for continuity of care for critical services such as dialysis during a disaster. (Disaster Med Public Health Preparedness. 2013;7:272-277).
Parkinson's disease (PD), the most common neurodegenerative movement disorder, has traditionally been considered a “classic” basal ganglia disease, as the most obvious pathology is seen in the dopaminergic cells in the substantia nigra pars compacta. Nevertheless recent discoveries in anatomical connections linking the basal ganglia and the cerebellum have led to a re-examination of the role of the cerebellum in the pathophysiology of PD. This review summarizes the role of the cerebellum in explaining many curious features of PD: the significant variation in disease progression between individuals; why severity of dopaminergic deficit correlates with many features of PD such as bradykinesia, but not tremor; and why PD subjects with a tremor-predominant presentation tend to have a more benign prognosis. It is clear that the cerebellum participates in compensatory mechanisms associated with the disease and must be considered an essential contributor to the overall pathophysiology of PD.
Four pedons on each of four drift sheets in the Lake Wellman area of the Darwin Mountains were sampled for chemical and microbial analyses. The four drifts, Hatherton, Britannia, Danum, and Isca, ranged from early Holocene (10 ka) to mid-Quaternary (c. 900 ka). The soil properties of weathering stage, salt stage, and depths of staining, visible salts, ghosts, and coherence increase with drift age. The landforms contain primarily high-centred polygons with windblown snow in the troughs. The soils are dominantly complexes of Typic Haplorthels and Typic Haploturbels. The soils were dry and alkaline with low levels of organic carbon, nitrogen and phosphorus. Electrical conductivity was high accompanied by high levels of water soluble anions and cations (especially calcium and sulphate in older soils). Soil microbial biomass, measured as phospholipid fatty acids, and numbers of culturable heterotrophic microbes, were low, with highest levels detected in less developed soils from the Hatherton drift. The microbial community structure of the Hatherton soil also differed from that of the Britannia, Danum and Isca soils. Ordination revealed the soil microbial community structure was influenced by soil development and organic carbon.
To evaluate whether longitudinal insurer claims data allow reliable identification of elevated hospital surgical site infection (SSI) rates.
Design.
We conducted a retrospective cohort study of Medicare beneficiaries who underwent coronary artery bypass grafting (CABG) in US hospitals performing at least 80 procedures in 2005. Hospitals were assigned to deciles by using case mix–adjusted probabilities of having an SSI-related inpatient or outpatient claim code within 60 days of surgery. We then reviewed medical records of randomly selected patients to assess whether chart-confirmed SSI risk was higher in hospitals in the worst deciles compared with the best deciles.
Participants.
Fee-for-service Medicare beneficiaries who underwent CABG in these hospitals in 2005.
Results.
We evaluated 114,673 patients who underwent CABG in 671 hospitals. In the best decile, 7.8% (958/12,307) of patients had an SSI-related code, compared with 24.8% (2,747/11,068) in the worst decile (P<.001). Medical record review confirmed SSI in 40% (388/980) of those with SSI-related codes. In the best decile, the chart-confirmed annual SSI rate was 3.2%, compared with 9.4% in the worst decile, with an adjusted odds ratio of SSI of 2.7 (confidence interval, 2.2–3.3; P<.001) for CABG performed in a worst-decile hospital compared with a best-decile hospital.
Conclusions.
Claims data can identify groups of hospitals with unusually high or low post-CABG SSI rates. Assessment of claims is more reproducible and efficient than current surveillance methods. This example of secondary use of routinely recorded electronic health information to assess quality of care can identify hospitals that may benefit from prevention programs.
To outline methods for deriving and validating intensive care unit (ICU) antimicrobial utilization (AU) measures from computerized data and to describe programming problems that emerged.
Design.
Retrospective evaluation of computerized pharmacy and administrative data.
Setting.
ICUs from 4 academic medical centers over 36 months.
Interventions.
Investigators separately developed and validated programming code to report AU measures in selected ICUs. Use of antibacterial and antifungal drugs for systemic administration was categorized and expressed as antimicrobial-days (each day that each antimicrobial drug was given to each patient) and patient-days receiving antimicrobials (each day that any antimicrobial drug was given to each patient). Monthly rates were compiled and analyzed centrally, with ICU patient-days as the denominator. Results were validated against data collected from manual review of medical records. Frequent discussion among investigators aided identification and correction of programming problems.
Results.
AU data were successfully programmed though a reiterative process of computer code revision. After identifying and resolving major programming errors, comparison of computerized patient-level data with data collected by manual review of medical records revealed discrepancies in antimicrobial-days and patient-days receiving antimicrobials that ranged from less than 1% to 17.7%. The hospital from which numerator data were derived from electronic records of medication administration had the least discrepant results.
Conclusions.
Computerized AU measures can be derived feasibly, but threats to validity must be sought out and corrected. The magnitude of discrepancies between computerized AU data and a gold standard based on manual review of medical records varies, with electronic records of medication administration providing maximal accuracy.
Proton radiography using laser-driven sources has been developed as a diagnostic since the beginning of the decade, and applied successfully to a range of experimental situations. Multi-MeV protons driven from thin foils via the Target Normal Sheath Acceleration mechanism, offer, under optimal conditions, the possibility of probing laser-plasma interactions, and detecting electric and magnetic fields as well as plasma density gradients with ~ps temporal resolution and ~ 5–10 µm spatial resolution. In view of these advantages, the use of proton radiography as a diagnostic in experiments of relevance to Inertial Confinement Fusion is currently considered in the main fusion laboratories. This paper will discuss recent advances in the application of laser-driven radiography to experiments of relevance to Inertial Confinement Fusion. In particular we will discuss radiography of hohlraum and gasbag targets following the interaction of intense ns pulses. These experiments were carried out at the HELEN laser facility at AWE (UK), and proved the suitability of this diagnostic for studying, with unprecedented detail, laser-plasma interaction mechanisms of high relevance to Inertial Confinement Fusion. Non-linear solitary structures of relevance to space physics, namely phase space electron holes, have also been highlighted by the measurements. These measurements are discussed and compared to existing models.
The incidence of surgical site infection (SSI) after hysterectomy ranges widely from 2% to 21%. A specific risk stratification index could help to predict more accurately the risk of incisional SSI following abdominal hysterectomy and would help determine the reasons for the wide range of reported SSI rates in individual studies. To increase our understanding of the risk factors needed to build a specific risk stratification index, we performed a retrospective multihospital analysis of risk factors for SSI after abdominal hysterectomy.
Methods.
Retrospective case-control study of 545 abdominal and 275 vaginal hysterectomies from July 1, 2003, to June 30, 2005, at 4 institutions. SSIs were defined by using Centers for Disease Control and Prevention/National Nosocomial Infections Surveillance criteria. Independent risk factors for abdominal hysterectomy were identified by using logistic regression.
Results.
There were 13 deep incisional, 53 superficial incisional, and 18 organ-space SSIs after abdominal hysterectomy and 14 organ-space SSIs after vaginal hysterectomy. Because risk factors for organ-space SSI were different according to univariate analysis, we focused further analyses on incisional SSI after abdominal hysterectomy. The maximum serum glucose level within 5 days after operation was highest in patients with deep incisional SSI, lower in patients with superficial incisional SSI, and lowest in uninfected patients (median, 189, 156, and 141 mg/dL, respectively; P = .005). Independent risk factors for incisional SSI included blood transfusion (odds ratio [OR], 2.4) and morbid obesity (body mass index [BMI], >35; OR, 5.7). Duration of operation greater than the 75th percentile (OR, 1.7), obesity (BMI, 30–35; OR, 3.0), and lack of private health insurance (OR, 1.7) were marginally associated with increased odds of SSI.
Conclusions.
Incisional SSI after abdominal hysterectomy was associated with increased BMI and blood transfusion. Longer duration of operation and lack of private health insurance were marginally associated with SSI.
Lexical semantic classes of verbs play an important role in structuring complex predicate information in a lexicon, thereby avoiding redundancy and enabling generalizations across semantically similar verbs with respect to their usage. Such classes, however, require many person-years of expert effort to create manually, and methods are needed for automatically assigning verbs to appropriate classes. In this work, we develop and evaluate a feature space to support the automatic assignment of verbs into a well-known lexical semantic classification that is frequently used in natural language processing. The feature space is general – applicable to any class distinctions within the target classification; broad – tapping into a variety of semantic features of the classes; and inexpensive – requiring no more than a POS tagger and chunker. We perform experiments using support vector machines (SVMs) with the proposed feature space, demonstrating a reduction in error rate ranging from 48% to 88% over a chance baseline accuracy, across classification tasks of varying difficulty. In particular, we attain performance comparable to or better than that of feature sets manually selected for the particular tasks. Our results show that the approach is generally applicable, and reduces the need for resource-intensive linguistic analysis for each new classification task. We also perform a wide range of experiments to determine the most informative features in the feature space, finding that simple, easily extractable features suffice for good verb classification performance.
To measure infection rates in a regional cohort of long-term-care facilities (LTCFs) using standard surveillance methods and to analyze different methods for interfacility comparisons.
Setting:
Seventeen LTCFs in Idaho.
Design:
Prospective, active surveillance for LTCF-acquired infections using standard definitions and case-finding methods was conducted from July 2001 to June 2002. All surveillance data were combined and individual facility performance was compared with the aggregate employing a variety of statistical and graphic methods.
Results:
The surveillance data set consisted of 472,019 resident-days of care with 1,717 total infections for a pooled mean rate of 3.64 infections per 1,000 resident-days. Specific infections included respiratory (828; rate, 1.75), skin and soft tissue (520; rate, 1.10), urinary tract (282; rate, 0.60), gastrointestinal (77; rate, 0.16), unexplained febrile illnesses (6; rate, 0.01), and bloodstream (4; rate, 0.01). Initially, methods adopted from the National Nosocomial Infections Surveillance System were used comparing individual rates with pooled means and percentiles of distribution. A more sensitive method appeared to be detecting statistically significant deviations (based on chi-square analysis) of the individual facility rates from the aggregate of all other facilities. One promising method employed statistical process control charts (U charts) adjusted to compare individual rates with aggregate monthly rates, providing simultaneous visual and statistical comparisons. Small multiples graphs were useful in providing images valid for rapid concurrent comparison of all facilities.
Conclusion:
Interfacility comparisons have been demonstrated to be valuable for hospital infection control programs, but have not been studied extensively in LTCFs.
A 3-yr study was conducted in Wheatland County, Alberta to determine if agronomic practices of growers influenced the occurrence of herbicide resistance in wild oat. Wild oat seeds were collected in 33 fields in 1997 and in 31 fields in each of 1998 and 1999 (one field per grower). Seedlings were screened for resistance to two acetyl-CoA carboxylase (ACCase) inhibitors, imazamethabenz, an acetolactate synthase (ALS) inhibitor, and triallate, a thiocarbamate herbicide. A questionnaire on herbicide resistance awareness and management practices was completed by each grower. Both ACCase and ALS inhibitor resistance in wild oat were linked to a lack of crop rotation diversity. In addition, ALS inhibitor–resistant wild oat was associated with conservation-tillage systems and recent use of herbicides with that mode of action. Results of this study suggest that timely tillage and inclusion of fall-seeded and perennial forage crops in rotations will effectively slow the selection of resistance in this grass species.
Nineteenth century negatives and positives in the collections of the National Museums of Scotland (NMS) and the National Galleries of Scotland (NGS) were analysed non-destructively to identify the techniques used in their manufacture. Modern positive and negative images prepared using known nineteenth century processes were also analysed for comparison. Air-path energy dispersive x-ray fluorescence analysis and controlled pressure scanning electron microscopy with energy dispersive microanalysis enabled the images to be divided into groups based on the levels of bromine, iodine and silver, and the likely processes used inferred. An early group of positives were probably sensitised with either silver chloride or silver bromide and fixed with potassium bromide. However, most positives were probably sensitised with silver chloride and fixed with sodium thiosulphate. Most negatives were probably sensitized with silver iodide and fixed with potassium bromide (predominant), sodium thiosulphate or potassium iodide. Cobalt and arsenic are present due to the use of smalt in the production of white paper. Copper and zinc are attributed to incorporation of fragments of brass buttons left on the rags used in paper production, observed as small blue spots. The presence of iron, sometimes visible as orange spots, may be from rust off the paper making machines.