To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An early economic evaluation to inform the translation into clinical practice of a spectroscopic liquid biopsy for the detection of brain cancer. Two specific aims are (1) to update an existing economic model with results from a prospective study of diagnostic accuracy and (2) to explore the potential of brain tumor-type predictions to affect patient outcomes and healthcare costs.
A cost-effectiveness analysis from a UK NHS perspective of the use of spectroscopic liquid biopsy in primary and secondary care settings, as well as a cost–consequence analysis of the addition of tumor-type predictions was conducted. Decision tree models were constructed to represent simplified diagnostic pathways. Test diagnostic accuracy parameters were based on a prospective validation study. Four price points (GBP 50-200, EUR 57-228) for the test were considered.
In both settings, the use of liquid biopsy produced QALY gains. In primary care, at test costs below GBP 100 (EUR 114), testing was cost saving. At GBP 100 (EUR 114) per test, the ICER was GBP 13,279 (EUR 15,145), whereas at GBP 200 (EUR 228), the ICER was GBP 78,300 (EUR 89,301). In secondary care, the ICER ranged from GBP 11,360 (EUR 12,956) to GBP 43,870 (EUR 50,034) across the range of test costs.
The results demonstrate the potential for the technology to be cost-effective in both primary and secondary care settings. Additional studies of test use in routine primary care practice are needed to resolve the remaining issues of uncertainty—prevalence in this patient population and referral behavior.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
Background: Advances in surgical leads have been thought to potentially enable improved low-back pain relief using SCS. A recently introduced 32-contact surgical lead, which couples multiple independent current control and anatomically-based neural targeting stimulation algorithms, allows for patient-specific programming optimization. We present a real world study of this surgical lead. Methods: A multi-center, consecutive, observational study of a new 32-contact surgical lead was carried out, using the Precision Spectra SCS System (Boston Scientific) in 100 subjects out to 12 months post-implant. We examined procedural information, programming parameters, and clinical outcomes including pain reduction (NRS), activities of daily living, and change in pain medications. Results: Surgical lead placement distribution was between T7 and L2, with most at top of T9 (26%). A mean reduction of 5.1 points (SD 2.15, p<0.001) from 7.8 (baseline) to 2.6 in overall pain was observed. A subset of subjects reporting low-back pain only exhibited a mean decrease of 6.0 points (SD 2.12, p<0.001) from 8.3 (baseline) to 2.2. Of these, 83.1% of subjects showed ≥50% back pain reduction. Increases in activities of daily living and reduction in pain medication usage were also observed in majority of subjects. Conclusions: Subjects implanted with a 32-contact surgical lead using a neural targeting algorithm demonstrated significant low-back pain reduction.
Objectives: Individuals with major depressive disorder (MDD) demonstrate poorer learning and memory skills relative to never-depressed comparisons (NDC). Previous studies report decreased volume and disrupted function of frontal lobes and hippocampi in MDD during memory challenge. However, it has been difficult to dissociate contributions of short-term memory and executive functioning to memory difficulties from those that might be attributable to long-term memory deficits. Methods: Adult males (MDD, n=19; NDC, n=22) and females (MDD, n=23; NDC, n=19) performed the Semantic List Learning Task (SLLT) during functional magnetic resonance imaging. The SLLT Encoding condition consists of 15 lists, each containing 14 words. After each list, a Distractor condition occurs, followed by cued Silent Rehearsal instructions. Post-scan recall and recognition were collected. Groups were compared using block (Encoding-Silent Rehearsal) and event-related (Words Recalled) models. Results: MDD displayed lower recall relative to NDC. NDC displayed greater activation in several temporal, frontal, and parietal regions, for both Encoding-Silent Rehearsal and the Words Recalled analyses. Groups also differed in activation patterns in regions of the Papez circuit in planned analyses. The majority of activation differences were not related to performance, presence of medications, presence of comorbid anxiety disorder, or decreased gray matter volume in MDD. Conclusions: Adults with MDD exhibit memory difficulties during a task designed to reduce the contribution of individual variability from short-term memory and executive functioning processes, parallel with decreased activation in memory and executive functioning circuits. Ecologically valid long-term memory tasks are imperative for uncovering neural correlates of memory performance deficits in adults with MDD. (JINS, 2016, 22, 412–425)
Coconut, Cocos nucifera L., is a tree that is cultivated to provide a large number of products, although it is mainly grown for its nutritional and medicinal values. Coconut oil, derived from the coconut fruit, has been recognised historically as containing high levels of saturated fat; however, closer scrutiny suggests that coconut should be regarded more favourably. Unlike most other dietary fats that are high in long-chain fatty acids, coconut oil comprises medium-chain fatty acids (MCFA). MCFA are unique in that they are easily absorbed and metabolised by the liver, and can be converted to ketones. Ketone bodies are an important alternative energy source in the brain, and may be beneficial to people developing or already with memory impairment, as in Alzheimer's disease (AD). Coconut is classified as a highly nutritious ‘functional food’. It is rich in dietary fibre, vitamins and minerals; however, notably, evidence is mounting to support the concept that coconut may be beneficial in the treatment of obesity, dyslipidaemia, elevated LDL, insulin resistance and hypertension – these are the risk factors for CVD and type 2 diabetes, and also for AD. In addition, phenolic compounds and hormones (cytokinins) found in coconut may assist in preventing the aggregation of amyloid-β peptide, potentially inhibiting a key step in the pathogenesis of AD. The purpose of the present review was to explore the literature related to coconut, outlining the known mechanistic physiology, and to discuss the potential role of coconut supplementation as a therapeutic option in the prevention and management of AD.
A new technique for the preparation of heavily cracked, heavily damaged, brittle materials for examination in a transmission electron microscope (TEM) is described in detail. In this study, cross-sectional TEM samples were prepared from indented silicon carbide (SiC) bulk ceramics, although this technique could also be applied to other brittle and/or multiphase materials. During TEM sample preparation, milling-induced damage must be minimized, since in studying deformation mechanisms, it would be difficult to distinguish deformation-induced cracking from cracking occurring due to the sample preparation. The samples were prepared using a site-specific, two-step ion milling sequence accompanied by epoxy vacuum infiltration into the cracks. This technique allows the heavily cracked, brittle ceramic material to stay intact during sample preparation and also helps preserve the true microstructure of the cracked area underneath the indent. Some preliminary TEM results are given and discussed in regards to deformation studies in ceramic materials. This sample preparation technique could be applied to other cracked and/or heavily damaged materials, including geological materials, archaeological materials, fatigued materials, and corrosion samples.
Field experiments were conducted between 2009 and 2011 in Ireland to compare the effects of soil tillage systems on the grain yield, nitrogen use efficiency (NUE) and nitrogen (N) uptake patterns of spring barley (Hordeum vulgare) in a cool Atlantic climate. The four tillage treatments comprised conventional tillage in spring (CT), reduced tillage in autumn (RT A), reduced tillage in spring (RT S) and reduced tillage in autumn and spring (RT A+S). Each tillage system was evaluated with five levels of fertilizer N (0, 75, 105, 135 and 165 kg N/ha). Grain yield varied between years but CT had a significantly higher mean yield over the three years than the RT systems. There was no significant difference between the three RT systems. Tillage system had no significant effect on the grain yield response to fertilizer N. As a result of the higher yields achieved, the CT system had a higher NUE than the RT systems at all N rates. There was no significant difference in NUE between the three RT systems. Conventional tillage had significantly higher nitrogen uptake efficiency (NUpE) than RT A and a significantly higher nitrogen utilization efficiency (NUtE) than all three RT systems. Crop N uptake followed a similar pattern each year. Large amounts of N were accumulated during the vegetative growth stages while N was lost after anthesis. Increased N rates had a positive effect on N uptake in the early growth stages but tended to promote N loss later in the season. The CT system had the highest N uptake in the initial growth stages but its rate of uptake diminished at a faster rate than the RT systems as the season progressed. Tillage system had an inconsistent effect on crop N content during the later growth stages. On the basis of these results it is concluded that the use of non-inversion tillage systems for spring barley establishment in a cool oceanic climate remains challenging and in certain conditions may result in a reduction in NUE and lower and more variable grain yields than conventional plough-based systems.
Vitamin D deficiency is emerging worldwide and many studies now suggest its role in the development of several chronic diseases. Due to the low level of vitamin D naturally occurring in food there is a need for supplementation and use of vitamin D-enhanced products. The aim of the present study was to determine if daily consumption of vitamin D2-enhanced mushrooms increased vitamin D status in free-living healthy adults or affected markers of the metabolic syndrome. A total of ninety volunteers (aged 40–65 years) were randomly assigned to one of two 4-week studies: mushroom study (15 µg vitamin D2 or placebo mushroom powder) and capsule study (15 µg vitamin D3 or placebo capsules). Consumption of vitamin D2-enhanced mushrooms increased serum 25-hydroxyvitamin D2 (25(OH)D2) by 128 % from baseline (3·9 (sd 1·9) nmol/l; P < 0·05). Serum 25(OH)D3 increased significantly in the vitamin D3 capsule group (a 55 % increase from a baseline of 44.0 (sd 17·1) nmol/l; P < 0·05). Vitamin D status (25(OH)D) was affected only in the vitamin D3 group. Plasminogen activator inhibitor-1 was lowered by vitamin D2 intake. Vitamin D2 from enhanced mushrooms was bioavailable and increased serum 25(OH)D2 concentration with no significant effect on 25(OH)D3 or total 25(OH)D.
Several neuroimaging studies have investigated brain grey matter in people with body dysmorphic disorder (BDD), showing possible abnormalities in the limbic system, orbitofrontal cortex, caudate nuclei and temporal lobes. This study takes these findings forward by investigating white matter properties in BDD compared with controls using diffusion tensor imaging. It was hypothesized that the BDD sample would have widespread significantly reduced white matter connectivity as characterized by fractional anisotropy (FA).
A total of 20 participants with BDD and 20 healthy controls matched on age, gender and handedness underwent diffusion tensor imaging. FA, a measure of water diffusion within a voxel, was compared between groups on a voxel-by-voxel basis across the brain using tract-based spatial statistics within the FSL package.
Results showed that, compared with healthy controls, BDD patients demonstrated significantly lower FA (p < 0.05) in most major white matter tracts throughout the brain, including in the superior longitudinal fasciculus, inferior fronto-occipital fasciculus and corpus callosum. Lower FA levels could be accounted for by increased radial diffusivity as characterized by eigenvalues 2 and 3. No area of higher FA was found in BDD.
This study provided the first evidence of compromised white matter integrity within BDD patients. This suggests that there are inefficient connections between different brain areas, which may explain the cognitive and emotion regulation deficits within BDD patients.
To estimate the proportion of healthcare-associated infections (HAIs) in US hospitals that are “reasonably preventable,” along with their related mortality and costs.
To estimate preventability of catheter-associated bloodstream infections (CABSIs), catheter-associated urinary tract infections (CAUTIs), surgical site infections (SSIs), and ventilator-associated pneumonia (VAP), we used a federally sponsored systematic review of interventions to reduce HAIs. Ranges of preventability included the lowest and highest risk reductions reported by US studies of “moderate” to “good” quality published in the last 10 years. We used the most recently published national data to determine the annual incidence of HAIs and associated mortality. To estimate incremental cost of HAIs, we performed a systematic review, which included costs from studies in general US patient populations. To calculate ranges for the annual number of preventable infections and deaths and annual costs, we multiplied our infection, mortality, and cost figures with our ranges of preventability for each HAI.
AS many as 65%–70% of cases of CABSI and CAUTI and 55% of cases of VAP and SSI may be preventable with current evidence-based strategies. CAUTI may be the most preventable HAI. CABSI has the highest number of preventable deaths, followed by VAP. CABSI also has the highest cost impact; costs due to preventable cases of VAP, CAUTI, and SSI are likely less.
Our findings suggest that 100% prevention of HAIs may not be attainable with current evidence-based prevention strategies; however, comprehensive implementation of such strategies could prevent hundreds of thousands of HAIs and save tens of thousands of lives and billions of dollars.
Objectives: Health technology assessment (HTA) programs influence practice on a broad scale through reimbursement decisions or national guidelines. Hospital-based HTA programs inform clinical decisions at the local level. Typically, they do this by adapting general HTA to their local setting, or by creating new HTA. However, unlike payer-based HTA organizations, hospital-based HTA organizations can also integrate local data into their reports.
Methods: We describe two examples of local data integrated into hospital-based HTA. In the first, qualitative data were used to select a new cardiac catheterization lab. In the second, quantitative data was used to inform a decision on whether to continue telemedicine services to critical care units. Local evidence sources included equipment service records, and interviews with physicians, technicians, and administrative staff in the first example, and the hospital's administrative and claims databases in the second example.
Results: In each case, there was little evidence from the peer-reviewed literature that could be applied to the decision. In the first example, staffing patterns and local preferences had considerable bearing on technology choices. In the second example, local outcomes data from administrative records were decisive.
Conclusions: Hospital-based HTA using local data can fill gaps in the published evidence, and also improve the generalizability of evidence to the local setting. To take advantage of local evidence, health systems should encourage the development of hospital-based HTA centers, seek out local preference data, and maintain databases of patient outcomes and utilization of services.