To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Intravenous tissue-type plasminogen activator (IVtPA) is a proven treatment for acute ischemic stroke; however, diabetes mellitus (DM) and previous cerebral infarction (PCI) were considered relative contraindications for thrombolysis within the 3–4.5 h period.
The study aimed to determine the safety and efficacy of IVtPA among diabetic patients with PCI presenting with acute ischemic stroke.
Studies which evaluated the outcome of IVtPA in terms of symptomatic intracerebral hemorrhage (sICH), functional outcome in modified Rankin scale, and death among diabetic patients with PCI presenting with acute ischemic stroke within the 3–4.5 h period were systematically searched until July 2019. Screening and eligibility criteria were applied. Risk of bias was evaluated using the Newcastle–Ottawa Scale. Odds ratios (ORs) with 95% confidence interval (CI) were used to compare measures of treatment effect. Mantel–Haenszel method and random-effects model were also employed.
Four registry-based studies with a total of 44,572 patients were included for quantitative synthesis. Giving IVtPA among DM+/PCI+ patients did not result in significantly increased rate of sICH (OR, 1.09; 95% CI, 0.88, 1.36) compared to No DM+/PCI+ patients. However, there was significantly higher mortality (OR, 1.81; 95% CI, 1.60, 2.06) in the DM+/PCI+ group. Conversely, among those who survived, the DM+/PCI+ patients were more functionally independent at 3 months (OR, 0.76; 95% CI, 0.61, 0.94).
Limited evidence suggests that thrombolysis in DM+/PCI+ patients does not result in significantly higher incidence of sICH and may improve functional independence. However, the significantly higher mortality in this group warrants an assessment of the individualized risk–benefit ratio in the use of IVtPA.
Criminal law and criminal justice are becoming increasingly globalised. In open societies, the era in which individual jurisdictions developed their own codes, statutes and systems of justice with no regard to other systems and countries is long over. There is a growing desire to develop common approaches to common problems and to learn from the diversity of current practice in different countries. This development has been reinforced by the internationalisation of criminal justice in international and mixed criminal tribunals. However, attempts at trans-jurisdictional discourse are often hampered by mutual misunderstanding. Some problems are linguistic: although English is the new lingua franca in international and comparative criminal law, not all foundational concepts of criminal law and justice originate in the English-speaking world; some of them are rooted in civil law jurisdictions, such as France, Germany and Italy. The translation of these concepts into English is thus subject to ambiguity and potential error: the same term may have different meanings in different legal contexts. As a consequence, critical and theoretical discussions too often take place within the different legal traditions rather than between them: Anglo-American scholars talk to each other, as do those taught in Continental European criminal law traditions; too rarely do they engage seriously with each other across these jurisdictional borders.
A defendant’s prior crimes affect decision-making throughout the criminal process, from decisions taken by the police, prosecutors and investigating magistrates (bail), through to prison and parole authorities considering whether to release prisoners. It is at sentencing however, that criminal history has the greatest impact on decisions and the lives of defendants. Of all the aggravating factors, a criminal record is the most commonly invoked, the most powerful and also the most controversial. In general, people with prior convictions are treated more harshly in all criminal justice systems, civil and common law. This near-universal sentencing policy is variously described as a Recidivist Sentencing Premium, a Prior Record Enhancement, or Criminal History Enhancement; the German term is Strafschärfung für Rückfalltäter or, briefer, Rückfallschärfung. The penologist Nigel Walker referred to prior convictions as ‘the most obvious example of aggravation’ and Hessick and Hessick described the recidivist sentencing premium as ‘one punishment issue on which everyone seems to agree’. In this chapter, we argue that it is neither as obvious nor as consensual as these quotes suggest. Other authors seem closer to the truth when they describe ‘the controversial question of sentencing repeat offenders’.
Sunflower protein is not used in human nutrition despite a relatively good amino acid composition. However, the bioavailability of sunflower isolate has never been measured in Humans. The goal of this work was to determine ileal digestibility of protein and amino acids from a sunflower isolate in healthy volunteers and to challenge newly developed dual isotope method.
Materials and methods:
Eight healthy volunteers were equipped with a naso-ileal tube. They received during four hours twelve doses of biscuits containing, in total, 25 g of 15N sunflower protein isolate together with 400 mg of a mixture of free 13C amino acids incorporated in chocolates. Polyethylene glycol was perfused as non-absorbable marker and ileal contents were collected during 8 hours after ingestion of the first meal. Real ileal digestibility was measured by assessing nitrogen and carbon content as well as 15N and 13C enrichments by EA-IRMS. Amino acid digestibility was determined by measuring 15N and 13C enrichments by GC-C-IRMS and quantity of amino acids by UPLC. Blood was collected for 8 h to determine 15N and 13C enrichments by GC-C-IRMS.
The ileal nitrogen flow was 2.7 ± 0.5 mL/min (mean ± SD). In average, 53.1 ± 12.0 mmol of exogenous nitrogen was recovered during the eight hours of experiment, resulting in an ileal digestibility of sunflower isolate was 85.6 ± 2.6 % of nitrogen ingested. 13C amino acids were also recovered at the ileal level. The mixture of free 13C revealed an ileal digestibility of 94.9 ± 0.9 %. Ileal indispensable amino acids digestibility and DIAAS are in progress.
Ileal digestibility of sunflower isolate incorporated in toasted biscuits was lower than the value found or a raw isolate in a rat model (94.5%). The study revealed that 5 % of free amino acids were not absorbed in the ileum. Amino acid digestibility will complete the study to evaluate the DIAAS of sunflower isolate and to compare values obtained with the standard method and with the dual isotope method.
Attempts at trans-jurisdictional debate and agreement are often beset by mutual misunderstanding. Professionals and academics engaged in comparative criminal law sometimes use the same terms with different meanings or different terms which mean the same thing. Although English is the new lingua franca in international and comparative criminal law, there are many ambiguities and uncertainties with regard to foundational criminal law and criminal justice concepts. However, there exists greater similarities among diverse systems of criminal law and justice than is commonly realised. This book explores the foundational principles and concepts that underpin the different domestic systems. It focuses on the Germanic and several principal Anglo-American jurisdictions, which are employed as examples of the wider common law-civil law divide.
This chapter outlines the Capacity Market (CM) which exists in Great Britain (GB). It covers some of the context and history of the mechanism including how the CM was designed, the process required for State aid approval and the legislation that underpins it. It then briefly refers to the economic theory behind capacity mechanisms and reviews how the CM in GB works in practice. Finally, it analyses the results seen so far in GB and some of the future challenges as the energy market continues to evolve.
CONTEXT AND HISTORY
CHANGES TO THE ENERGY SYSTEM
Great Britain, like many countries in Europe, is facing significant changes in its electricity system. Many traditional generators, such as large coal and nuclear plants, are coming to the natural end of their lives. This is not only a recent phenomenon: 11.5 GW of coal and oil capacity closed before 2015 due to the implementation of the EU Large Combustion Plant Directive. At least a further 8 GW of capacity is expected to close by 2030 as they become uneconomic, leading to a greater need for investment in replacement capacity. In addition, the power sector must make significant reductions in its carbon intensity in order to meet legally binding emissions targets. The Government's most recent Carbon Budget set a target for 2030 for reducing carbon emissions by 57 % compared to 1990 levels.
Intermittent generating technologies such as solar and wind are an increasingly significant part of the electricity system with almost 30GW installed by the end of 2017. Renewable sources provided almost a quarter of total UK electricity generation in 2016 compared to only 5 % in 2006. This creates an increasing need for flexibility and for backup generation to cover periods where intermittent technologies are not delivering.
The way the system is being used is also changing. New technologies such as electricity storage are being deployed on a commercial scale as costs reduce. And more generators are now locating closer to customers and connecting directly to the distribution system. All of these changes are creating new challenges for the market, system operators, regulators and government.
Treatment for hoarding disorder is typically performed by mental health professionals, potentially limiting access to care in underserved areas.
We aimed to conduct a non-inferiority trial of group peer-facilitated therapy (G-PFT) and group psychologist-led cognitive–behavioural therapy (G-CBT).
We randomised 323 adults with hording disorder 15 weeks of G-PFT or 16 weeks of G-CBT and assessed at baseline, post-treatment and longitudinally (≥3 months post-treatment: mean 14.4 months, range 3–25). Predictors of treatment response were examined.
G-PFT (effect size 1.20) was as effective as G-CBT (effect size 1.21; between-group difference 1.82 points, t = −1.71, d.f. = 245, P = 0.04). More homework completion and ongoing help from family and friends resulted in lower severity scores at longitudinal follow-up (t = 2.79, d.f. = 175, P = 0.006; t = 2.89, d.f. = 175, P = 0.004).
Peer-led groups were as effective as psychologist-led groups, providing a novel treatment avenue for individuals without access to mental health professionals.
Declaration of interest
C.A.M. has received grant funding from the National Institutes of Health (NIH) and travel reimbursement and speakers’ honoraria from the Tourette Association of America (TAA), as well as honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. K.D. receives research support from the NIH and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. R.S.M. receives research support from the National Institute of Mental Health, National Institute of Aging, the Hillblom Foundation, Janssen Pharmaceuticals (research grant) and the Alzheimer's Association. R.S.M. has also received travel support from the National Institute of Mental Health for Workshop participation. J.Y.T. receives research support from the NIH, Patient-Centered Outcomes Research Institute and the California Tobacco Related Research Program, and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. All other authors report no conflicts of interest.
Surgical site infections (SSIs) following colorectal surgery (CRS) are among the most common healthcare-associated infections (HAIs). Reduction in colorectal SSI rates is an important goal for surgical quality improvement.
To examine rates of SSI in patients with and without cancer and to identify potential predictors of SSI risk following CRS
American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data files for 2011–2013 from a sample of 12 National Comprehensive Cancer Network (NCCN) member institutions were combined. Pooled SSI rates for colorectal procedures were calculated and risk was evaluated. The independent importance of potential risk factors was assessed using logistic regression.
Of 22 invited NCCN centers, 11 participated (50%). Colorectal procedures were selected by principal procedure current procedural technology (CPT) code. Cancer was defined by International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) codes.
The primary outcome of interest was 30-day SSI rate.
A total of 652 SSIs (11.06%) were reported among 5,893 CRSs. Risk of SSI was similar for patients with and without cancer. Among CRS patients with underlying cancer, disseminated cancer (SSI rate, 17.5%; odds ratio [OR], 1.66; 95% confidence interval [CI], 1.23–2.26; P=.001), ASA score ≥3 (OR, 1.41; 95% CI, 1.09–1.83; P=.001), chronic obstructive pulmonary disease (COPD; OR, 1.6; 95% CI, 1.06–2.53; P=.02), and longer duration of procedure were associated with development of SSI.
Patients with disseminated cancer are at a higher risk for developing SSI. ASA score >3, COPD, and longer duration of surgery predict SSI risk. Disseminated cancer should be further evaluated by the Centers for Disease Control and Prevention (CDC) in generating risk-adjusted outcomes.
An evidence-based emergency department (ED) atrial fibrillation and flutter (AFF) pathway was developed to improve care. The primary objective was to measure rates of new anticoagulation (AC) on ED discharge for AFF patients who were not AC correctly upon presentation.
This is a pre-post evaluation from April to December 2013 measuring the impact of our pathway on rates of new AC and other performance measures in patients with uncomplicated AFF solely managed by emergency physicians. A standardized chart review identified demographics, comorbidities, and ED treatments. The primary outcome was the rate of new AC. Secondary outcomes were ED length of stay (LOS), referrals to AFF clinic, ED revisit rates, and 30-day rates of return visits for congestive heart failure (CHF), stroke, major bleeding, and death.
ED AFF patients totalling 301 (129 pre-pathway [PRE]; 172 post-pathway [POST]) were included; baseline demographics were similar between groups. The rates of AC at ED presentation were 18.6% (PRE) and 19.7% (POST). The rates of new AC on ED discharge were 48.6 % PRE (95% confidence interval [CI] 42.1%-55.1%) and 70.2% POST (62.1%-78.3%) (20.6% [p<0.01; 15.1-26.3]). Median ED LOS decreased from 262 to 218 minutes (44 minutes [p<0.03; 36.2-51.8]). Thirty-day rates of ED revisits for CHF decreased from 13.2% to 2.3% (10.9%; p<0.01; 8.1%-13.7%), and rates of other measures were similar.
The evidence-based pathway led to an improvement in the rate of patients with new AC upon discharge, a reduction in ED LOS, and decreased revisit rates for CHF.
The San Francisco Fire Department’s (SFFD; San Francisco, California USA) Homeless Outreach and Medical Emergency (HOME) Team is the United States’ first Emergency Medical Services (EMS)-based outreach effort using a specially trained paramedic to redirect frequent users of EMS to other types of services. The effectiveness of this program at reducing repeat use of emergency services during the first seven months of the team’s existence was examined.
A retrospective analysis of EMS use frequency and demographic characteristics of frequent users was conducted. Clients that used emergency services at least four times per month from March 2004 through May 2005 were contacted for intervention. Patterns for each frequent user before and after intervention were analyzed. Changes in EMS use during the 15-month study interval was the primary outcome measurement.
A total of 59 clients were included. The target population had a median age of 55.1 years and was 68% male. Additionally, 38.0% of the target population was homeless, 43.4% had no primary care, 88.9% had a substance abuse disorder at time of contact, and 83.0% had a history of psychiatric disorder. The HOME Team undertook 320 distinct contacts with 65 frequent users during the study period. The average EMS use prior to HOME Team contact was 18.72 responses per month (SD=19.40), and after the first contact with the HOME Team, use dropped to 8.61 (SD=10.84), P<.001.
Frequent users of EMS suffer from disproportionate comorbidities, particularly substance abuse and psychiatric disorders. This population responds well to the intervention of a specially trained paramedic as measured by EMS usage.
TangherliniN, VillarJ, BrownJ, RodriguezRM, YehC, FriedmanBT, WadaP. The HOME Team: Evaluating the Effect of an EMS-based Outreach Team to Decrease the Frequency of 911 Use Among High Utilizers of EMS. Prehosp Disaster Med. 2016;31(6):603–607.
A total of 149 children, who spent an average of 13.8 months in Russian institutions, were transferred to Russian families of relatives and nonrelatives at an average age of 24.7 months. After residing in these families for at least 1 year (average = 43.2 months), parents reported on their attachment, indiscriminately friendly behavior, social–emotional competencies, problem behaviors, and effortful control when they were 1.5–10.7 years of age. They were compared to a sample of 83 Russian parents of noninstitutionalized children, whom they had reared from birth. Generally, institutionalized children were rated similarly to parent-reared children on most measures, consistent with substantial catch-up growth typically displayed by children after transitioning to families. However, institutionalized children were rated more poorly than parent-reared children on certain competencies in early childhood and some attentional skills. There were relatively few systematic differences associated with age at family placement or whether the families were relatives or nonrelatives. Russian parent-reared children were rated as having more problem behaviors than the US standardization sample, which raises cautions about using standards cross-culturally.
The objective of the present study was to investigate associations between sugar intake and overweight using dietary biomarkers in the Norfolk cohort of the European Prospective Investigation into Cancer and Nutrition (EPIC-Norfolk).
Prospective cohort study.
EPIC-Norfolk in the UK, recruitment between 1993 and 1997.
Men and women (n 1734) aged 39–77 years. Sucrose intake was assessed using 7 d diet diaries. Baseline spot urine samples were analysed for sucrose by GC-MS. Sucrose concentration adjusted by specific gravity was used as a biomarker for intake. Regression analyses were used to investigate associations between sucrose intake and risk of BMI>25·0 kg/m2 after three years of follow-up.
After three years of follow-up, mean BMI was 26·8 kg/m2. Self-reported sucrose intake was significantly positively associated with the biomarker. Associations between the biomarker and BMI were positive (β=0·25; 95 % CI 0·08, 0·43), while they were inverse when using self-reported dietary data (β=−1·40; 95 % CI −1·81, −0·99). The age- and sex-adjusted OR for BMI>25·0 kg/m2 in participants in the fifth v. first quintile was 1·54 (95 % CI 1·12, 2·12; Ptrend=0·003) when using biomarker and 0·56 (95 % CI 0·40, 0·77; Ptrend<0·001) with self-reported dietary data.
Our results suggest that sucrose measured by objective biomarker but not self-reported sucrose intake is positively associated with BMI. Future studies should consider the use of objective biomarkers of sucrose intake.