To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The occurrence of nonlocal objects, raw materials, and ideas in the southwestern United States (U.S. SW) has long been recognized as evidence of interaction between prehispanic peoples of this region and those of greater Mesoamerica. Although many archaeologists have analyzed the directionality and potential means by which these objects and concepts moved across the landscape, few have assessed the degree to which Mesoamerican practices and traditional assemblages remained intact as the artifacts and ideas moved farther from their places of origin. The current study analyzes the distribution and deposition of blue-green stone mosaics, a craft technology that was well established in Mesoamerica by the Late Preclassic period (300 BC–AD 250) and spread to the U.S. SW by the start of the Hohokam Pioneer period (AD 475). We assess the spatial distribution, contextual deposition, and morphology of mosaics at sites within Hohokam Canal System 2, located in the Phoenix Basin of Arizona. We use these data to infer mosaics’ social value and function within Hohokam social structure. Analyses suggest that, although the technology of mosaic making may have originated in Mesoamerica, the contexts and ways in which mosaics were used in the Hohokam regional system were decidedly Hohokam.
The blockchain industry has recently broken through into the general public’s consciousness. Gone were the days of blockchain projects being solely the interest of computer programmers, libertarians, and anti-government activists. Now, discussion of the industry graced the pages of the New York Times1 and the Wall Street Journal,2 and the nascent industry was regularly covered by television news programs such as CNBC’s Fast Money.3 The majority of this attention was directed to price increases in cryptocurrencies, such as Bitcoin, but a new vehicle for raising capital – known as an initial coin offering, or ICO – also fueled public enthusiasm. All of this excitement and curiosity has made it harder and harder for lawyers to ignore this industry. As such, it is beneficial for lawyers to get a high-level understanding of what the blockchain industry is, and how it makes technologies like cryptocurrencies possible.
Benzodiazepine (BZD) prescription rates have increased over the past decade in the United States. Available literature indicates that sociodemographic factors may influence diagnostic patterns and/or prescription behaviour. Herein, the aim of this study is to determine whether the gender of the prescriber and/or patient influences BZD prescription.
Cross-sectional study using data from the Florida Medicaid Managed Medical Assistance Program from January 1, 2018 to December 31, 2018. Eligible recipients ages 18 to 64, inclusive, enrolled in the Florida Medicaid plan for at least 1 day, and were dually eligible. Recipients either had a serious mental illness (SMI), or non-SMI and anxiety.
Total 125 463 cases were identified (i.e., received BZD or non-BZD prescription). Main effect of patient and prescriber gender was significant F(1, 125 459) = 0.105, P = 0 .745, partial η2 < 0.001. Relative risk (RR) of male prescribers prescribing a BZD compared to female prescribers was 1.540, 95% confidence intervals (CI) [1.513, 1.567], whereas the RR of male patients being prescribed a BZD compared to female patients was 1.16, 95% CI [1.14, 1.18]. Main effects of patient and prescriber gender were statistically significant F(1, 125 459) = 188.232, P < 0.001, partial η2 = 0.001 and F(1, 125 459) = 349.704, P < 0.001, partial η2 = 0.013, respectively.
Male prescribers are more likely to prescribe BZDs, and male patients are more likely to receive BZDs. Further studies are required to characterize factors that influence this gender-by-gender interaction.
Identify risk factors that could increase progression to severe disease and mortality in hospitalized SARS-CoV-2 patients in the Southeast US.
Design, Setting, and Participants
Multicenter, retrospective cohort including 502 adults hospitalized with laboratory-confirmed COVID-19 between March 1, 2020 and May 8, 2020 within one of 15 participating hospitals in 5 health systems across 5 states in the Southeast US.
The study objectives were to identify risk factors that could increase progression to hospital mortality and severe disease (defined as a composite of intensive care unit admission or requirement of mechanical ventilation) in hospitalized SARS-CoV-2 patients in the Southeast US.
A total of 502 patients were included, and the majority (476/502, 95%) had clinically evaluable outcomes. Hospital mortality was 16% (76/476), while 35% (177/502) required ICU admission, and 18% (91/502) required mechanical ventilation. By both univariate and adjusted multivariate analysis, hospital mortality was independently associated with age (adjusted odds ratio [aOR] 2.03 for each decade increase, 95% CI 1.56-2.69), male sex (aOR 2.44, 95% CI: 1.34-4.59), and cardiovascular disease (aOR 2.16, 95% CI: 1.15-4.09). As with mortality, risk of severe disease was independently associated with age (aOR 1.17 for each decade increase, 95% CI: 1.00-1.37), male sex (aOR 2.34, 95% CI 1.54-3.60), and cardiovascular disease (aOR 1.77, 95% CI 1.09-2.85).
In an adjusted multivariate analysis, advanced age, male sex, and cardiovascular disease increased risk of severe disease and mortality in patients with COVID-19 in the Southeast US. In-hospital mortality risk doubled with each subsequent decade of life.
The recently discovered massive and stockwork sulphide mineralization of Semblana-Rosa Magra and Monte Branco, situated ESE of the Neves–Corvo volcanogenic massive sulphide (VMS) deposit in the Iberian Pyrite Belt (IPB) is presented. Geological setting and tectonic model is discussed based on proxies such as palynostratigraphy and U–Pb zircon geochronology. The mineralization is found within the IPB Volcano-Sedimentary Complex (VSC) Lower sequence, which includes felsic volcanic rocks (rhyolites) with U–Pb ages in zircons of 359.6 ± 1.6 Ma, and black shales of the Neves Formation of late Strunian age. Massive sulphides are enveloped by these shales, implying that felsic volcanism, mineralization and shale sedimentation are essentially coeval. This circumstance is considered highly prospective, as it represents an important exploration vector to target VMS mineralization across the IPB, in areas where the Lower VSC sequence is present. The Upper VSC sequence, with siliciclastic and volcanogenic sedimentary rocks of middle–late Visean age, shows no massive mineralization but a late Tournaisian (350.9 ± 2.3 Ma) volcanism with disseminated sulphides was also identified. Nevertheless, stratigraphic palynological gaps were found within the Strunian and in the Tournaisian sediments, between the Lower and Upper VSC sequences, reflecting probable erosion and uplift mechanisms linked with extensional tectonics. The Semblana and Monte Branco deposits and the Rosa Magra stockwork are enclosed by tectonic sheets that dismembered the VSC sequence in a fold-and-thrust tectonic complex, characteristic of the NE Neves–Corvo region. The methodologies used allow a geological comparison between Neves–Corvo and other IPB mine regions such as Lousal–Caveira, Herrerias, Tharsis and Aznalcollar.
Evaluate changes in antimicrobial use during COVID-19 and after implementation of a multispecialty COVID-19 clinical guidance team compared to pre–COVID-19 antimicrobial use.
Retrospective observational study.
Tertiary-care academic medical center.
Internal medicine and medical intensive care unit (MICU) provider teams and hospitalized COVID-19 patients.
Difference-in-differences analyses of antibiotic days of therapy per 1,000 patient days present (DOT) for internal medicine and MICU teams treating COVID-19 patients versus teams that did not were performed for 3 periods: before COVID-19, initial COVID-19 period, and after implementation of a multispecialty COVID-19 clinical guidance team which included daily, patient-specific antimicrobial stewardship recommendations. Patient characteristics associated with antibiotic DOT were evaluated using multivariable Poisson regression.
In the initial COVID-19 period, compared to the pre–COVID-19 period, internal medicine and MICU teams increased weekly antimicrobial use by 145.3 DOT (95% CI, 35.1–255.5) and 204.0 DOT (95% CI, −16.9 to 424.8), respectively, compared to non–COVID-19 teams. In the intervention period, internal medicine and MICU COVID-19 teams both had significant weekly decreases of 362.3 DOT (95% CI, −443.3 to −281.2) and 226.3 DOT (95% CI, −381.2 to –71.3). Of 131 patients hospitalized with COVID-19, 86 (65.6%) received antibiotics; no specific patient factors were significantly associated with an expected change in antibiotic days.
Antimicrobial use initially increased for COVID-19 patient care teams compared to pre–COVID-19 levels but significantly decreased after implementation of a multispecialty clinical guidance team, which may be an effective strategy to reduce unnecessary antimicrobial use.
Background: Detection of unusual carbapenemase-producing organisms (CPOs) in a healthcare facility may signify broader regional spread. During investigation of a VIM-producing Pseudomonas aeruginosa (VIM-CRPA) outbreak in a long-term acute-care hospital in central Florida, enhanced surveillance identified VIM-CRPA from multiple facilities, denoting potential regional emergence. We evaluated infection control and performed screening for CPOs in skilled nursing facilities (SNFs) across the region to identify potential CPO reservoirs and improve practices. Methods: All SNFs in 2 central Florida counties were offered a facility-wide point-prevalence survey (PPS) for CPOs and a nonregulatory infection control consultation. PPSs were conducted using a PCR-based screening method; specimens with a carbapenemase gene detected were cultured to identify the organisms. Infection control assessments focused on direct observations of hand hygiene (HH), environmental cleaning, and the sink splash zone. Thoroughness of environmental cleaning was evaluated using fluorescent markers applied to 6 standardized high-touch surfaces in at least 2 rooms per facility. Results: Overall, 21 (48%) SNFs in the 2-county region participated; 18 conducted PPS. Bed size ranged from 40 to 391, 5 (24%) facilities were ventilator-capable SNFs (vSNFs), and 12 had short-stay inpatient rehabilitation units. Of 1,338 residents approached, 649 agreed to rectal screening, and 14 (2.2%) carried CPOs. CPO-colonized residents were from the ventilator-capable units of 3 vSNFs (KPC-CRE=7; KPC-CRPA=1) and from short-stay units of 2 additional facilities (VIM-CRPA, n = 5; KPC-CRE, n = 1). Among the 5 facilities where CPO colonization was identified, the prevalence ranged from 1.1% in a short-stay unit to 16.1% in a ventilator unit. All facilities had access to soap and water in resident bathrooms; 14 (67%) had alcohol-based hand rubs accessible. Overall, mean facility HH adherence was 52% (range, 37%–66%; mean observations per facility = 106) (Fig. 1). We observed the use of non–EPA-registered disinfectants and cross contamination from dirty to clean areas during environmental cleaning; the overall surface cleaning rate was 46% (n = 178 rooms); only 1 room had all 6 markers removed. Resident supplies were frequently stored in the sink splash zone. Conclusions: A regional assessment conducted in response to emergence of VIM-CRPA identified a relatively low CPO prevalence at participating SNFs; CPOs were primarily identified in vSNFs and among short-stay residents. Across facilities, we observed low adherence to core infection control practices that could facilitate spread of CPOs and other resistant organisms. In this region, targeting ventilator and short-stay units of SNFs for surveillance and infection control efforts may have the greatest prevention impact.
Advanced cancer patients who are parents of minor children experience heightened psychosocial distress. Oncology social workers (OSWs) are essential providers of psychosocial support to parents with advanced cancer. Yet, little is known about the experiences and approaches of OSWs in addressing these patients’ unique needs. The purpose of this study was to characterize the attitudes, practice behaviors, and training experiences of OSWs who provide psychosocial care for advanced cancer patients with minor children.
Forty-one OSWs participated in a cross-sectional survey addressing multiple facets of their psychosocial care for parents with advanced cancer. The five assessed domains of psychosocial support were communication support, emotional support, household support, illness and treatment decision-making support, and end-of-life planning.
Participants reported greatest confidence in counseling patients on communication with children about illness and providing support to co-parents about parenting concerns. OSWs reported less confidence in counseling parents on end-of-life issues and assisting families with non-traditional household structures. The majority of participants reported needing more time in their clinical practice to sufficiently address parents’ psychosocial needs. Nearly 90% of participants were interested in receiving further training on the care of parents with advanced cancer.
Significance of results
To improve the care of parents with advanced cancer, it is critical to understand how the psychosocial oncology workforce perceives its clinical practice needs. Study findings suggest an opportunity for enhanced training, particularly with respect to end-of-life needs and in response to the changing household structure of American families.
Modified Blalock–Taussig shunt thrombosis is a life-threatening event. We describe an extremely rare catheter-induced shunt thrombosis in an infant with complex CHD and its successful treatment utilising a single low dose of local recombinant tissue plasminogen activator in conjunction with balloon angioplasty.
The United Nations 2030 Agenda for Sustainable Development sets a framework of universal Sustainable Development Goals (SDGs) to address challenges to society and the planet. Island invasive species eradications have well-documented benefits that clearly align with biodiversity conservation-related SDGs, yet the value of this conservation action for socioeconomic benefits is less clear. We examine the potential for island invasive vertebrate eradications to have ecological and socioeconomic benefits. Specifically, we examine: (1) how SDGs may have been achieved through past eradications; and (2) how planned future eradications align with SDGs and associated targets. We found invasive vertebrate eradication to align with 13 SDGs and 42 associated targets encompassing marine and terrestrial biodiversity conservation, promotion of local and global partnerships, economic development, climate change mitigation, human health and sanitation and sustainable production and consumption. Past eradications on 794 islands aligned with a median of 17 targets (range 13–38) by island. Potential future eradications on 292 highly biodiverse islands could align with a median of 25 SDG targets (range 15–39) by island. This analysis enables the global community to explicitly describe the contributions that invasive vertebrate management on islands can make towards implementing the global sustainable development agenda.
Cryoplanation terraces are prominent but enigmatic landforms found in present and past periglacial environments. Geomorphologists have debated for more than a century over processes involved in the formation of these elevated, step-like, bedrock features. Presented here are the first numerical surface exposure ages and scarp retreat rates from cryoplanation terraces in the Yukon-Tanana Upland (YTU) in Alaska, part of unglaciated eastern Beringia, obtained from terrestrial cosmogenic nuclides (TCN) in surface boulders. Ages comprise six 10Be TCN ages from two terrace treads near Eagle Summit and six 36 Cl ages from two treads on Mt. Fairplay. Based on these exposure ages, scarps at both locations were last actively eroding from 49 to 22.4 ka. Both locations exhibit time-transgressive development, particularly near scarp-tread junctions. Boulder exposure ages and distances between sampled boulder locations were used to estimate scarp retreat rates of 0.11 to 0.56 cm/yr. These numerical exposure ages presented here demonstrate that the cryoplanation terraces in the YTU are diachronous surfaces actively eroding during multiple cold intervals. With these results, hypotheses for cryoplanation terrace formation are discussed and evaluated for the YTU, including those based on geologic structure, nivation, and the influence of permafrost.
We have developed ultra-high risk criteria for bipolar affective disorder (bipolar at-risk - BAR) which include general criteria such as being in the peak age range of the onset of the disorder and a combination of specific criteria including sub-threshold mania, depressive symptoms, cyclothymic features and genetic risk. In the current study, the predictive and discriminant validity of these criteria were tested in help seeking adolescents and young adults.
This medical file-audit study was conducted at ORYGEN Youth Health (OYH), a public mental health program for young people aged between 15 and 24 years and living in metropolitan Melbourne, Australia. BAR criteria were applied to the intake assessments of all non-psychotic patients who were being treated in OYH on 31 January.08. All entries were then checked for conversion criteria. Hypomania/mania related additions or alterations to existing treatments or initiation of new treatment by the treating psychiatrist served as conversion criteria to mania.
The BAR criteria were applied to 173 intake assessments. Of these, 22 patients (12.7%) met BAR criteria. The follow-up period of the sample was 265.5 days on average (SD 214.7). There were significantly more cases in the BAR group (22.7%, n = 5) than in the non-BAR group (0.7%, n = 1) who met conversion criteria (p < .001).
These findings support the notion that people who develop a first episode of mania can be identified during the prodromal phase. The proposed criteria need further evaluation in prospective clinical trials.
Individuals with Generalised Anxiety Disorder (GAD) have an attentional bias towards threatening information. It is not known whether this results from facilitated engagement (faster orientation) or delayed disengagement (shifting attention away) from threat. Recent research has developed a new methodology designed to modify attentional disengagement from threat.
Using this paradigm, the present study assessed the causal role of attentional disengagement from threat and its impact on worry.
Twenty-four university students scoring below 56 on the Penn-State-Worry-Questionnaire were randomly assigned to either threat disengagement training, or non-threat disengagement training. Training was assessed using threat and non-threat test-trials. All participants then completed a novel worry task, assessing tendency, ability and persistency of worry. The hypothesis was that training to disengage from threat rather than non-threat stimuli would affect tendency, ability or persistence of worry.
Accuracy and test-trial reaction-time data indicated disengagement training was successful; compared to the non-threat disengagement group, the threat disengagement group had faster reaction-times for non-threat valence test-trials, experienced marginally non-significantly more negative intrusions during active worry, and found it significantly more difficult to worry, when required to engage solely with worry without interruption in the worry task.
It is possible to manipulate attentional bias to disengage from threat information, leading to fewer negative thought intrusions during active worry and increased difficulty in engaging solely with worry, thus suggesting that impaired disengagement has a causal role in the ability to worry.
Two randomized, controlled trials of L-methylfolate augmentation of SSRIs for major depressive disorder (MDD) were conducted using a novel study design (sequential parallel comparison design- SPCD).
To evaluate the efficacy of L-methylfolate augmentation using the Hamilton Depression Rating Scale.
In study one (TRD-1), 148 outpatients with SSRI-resistant MDD were enrolled in a 60-day, SPCD study, divided into two 30-day periods (phases 1 and 2). Patients were randomized 2:3:3 to receive L-methylfolate (7.5mg/d in phase 1, 15mg/d in phase 2), placebo in phase 1 followed by L-methylfolate 7.5mg/d in phase 2, or placebo for both phases. Study two (TRD-2) involved 75 patients and was identical in design to TRD-1 except for the dose of L-methylfolate (15mg only).
In the TRD-1 Study, L-methylfolate 7.5 mg/d was not found to be more effective than placebo. In phase 1 of the TRD-2 Study, 37% of patients on L-methylfolate 15mg/d responded and 18% of placebo patients responded, while in phase 2 among placebo non-responders, the response rates were 28% on L-methylfolate 15mg/d and 9.5% on placebo. When phases 1 and 2 were pooled according to the SPCD model, the difference in response rates was statistically significant in favor of L-methylfolate (p = 0.0399). The rates of spontaneously reported AEs and rates of study discontinuation appear r comparable between L-methylfolate and placebo in both studies. Rates of study discontinuation were also comparable
These studies suggest that L-methylfolate 15 mg/d may be a safe and effective augmentation strategy for inadequate response to SSRIs.
Many institutions are attempting to implement patient-reported outcome (PRO) measures. Because PROs often change clinical workflows significantly for patients and providers, implementation choices can have major impact. While various implementation guides exist, a stepwise list of decision points covering the full implementation process and drawing explicitly on a sociotechnical conceptual framework does not exist.
To facilitate real-world implementation of PROs in electronic health records (EHRs) for use in clinical practice, members of the EHR Access to Seamless Integration of Patient-Reported Outcomes Measurement Information System (PROMIS) Consortium developed structured PRO implementation planning tools. Each institution pilot tested the tools. Joint meetings led to the identification of critical sociotechnical success factors.
Three tools were developed and tested: (1) a PRO Planning Guide summarizes the empirical knowledge and guidance about PRO implementation in routine clinical care; (2) a Decision Log allows decision tracking; and (3) an Implementation Plan Template simplifies creation of a sharable implementation plan. Seven lessons learned during implementation underscore the iterative nature of planning and the importance of the clinician champion, as well as the need to understand aims, manage implementation barriers, minimize disruption, provide ample discussion time, and continuously engage key stakeholders.
Highly structured planning tools, informed by a sociotechnical perspective, enabled the construction of clear, clinic-specific plans. By developing and testing three reusable tools (freely available for immediate use), our project addressed the need for consolidated guidance and created new materials for PRO implementation planning. We identified seven important lessons that, while common to technology implementation, are especially critical in PRO implementation.
To determine whether the Society for Healthcare Epidemiology of America (SHEA) and the Infectious Diseases Society of America (IDSA) Clostridioides difficile infection (CDI) severity criteria adequately predicts poor outcomes.
Retrospective validation study.
Setting and participants:
Patients with CDI in the Veterans’ Affairs Health System from January 1, 2006, to December 31, 2016.
For the 2010 criteria, patients with leukocytosis or a serum creatinine (SCr) value ≥1.5 times the baseline were classified as severe. For the 2018 criteria, patients with leukocytosis or a SCr value ≥1.5 mg/dL were classified as severe. Poor outcomes were defined as hospital or intensive care admission within 7 days of diagnosis, colectomy within 14 days, or 30-day all-cause mortality; they were modeled as a function of the 2010 and 2018 criteria separately using logistic regression.
We analyzed data from 86,112 episodes of CDI. Severity was unclassifiable in a large proportion of episodes diagnosed in subacute care (2010, 58.8%; 2018, 49.2%). Sensitivity ranged from 0.48 for subacute care using 2010 criteria to 0.73 for acute care using 2018 criteria. Areas under the curve were poor and similar (0.60 for subacute care and 0.57 for acute care) for both versions, but negative predictive values were >0.80.
Model performances across care settings and criteria versions were generally poor but had reasonably high negative predictive value. Many patients in the subacute-care setting, an increasing fraction of CDI cases, could not be classified. More work is needed to develop criteria to identify patients at risk of poor outcomes.
Many clinical psychologists either work with children who may be eligible for educational accommodations and special education or else work with adults who have received such services. This chapter provides an overview of assessment issues in educational settings, with a focus on K-12 schooling. We review the legal framework for special education assessment, before considering two controversial issues that interact with that framework: multitiered systems of support that delay comprehensive assessments and the question of whether students with ethnic minority backgrounds are overidentified as having disabilities. We then turn to the assessment of learning disabilities, the largest special education category; we review and evaluate major approaches to learning disability identification. Our next topic is the use of assessment data to determine which students need accommodations on classroom and high-stakes tests. Finally, we discuss an emerging issue: the measurement of effort and motivation that students exhibit during testing in educational settings.
Disturbed sleep and activity are prominent features of bipolar disorder type I (BP-I). However, the relationship of sleep and activity characteristics to brain structure and behavior in euthymic BP-I patients and their non-BP-I relatives is unknown. Additionally, underlying genetic relationships between these traits have not been investigated.
Relationships between sleep and activity phenotypes, assessed using actigraphy, with structural neuroimaging (brain) and cognitive and temperament (behavior) phenotypes were investigated in 558 euthymic individuals from multi-generational pedigrees including at least one member with BP-I. Genetic correlations between actigraphy-brain and actigraphy-behavior associations were assessed, and bivariate linkage analysis was conducted for trait pairs with evidence of shared genetic influences.
More physical activity and longer awake time were significantly associated with increased brain volumes and cortical thickness, better performance on neurocognitive measures of long-term memory and executive function, and less extreme scores on measures of temperament (impulsivity, cyclothymia). These associations did not differ between BP-I patients and their non-BP-I relatives. For nine activity-brain or activity-behavior pairs there was evidence for shared genetic influence (genetic correlations); of these pairs, a suggestive bivariate quantitative trait locus on chromosome 7 for wake duration and verbal working memory was identified.
Our findings indicate that increased physical activity and more adequate sleep are associated with increased brain size, better cognitive function and more stable temperament in BP-I patients and their non-BP-I relatives. Additionally, we found evidence for pleiotropy of several actigraphy-behavior and actigraphy-brain phenotypes, suggesting a shared genetic basis for these traits.
Concurrent chemotherapy with radiotherapy is the standard treatment for locoregionally advanced nasopharyngeal cancer. Cetuximab can be used in the treatment of head and neck squamous cell carcinoma. However, the randomised studies that led to approval for its use in this setting excluded nasopharyngeal cancer. In the context of limited data for the use of cetuximab in nasopharyngeal cancer in the medical literature, this review aimed to summarise the current evidence for its use in both primary and recurrent or metastatic disease.
A literature search was performed using the keywords ‘nasopharyngeal neoplasm’, ‘cetuximab’ and ‘Erbitux’.
Twenty studies were included. There were no randomised phase III trials, but there were nine phase II trials. The use of cetuximab in the treatment of nasopharyngeal carcinoma has been tested in various settings, including in combination with induction chemotherapy and concurrent chemoradiotherapy, and in the palliative setting.
There is no evidence of benefit from the addition of cetuximab to standard management protocols, and there is some evidence of increased toxicity. There is more promise for its use in metastatic or locally recurrent settings. This review draws together the existing evidence and could provide a focus for future studies.