To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Given the hierarchical nature and structure of field schools, enrolled students are particularly susceptible to harassment and assault. In 2018, the National Academies of Sciences, Engineering, and Medicine (NASEM) released recommendations to help prevent sexual harassment and assault of women in academia. Although these recommendations are specific to higher education and exclusive to women, some can be modified and applied to the context of archaeological field schools. We review the NASEM's recommendations, with particular attention to those applicable to the field school setting, and provide suggestions for making field schools safer and more inclusive learning environments for all students. Although we present recommendations for practices that can be implemented at field schools, additional research is needed to understand how sexual harassment occurs at field schools and how the implementation of these recommendations can make learning safer.
Although no pharmacological treatment has proved to be highly effective for reducing cocaine dependence, several medications have been tested over the last decade and have shown promising efficacy. Modafinil (Provigil), known as a treatment for day time sleepiness, and Topiramate (Topamax), an anti-epileptic medication also prescribed for migraine, have been shown to be effective in controlled clinical trials. We have recently started a major study utilizing Positron Emission Tomography (PET) brain imaging to monitor the progress of pharmacotherapy with modafinil or topiramate in cocaine-dependent and methadone-maintained cocaine-dependent patients. Patients will be assessed before treatment, and again after 4 weeks of pharmacotherapy. The aims of the project are to study effects of the two medications on cocaine dependence and craving, and on dopamine binding in the brain. At each assessment session, patients will undergo PET with [11C] raclopride to image the dopamine receptor DRD2. To trigger craving, patients will then be exposed to a videotape showing cocaine use; a questionnaire will be used to record their subjective responses, and a second PET scan will be performed with [18F] fluorodeoxyglucose (FDG) to image cerebral glucose metabolism during craving. This protocol was designed to enable us to study changes resulting from pharmacotherapy on dopamine binding in the brain, and on craving as reflected both in subjective measures and regional cerebral glucose metabolism. In addition, we will investigate the association between subjective measures of craving for cocaine and the level of dopamine DRD2 receptor occupancy in the brain before and after treatment. Notwithstanding the complexity of the clinical and therapeutic reality characterizing cocaine dependence, we hope to present preliminary evidence for the relative efficacy of these two promising medications in treatment for cocaine. dependence. This evidence could also elucidate the brain mechanisms underlying cocaine craving and dependence in cocaine-dependent patients.
Every year 4,000 to 5,000 adolescents reside in Quebec Youth Protection Centers (YPCs). Many youth have risky behaviours and mental health issues that put them at risk for sexually transmitted infections (STIs).
Document the prevalence of STIs (chlamydia and gonorrhoeae) among adolescents aged 14-17 years old entering Quebec residential YPCs and identify associated risk factors.
In 2008–2009, adolescents residing in six YPCs completed a questionnaire covering sexual and substance use behaviours, as well as other health issues affecting their well-being. Urine samples were collected for Chlamydia trachomatis (CTGI) and Neisseria gonorrhoea (NGGI) genital infections.
Among 578 participants, 14-17 years old, 89% were sexually active. Risk behaviours included: early sexual initiation (66% < 14 years); multiple partners (median: girls 5, boys 8); group sex (girls 38%, boys 43%); sex for money or goods (girls 27%, boys 8%). Half of sexual relations were under the influence of drugs/alcohol. Regular substance use (3x weekly and +) was: tobacco: 75.0%; cannabis: 63.1%; alcohol: 24.2%; amphetamines: 16.7%; and cocaine: 7.4%. Prevalence of CTGI: 9.3% girls, 1.9% boys; NGGI: 1.7% girls, 0% boys. In multivariate analysis, factors significantly associated with chlamydia infection among girls were: alcohol intoxication hospitalisation or history of suicide ideation with plan.
Serious alcohol misuse or mental distress were significantly associated with STI infections among adolescents. Mental health professionals are encouraged to provide sexual health and substance use counselling with adolescent patients given the highly woven interaction between mental distress and risk of sexually transmitted infections.
Introduction: Oxygen is commonly administered to prehospital patients presenting with acute myocardial infarction (AMI). We conducted a systematic review to determine if oxygen administration, in AMI, impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central, clinicaltrials.gov and ISRCTN for relevant randomized controlled trials and observational studies comparing oxygen administration and no oxygen administration. The outcomes of interest were: mortality (≤30 days, in-hospital, and intermediate 2-11 months), infarct size, and major adverse cardiac events (MACE). Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. A meta-analysis was performed using RevMan 5 software. Results: Our search yielded 1192 citations of which 48 studies were reviewed as full texts and a total of 8 studies were included in the analysis. All evidence was considered low or very low quality. Five studies reported on mortality finding low quality evidence of no benefit or harm. Low quality evidence demonstrated no benefit or harm from supplemental oxygen administration. Similarly, no benefit or harm was found in MACE or infarct size (very low quality). Normoxia was defined as oxygen saturation measured via pulse oximetry at ≥90% in one recent study and ≥94% in another. Conclusion: We found low and very low quality evidence that the administration of supplemental oxygen to normoxic patients experiencing AMI, provides no clear harm nor benefit for mortality or MACE. The evidence on infarct size was inconsistent and warrants further prospective examination.
Introduction: Opioids are routinely administered for analgesia to prehospital patients experiencing chest discomfort from acute myocardial infarction (AMI). We conducted a systematic review to determine if opioid administration impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central and Clinicaltrials.gov for relevant randomized controlled trials and observational studies comparing opioid administration in AMI patients from 1990 to 2017. The outcomes of interest were: all-cause short-term mortality (≤30 days), major adverse cardiac events (MACE), platelet activity and aggregation, immediate adverse events, infarct size, and analgesia. Included studies were hand searched for additional citations. Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. Results: Our search yielded 3001 citations of which 19 studies were reviewed as full texts and a total of 9 studies were included in the analysis. The studies predominantly reported on morphine as the opioid. Five studies reported on mortality (≤30 days), seven on MACE, four on platelet activity and aggregation, two on immediate adverse events, two on infarct size and none on analgesic effect. We found low quality evidence suggesting no benefit or harm in terms of mortality or MACE. However, low quality evidence indicates that opioids increase infarct size. Low-quality evidence also shows reduced serum P2Y12 (eg: clopidogrel and ticagrelor) active metabolite levels and increased platelet reactivity in the first several hours post administration following an increase in vomiting. Conclusion: We find low and very low quality evidence that the administration of opioids in STEMI may be adversely related to vomiting and some surrogate outcomes including increased infarct size, reduced serum P2Y12 levels, and increased platelet activity. We found no clear benefit or harm on patient-oriented clinical outcomes including mortality.
Objective: To conduct a formative evaluation of a transitional intervention for family caregivers, with assessment of feasibility, acceptability, appropriateness, and potential benefits. Methods: The intervention aimed to provide emotional support, information on community resources, and information and support for development of coping skills for the caregivers of patients aged 65 and older who were to be discharged home from an acute medical hospital admission. We used a one-group, pre- and three-month post-test study design. Results: Ninety-one patient-caregiver dyads were recruited. Of these, 63 caregivers (69%) received all five planned intervention sessions, while 60 (66%) completed the post-test. There were significant reductions in caregiver anxiety and depression following the intervention, and high rates of satisfaction. Discussion: This transitional intervention should be further evaluated, preferably with a control group, either as a stand-alone intervention or as one component of a comprehensive transitional intervention for older patients and their caregivers.
The LIGO/Virgo detections of gravitational waves from merging black holes of ≃ 30 solar mass suggest progenitor stars of low metallicity (Z/Z⊙ ≲ 0.3). In this talk I will provide constrains on where the progenitors of GW150914 and GW170104 may have formed, based on advanced models of galaxy formation and evolution combined with binary population synthesis models. First I will combine estimates of galaxy properties (star-forming gas metallicity, star formation rate and merger rate) across cosmic time to predict the low redshift BBH merger rate as a function of present day host galaxy mass, formation redshift of the progenitor system and different progenitor metallicities. I will show that the signal is dominated by binaries formed at the peak of star formation in massive galaxies with and binaries formed recently in dwarf galaxies. Then, I will present what very high resolution hydrodynamic simulations of different galaxy types can learn us about their black hole populations.
We performed a spatial-temporal analysis to assess household risk factors for Ebola virus disease (Ebola) in a remote, severely-affected village. We defined a household as a family's shared living space and a case-household as a household with at least one resident who became a suspect, probable, or confirmed Ebola case from 1 August 2014 to 10 October 2014. We used Geographic Information System (GIS) software to calculate inter-household distances, performed space-time cluster analyses, and developed Generalized Estimating Equations (GEE). Village X consisted of 64 households; 42% of households became case-households over the observation period. Two significant space-time clusters occurred among households in the village; temporal effects outweighed spatial effects. GEE demonstrated that the odds of becoming a case-household increased by 4·0% for each additional person per household (P < 0·02) and 2·6% per day (P < 0·07). An increasing number of persons per household, and to a lesser extent, the passage of time after onset of the outbreak were risk factors for household Ebola acquisition, emphasizing the importance of prompt public health interventions that prioritize the most populated households. Using GIS with GEE can reveal complex spatial-temporal risk factors, which can inform prioritization of response activities in future outbreaks.
Doveweed is a problematic weed of lawns and sod production, as well as golf course roughs, fairways, and tees. End-user reports of selective POST control options are inconsistent and control is often short-lived. In addition, inconsistent control with non-selective herbicides such as glyphosate is common. The goals of this research were: (1) evaluate selective POST doveweed control options in ‘Tifway’ hybrid bermudagrass turf; (2) compare efficacy of single vs. sequential applications of selective POST herbicides; (3) quantify doveweed tolerance to glyphosate; and (4) quantify recovery of foliar applied glyphosate following treatment with a C14-glyphosate solution. A single application of sulfentrazone+metsulfuron; thiencarbazone+iodosulfuron+dicamba or 2,4-D+MCPP+dicamba+carfentrazone; or thiencarbazone+foramsulfuron+halosulfuron provided >60% control 2 weeks after initial treatment (WAIT). A second application of these treatments 3 WAIT improved control 6 WAIT. Two applications of 2,4-D+MCPP+dicamba+carfentrazone or thiencarbazone+foramsulfuron+halosulfuron provided ~80% control 6 WAIT. Doveweed was tolerant to glyphosate application up to 5.68 kgaeha-1. Absorption of 14C-glyphosate was compared between doveweed with cuticle intact, doveweed with a disturbed cuticle, and smooth crabgrass. 14C-glyphosate recovery from the leaf surface of doveweed plants with an intact cuticle was 93.6%. In comparison, 14C-glyphosate recovery from the leaf surface of doveweed plants with a disrupted cuticle and the leaf surface of crabgrass plants was 79.1 and 70.5%, respectively.
Introduction: Intra-articular steroid injection (IASI) is commonly used in the emergency department for management of osteoarthritis (OA) symptoms. Hip IASI carries risks, such as avascular necrosis, and there is currently no reliable way to predict long-term response of a patient’s OA to IASI. Ultrasound (US) conveniently assesses for active arthropathy by detecting effusion-synovitis, and x-ray (XR) is useful for visualizing bone-related changes. We investigated the extent that a response to hip IASI could be predicted from baseline OA patient clinical and physical features alongside US and XR imaging features. Methods: 97 consenting patients with symptomatic hip OA presenting for hip IASI were evaluated at baseline (XR and US) and again 8-weeks after IASI (US only). Self-reported pain (WOMAC), hip range of motion (ROM) were measured at baseline and follow up. On US images we quantified joint effusion and synovial thickening, i.e., “effusion-synovitis”, by the bone-capsule distance (BCD) at the apex of the femoral head from outer femoral cortex to outer synovium. On XR, we measured minimum joint space width (cm) and Kellgren-Lawrence (K-L) Grade for osteophytes and sclerotic changes. Results: In our 97 patients (43 female) aged 28-87 years (mean 59+/-13 years, K-L grades averaged 2.5+/-1.5, and US BCD averaged 5.9+/-2.0 mm. We performed multiple linear regression using age, sex, BMI, ROM of hip flexion, US BCD, radiographic joint space width and K-L grade against the dependent variable, change in WOMAC pain subscore (R=0.587, P=0.002). We compared the response predicted by this model to the actual change in WOMAC pain. At a threshold value of -20% for minimal clinically important difference, 35/97 patients were responders, and a 2x2 table gave 67% overall model predictive accuracy, 61% sensitivity, and 71% specificity. Likelihood ratio for a positive response (LR+) was 2.13. Conclusion: Combining radiographic information on structural damage, US information on active arthropathy, and demographics correctly predicted about two-thirds of the patients that would benefit from IASI after 8 weeks. A patient with hip OA that met our model criteria was more than twice as likely to respond to IASI. With further model refinement, effective, personalized evidence-based management of symptomatic hip OA is possible using XR and hip US, which could both be performed during an ER visit.
Pertussis epidemics have displayed substantial spatial heterogeneity in countries with high socioeconomic conditions and high vaccine coverage. This study aims to investigate the relationship between pertussis risk and socio-environmental factors on the spatio-temporal variation underlying pertussis infection. We obtained daily case numbers of pertussis notifications from Queensland Health, Australia by postal area, for the period January 2006 to December 2012. A Bayesian spatio-temporal model was used to quantify the relationship between monthly pertussis incidence and socio-environmental factors. The socio-environmental factors included monthly mean minimum temperature (MIT), monthly mean vapour pressure (VAP), Queensland school calendar pattern (SCP), and socioeconomic index for area (SEIFA). An increase in pertussis incidence was observed from 2006 to 2010 and a slight decrease from 2011 to 2012. Spatial analyses showed pertussis incidence across Queensland postal area to be low and more spatially homogeneous during 2006–2008; incidence was higher and more spatially heterogeneous after 2009. The results also showed that the average decrease in monthly pertussis incidence was 3·1% [95% credible interval (CrI) 1·3–4·8] for each 1 °C increase in monthly MIT, while average increase in monthly pertussis incidences were 6·2% (95% CrI 0·4–12·4) and 2% (95% CrI 1–3) for SCP periods and for each 10-unit increase in SEIFA, respectively. This study demonstrated that pertussis transmission is significantly associated with MIT, SEIFA, and SCP. Mapping derived from this work highlights the potential for future investigation and areas for focusing future control strategies.
We examined functional outcomes and quality of life of whole brain radiotherapy (WBRT) with integrated fractionated stereotactic radiotherapy boost (FSRT) for brain metastases treatment. Methods Eighty seven people with 1-3 brain metastases were enrolled on this Phase II trial of WBRT (30Gy/10)+simultaneous FSRT, (60Gy/10). Results Mean (Min-Max) baseline KPS, Mini Mental Status Exam (MMSE) and FACT-BR quality of life were 83 (70-100), 28 (21-30) and 143 (98-153). Lower baseline MMSE (but not KPS or FACT-Br) was associated with worse survival after adjusting for age, number of metastases, primary and extra-cranial disease status. Crude rates of deterioration (>10 points decrease from baseline for KPS and FACT-Br, MMSE fall to<27) ranged from 26-38% for KPS, 32-59% for FACT-Br and 0-16%for MMSE depending on the time-point assessed with higher rates generally noted at earlier time points (<6months post-treatment). Using a linear mixed models analysis, significant declines from baseline were noted for KPS and FACT-Br (largest effects at 6 weeks to 3 months) with no significant change in MMSE. Conclusions The effects on function and quality of life of this integrated treatment of WBRT+simultaneous FSRT were similar to other published series combining WBRT+SRS.
Tropical signalgrass (TSG) has become a serious weed problem in tropical and subtropical regions such as Florida in recent years in association with the ban of organic arsenical herbicide use in turf. The purpose of this research was to identify alternative POST herbicides that control TSG. Two field experiments were conducted in bermudagrass golf course fairways in south and central Florida in 2014 and 2015. Several nonorganic arsenical herbicide treatments controlled TSG. In the first experiment, treatments containing amicarbazone alone and in combination with other herbicides provided > 97% TSG control 12 wk after initial treatment (WAIT) in 2014 and 2015. These included a single application of amicarbazone at 0.49 kg ai ha−1, or sequential applications of amicarbazone at 0.25 kg ha−1 in combination with foramsulfuron at 0.04 kg ai ha−1, sulfentrazone + imazethapyr at 0.25 kg ai ha−1, thiencarbazone + foramsulfuron + halosulfuron at 0.14 kg ai ha−1, and thiencarbazone + iodosulfuron + dicamba at 0.18 kg ai/ae ha−1. In the second experiment, sequential applications of thiencarbazone + foramsulfuron + halosulfuron at 0.14 kg ha−1 in combination with either quinclorac at 0.84 kg ai ha−1 or metribuzin at 0.28 kg ai ha−1 provided ≥ 85% TSG control 12 WAIT in both years.
Introduction: Point of care ultrasound has become an established tool in the initial management of patients with undifferentiated hypotension. Current established protocols (RUSH, ACES, etc) were developed by expert user opinion, rather than objective, prospective data. We wished to use reported disease incidence to develop an informed approach to PoCUS in hypotension using a “4 F’s” approach: Fluid; Form; Function; Filling. Methods: We summarized the incidence of PoCUS findings from an international multicentre RCT, and using a modified Delphi approach incorporating this data we obtained the input of 24 international experts associated with five professional organizations led by the International Federation of Emergency Medicine. The modified Delphi tool was developed to reach an international consensus on how to integrate PoCUS for hypotensive emergency department patients. Results: Rates of abnormal PoCUS findings from 151 patients with undifferentiated hypotension included left ventricular dynamic changes (43%), IVC abnormalities (27%), pericardial effusion (16%), and pleural fluid (8%). Abdominal pathology was rare (fluid 5%, AAA 2%). After two rounds of the survey, using majority consensus, agreement was reached on a SHoC-hypotension protocol comprising: A. Core: 1. Cardiac views (Sub-xiphoid and parasternal windows for pericardial fluid, cardiac form and ventricular function); 2. Lung views for pleural fluid and B-lines for filling status; and 3. IVC views for filling status; B. Supplementary: Additional cardiac views; and C. Additional views (when indicated) including peritoneal fluid, aorta, pelvic for IUP, and proximal leg veins for DVT. Conclusion: An international consensus process based on prospectively collected disease incidence has led to a proposed SHoC-hypotension PoCUS protocol comprising a stepwise clinical-indication based approach of Core, Supplementary and Additional PoCUS views.
Introduction: Point of care ultrasound (PoCUS) provides invaluable information during resuscitation efforts in cardiac arrest by determining presence/absence of cardiac activity and identifying reversible causes such as pericardial tamponade. There is no agreed guideline on how to safely and effectively incorporate PoCUS into the advanced cardiac life support (ACLS) algorithm. We consider that a consensus-based priority checklist using a “4 F’s” approach (Fluid; Form; Function; Filling), would provide a better algorithm during ACLS. Methods: The ultrasound subcommittee of the Australasian College for Emergency Medicine (ACEM) drafted a checklist incorporating PoCUS into the ACLS algorithm. This was further developed using the input of 24 international experts associated with five professional organizations led by the International Federation of Emergency Medicine. A modified Delphi tool was developed to reach an international consensus on how to integrate ultrasound into cardiac arrest algorithms for emergency department patients. Results: Consensus was reached following 3 rounds. The agreed protocol focuses on the timing of PoCUS as well as the specific clinical questions. Core cardiac windows performed during the rhythm check pause in chest compressions are the sub-xiphoid and parasternal cardiac views. Either view should be used to detect pericardial fluid, as well as examining ventricular form (e.g. right heart strain) and function, (e.g. asystole versus organized cardiac activity). Supplementary views include lung views (for absent lung sliding in pneumothorax and for pleural fluid), and IVC views for filling. Additional ultrasound applications are for endotracheal tube confirmation, proximal leg veins for DVT, or for sources of blood loss (AAA, peritoneal/pelvic fluid). Conclusion: The authors hope that this process will lead to a consensus-based SHoC-cardiac arrest guideline on incorporating PoCUS into the ACLS algorithm.
Patients with psychosis display the so-called ‘Jumping to Conclusions’ bias (JTC) – a tendency for hasty decision-making in probabilistic reasoning tasks. So far, only a few studies have evaluated the JTC bias in ‘at-risk mental state’ (ARMS) patients, specifically in ARMS samples fulfilling ‘ultra-high risk’ (UHR) criteria, thus not allowing for comparisons between different ARMS subgroups.
In the framework of the PREVENT (secondary prevention of schizophrenia) study, a JTC task was applied to 188 patients either fulfilling UHR criteria or presenting with cognitive basic symptoms (BS). Similar data were available for 30 healthy control participants matched for age, gender, education and premorbid verbal intelligence. ARMS patients were identified by the Structured Interview for Prodromal Symptoms (SIPS) and the Schizophrenia Proneness Instrument – Adult Version (SPI-A).
The mean number of draws to decision (DTD) significantly differed between ARM -subgroups: UHR patients made significantly less draws to make a decision than ARMS patients with only cognitive BS. Furthermore, UHR patients tended to fulfil behavioural criteria for JTC more often than BS patients. In a secondary analysis, ARMS patients were much hastier in their decision-making than controls. In patients, DTD was moderately associated with positive and negative symptoms as well as disorganization and excitement.
Our data indicate an enhanced JTC bias in the UHR group compared to ARMS patients with only cognitive BS. This underscores the importance of reasoning deficits within cognitive theories of the developing psychosis. Interactions with the liability to psychotic transitions and therapeutic interventions should be unravelled in longitudinal studies.
Attention deficit hyperactivity disorder (ADHD) is overrepresented in prison, making it imperative to identify a screening tool that can be quickly applied to efficiently detect the disorder. We explored the discrimination ability of a widely used ADHD screen, the Barkley Adult ADHD Rating Scale (BAARS-IV), against a clinical diagnostic interview. A brief version of the screen was then developed in order to simplify its use in the prison context, and maximize its diagnostic properties.
A cross-sectional study of 390 male prison inmates was performed in the UK, all participants were screened and interviewed via the Diagnostic Interview for ADHD in Adults 2.0 (DIVA-2).
A total of 47 (12.1%) inmates screened positive for ADHD using the full BAARS-IV, and 96 (24.6%) were clinically diagnosed, for a sensitivity of 37.9 and a specificity of 96.3. Our models identified the six items that most predicted ADHD diagnosis, with adjusted odds ratios ranging from 2.66 to 4.58. Sensitivity, specificity and accuracy were 0.82, 0.84 and 0.84, respectively, for the developed brief scale, and 0.71, 0.85 and 0.81 for its validation. Weighted probability scores produced an area under the curve of 0.89 for development, and 0.82 for validation of the brief scale.
The original BAARS-IV performed poorly at identifying prison inmates with ADHD. Our developed brief scale substantially improved diagnostic accuracy. The brief screening instrument has great potential to be used as an accurate and resource-effective tool to screen young people and adults for likely ADHD in the criminal justice system.