We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Alcohol use in autism spectrum disorder (ASD) is under-researched. Previous reviews have explored substance use as a whole, but this neglects individual characteristics unique to different substances. Alcohol use in non-clinical samples is associated with diverse responses. To advance practice and policy, an improved understanding of alcohol use among people with ASD is crucial to meet individual needs.
Aims
This was a narrative systematic review of the current literature on the association between alcohol use and ASD, focusing on aetiology (biological, psychological, social and environmental risk factors) and implications (consequences and protective factors) of alcohol use in autistic populations who utilise clinical services. We sought to identify priority research questions and offer policy and practice recommendations.
Method
PROSPERO Registration: CRD42023430291. The search was conducted across five databases: CINAHL, EMBASE, MEDLINE, PsychINFO and Global Health. Included studies explored alcohol use and ASD within clinical samples.
Results
A total of 22 studies was included in the final review. The pooled prevalence of alcohol use disorder in ASD was 1.6% and 16.1% in large population registers and clinical settings, respectively. Four components were identified as possible aetiological risk factors: age, co-occurring conditions, gender and genetics. We identified ten implications for co-occurring alcohol use disorder in ASD, summarised as a concept map.
Conclusion
Emerging trends in the literature suggest direction and principles for research and practice. Future studies should use a standardised methodological approach, including psychometrically validated instruments and representative samples, to inform policy and improve the experience for autistic populations with co-occurring alcohol use.
Control over the legislative messaging agenda has important political, electoral and policy consequences. Existing models of congressional agenda-setting suggest that national polarization drives the agenda. At the same time, models of home style and formal models of leadership hypothesize that legislators shift their messaging as they balance coordination and information problems. We say the coordination problem dominates when conditions incentivize legislators to agree on the same message rather than fail to reach consensus. Conversely, the information problem is said to dominate in circumstances where legislators prefer to say nothing at all rather than reach consensus on the wrong political message. Formal theories predict that when coordination problems are pressing, legislative members follow the policy positions of party leaders. When their party’s information problem is acute, party members instead rely on the wisdom of the caucus to set the party’s agenda. To test these theories, we analyze the Twitter accounts of U.S. House members with a Joint Sentiment Topic model, generating a new understanding of House leadership power. Our analyses reveal complex leader-follower relationships. Party leaders possess the power to substantially affect the propensity of rank-and-file members to discuss topics, especially when the coordination problem dominates; these effects are pronounced even when coordination problems are pressing. That said, when the underlying politics are unclear, rank-and-file members exert influence on the discussion of a topic because the information problem is more acute. At the same time and for these uncertain topics, leadership influence decreases, consistent with theory. We show these results are robust to the underlying dynamics of contemporary political discussion and context, including leading explanations for party leadership power, such as national polarization.
This editorial considers the value and nature of academic psychiatry by asking what defines the specialty and psychiatrists as academics. We frame academic psychiatry as a way of thinking that benefits clinical services and discuss how to inspire the next generation of academics.
Medically assisted alcohol withdrawal (MAAW) is increasingly undertaken on acute adult psychiatric wards.
Aims
Comparison of the quality of MAAW between acute adult wards and specialist addictions units in mental health services.
Method
Clinical audit conducted by the Prescribing Observatory for Mental Health (POMH). Information on MAAW was collected from clinical records using a bespoke data collection tool.
Results
Forty-five National Health Service (NHS) mental health trusts/healthcare organisations submitted data relating to the treatment of 908 patients undergoing MAAW on an acute adult ward or psychiatric intensive care unit (PICU) and 347 admitted to a specialist NHS addictions unit. MAAW had been overseen by an addiction specialist in 33 (4%) of the patients on an acute adult ward/PICU. A comprehensive alcohol history, measurement of breath alcohol, full screening for Wernicke's encephalopathy, use of parenteral thiamine, prescription of medications for relapse prevention (such as acamprosate) and referral for specialist continuing care of alcohol-related problems following discharge were all more commonly documented when care was provided on a specialist unit or when there was specialist addictions management on an acute ward.
Conclusions
The findings suggest that the quality of care provided for medically assisted withdrawal from alcohol, including the use of evidence-based interventions, is better when clinicians with specialist addictions training are involved. This has implications for future quality improvement in the provision of MAAW in acute adult mental health settings.
Work is among the most important influences on safety, health and wellbeing, both as a threat to health and as a source of resources that support health. However, the nature and pace of changes to the modern workplace present significant challenges to researchers seeking to understand the health implications of these changes, as well as to government and organizational leaders seeking to craft appropriate policy solutions. This chapter has three goals: (1) to provide an overview of occupational health psychology and describe the NIOSH concept of work organization in terms of implications for occupational health, (2) to present the Job Demands–Resources model as a theoretical framework accounting for the effects of work organization on employee health, and (3) describe health implications of several key trends in the nature of work organization including the employment relationships, work schedules, technology, lean production, and safety and wellness interventions.
Background: Electroconvulsive therapy (ECT) involves the induction of a generalized seizure with an electrical current and has been used worldwide when treating medically refractory psychiatric illness. Here we describe a patient with no prior history or risk factors for epilepsy who developed temporal lobe epilepsy after chronic treatment of ECT. Methods: A 16-year-old right-handed boy with severe refractory depression received ECT treatment every 10 days for 8 months. Six months into his ECT treatment, the patient developed seizures and was admitted to a pediatric epilepsy monitoring unit. Results: Initial clinical events included lightheadedness, diaphoresis, and nausea with associated kaleidoscopic vision changes. Seizures progressed to confusion, fear and paranoia by the time the patient was admitted for monitoring. Long-term video EEG captured many focal seizures with impaired awareness, all originating from both temporal lobes. MRI was normal. ECT was terminated and the patient started on carbamazepine. He has been seizure free for the past 2 years on medication Conclusions: While rare, we present a case of a patient with no prior risk factors for epilepsy who developed temporal lobe epilepsy after chronic ECT treatment. Although ECT is an indispensable treatment for many medically refractory psychiatric illnesses, we suggest caution in young patient undergoing ECT.
A new species, Contarinia brassicola Sinclair (Diptera: Cecidomyiidae), which induces flower galls on canola (Brassica napus Linnaeus and Brassica rapa Linnaeus (Brassicaceae)), is described from Saskatchewan and Alberta, Canada. Larvae develop in the flowers of canola, which causes swelling and prevents opening, pod formation, and seed set. Mature larvae exit the galls, fall to the soil, and form cocoons. Depending on conditions, larvae will either pupate and eclose in the same calendar year or enter facultative diapause and emerge the following year. At least two generations of C. brassicola occur each year. Adults emerge from overwintering cocoons in the spring and lay eggs on developing canola flower buds. The galls produced by C. brassicola were previously attributed to the swede midge, Contarinia nasturtii (Kieffer) in Saskatchewan; here, we compare and list several characters to differentiate the two species.
Indigenous women and children experience some of the most profound health disparities globally. These disparities are grounded in historical and contemporary trauma secondary to colonial atrocities perpetuated by settler society. The health disparities that exist for chronic diseases may have their origins in early-life exposures that Indigenous women and children face. Mechanistically, there is evidence that these adverse exposures epigenetically modify genes associated with cardiometabolic disease risk. Interventions designed to support a resilient pregnancy and first 1000 days of life should abrogate disparities in early-life socioeconomic status. Breastfeeding, prenatal care and early child education are key targets for governments and health care providers to start addressing current health disparities in cardiometabolic diseases among Indigenous youth. Programmes grounded in cultural safety and co-developed with communities have successfully reduced health disparities. More works of this kind are needed to reduce inequities in cardiometabolic diseases among Indigenous women and children worldwide.
The particle size of the forage has been proposed as a key factor to ensure a healthy rumen function and maintain dairy cow performance, but little work has been conducted on ryegrass silage (GS). To determine the effect of chop length of GS and GS:maize silage (MS) ratio on the performance, reticular pH, metabolism and eating behaviour of dairy cows, 16 multiparous Holstein-Friesian cows were used in a 4×4 Latin square design with four periods each of 28-days duration. Ryegrass was harvested and ensiled at two mean chop lengths (short and long) and included at two ratios of GS:MS (100:0 or 40:60 dry matter (DM) basis). The forages were fed in mixed rations to produce four isonitrogenous and isoenergetic diets: long chop GS, short chop GS, long chop GS and MS and short chop GS and MS. The DM intake (DMI) was 3.2 kg/day higher (P<0.001) when cows were fed the MS than the GS-based diets. The short chop length GS also resulted in a 0.9 kg/day DM higher (P<0.05) DMI compared with the long chop length. When fed the GS:MS-based diets, cows produced 2.4 kg/day more (P<0.001) milk than when fed diets containing GS only. There was an interaction (P<0.05) between chop length and forage ratio for milk yield, with a short chop length GS increasing yield in cows fed GS but not MS-based diets. An interaction for DM and organic matter digestibility was also observed (P<0.05), where a short chop length GS increased digestibility in cows when fed the GS-based diets but had little effect when fed the MS-based diet. When fed the MS-based diets, cows spent longer at reticular pH levels below pH 6.2 and pH 6.5 (P<0.01), but chop length had little effect. Cows when fed the MS-based diets had a higher (P<0.05) milk fat concentration of C18 : 2n-6 and total polyunsaturated fatty acids compared with when fed the GS only diets. In conclusion, GS chop length had little effect on reticular pH, but a longer chop length reduced DMI and milk yield but had little effect on milk fat yield. Including MS reduced reticular pH, but increased DMI and milk performance irrespective of the GS chop length.
Introduction: Early recognition of sepsis can improve patient outcomes yet recognition by paramedics is poor and research evaluating the use of prehospital screening tools is limited. Our objective was to evaluate the predictive validity of the Regional Paramedic Program for Eastern Ontario (RPPEO) prehospital sepsis notification tool to identify patients with sepsis and to describe and compare the characteristics of patients with an emergency department (ED) diagnosis of sepsis that are transported by paramedics. The RPPEO prehospital sepsis notification tool is comprised of 3 criteria: current infection, fever &/or history of fever and 2 or more signs of hypoperfusion (eg. SBP<90, HR 100, RR24, altered LOA). Methods: We performed a review of ambulance call records and in-hospital records over two 5-month periods between November 2014 February 2016. We enrolled a convenience sample of patients, assessed by primary and advanced care paramedics (ACPs), with a documented history of fever &/or documented fever of 38.3°C (101°F) that were transported to hospital. In-hospital management and outcomes were obtained and descriptive, t-tests, and chi-square analyses performed where appropriate. The RPPEO prehospital sepsis notification tool was compared to an ED diagnosis of sepsis. The predictive validity of the RPPEO tool was calculated (sensitivity, specificity, NPV, PPV). Results: 236 adult patients met the inclusion criteria with the following characteristics: mean age 65.2 yrs [range 18-101], male 48.7%, history of sepsis 2.1%, on antibiotics 23.3%, lowest mean systolic BP 125.9, treated by ACP 58.9%, prehospital temperature documented 32.6%. 34 (14.4%) had an ED diagnosis of sepsis. Patients with an ED diagnosis of sepsis, compared to those that did not, had a lower prehospital systolic BP (114.9 vs 127.8, p=0.003) and were more likely to have a prehospital shock index >1 (50.0% vs 21.4%, p=0.001). 44 (18.6%) patients met the RPPEO sepsis notification tool and of these, 27.3% (12/44) had an ED diagnosis of sepsis. We calculated the following predictive values of the RPPEO tool: sensitivity 35.3%, specificity 84.2%, NPV 88.5%, PPV 27.3%. Conclusion: The RPPEO prehospital sepsis notification tool demonstrated modest diagnostic accuracy. Further research is needed to improve accuracy and evaluate the impact on patient outcomes.
Established methods of recruiting population controls for case–control studies to investigate gastrointestinal disease outbreaks can be time consuming, resulting in delays in identifying the source or vehicle of infection. After an initial evaluation of using online market research panel members as controls in a case–control study to investigate a Salmonella outbreak in 2013, this method was applied in four further studies in the UK between 2014 and 2016. We used data from all five studies and interviews with members of each outbreak control team and market research panel provider to review operational issues, evaluate risk of bias in this approach and consider methods to reduce confounding and bias. The investigators of each outbreak reported likely time and cost savings from using market research controls. There were systematic differences between case and control groups in some studies but no evidence that conclusions on the likely source or vehicle of infection were incorrect. Potential selection biases introduced by using this sampling frame and the low response rate are unclear. Methods that might reduce confounding and some bias should be balanced with concerns for overmatching. Further evaluation of this approach using comparisons with traditional methods and population-based exposure survey data is recommended.
Studies were conducted to determine lethal temperatures for Cyperus esculentus and Cyperus rotundus tubers using diurnal oscillations in soil temperature with maxima of 40, 45, 50, and 55 C and a minimum of 26 C. Growth of Cyperus spp. plants was faster at 40 C than at a constant temperature of 26 C. The 45 C treatment delayed Cyperus spp. emergence but was not lethal to tubers. Tuber mortality was 100% for both Cyperus spp. with the 50 and 55 C temperature regimes. Soil solarization with thermal-infrared-retentive (TIR) films resulted in higher soil temperatures than with a 30-μm low-density polyethylene (LDPE) clear film. With TIR films, greater proportions of emerged C. rotundus plants were killed by foliar scorching, and 6 wk of soil solarization was more effective at reducing C. rotundus density than with the LDPE film. Four weeks after film removal, the lowest level of control was obtained with the LDPE film. For C. rotundus tubers planted 5 and 10 cm deep, 62% control was obtained with the LDPE film, and it decreased to 32% with a 15-cm planting depth. The best residual control was 95 and 92% with the 75-and 100-μm TIR films, respectively. With the TIR films, there was no significant change in C. rotundus control with planting depth.
Soil solarization is a process of soil disinfestation that involves solar heating of moist soil covered with transparent polyethylene film. This nonchemical approach to controlling soil-borne pests is being investigated as an alternative to methyl bromide fumigation. Summer solarization controlled annual weed species and suppressed purple nutsedge. Although some nutsedge tubers sprouted despite the solarization treatment, the resulting shoots were almost always trapped under the clear solarization film. Conversely, in rows that were mulched with black film, nutsedge rhizomes punctured the film so that leaf expansion occurred above the film. In controlled pot experiments conducted in darkness, yellow nutsedge rhizomes readily penetrated 19- and 30-μm clear films as effectively as opaque films. Thicker clear films and bubble film reduced nutsedge penetration. In the greenhouse and laboratory, nutsedge penetration of transparent polyethylene film was inversely related to irradiance levels when the film was in direct contact with the soil. However, when there was a 5- to 10-mm space between the soil and the film, the lowest irradiance level (30 μmol m−2 s−1) was as effective as 320 μmol m−2 s−1 in reducing penetration by purple nutsedge. The film penetration by nutsedge rhizomes appears to be linked to a light-dependent morphological change from rhizome growth to leaf development, which occurs before film penetration with clear mulch and after film penetration with opaque mulch. The alternate sprouting and foliar scorching of nutsedge shoots trapped under clear films could potentially be exploited to deplete nutsedge tubers that occur at soil depths that do not develop lethal temperatures under soil solarization.
Soybean [Glycine max (L.) Merr.] cultivars ‘UFV1’ and ‘UFV2’ grown at Viçosa and Florestal, Brazil, and ‘Bonus' and ‘Wells' at Urbana, Illinois, were sprayed at growth stages R5.5 to R6 (full-pod) or R7 (50% defoliation) with the desiccant/herbicides glyphosate [N-(phosphonomethyl)glycine], paraquat (1,1′-dimethyl-4,4′-bipyridinium ion), or sodium chlorate:sodium borate (50:50, w/v). Desiccation of plants by paraquat significantly reduced seed weight and germination at all locations and increased the incidence of Alternaria and Phomopsis spp. at Urbana. Analysis of the combined data from the Brazilian locations showed a significant decrease in seed germination for all treatments except paraquat sprayed on the UFV2 at R7 and sodium chlorate: sodium borate sprayed on UFV1 at R7. Herbicide application at R7 did not result in consistent increases in seedborne Fusarium or Phomopsis spp. at any Brazilian location. Rainfall and temperature at seed maturation were more important variables in pod-to-seed infection by these fungi than increased rates of tissue senescence caused by the desiccants.
Introduction: In Ottawa, STEMI patients are transported directly to percutaneous coronary intervention (PCI) by advanced care paramedics (ACPs), primary care paramedics (PCPs), or transferred from PCP to ACP crew (ACP-intercept). PCPs have a limited skill set to address complications during transport.The objective of this study was to determine what clinically important events (CIEs) occurred in STEMI patients transported for primary PCI via a PCP crew, and what proportion of such events could only be treated by ACP protocols. Methods: We conducted a health record review of STEMI patients transported for primary PCI from Jan 1, 2011-Dec 21, 2015. Ottawa has a single PCI center and its EMS system employs both PCP and ACP paramedics. We identified consecutive STEMI bypass patients transported by PCP-only and ACP-intercept using the dispatch database. A data extraction form was piloted and used to extract patient demographics, transport times, and primary outcomes: CIEs and interventions performed during transport, and secondary outcomes: hospital diagnosis, and mortality. CIEs were reviewed by two investigators to determine if they would be treated differently by ACP protocols. We present descriptive statistics. Results: We identified 967 STEMI bypass cases among which 214 (118 PCP-only and 96 ACP-intercept) met all inclusion criteria. Characteristics were: mean age 61.4 years, 78% male, 31.8% anterior and 44.4% inferior infarcts, mean response time 6 min, total paramedic contact time 29 min, and in cases of ACP-intercept 7 min of PCP-only contact time.A CIE occurred in 127 (59%) of cases: SBP<90 mmHg 26.2%, HR<60 30.4%, HR>100 20.6%, malignant arrhythmias 7.5%, altered mental status 6.5%, airway intervention 2.3%, 2 patients (0.9%) arrested, both survived. Of the CIE identified, 54 (42.5%) could be addressed differently by ACP vs PCP protocols (25.2% of total cases). The majority related to fluid boluses for hypotension (44 cases; 35% of CIE). ACP intervention for CIEs within the ACP intercept group was 51.6%. There were 6 in-hospital deaths (2.8%) with no difference in transport crew type. Conclusion: CIEs are common in STEMI bypass patients however a smaller proportion of such CIE would be addressed differently by ACP protocols compared to PCP protocols. The vast majority of CIE appeared to be transient and of limited clinical significance.
Introduction: The Canadian C-Spine Rule (CCR) was validated by emergency physicians and triage nurses to determine the need for radiography in alert and stable Emergency Department trauma patients. It was modified and validated for use by paramedics in 1,949 patients. The prehospital CCR calls for evaluation of active neck rotation if patients have none of 3 high-risk criteria and at least 1 of 4 low-risk criteria. This study evaluated the impact and safety of the implementation of the CCR by paramedics. Methods: This single-centre prospective cohort implementation study took place in Ottawa, Canada. Advanced and primary care paramedics received on-line and in-person training on the CCR, allowing them to use the CCR to evaluate eligible patients and selectively transport them without immobilization. We evaluated all consecutive eligible adult patients (GCS 15, stable vital signs) at risk for neck injury. Paramedics were required to complete a standardized study data form for each eligible patient evaluated. Study staff reviewed paramedic documentation and corresponding hospital records and diagnostic imaging reports. We followed all patients without initial radiologic evaluation for 30 days for referral to our spine service, or subsequent visit with radiologic evaluation. Analyses included sensitivity, specificity, kappa coefficient, t-test, and descriptive statistics with 95% CIs. Results: The 4,034 patients enrolled between Jan. 2011 and Aug. 2015 were: mean age 43 (range 16-99), female 53.3%, motor vehicle collision 51.9%, fall 23.8%, admitted to hospital 7.0%, acute c-spine injury 0.8%, and clinically important c-spine injury (0.3%). The CCR classified patients for 11 important injuries with sensitivity 91% (95% CI 58-100%), and specificity 67% (95% CI 65-68%). Kappa agreement for interpretation of the CCR between paramedics and study investigators was 0.94 (95% CI 0.92-0.95). Paramedics were comfortable or very comfortable using the CCR in 89.8% of cases. Mean scene time was 3 min (15.6%) shorter for those not immobilized (17 min vs. 20 min; p=0.0001). A total of 2,569 (63.7%) immobilizations were safely avoided using the CCR. Conclusion: Paramedics could safely and accurately apply the CCR to low-risk trauma patients. This had a significant impact on scene times and the number of prehospital immobilizations.