To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Electroconvulsive therapy (ECT) involves the induction of a generalized seizure with an electrical current and has been used worldwide when treating medically refractory psychiatric illness. Here we describe a patient with no prior history or risk factors for epilepsy who developed temporal lobe epilepsy after chronic treatment of ECT. Methods: A 16-year-old right-handed boy with severe refractory depression received ECT treatment every 10 days for 8 months. Six months into his ECT treatment, the patient developed seizures and was admitted to a pediatric epilepsy monitoring unit. Results: Initial clinical events included lightheadedness, diaphoresis, and nausea with associated kaleidoscopic vision changes. Seizures progressed to confusion, fear and paranoia by the time the patient was admitted for monitoring. Long-term video EEG captured many focal seizures with impaired awareness, all originating from both temporal lobes. MRI was normal. ECT was terminated and the patient started on carbamazepine. He has been seizure free for the past 2 years on medication Conclusions: While rare, we present a case of a patient with no prior risk factors for epilepsy who developed temporal lobe epilepsy after chronic ECT treatment. Although ECT is an indispensable treatment for many medically refractory psychiatric illnesses, we suggest caution in young patient undergoing ECT.
Background: There are few published reports on the safety and efficacy of stereoelectroencephalography (SEEG) in the presurgical evaluation of pediatric drug-resistant epilepsy. Our objective was to describe institutional experience with pediatric SEEG in terms of (1) insertional complications, (2) identification of the epileptogenic zone and (3) seizure outcome following SEEG-tailored resections. Methods: Retrospective review of 29 patients pediatric drug resistant epilepsy patients who underwent presurgical SEEG between 2005 – 2018. Results: 29 pediatric SEEG patients (15 male; 12.4 ± 4.6 years old) were included in this study with mean follow-up of 6.0 ± 4.1 years. SEEG-related complications occurred in 1/29 (3%)—neurogenic pulmonary edema. A total of 190 multi-contact electrodes (mean of 7.0 ± 2.5per patient) were implanted across 30 insertions which captured 437 electrographic seizures (mean 17.5 ± 27.6 per patient). The most common rationale for SEEG was normal MRI with surface EEG that failed to identify the EZ (16/29; 55%). SEEG-tailored resections were performed in 24/29 (83%). Engel I outcome was achieved following resections in 19/24 cases (79%) with 5.9 ± 4.0 years of post-operative follow-up. Conclusions: Stereoelectroencephalography in presurgical evaluation of pediatric drug-resistant epilepsy is a safe and effective way to identify the epileptogenic zone permitting SEEG-tailored resection.
Delays in triage processes in the emergency department (ED) can compromise patient safety. The aim of this study was to provide proof-of-concept that a self-check-in kiosk could decrease the time needed to identify ambulatory patients arriving in the ED. We compared the use of a novel automated self-check-in kiosk to identify patients on ED arrival to routine nurse-initiated patient identification.
We performed a prospective trail with random weekly allocation to intervention or control processes during a 10-week study period. During intervention weeks, patients used a self-check-in kiosk to self-identify on arrival. This electronically alerted triage nurses to patient arrival times and primary complaint before triage. During control weeks, kiosks were unavailable and patients were identified using routine nurse-initiated triage. The primary outcome was time-to-first-identification, defined as the interval between ED arrival and identification in the hospital system.
Median (interquartile range) time-to-first-identification was 1.4 minutes (1.0–2.08) for intervention patients and 9 minutes (5–18) for control patients. Regression analysis revealed that the adjusted time-to-first-identification was 13.6 minutes (95% confidence interval 12.8–14.5) faster for the intervention group.
A self-check-in kiosk significantly reduced the time-to-first-identification for ambulatory patients arriving in the ED.
Introduction: Trauma and injury play a significant role in the population's burden of disease. Limited research exists evaluating the role of trauma bypass protocols. The objective of this study was to assess the impact and effectiveness of a newly introduced prehospital field trauma triage (FTT) standard, allowing paramedics to bypass a closer hospital and directly transport to a trauma centre (TC) provided transport times were within 30 minutes. Methods: We conducted a 12-month multi-centred health record review of paramedic call reports and emergency department health records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness, step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as an urgent transport to hospital, that met one of the 4 steps of the FTT standard and would allow for a bypass consideration. We developed and piloted a standardized data collection tool and obtained consensus on all data definitions. The primary outcome was the rate of appropriate triage to a TC, defined as any of the following: injury severity score ≥12, admitted to an intensive care unit, underwent non-orthopedic operation, or death. We report descriptive and univariate analysis where appropriate. Results: 570 adult patients were included with the following characteristics: mean age 48.8, male 68.9%, attended by Advanced Care Paramedic 71.8%, mechanisms of injury: MVC 20.2%, falls 29.6%, stab wounds 10.5%, median initial GCS 14, mean initial BP 132, prehospital fluid administered 26.8%, prehospital intubation 3.5%, transported to a TC 74.6%. Of those transported to a TC, 308 (72.5%) had bypassed a closer hospital prior to TC arrival. Of those that bypassed a closer hospital, 136 (44.2%) were determined to be “appropriate triage to TC”. Bypassed patients more often met the step 1 or step 2 of the standard (186, 66.9%) compared to the step 3 or step 4 (122, 39.6%). An appropriate triage to TC occurred in 104 (55.9%) patients who had met step 1 or 2 and 32 (26.2%) patients meeting step 3 or 4 of the FTT standard. Conclusion: The FTT standard can identify patients who should be bypassed and transported to a TC. However, this is at a cost of potentially burdening the system with poor sensitivity. More work is needed to develop a FTT standard that will assist paramedics in appropriately identifying patients who require a trauma centre.
A new species, Contarinia brassicola Sinclair (Diptera: Cecidomyiidae), which induces flower galls on canola (Brassica napus Linnaeus and Brassica rapa Linnaeus (Brassicaceae)), is described from Saskatchewan and Alberta, Canada. Larvae develop in the flowers of canola, which causes swelling and prevents opening, pod formation, and seed set. Mature larvae exit the galls, fall to the soil, and form cocoons. Depending on conditions, larvae will either pupate and eclose in the same calendar year or enter facultative diapause and emerge the following year. At least two generations of C. brassicola occur each year. Adults emerge from overwintering cocoons in the spring and lay eggs on developing canola flower buds. The galls produced by C. brassicola were previously attributed to the swede midge, Contarinia nasturtii (Kieffer) in Saskatchewan; here, we compare and list several characters to differentiate the two species.
Vitamin B12 is synthesised in the rumen from cobalt (Co) and has a major role in metabolism in the peri-paturient period, although few studies have evaluated the effect of the dietary inclusion of Co, vitamin B12 or injecting vitamin B12 on the metabolism, health and performance of high yielding dairy cows. A total of 56 Holstein-Friesian dairy cows received one of four treatments from 8 weeks before calving to 8 weeks post-calving: C, no added Co; DC, additional 0.2 mg Co/kg dry matter (DM); DB, additional 0.68 mg vitamin B12/kg DM; IB, intra-muscular injection of vitamin B12 to supply 0.71 mg/cow per day prepartum and 1.42 mg/cow per day post-partum. The basal and lactation rations both contained 0.21 mg Co/kg DM. Cows were weighed and condition scored at drying off, 4 weeks before calving, within 24 h of calving and at 2, 4 and 8 weeks post-calving, with blood samples collected at drying off, 2 weeks pre-calving, calving and 2, 4 and 8 weeks post-calving. Liver biopsy samples were collected from all animals at drying off and 4 weeks post-calving. Live weight changed with time, but there was no effect of treatment (P>0.05), whereas cows receiving IB had the lowest mean body condition score and DB the highest (P<0.05). There was no effect of treatment on post-partum DM intake, milk yield or milk fat concentration (P>0.05) with mean values of 21.6 kg/day, 39.6 kg/day and 40.4 g/kg, respectively. Cows receiving IB had a higher plasma vitamin B12 concentration than those receiving any of the other treatments (P<0.001), but there was no effect (P>0.05) of treatment on homocysteine or succinate concentrations, although mean plasma methylmalonic acid concentrations were lower (P=0.019) for cows receiving IB than for Control cows. Plasma β-hydroxybutyrate concentrations increased sharply at calving followed by a decline, but there was no effect of treatment. Similarly, there was no effect (P>0.05) of treatment on plasma non-esterified fatty acids or glucose. Whole tract digestibility of DM and fibre measured at week 7 of lactation were similar between treatments, and there was little effect of treatment on the milk fatty acid profile except for C15:0, which was lower in cows receiving DC than IB (P<0.05). It is concluded that a basal dietary concentration of 0.21 mg Co/kg DM is sufficient to meet the requirements of high yielding dairy cows during the transition period, and there is little benefit from additional Co or vitamin B12.
Indigenous women and children experience some of the most profound health disparities globally. These disparities are grounded in historical and contemporary trauma secondary to colonial atrocities perpetuated by settler society. The health disparities that exist for chronic diseases may have their origins in early-life exposures that Indigenous women and children face. Mechanistically, there is evidence that these adverse exposures epigenetically modify genes associated with cardiometabolic disease risk. Interventions designed to support a resilient pregnancy and first 1000 days of life should abrogate disparities in early-life socioeconomic status. Breastfeeding, prenatal care and early child education are key targets for governments and health care providers to start addressing current health disparities in cardiometabolic diseases among Indigenous youth. Programmes grounded in cultural safety and co-developed with communities have successfully reduced health disparities. More works of this kind are needed to reduce inequities in cardiometabolic diseases among Indigenous women and children worldwide.
Neothaumalea atlanticanew genus, new species (Diptera: Thaumaleidae), is described from the state of Santa Catarina in southern Brazil. This represents the first thaumaleid collected east of the Andes mountain range. The egg, larva, pupa, and both adults are described and illustrated, distribution map presented, and phylogenetic affinities discussed. A key to the genera of South America is also provided.
Background: Temporal lobe epilepsy (TLE) accounts for approximately 20% of pediatric epilepsy cases. Of those, many are considered medically intractable and require surgical interventions. In this study, we hypothesized that mesial temporal sclerosis (MTS) was less common in patients who had undergone surgery for intractable pediatric TLE than in adult series. We further hypothesized that there was a radiological and pathological discordance in identifying the cause of pediatric TLE. Methods: We retrospectively reviewed the charts of pediatric patients with TLE who had undergone surgical treatments as part of the University of Alberta’s Comprehensive Epilepsy Program between 1988 and 2018. Along with preoperative magnetic resonance imaging (MRI) reports, post-surgical pathology results and seizure outcomes were studied Results: Of the 83 pediatric patients who had undergone temporal lobe epilepsy surgery, 28% had tumors, 22% had dual pathologies, 18% had MTS, 11% had focal cortical dysplasia, and 22% had other pathologies. In addition, for 36% of these patients, discordance between their pre-surgical MRI reports and post-surgical pathology reports were found. Conclusions: This was one of the largest retrospective cohort studies of pediatric patients who had undergone surgery for intractable TLE. This study showed that tumors, and not MTS, were the most common pathology in surgical pediatric TLE.
Introduction: Early recognition of sepsis can improve patient outcomes yet recognition by paramedics is poor and research evaluating the use of prehospital screening tools is limited. Our objective was to evaluate the predictive validity of the Regional Paramedic Program for Eastern Ontario (RPPEO) prehospital sepsis notification tool to identify patients with sepsis and to describe and compare the characteristics of patients with an emergency department (ED) diagnosis of sepsis that are transported by paramedics. The RPPEO prehospital sepsis notification tool is comprised of 3 criteria: current infection, fever &/or history of fever and 2 or more signs of hypoperfusion (eg. SBP<90, HR 100, RR24, altered LOA). Methods: We performed a review of ambulance call records and in-hospital records over two 5-month periods between November 2014 February 2016. We enrolled a convenience sample of patients, assessed by primary and advanced care paramedics (ACPs), with a documented history of fever &/or documented fever of 38.3°C (101°F) that were transported to hospital. In-hospital management and outcomes were obtained and descriptive, t-tests, and chi-square analyses performed where appropriate. The RPPEO prehospital sepsis notification tool was compared to an ED diagnosis of sepsis. The predictive validity of the RPPEO tool was calculated (sensitivity, specificity, NPV, PPV). Results: 236 adult patients met the inclusion criteria with the following characteristics: mean age 65.2 yrs [range 18-101], male 48.7%, history of sepsis 2.1%, on antibiotics 23.3%, lowest mean systolic BP 125.9, treated by ACP 58.9%, prehospital temperature documented 32.6%. 34 (14.4%) had an ED diagnosis of sepsis. Patients with an ED diagnosis of sepsis, compared to those that did not, had a lower prehospital systolic BP (114.9 vs 127.8, p=0.003) and were more likely to have a prehospital shock index >1 (50.0% vs 21.4%, p=0.001). 44 (18.6%) patients met the RPPEO sepsis notification tool and of these, 27.3% (12/44) had an ED diagnosis of sepsis. We calculated the following predictive values of the RPPEO tool: sensitivity 35.3%, specificity 84.2%, NPV 88.5%, PPV 27.3%. Conclusion: The RPPEO prehospital sepsis notification tool demonstrated modest diagnostic accuracy. Further research is needed to improve accuracy and evaluate the impact on patient outcomes.
Prolonged periods of stress have been associated with impaired immune function; the experiment reported here investigates a potential link between level of metabolic load and immune function in lactating dairy cattle. A group of 111 Holstein-Friesian dairy cows was used. The cows belonged to one of two genetic lines: a selection line (S) with high genetic merit for fat plus protein yield and an unselected control line (C). The cows were offered one of two silage-based total mixed diets containing either 200 g (LC) or 450 g (HC) of concentrate per kg dry matter. Combination of genetic selection and food gave four groups: S-LC, S-HC, C-LC and C-HC. All cows were inoculated with a live attenuated BHV-1 vaccine soon after parturition and the primary antibody response in whey monitored. The number of BHV-1 antibody positive cows was not significantly different between the four groups; but, the initial antibody response was lower in cows of high genetic merit which were given a low concentrate diet. Statistical analysis demonstrated that the contribution of diet to this effect was highly significant. One year later, again after parturition, the experiment was repeated, this time using serum as the test sample. The average antibody response of the BHV-1 antibody positive cows was not significantly different between the four groups but the number of antibody positive cows was group-dependent. In conclusion, diet type but not genetic merit for high fat plus protein yield made a highly significant contribution to the antibody response of dairy cows to BHV-1 vaccination, both initially and a year later.
Established methods of recruiting population controls for case–control studies to investigate gastrointestinal disease outbreaks can be time consuming, resulting in delays in identifying the source or vehicle of infection. After an initial evaluation of using online market research panel members as controls in a case–control study to investigate a Salmonella outbreak in 2013, this method was applied in four further studies in the UK between 2014 and 2016. We used data from all five studies and interviews with members of each outbreak control team and market research panel provider to review operational issues, evaluate risk of bias in this approach and consider methods to reduce confounding and bias. The investigators of each outbreak reported likely time and cost savings from using market research controls. There were systematic differences between case and control groups in some studies but no evidence that conclusions on the likely source or vehicle of infection were incorrect. Potential selection biases introduced by using this sampling frame and the low response rate are unclear. Methods that might reduce confounding and some bias should be balanced with concerns for overmatching. Further evaluation of this approach using comparisons with traditional methods and population-based exposure survey data is recommended.
The objective of this paper is to present a qualitative study of introducing substance misuse screening using the Screening Brief Intervention and Referral to Treatment (SBIRT) model, in primary care in Abu Dhabi.
Substance misuse in the UAE is an increasing problem. However religious beliefs and fear of legal consequences have prevented this topic from being openly discussed, risk levels identified through screening and treatment options offered.
A controlled trial was undertaken which included a qualitative process study which is reported here. Qualitative interviews with primary care physicians from two intervention clinics were undertaken to explore their views, experiences and attitudes towards substance misuse management in their clinic. Physicians were trained on SBIRT and on the research project process and documentation. At completion of the project, 10 months after the training, physicians (n=17) were invited to participate in an interview to explore their experiences of training and implementation of SBIRT. Interviews were recorded and transcribed. Inductive thematic coding was applied.
In total, 11 physicians were interviewed and three main themes emerged: (1) The SBIRT screening project, (2) cultural issues and (3) patient follow-up. Findings revealed a general willingness toward the concept of screening and delivering brief interventions in primary care although increased workload and uncertainties about remuneration for the service may be a barrier to future implementation. There was a perceived problem of substance misuse that was not currently being met and a strong perception that patients were not willing to reveal substance use due cultural barriers and fear of police involvement. In conclusion this qualitative process evaluation provided essential insight into implementing SBIRT in the Middle East. In conclusion, despite physician willingness and a clinical need for a substance misuse care pathway, the reluctance among patients to admit to substance use in this culture needs to be addressed to enable successful implementation.
Trans-10, cis-12 CLA is produced as an intermediary during the biohydrogenation of linoleic acid (C18:2 n-6) in the rumen and has been shown to be a potent inhibitor of milk fat synthesis in ruminants. The production of trans-10, cis-12 CLA in the rumen is affected by dietary concentrate: forage ratio (Kucuk et al., 2001), rumen pH and the amount and source of linoleic acid in the diet. However, the interaction between oil source, carbohydrate source and pH on the production of trans-10, cis-12 CLA is unclear (Beam et al., 2000). The objectives of the current study were to determine the effects of oil source, carbohydrate source and pH on the biohydrogenation of linoleic acid and production of trans-10, cis-12 CLA in vitro.
Trans- 10, cis- 12 conjugated linoleic acid (CLA), a biohydrogenation intermediate produced in the rumen, is a potent inhibitor of milk fat synthesis. Data from a number of studies where various doses of trans -10, cis -12 CLA have been abomasally infused demonstrate a curvilinear relationship between the percent reduction in milk fat yield and both the dose of trans- 10, cis- 12 CLA infused and the milk fat content of trans- 10, cis- 12 CLA. In addition to a reduction in milk fat output, under some circumstances an increase in milk yield and milk protein output are observed. To date, there has been no examination of the effects of trans- 10, cis- 12 CLA on milk fat synthesis in lactating sheep. The current study was therefore designed to determine if trans- 10, cis- 12 CLA would inhibit milk fat synthesis in lactating sheep. In order to test the effectiveness of trans- 10, cis- 12 CLA in inhibiting milk fat synthesis we used a lipid-encapsulated trans- 10, cis- 12 CLA supplement (LE-CLA) as a means to provide the trans- 10, cis- 12 CLA isomer post-ruminally.
Previous work has shown that processing whole crop wheat (WCW) at harvest increases starch digestibility (Jackson et al., 2002). However, no effect was seen in terms of milk yield. It has been suggested that the provision of a sugar source might utilise the high rumen ammonia levels found in animals receiving urea-treated whole crop wheat (Abdalla et al., 1999). Sources such as lactose have also been shown to reduce rumen protozoa numbers, increase bacterial protein supply and result in a more stable rumen pH, particularly with high starch diets (Hussain and Miller, 1999). Additionally, to date, processed whole crop wheat has not been evaluated against other alternative forages. The objective of the current experiment was, therefore, to compare processed urea treated whole crop wheat with maize silage and determine the effects of carbohydrate supplementation of whole crop wheat on intake and milk production in dairy cows.