To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Prior to the pandemic, telemedicine use was limited and sparsely funded within Ontario. During the pandemic, a shift in clinical recommendations and government funding models promoted telemedicine. We aim to highlight both quantitative and qualitative aspects of the patient and provider experience over 2.5 years within a Canadian Pediatric Neurology clinic. Main objectives of the study are to assess the safety, efficiency and convenience of telemedicine. Methods: A REDCap survey was sent to all patients with a telemedicine appointment from March 2020 –September 2022 and all Pediatric Neurology providers. Survey included a 5-point Likert scale questions, open questions, and patient characteristics. Results: Responses received from 272 patients and 7 providers. 91% of patients and all providers were satisfied with telemedicine. 95% of patients and all providers felt they received or were able to provide safe/adequate care. 90% of patients and all providers reported that telemedicine was more convenient. 87% of patients and all providers were interested in future appointments via telemedicine. Conclusions: Our survey shows patients and providers had highly positive experiences with telemedicine – reporting care was adequate, safe, and more convenient. This data supports incorporating telemedicine into future care and advocates that Canadian regulations/billing codes to continue to support telemedicine.
Background: Currently, there are no standardized approaches to care or evaluation for tone dysfunction in Canada. The study authors hypothesize that there is significant practice variation across the country. This environmental scan is aimed to describe the current practice for management of patients with hypertonia across Canada. Methods: A web-based survey was developed by the authors with a multi-disciplinary approach and sent to representative rehabilitation sites in each province. All statistical analyses were performed using the R statistical software version 4.0. Results: CP was found to be the most common diagnosis for tone dysfunction, with 58% (7/12) of sites diagnosing greater than 20 new patients per year. All 12 sites offered oral baclofen and gabapentin, and 92% of sites offered trihexyphenidyl. Botulinum toxin injections were offered at 50% of sites. Upper and lower extremity procedures were offered in 83% of the sites. In 8 of 12 sites (67%), patients were seen within a formal multidisciplinary clinic to manage hypertonia. Conclusions: The information gained from this study provides some insight into the current practice across Canada for children with hypertonia. This study may assist in the development of a national, standardized strategy to tone management, potentially facilitating more equitable access to care for patients.
Background: The Epilepsy Monitoring Unit (EMU) plays a crucial role in a patient’s diagnosis and management for seizures and epilepsy. The duration of stay required to obtain adequate information is not clear, especially in the pediatric population. In this study, we examine whether a one to four day length of stay in the EMU is sufficient to obtain the necessary information. Methods: Retrospective review of 522 admissions (2014-2021). Included any patient admitted to CHEO’s EMU for any length of time. Results: The average admission was 1.75 days with 35.7% of patients requiring repeat EMU visits. Through a binary logistic regression, we show that a previous diagnosis of refractory seizures increases the chance of readmission to the EMU. However, a diagnosis of refractory seizures is also associated with a higher chance of achieving admission goals. While other factors including seizure type, weaning of meds, goals of admission, age, and gender have no influence on likelihood of readmission or achieving admission goals. Conclusions: This study indicates that having a short admission for EMU monitoring is sufficient to capture enough data to achieve admission goals in the pediatric population.
A series of measures of calf welfare was developed through a process of expert consultation. A welfare assessment of group-housed calves was carried out on 53 UK dairy farms during the winter of 2000/01. The assessment used animal-based measures including direct observation of the calves and examination of their health history through a review of farm records. The findings from this were compiled into a profile of calf welfare which outlined the range of results for each measure used. The results fell into the three categories of respiratory health, nutrition and general appearance. A broad range of results was found across the farms visited for the measures in each of these categories. Some farms performed well for all measures taken, and no farms performed consistently badly across all aspects of calf welfare. The majority of farms combined aspects of both good and poor welfare performance.
This paper describes an approach to assessing the overall welfare of cows on dairy farms. Veterinary and behaviour experts were shown results for ten selected welfare parameters for 25 pairs of dairy farms paired for farm assurance status but with similar geographical location and husbandry system. From this information alone they were asked to state which farms had better welfare. Overall, there were no significant differences between the conclusions of veterinary and behaviour experts. There was a significant relationship between the proportion of experts rating a farm as poorer and the measured difference in the number of cows with lameness or rising restrictions between the paired farms. There were no significant relationships between the expert decisions and differences in milk yield, flight distance, swollen hocks, mastitis incidence, dystocia level, conception rates, prevalence of thin cows and proportion of cows with dirty udders. Clearly, experts rate lameness and discomfort as highly important indices of poor welfare in dairy cows.
A Delphi technique was used to gather the opinions of animal welfare experts on the most appropriate measures for welfare assessment of farm animals. The experts were asked to consider measures that were directed towards the animal (animal-based), rather than measurement of their environment. This systematic approach was designed to achieve a degree of consensus of opinion between a large number of experts. Two rounds of postal questionnaires were targeted at people with expertise in one or more of the species of interest. The respondents suggested measures based upon observations of health status, behaviour, and examination of records. These measures reflect the animal's welfare state — in other words, how the animal is coping within the environment and husbandry system in which it lives. The measures for cattle, pigs and laying hens were categorised into 22, 23 and 28 aspects, respectively, with the highest ranking of importance being given to observation of lameness in dairy cattle and pigs and to observation of feather condition in laying hens. This Delphi study was the basis for the development of a series of protocols to assess the welfare state of dairy cattle, pigs and laying hens.
To evaluate the effectiveness of an automated hand hygiene compliance system (AHHCS) audible alert and vibration for increasing hand hygiene compliance.
A nonrandomized, before-and-after, quasi-experimental study of an AHHCS was implemented in several inpatient units. Over a 51-day period, the system’s real-time audible alert was turned on, off, and back on. Overall, hand hygiene compliance was compared between days with activated and deactivated alerts and vibration.
This study was conducted at a level 1 trauma center, a regional academic health system with 1,564 beds.
The AHHCS was implemented in 9 inpatient units: 3 adult medical-surgical step-down units, and 6 adult intensive care units. The AHHCS badges were assigned to patient care assistants, registered nurses, physical therapists, occupational therapists, speech therapists, respiratory therapists, and physicians.
In the 9 inpatient units, selected healthcare staff were issued wearable badges that detected entry into and exit from a patient room. The audible alert was turned on for 16 days, turned off for 17 days, and then turned back on for 18 days, for a total of 51 days.
Utilization of the AHHCS real-time audible alert reminder resulted in sustained HH compliance ≥90%. When the alert and vibration were deactivated, HH compliance dropped to an average of 74% (range, 62%–78%). Once the alert resumed, HH compliance returned to ≥90%.
Utilization of an AHHCS with real-time reminder audible alerts may be an effective method to increase healthcare worker HH compliance to ≥90%. Users of AHHCSs should consider the use of real-time reminders to improve HH compliance.
Plans for allocation of scarce life-sustaining resources during the coronavirus disease 2019 (COVID-19) pandemic often include triage teams, but operational details are lacking, including what patient information is needed to make triage decisions.
A Delphi study among Washington state disaster preparedness experts was performed to develop a list of patient information items needed for triage team decision-making during the COVID-19 pandemic. Experts proposed and rated their agreement with candidate information items during asynchronous Delphi rounds. Consensus was defined as ≥80% agreement. Qualitative analysis was used to describe considerations arising in this deliberation. A timed simulation was performed to evaluate feasibility of data collection from the electronic health record.
Over 3 asynchronous Delphi rounds, 50 experts reached consensus on 24 patient information items, including patients’ age, severe or end-stage comorbidities, the reason for and timing of admission, measures of acute respiratory failure, and clinical trajectory. Experts weighed complex considerations around how information items could support effective prognostication, consistency, accuracy, minimizing bias, and operationalizability of the triage process. Data collection took a median of 227 seconds (interquartile range = 205, 298) per patient.
Experts achieved consensus on patient information items that were necessary and appropriate for informing triage teams during the COVID-19 pandemic.
Background: The International League Against Epilepsy recommends patients with drug resistant epilepsy (DRE) be referred for surgical evaluation, however prior literature suggests this is an underutilized intervention. This study captures practices of North American pediatric neurologists regarding the management of DRE and factors which may promote or limit referrals for epilepsy surgical evaluation. Methods: A REDCap survey distributed via the Child Neurology Society mailing list to pediatric neurologists practicing in North America. “R” was used to conduct data analyses. Ethics approval from the CHEO REB was granted prior to the start of data collection. Results: 102 pediatric neurologists responded, 77% of whom currently practice in the United States. 73% of respondents reported they would refer a patient for surgical consultation after two failed medications. Of all potential predictors tested in a logistic regression model, low referral volume was the only predictor of whether participants refer patients after more than three failed medications. Conclusions: Pediatric neurologists demonstrate fair knowledge of formal recommendations to refer patients for surgical evaluation after two failed medication trials. Other modifiable factors reported, especially family perceptions of epilepsy surgery, should be prioritized when developing tools to enhance effective referrals and increase utilization of epilepsy surgery in the management of pediatric DRE.
Digitization and the release of public records on the Internet have expanded the reach and uses of criminal record data in the United States. This study analyzes the types and volume of personally identifiable data released on the Internet via two hundred public governmental websites for law enforcement, criminal courts, corrections, and criminal record repositories in each state. We find that public disclosures often include information valuable to the personal data economy, including the full name, birthdate, home address, and physical characteristics of arrestees, detainees, and defendants. Using administrative data, we also estimate the volume of data disclosed online. Our findings highlight the mass dissemination of pre-conviction data: every year, over ten million arrests, 4.5 million mug shots, and 14.7 million criminal court proceedings are digitally released at no cost. Post-conviction, approximately 6.5 million current and former prisoners and 12.5 million people with a felony conviction have a record on the Internet. While justified through public records laws, such broad disclosures reveal an imbalance between the “transparency” of data releases that facilitate monitoring of state action and those that facilitate monitoring individual people. The results show how the criminal legal system increasingly distributes Internet privacy violations and community surveillance as part of contemporary punishment.
In order to maximize the utility of future studies of trilobite ontogeny, we propose a set of standard practices that relate to the collection, nomenclature, description, depiction, and interpretation of ontogenetic series inferred from articulated specimens belonging to individual species. In some cases, these suggestions may also apply to ontogenetic studies of other fossilized taxa.
To make a power spectrum (PS) detection of the 21-cm signal from the Epoch of Reionisation (EoR), one must avoid/subtract bright foreground sources. Sources such as Fornax A present a modelling challenge due to spatial structures spanning from arc seconds up to a degree. We compare modelling with multi-scale (MS) CLEAN components to ‘shapelets’, an alternative set of basis functions. We introduce a new image-based shapelet modelling package, SHAMFI. We also introduce a new CUDA simulation code (WODEN) to generate point source, Gaussian, and shapelet components into visibilities. We test performance by modelling a simulation of Fornax A, peeling the model from simulated visibilities, and producing a residual PS. We find the shapelet method consistently subtracts large-angular-scale emission well, even when the angular resolution of the data is changed. We find that when increasing the angular resolution of the data, the MS CLEAN model worsens at large angular scales. When testing on real Murchison Widefield Array data, the expected improvement is not seen in real data because of the other dominating systematics still present. Through further simulation, we find the expected differences to be lower than obtainable through current processing pipelines. We conclude shapelets are worthwhile for subtracting extended galaxies, and may prove essential for an EoR detection in the future, once other systematics have been addressed.
With most mental health disorders emerging in the later teenage years, university students are arguably an at-risk population with increased mental health support needs. This population is characterised by important, life-changing transitions (moving away from home, friends and family) and new potential stressors (including increased academic pressures and relational challenges). Research to examine determinants of mental health help-seeking behaviours in university students is needed to ensure emotional health needs are being met at this critical time.
To examine levels of psychological distress and mental health help-seeking behaviours in a sample of UK university students. By identifying factors associated with help seeking, we can better understand the mental health needs of this population and inform support provision.
This study draws on data from the social and emotional well-being in university students (SoWise) study, an online survey which aimed to examine risk and resilience for social and emotional well-being in young people attending a UK university.
Whole sample analysis (n = 461) showed help seeking was significantly associated with psychological distress, current life stressors and anxious attachment and not associated with perceived mental health stigma. Sub-group analysis (n = 171) suggests being female and older significantly predicted help seeking in students with mild/moderate psychological distress.
Younger males with mild/moderate psychological distress are less likely to seek mental health support and represent an “invisible” at risk group. Results also suggest that global anti-stigma campaigns in universities may not prove effective in encouraging help seeking.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Provisia™ rice was developed recently by the BASF Corporation for control of grass weeds and is complementary to existing Clearfield® technology. Our previous research showed that resistance of Provisia™ rice to the acetyl coenzyme-A carboxylase herbicide quizalofop-p-ethyl (QPE) in laboratory and greenhouse environments is governed by a single dominant Mendelian gene. However, these results may not be consistent in different populations or field environments. Therefore, the first objective of the current research is to determine the inheritance of resistance to QPE in rice using different segregating populations evaluated under U.S. field environments. The second objective is to evaluate the response of QPE-resistant breeding lines to various herbicide concentrations at two U.S. locations. Chi-square tests of 12 F2 populations evaluated in Louisiana during 2014 and 2015 indicated that QPE seedling resistance at 240 g ai ha−1 was governed by a single dominant Mendelian gene with no observable maternal effects. Similar results were obtained in five F3 populations derived from the aforementioned F2 populations. Allele-specific SNP markers for QPE resistance also followed Mendelian segregation in the five F2 populations. For the second objective, six QPE-resistant inbred lines showed transient leaf injury at 1× (120 g ai ha−1) or 2× (240 g ai ha−1) field rates 7 and 21 d after treatment (DAT). However, a trend of reduced injury (recovery) from 7 through 33 DAT was observed for all breeding material. No differences in grain yield were found between untreated QPE-resistant lines and those treated with 1× or 2× QPE field rate. Single gene inheritance and good levels of QPE herbicide field resistance in different genetic populations suggest feasibility for rapid and effective development of new QPE-resistant varieties and effective stewardship of the Provisia™ technology.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
Background: Biallelic variants in POLR1C are associated with POLR3-related leukodystrophy (POLR3-HLD), or 4H leukodystrophy (Hypomyelination, Hypodontia, Hypogonadotropic Hypogonadism), and Treacher Collins syndrome (TCS). The clinical spectrum of POLR3-HLD caused by variants in this gene has not been described. Methods: A cross-sectional observational study involving 25 centers worldwide was conducted between 2016 and 2018. The clinical, radiologic and molecular features of 23 unreported and previously reported cases of POLR3-HLD caused by POLR1C variants were reviewed. Results: Most participants presented between birth and age 6 years with motor difficulties. Neurological deterioration was seen during childhood, suggesting a more severe phenotype than previously described. The dental, ocular and endocrine features often seen in POLR3-HLD were not invariably present. Five patients (22%) had a combination of hypomyelinating leukodystrophy and abnormal craniofacial development, including one individual with clear TCS features. Several cases did not exhibit all the typical radiologic characteristics of POLR3-HLD. A total of 29 different pathogenic variants in POLR1C were identified, including 13 new disease-causing variants. Conclusions: Based on the largest cohort of patients to date, these results suggest novel characteristics of POLR1C-related disorder, with a spectrum of clinical involvement characterized by hypomyelinating leukodystrophy with or without abnormal craniofacial development reminiscent of TCS.
Background: The classic ketogenic diet is the main non-pharmacological treatment for refractory epilepsy; however, adherence is often challenging. The low glycemic index diet (LGID) is less strict, almost equally effective, and associated with improved adherence. Little is known about the quality of life of children treated with LGID. The objective of this study was to explore changes in the quality of life of children with epilepsy transitioning to the LGID. Methods: Patients on LGID and their parents filled out Pediatric Quality of Life Epilepsy Module questionnaires; one while being on the LGID, and one retrospectively for the time prior to starting the LGID. Results: Data was collected from five children ages 3-13 and their parents. Complete seizure control was seen in two children, >50% seizure reduction in one, and no change in two children. Parental reported quality of life while on the LGID increased with two participants but decreased in all child self reports. Conclusions: Although the LGID led to improved seizure control in three out of five patients, the child-reported quality of life decreased in all children. Larger prospective studies are warranted to reliably assess the impact of the LGID on the quality of life in children with epilepsy.
Background: Cannabis has been shown to be an effective therapy for epilepsy in children with Dravet and Lennox-Gastaut syndrome. Despite the fact that many pediatric epilepsy patients across Canada are currently being treated with cannabis, little is known about pediatric neurologists’ attitudes towards it. Methods: A 26-item online survey was distributed to 148 pediatric neurologists across Canada. Results: 56/148 neurologists responded and reported that over 600 children with epilepsy are currently taking cannabinoids. 34% of neurologists authorized cannabis to children, 38% referred children for authorization, and 29% did not authorize or refer their patients. Of those neurologists who referred, 76% referred to a community-based non-neurologist. The majority of physicians authorized cannabis to patients with Dravet syndrome (68%) and Lennox-Gastaut syndrome (64%). Cannabis was never authorized as a first-line treatment. 54% of neurologists stated that their patients were taking CBD alone, despite this option not being available in Canada. All physicians reported having at least one hesitation regarding cannabis, the most common ones being poor evidence (66%), poor quality control (52%), and cost (50%). Conclusions: The majority of Canadian pediatric neurologists use cannabis as a treatment for epilepsy in children. However, there appear to be knowledge gaps and hesitations.
Background: Seizure monitoring via amplitude-integrated EEG (aEEG) is standard of care in many NICUs; however, conventional EEG (cEEG) is the gold standard for seizure detection. We compared the diagnostic yield of aEEG interpreted at the bedside, aEEG interpreted by an expert, and cEEG. Methods: Neonates received aEEG and cEEG in parallel. Clinical events and aEEG were interpreted at bedside and subsequently independently analyzed by experienced neonatology and neurology readers. Sensitivity and specificity of bedside aEEG as compared to expert aEEG interpretation and cEEG were evaluated. Results: Thirteen neonates were monitored for an average duration of 33 hours (range 15-94). Fourteen seizure-like events were detected by clinical observation, and 12 others by bedside aEEG analysis. None of the bedside aEEG events were confirmed as seizures on cEEG. Expert aEEG interpretation had a sensitivity of 13% with 46% specificity for individual seizure detection (not adjusting for patient differences), and a sensitivity of 50% with 46% specificity for detecting patients with seizures. Conclusions: Real-world bedside aEEG monitoring failed to detect seizures evidenced via cEEG, while misclassifying other events as seizures. Even post-hoc expert aEEG interpretation provided limited sensitivity and specificity. Considering the poor sensitivity and specificity of bedside aEEG interpretation, combined monitoring may provide limited clinical benefit.
We provide the first in situ measurements of antenna element beam shapes of the Murchison Widefield Array. Most current processing pipelines use an assumed beam shape, which can cause absolute and relative flux density errors and polarisation ‘leakage’. Understanding the primary beam is then of paramount importance, especially for sensitive experiments such as a measurement of the 21-cm line from the epoch of reionisation, where the calibration requirements are so extreme that tile to tile beam variations may affect our ability to make a detection. Measuring the primary beam shape from visibilities is challenging, as multiple instrumental, atmospheric, and astrophysical factors contribute to uncertainties in the data. Building on the methods of Neben et al. [Radio Sci., 50, 614], we tap directly into the receiving elements of the telescope before any digitisation or correlation of the signal. Using ORBCOMM satellite passes we are able to produce all-sky maps for four separate tiles in the XX polarisation. We find good agreement with the beam model of Sokolowski et al. [2017, PASA, 34, e062], and clearly observe the effects of a missing dipole from a tile in one of our beam maps. We end by motivating and outlining additional on-site experiments.