To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
Background: Biallelic variants in POLR1C are associated with POLR3-related leukodystrophy (POLR3-HLD), or 4H leukodystrophy (Hypomyelination, Hypodontia, Hypogonadotropic Hypogonadism), and Treacher Collins syndrome (TCS). The clinical spectrum of POLR3-HLD caused by variants in this gene has not been described. Methods: A cross-sectional observational study involving 25 centers worldwide was conducted between 2016 and 2018. The clinical, radiologic and molecular features of 23 unreported and previously reported cases of POLR3-HLD caused by POLR1C variants were reviewed. Results: Most participants presented between birth and age 6 years with motor difficulties. Neurological deterioration was seen during childhood, suggesting a more severe phenotype than previously described. The dental, ocular and endocrine features often seen in POLR3-HLD were not invariably present. Five patients (22%) had a combination of hypomyelinating leukodystrophy and abnormal craniofacial development, including one individual with clear TCS features. Several cases did not exhibit all the typical radiologic characteristics of POLR3-HLD. A total of 29 different pathogenic variants in POLR1C were identified, including 13 new disease-causing variants. Conclusions: Based on the largest cohort of patients to date, these results suggest novel characteristics of POLR1C-related disorder, with a spectrum of clinical involvement characterized by hypomyelinating leukodystrophy with or without abnormal craniofacial development reminiscent of TCS.
Background: Seizure monitoring via amplitude-integrated EEG (aEEG) is standard of care in many NICUs; however, conventional EEG (cEEG) is the gold standard for seizure detection. We compared the diagnostic yield of aEEG interpreted at the bedside, aEEG interpreted by an expert, and cEEG. Methods: Neonates received aEEG and cEEG in parallel. Clinical events and aEEG were interpreted at bedside and subsequently independently analyzed by experienced neonatology and neurology readers. Sensitivity and specificity of bedside aEEG as compared to expert aEEG interpretation and cEEG were evaluated. Results: Thirteen neonates were monitored for an average duration of 33 hours (range 15-94). Fourteen seizure-like events were detected by clinical observation, and 12 others by bedside aEEG analysis. None of the bedside aEEG events were confirmed as seizures on cEEG. Expert aEEG interpretation had a sensitivity of 13% with 46% specificity for individual seizure detection (not adjusting for patient differences), and a sensitivity of 50% with 46% specificity for detecting patients with seizures. Conclusions: Real-world bedside aEEG monitoring failed to detect seizures evidenced via cEEG, while misclassifying other events as seizures. Even post-hoc expert aEEG interpretation provided limited sensitivity and specificity. Considering the poor sensitivity and specificity of bedside aEEG interpretation, combined monitoring may provide limited clinical benefit.
Background: The classic ketogenic diet is the main non-pharmacological treatment for refractory epilepsy; however, adherence is often challenging. The low glycemic index diet (LGID) is less strict, almost equally effective, and associated with improved adherence. Little is known about the quality of life of children treated with LGID. The objective of this study was to explore changes in the quality of life of children with epilepsy transitioning to the LGID. Methods: Patients on LGID and their parents filled out Pediatric Quality of Life Epilepsy Module questionnaires; one while being on the LGID, and one retrospectively for the time prior to starting the LGID. Results: Data was collected from five children ages 3-13 and their parents. Complete seizure control was seen in two children, >50% seizure reduction in one, and no change in two children. Parental reported quality of life while on the LGID increased with two participants but decreased in all child self reports. Conclusions: Although the LGID led to improved seizure control in three out of five patients, the child-reported quality of life decreased in all children. Larger prospective studies are warranted to reliably assess the impact of the LGID on the quality of life in children with epilepsy.
Background: Emergency Department (ED) communication between patients and clinicians is fraught with challenges. A local survey of 65 ED patients revealed low patient satisfaction with ED communication and resultant patient anxiety. Aim Statement: To increase patient satisfaction with ED communication and to decrease patient anxiety related to lack of ED visit information (primary aims), and to decrease clinician-perceived patient interruptions (secondary aim), each by one point on a 5-point Likert scale over a six-month period. Measures & Design: We performed wide stakeholder engagement, surveyed patients and clinicians, and conducted a patient focus group. An inductive analysis followed by a yield-feasibility-effort grid led to three interventions, introduced through sequential and additive Plan-Do-Study-Act (PDSA) cycles. PDSA 1: clinician communication tool (Acknowledge-Empathize-Inform [AEI] tool), based on survey themes and a literature review, and introduced through a multi-modal education approach. PDSA 2: patient information pamphlets developed with stakeholder input. PDSA 3: new waiting room TV screen with various informational ED-specific videos. Measures were conducted through anonymous surveys: Primary aims towards the end of the patient ED stay, and the secondary aim at the end of the clinician shift. We used Statistical Process Control (SPC) charts with usual special cause variation rules. Two-tailed Mann-Whitney tests were used to assess for statistical significance between means (significance: p < 0.05). Evaluation/Results: Over five months, 232 patient and 104 clinician surveys were collected. Wait times, ED processes, timing of typical steps, and directions were reported as the most important communication gaps, they and were included in the interventions. Patient satisfaction improved from 3.28 (5 being best, all means; n = 65) to 4.15 (n = 59, p < 0.0001). Patient anxiety improved from 2.96 (1 being best; n = 65) to 2.31 (n = 59, p < 0.01). Clinician-perceived interruptions went from 4.33 (1 being best; n = 30) to 4.18 (n = 11, p = 0.98). SPC charts using Likert scales did not show special cause variation. Discussion/Impact: A sequential, additive approach undertaken with pragmatic and low-cost interventions based on both clinician and patient input led to increased patient satisfaction with communication and decreased patient anxiety due to lack of ED visit information after PDSA cycles. These approaches could easily be replicated in other EDs to improve the patient experience.
We provide the first in situ measurements of antenna element beam shapes of the Murchison Widefield Array. Most current processing pipelines use an assumed beam shape, which can cause absolute and relative flux density errors and polarisation ‘leakage’. Understanding the primary beam is then of paramount importance, especially for sensitive experiments such as a measurement of the 21-cm line from the epoch of reionisation, where the calibration requirements are so extreme that tile to tile beam variations may affect our ability to make a detection. Measuring the primary beam shape from visibilities is challenging, as multiple instrumental, atmospheric, and astrophysical factors contribute to uncertainties in the data. Building on the methods of Neben et al. [Radio Sci., 50, 614], we tap directly into the receiving elements of the telescope before any digitisation or correlation of the signal. Using ORBCOMM satellite passes we are able to produce all-sky maps for four separate tiles in the XX polarisation. We find good agreement with the beam model of Sokolowski et al. [2017, PASA, 34, e062], and clearly observe the effects of a missing dipole from a tile in one of our beam maps. We end by motivating and outlining additional on-site experiments.
We describe the motivation and design details of the ‘Phase II’ upgrade of the Murchison Widefield Array radio telescope. The expansion doubles to 256 the number of antenna tiles deployed in the array. The new antenna tiles enhance the capabilities of the Murchison Widefield Array in several key science areas. Seventy-two of the new tiles are deployed in a regular configuration near the existing array core. These new tiles enhance the surface brightness sensitivity of the array and will improve the ability of the Murchison Widefield Array to estimate the slope of the Epoch of Reionisation power spectrum by a factor of ∼3.5. The remaining 56 tiles are deployed on long baselines, doubling the maximum baseline of the array and improving the array u, v coverage. The improved imaging capabilities will provide an order of magnitude improvement in the noise floor of Murchison Widefield Array continuum images. The upgrade retains all of the features that have underpinned the Murchison Widefield Array’s success (large field of view, snapshot image quality, and pointing agility) and boosts the scientific potential with enhanced imaging capabilities and by enabling new calibration strategies.
To describe the modification and validation of an existing instrument, the Environment and Policy Assessment and Observation (EPAO), to better capture provider feeding practices.
Modifications to the EPAO were made, validity assessed through expert review, pilot tested and then used to collect follow-up data during a two-day home visit from an ongoing cluster-randomized trial. Exploratory factor analysis investigated the underlying factor structure of the feeding practices. To test predictive validity of the factors, multilevel mixed models examined associations between factors and child’s diet quality as captured by the Healthy Eating Index-2010 (HEI-2010) score (measured via the Dietary Observation in Childcare Protocol).
Family childcare homes (FCCH) in Rhode Island and North Carolina, USA.
The modified EPAO was pilot tested with fifty-three FCCH and then used to collect data in 133 FCCH.
The final three-factor solution (‘coercive control and indulgent feeding practices’, ‘autonomy support practices’, ‘negative role modelling’) captured 43 % of total variance. In multilevel mixed models adjusted for covariates, ‘autonomy support practices’ was positively associated with children’s diet quality. A 1-unit increase in the use of ‘autonomy support practices’ was associated with a 9·4-unit increase in child HEI-2010 score (P=0·001).
Similar to the parenting literature, constructs which describe coercive controlling practices and those which describe autonomy-supportive practices emerged. Given that diets of pre-schoolers in the USA remain suboptimal, teaching childcare providers about supportive feeding practices may help improve children’s diet quality.
In Cameroon, there is a national programme engaged in the control of schistosomiasis and soil-transmitted helminthiasis. In certain locations, the programme is transitioning from morbidity control towards local interruption of parasite transmission. The volcanic crater lake villages of Barombi Mbo and Barombi Kotto are well-known transmission foci and are excellent context-specific locations to assess appropriate disease control interventions. Most recently they have served as exemplars of expanded access to deworming medications and increased environmental surveillance. In this paper, we review infection dynamics through time, beginning with data from 1953, and comment on the short- and long-term success of disease control. We show how intensification of local control is needed to push towards elimination and that further environmental surveillance, with targeted snail control, is needed to consolidate gains in preventive chemotherapy as well as empower local communities to take ownership of interventions.
Human fascioliasis infection sources are analysed for the first time in front of the new worldwide scenario of this disease. These infection sources include foods, water and combinations of both. Ingestion of freshwater wild plants is the main source, with watercress and secondarily other vegetables involved. The problem of vegetables sold in uncontrolled urban markets is discussed. Distinction between infection sources by freshwater cultivated plants, terrestrial wild plants, and terrestrial cultivated plants is made. The risks by traditional local dishes made from sylvatic plants and raw liver ingestion are considered. Drinking of contaminated water, beverages and juices, ingestion of dishes and soups and washing of vegetables, fruits, tubercles and kitchen utensils with contaminated water are increasingly involved. Three methods to assess infection sources are noted: detection of metacercariae attached to plants or floating in freshwater, anamnesis in individual patients, and questionnaire surveys in endemic areas. The infectivity of metacercariae is reviewed both under field conditions and experimentally under the effects of physicochemical agents. Individual and general preventive measures appear to be more complicated than those considered in the past. The high diversity of infection sources and their heterogeneity in different countries underlie the large epidemiological heterogeneity of human fascioliasis throughout.
Outbreaks of Old World cutaneous leishmaniasis (CL) have significantly increased due to the conflicts in the Middle East, with most of the cases occurring in resource-limited areas such as refugee settlements. The standard methods of diagnosis include microscopy and parasite culture, which have several limitations. To address the growing need for a CL diagnostic that can be field applicable, we have identified five candidate neoglycoproteins (NGPs): Galα (NGP3B), Galα(1,3)Galα (NGP17B), Galα(1,3)Galβ (NGP9B), Galα(1,6)[Galα(1,2)]Galβ (NGP11B), and Galα(1,3)Galβ(1,4)Glcβ (NGP1B) that are differentially recognized in sera from individuals with Leishmania major infection as compared with sera from heterologous controls. These candidates contain terminal, non-reducing α-galactopyranosyl (α-Gal) residues, which are known potent immunogens to humans. Logistic regression models found that NGP3B retained the best diagnostic potential (area under the curve from receiver-operating characteristic curve = 0.8). Our data add to the growing body of work demonstrating the exploitability of the human anti-α-Gal response in CL diagnosis.
On August 25, 2017, Hurricane Harvey made landfall near Corpus Christi, Texas. The ensuing unprecedented flooding throughout the Texas coastal region affected millions of individuals.1 The statewide response in Texas included the sheltering of thousands of individuals at considerable distances from their homes. The Dallas area established large-scale general population sheltering as the number of evacuees to the area began to amass. Historically, the Dallas area is one familiar with “mega-sheltering,” beginning with the response to Hurricane Katrina in 2005.2 Through continued efforts and development, the Dallas area had been readying a plan for the largest general population shelter in Texas. (Disaster Med Public Health Preparedness. 2019;13:33–37)
The degree of processing of protein-rich feeds affects their physical properties. Seeds which are less comminuted, whether cracked or rolled, may have properties which make their behaviour, in the rumen and postruminally, distinct from fine ground material and which may therefore alter their performance as feed proteins. The use of lupin seeds as a replacement for soya in ruminant diets has been demonstrated (Moss et al, 1997). This project aimed to assess whether the processing of lupin seeds, either hammer milling or rolling, affected the performance of young cattle fed the seed as their principal source of protein.
Many cases of food-borne illness in the UK are related to the consumption of contaminated meat products. This has highlighted the importance of adopting hygienic procedures throughout the meat production chain, including the farm environment (Pennington, 2000). Many factors are known to affect the hygienic condition of finished cattle (Davies et al., 2000) and various husbandry practices may be used to improve cleanliness at slaughter. Feed withdrawal, for example, may be used to reduce faecal output and improve the visible cleanliness of hides. However, the extent to which this impacts upon microbiological contamination of the hide, and its effects on pathogen levels following transport to the abattoir remain to be determined. This study investigated the interactive effects of feeding a straw-only diet prior to transport and journey time on the microbiological status of cattle faeces and hides.
The ingestion of creep feed by piglets depends upon them displaying suitable exploratory behaviours away from the sows udder. However, sows milk has an excellent FCR and it may be argued that, where the udder is available, exploration away from it is a failure to optimise the use of available food resources. A corrolary to this is that those piglets displaying exploratory behaviour away from the sows udder will be those piglets for which further udder massage shows no net gain i.e. those exploiting the maximum potential of the udder and those for whom the udder shows very little benefit. The former group will be characterised by heavier weights and faster growth rates. The latter group will be characterised by lighter weights and slower growth rates. It may be assumed that animals exist on a continuum between these two states. In commercial conditions the latter group is typically fostered off or culled. This project tested the hypothesis that, in healthy litters (thus excluding the latter group), exploratory behaviour away from the udder will be displayed proportional to the weight of the piglet.
Escherichia coli O157 are zoonotic bacteria for which cattle are an important reservoir. Prevalence estimates for E. coli O157 in British cattle for human consumption are over 10 years old. A new baseline is needed to inform current human health risk. The British E. coli O157 in Cattle Study (BECS) ran between September 2014 and November 2015 on 270 farms across Scotland and England & Wales. This is the first study to be conducted contemporaneously across Great Britain, thus enabling comparison between Scotland and England & Wales. Herd-level prevalence estimates for E. coli O157 did not differ significantly for Scotland (0·236, 95% CI 0·166–0·325) and England & Wales (0·213, 95% CI 0·156–0·283) (P = 0·65). The majority of isolates were verocytotoxin positive. A higher proportion of samples from Scotland were in the super-shedder category, though there was no difference between the surveys in the likelihood of a positive farm having at least one super-shedder sample. E. coli O157 continues to be common in British beef cattle, reaffirming public health policy that contact with cattle and their environments is a potential infection source.
OBJECTIVES/SPECIFIC AIMS: Sudden death in the young (SDY) occurs in people between 1 and 40 years of age who do not have a known premortem risk factor for early death. Cardiovascular diseases account for the majority of causes of SDY. Sequencing of genes associated with congenital arrhythmia susceptibility and familial cardiomyopathy reveals pathogenic variants in 30% of postmortem cases (often called “molecular autopsy”). However, better data are needed to determine the prevalence of phenotype and genotype abnormalities in surviving relatives. METHODS/STUDY POPULATION: A retrospective cohort study was performed at a tertiary pediatric center including all subjects with a family history of SDY. Cases were identified using ICD-9 codes (798.1 or .9, V17.41, V17.49, V19.8, V61.07), search of cardiology databases, and by recursive identification of all family members of a subject. Phenotype data was independently reviewed by a pediatric cardiologist. Genotype results were available when obtained by the original treating physician. RESULTS/ANTICIPATED RESULTS: Cardiac evaluations were performed in 279 subjects from 175 families, of whom 117 subjects (42%) were first-degree relatives of the proband. Mean age of the subject at time of evaluation was 9 years (SD 5.9). Most probands were over 18 years at the time of SDY: 1–4 years of age (9%); 5–12 (5%); 13–17 (16%); 18–24 (18%); 25–40 (42%). A final diagnosis was determined in 55 families (20%), and a variant in a gene potentially causative of SDY was discovered in 20/55 (36%) of those families. Variants were classified as 50% pathogenic/likely pathogenic, 50% variants of unknown significance. Cardiac testing (ECG, echo, EST, signal averaged ECG, cardiac MRI, or EP study) was abnormal in 124/279 subjects (44%). Among those with abnormal studies, 57/124 (46%) were from a family where a final diagnosis could be determined (LQT 43%, HCM 21%, ARVC 4%, other cardiomyopathy 19%, WPW 5%, CPVT 2%). However, 67/279 of total subjects (24%) had at least 1 abnormal study and a final diagnosis was not determined in the family. DISCUSSION/SIGNIFICANCE OF IMPACT: An abnormal phenotype is common among relatives referred for cardiac evaluation after SDY. While testing identifies a family diagnosis in 20% of families, many patients have abnormal cardiac testing and no clear diagnosis can be made. An improved postmortem protocol for phenotype testing in relatives of a SDY victim and improved postmortem genetic testing may lead to a higher diagnosis rate and improved risk determination in surviving family members.
The thermal decomposition of mill scale, and the effect of mill scale addition on the formation and decomposition of Silico-Ferrite of Calcium and Aluminium (SFCA) and SFCA-I iron ore sinter bonding phases, has been investigated using in situ X-ray diffraction. Application of the external standard method of quantitative phase analysis of the in situ data collected during decomposition of the mill scale highlighted the applicability of this method for the determination of the nature and abundance of amorphous material in a mineral sample. Increasing mill scale addition from 2.6 to 10.6 and to 21.2 wt% in an otherwise synthetic sinter mixture composition designed to form SFCA did not significantly affect the thermal stability ranges of SFCA-I or SFCA, nor did it significantly affect the amount of each of SFCA or SFCA-I, which formed. This was attributed to the low impurity (i.e. Mn, Mg) concentration in the mill scale, and also the transformation to hematite during heating of the wüstite and magnetite present in the mill scale, with the hematite available for reaction to form SFCA and SFCA-I.
The electrochemical behaviour of a number of Pb-based anode alloys, under simulated electrowinning conditions, in a 1.6 M H2SO4 electrolyte at 45 °C was studied. Namely, the evolution of PbO2 and PbSO4 surface layers was investigated by quantitative in situ synchrotron X-ray diffraction (S-XRD) and subsequent Rietveld-based quantitative phase analysis (QPA). In the context of seeking new anode alloys, this research shows that the industry standard Pb-0.08Ca-1.52Sn (wt%) anode, when exposed to a galvanostatic current and intermittent power interruptions, exhibited poor electrochemical performance relative to select custom Pb-based binary alloys; Pb–0.73Mg, Pb–5.05Ag, Pb–0.07Rh, and Pb–1.4Zn (wt%). The in situ S-XRD measurements and subsequent QPA indicated that this was linked to a lower proportion of β-PbO2, relative to PbSO4, on the Pb-0.08Ca-1.52Sn alloy at all stages of the electrochemical cycling. The best performing alloy, in terms of minimisation of overpotential during normal electrowinning operation and minimising the deleterious effects of repeated power interruptions – both of which are significant factors in energy consumption – was determined to be Pb–0.07Rh.