To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Fasciola jacksoni is a significant contributor to the health and mortality of Asian elephants, particularly those in Sri Lanka. Despite the impact of fascioliasis on elephant populations, it is a neglected veterinary disease with limited taxonomic understanding. Molecular characterization and phylogenetic analysis of F. jacksoni were carried out to evaluate its suggested basal position in the Fasciolidae. Adult worms were collected during post-mortem of elephants, and eggs were collected from living elephants in National parks across Sri Lanka. Using the mitochondrial genes nicotinamide dehydrogenase subunit 1 (nad1) and cytochrome oxidase subunit 1 (cox1), and a partial 28S ribosomal DNA (28S rDNA), DNA sequences were generated from the F. jacksoni adult and egg material. Maximum likelihood (ML) phylogenetic analyses did not resolve F. jacksoni to be basal to the Fasciolidae. Furthermore, the ML analyses showed that the genus Fasciola was not monophyletic and that F. jacksoni was a sister species to the deer liver fluke Fascioloides magna. A clear framework is required to determine the taxonomic status of F. jacksoni and this current study provides the first detailed application of molecular techniques from multiple hosts across Sri Lanka with the production of reference DNA sequences for this important parasite.
We study the mixing dynamics of solute blobs in the flow through saturated heterogeneous porous media. As the solute plume is advected through a heterogeneous porous medium it suffers a series of deformations that determine its mixing with the ambient fluid through diffusion. Key questions are the relation between the spatial disorder and the mixing dynamics and the effect of the initial solute distribution. To address these questions, we formulate the advection–diffusion problem in a coordinate system that moves and rotates along streamlines of the steady flow field. The impact of the medium heterogeneity is quantified systematically within a stochastic modelling approach. For a simple shear flow, the maximum concentration of a blob decays asymptotically as
. For heterogeneous porous media, the mixing of the solute blob is determined by the random sampling of flow and deformation heterogeneity along trajectories, a mechanism different from persistent shear. We derive explicit perturbation theory expressions for stretching-enhanced solute mixing that relate the medium structure and mixing behaviour. The solution is valid for moderate heterogeneity. The random sampling of shear along trajectories leads to a
decay of the maximum concentration as opposed to an equivalent homogeneous medium, for which it decays as
Introduction: Prevalence and incidence of delirium in older patients admitted to acute and long-term care facilities ranges between 9.6% and 89% but little is known in the context of emergency department (ED) incident delirium. Literature regarding the incidence of delirium in the ED and its potential impacts on hospital length of stay (LOS), functional status and unplanned ED readmissions is scant, its consequences have yet to be clearly identified in order to orient modern acute medical care. Methods: This study is part of the multicenter prospective cohort INDEED study. Three Canadian EDs completed the two years prospective study (March-July 2015 and Feb-May 2016). Patients aged 65 years old, initially free of delirium with an ED stay 8hours were followed up to 24h after ward admission. Patients were assessed 2x/day during their entire ED stay and up to 24 hours on hospital ward by research assistants (RA). The primary outcome of this study was incident delirium in the ED or within 24 h of ward admission. Functional and cognitive status were assessed using validated Older Americans’ Resources and Services and the Telephone Interview for Cognitive Status- modified tools. The Confusion Assessment Method (CAM) was used to detect incident delirium. ED and hospital administrative data were collected. Inter-observer agreement was realized among RA. Results: Incident delirium was not different between sites, nor between phases, nor between times from one site to another. All phases confounded, there is between 7 to 11% of ED related incident delirious episodes. Differences were seen in ED LOS between sites in non-delirious patients, but also between some sites for delirious participants (p<0.05). Only one site had a difference in ED LOS between their delirious and non-delirious patients, respectively of 52.1 and 40.1 hours (p<0.05). There is also a difference between sites in the time between arrival to the ED and the incidence of delirium (p=0.003). Kappa statistics were computed to measure inter-rater reliability of the CAM. Based on an alpha of 5%, 138 patients would allow 80% power for an estimated overall incidence proportion of 15 % with 5% precision.. Other predictive delirium variables, such as cognitive status, environmental factors, functional status, comorbidities, physiological status, and ED and hospital length of stay were similar between sites and phases. Conclusion: The fact that incidence of delirium was the same for all sites, despite the differences of ED LOS and different time periods suggest that many other modifiable and non-modifiable factors along LOS influenced the incidence of ED induced delirium. Emergency physician should concentrate on improving senior-friendly environment for the ED.
Introduction: It is documented that physicians and nurses fail to detect delirium in more than half of cases from various clinical settings, which could have serious consequences for seniors and for our health care system. The present study aimed to describe the rate of documented incident delirium in 5 Canadian Emergency departments (ED) by health professionals (HP). Methods: This study is part of the multicenter prospective cohort INDEED study. Patients aged 65 years old, initially free of delirium with an ED stay 8hours were followed up to 24h after ward admission. Delirium status was assessed twice daily using the Confusion Assessment Method (CAM) by trained research assistants (RA). HP reviewed patient charts to assess detection of delirium. HP had no specific routine detection of delirious ED patients. Inter-observer agreement was realized among RA. Comparison of detection between RA and HP was realized with univariate analyses. Results: Among the 652 included patients, 66 developed a delirium as evaluated with the CAM by the RA. Among those 66 patients, only 10 deliriums (15.2%) were documented in the patients medical file by the HP. 54 (81.8%) patients with a CAM positive for delirium by the RA were not recorded by the HP, 2 had incomplete charts. The delirium index was significantly higher in the HP reported group compared to the HP not reported, respectively 7.1 and 4.5 (p<0.05). Other predictive delirium variables, such as cognitive status, functional status, comorbidities, physiological status, and ED and hospital length of stay were similar between groups. Conclusion: It seems that health professionals missed 81.8% of the potential delirious ED patients in comparison to routine structured screening of delirium. HP could identify patients with a greater severity of symptoms. Our study points out the need to better identify elders at risk to develop delirium and the need for fast and reliable tools to improve the screening of this disorder.
Introduction: Head injury is a common presentation to all emergency departments. Previous research has shown that such injuries may be complicated by delayed intracranial hemorrhage (D-ICH) after the initial scan is negative. Exposure to anticoagulant or anti-platelet medications (ACAP) may be a risk factor for D-ICH. We have conducted a systematic review and meta-analysis to determine the incidence of delayed traumatic intracranial hemorrhage in patients taking anticoagulants, anti-platelets or both. Methods: The literature search was conducted in March 2017 with an update in April 2017. Keyword and MeSH terms were used to search OVID Medline, Embase and the Cochrane database as well as grey literature sources. All cohort and experimental studies were eligible for selection. Inclusion criteria included pre-injury exposure to oral anticoagulant and / or anti-platelet medication and a negative initial CT scan of the brain (CT1). The primary outcome was delayed intracranial hemorrhage present on repeat CT scan (CT2) within 48 hours of the presentation. Only patients who were rescanned or observed minimally were included. Clinically significant D-ICH were those that required neurosurgery, caused death or necessitated a change in management strategy, such as admission. Results: Fifteen primary studies were ultimately identified, comprising a total of 3801 patients. Of this number, 2111 had a control CT scan. 39 cases of D-ICH were identified, with the incidence of D-ICH calculated to be 1.31% (95% CI [0.56, 2.27]). No more than 12 of these patients had a clinically significant D-ICH representing 0.09% (95% CI [0.00, 0.31]). 10 of them were on warfarin and two on aspirin. There were three deaths recorded and three patients needed neurosurgery. Conclusion: The relatively low incidence suggests that repeat CT should not be mandatory for patients without ICH on first CT. This is further supported by the negligibly low rate of clinically significant D-ICH. Evidence-based assessments should be utilised to indicate the appropriate discharge plan, with further research required to guide the balance between clinical observation and repeat CT.
Influenza epidemics are monitored using influenza-like illness (ILI) data reported by health-care professionals. Timely detection of the onset of epidemics is often performed by applying a statistical method on weekly ILI incidence estimates with a large range of methods used worldwide. However, performance evaluation and comparison of these algorithms is hindered by: (1) the absence of a gold standard regarding influenza epidemic periods and (2) the absence of consensual evaluation criteria. As of now, performance evaluations metrics are based only on sensitivity, specificity and timeliness of detection, since definitions are not clear for time-repeated measurements such as weekly epidemic detection. We aimed to evaluate several epidemic detection methods by comparing their alerts to a gold standard determined by international expert consensus. We introduced new performance metrics that meet important objective of influenza surveillance in temperate countries: to detect accurately the start of the single epidemic period each year. Evaluations are presented using ILI incidence in France between 1995 and 2011. We found that the two performance metrics defined allowed discrimination between epidemic detection methods. In the context of performance detection evaluation, other metrics used commonly than the standard could better achieve the needs of real-time influenza surveillance.
Central nervous system infections (CNSI) are a leading cause of death and long-term disability in children. Using ICD-10 data from 2005 to 2015 from three central hospitals in Ho Chi Minh City (HCMC), Vietnam, we exploited generalized additive mixed models (GAMM) to examine the spatial-temporal distribution and spatial and climatic risk factors of paediatric CNSI, excluding tuberculous meningitis, in this setting. From 2005 to 2015, there were 9469 cases of paediatric CNSI; 33% were ⩽1 year old at admission and were mainly diagnosed with presumed bacterial CNSI (BI) (79%), the remainder were >1 year old and mainly diagnosed with presumed non-bacterial CNSI (non-BI) (59%). The urban districts of HCMC in proximity to the hospitals as well as some outer districts had the highest incidences of BI and non-BI; BI incidence was higher in the dry season. Monthly BI incidence exhibited a significant decreasing trend over the study. Both BI and non-BI were significantly associated with lags in monthly average temperature, rainfall, and river water level. Our findings add new insights into this important group of infections in Vietnam, and highlight where resources for the prevention and control of paediatric CNSI should be allocated.
Schizophrenia (SZ) and bipolar disorder (BD) are heritable, polygenic disorders with shared clinical and genetic components, suggesting a psychosis continuum. Cannabis use is a well-documented environmental risk factor in psychotic disorders. In the current study, we investigated the relationship between SZ genetic load and cannabis use before illness onset in SZ and BD spectrums. Since frequent early cannabis use (age <18 years) is believed to increase the risk of developing psychosis more than later use, follow-up analyses were conducted comparing early use to later use and no use.
We assigned a SZ-polygenic risk score (PGRS) to each individual in our independent sample (N = 381 SZ spectrum cases, 220 BD spectrum cases and 415 healthy controls), calculated from the results of the Psychiatric Genomics Consortium (PGC) SZ case–control study (N = 81 535). SZ-PGRS in patients who used cannabis weekly to daily in the period before first illness episode was compared with that of those who never or infrequently used cannabis.
Patients with weekly to daily cannabis use before illness onset had the highest SZ-PGRS (p = 0.02, Cohen's d = 0.33). The largest difference was found between patients with daily or weekly cannabis use before illness onset <18 years of age and patients with no or infrequent use of cannabis (p = 0.003, Cohen's d = 0.42).
Our study supports an association between high SZ-PGRS and frequent cannabis use before illness onset in psychosis continuum disorders.
Deriving glacier outlines from satellite data has become increasingly popular in the past decade. In particular when glacier outlines are used as a base for change assessment, it is important to know how accurate they are. Calculating the accuracy correctly is challenging, as appropriate reference data (e.g. from higher-resolution sensors) are seldom available. Moreover, after the required manual correction of the raw outlines (e.g. for debris cover), such a comparison would only reveal the accuracy of the analyst rather than of the algorithm applied. Here we compare outlines for clean and debris-covered glaciers, as derived from single and multiple digitizing by different or the same analysts on very high- (1 m) and medium-resolution (30 m) remote-sensing data, against each other and to glacier outlines derived from automated classification of Landsat Thematic Mapper data. Results show a high variability in the interpretation of debris-covered glacier parts, largely independent of the spatial resolution (area differences were up to 30%), and an overall good agreement for clean ice with sufficient contrast to the surrounding terrain (differences ∼5%). The differences of the automatically derived outlines from a reference value are as small as the standard deviation of the manual digitizations from several analysts. Based on these results, we conclude that automated mapping of clean ice is preferable to manual digitization and recommend using the latter method only for required corrections of incorrectly mapped glacier parts (e.g. debris cover, shadow).
In this proceeding, we show how observations of Solar System Objects with Gaia can be used to test General Relativity and to constrain modified gravitational theories. The high number of Solar System objects observed and the variety of their orbital parameters associated with the impressive astrometric accuracy will allow us to perform local tests of General Relativity. In this communication, we present a preliminary sensitivity study of the Gaia observations on dynamical parameters such as the Sun quadrupolar moment and on various extensions to general relativity such as the parametrized post-Newtonian parameters, the fifth force formalism and a violation of Lorentz symmetry parametrized by the Standard-Model extension framework. We take into account the time sequences and the geometry of the observations that are particular to Gaia for its nominal mission (5 years) and for an extended mission (10 years).
Accurate positional measurements of planets and satellites are used to improve our knowledge of their orbits and dynamics, and to infer the accuracy of the planet and satellite ephemerides. With the arrival of the Gaia-DR1 reference star catalog and its complete release afterward, the methods for ground-based astrometry become outdated in terms of their formal accuracy compared to the catalog's which is used. Systematic and zonal errors of the reference stars are eliminated, and the astrometric process now dominates in the error budget.
We present a set of algorithms for computing the apparent directions of planets, satellites and stars on any date to micro-arcsecond precision. The expressions are consistent with the ICRS reference system, and define the transformation between theoretical reference data, and ground-based astrometric observables.
Because of to its exceptional resolving power, Gaia should detect a few thousands gravitational lensed systems. These consist in multiple images of background quasars. The estimated number of lens phenomena in the sky, however, depends on the cosmological model considered. By taking into account the observational bias that will restrict the detection of lensed quasars, identification of these up to a given limiting magnitude will constrain the cosmological parameters.
We have investigated the known gravitationally lensed quasars present in the Gaia DR1, and found that a significant number of components of these systems have been measured and are present in the Gaia DR1 catalogue although quasi none of them have all their components detected. We additionally examined the immediate surroundings of QSOs from the large Quasar catalogue, LQAC3, and detected several configurations compatible with gravitational lensing phenomena. A more global strategy to systematically detect the potential candidates in the various releases of the Gaia catalogue is presented.
A general theory for predicting the distribution of scalar gradients (or concentration differences) in heterogeneous flows is proposed. The evolution of scalar fields is quantified from the analysis of the evolution of elementary lamellar structures, which naturally form under the stretching action of the flows. Spatial correlations in scalar fields, and concentration gradients, hence develop through diffusive aggregation of stretched lamellae. Concentration levels at neighbouring spatial locations result from a history of lamella aggregation, which is partly common to the two locations. Concentration differences eliminate this common part, and thus depend only on lamellae that have aggregated independently. Using this principle, we propose a theory which envisions concentration increments as the result of a deconstruction of the basic lamella assemblage. This framework provides analytical expressions for concentration increment probability density functions (PDFs) over any spatial increments for a range of flow systems, including turbulent flows and low-Reynolds-number porous media flows, for confined and dispersing mixtures. Through this deconstruction principle, scalar increment distributions reveal the elementary stretching and aggregation mechanisms building scalar fields.
Limiting the post-weaning intake of the young rabbit is known to improve its resistance to digestive disorders, whereas a degradation of its housing hygiene is assumed to have a negative impact on its health. This study aims at providing insights into the mechanism of digestive health preservation regarding both host (growth and immune response) and its symbiotic digestive microbiota. A 2×2 factorial design from weaning (day 28) to day 64 was set up: ad libitum intake or restricted intake at 70% of ad libitum, and high v. low hygiene of housing (n=105 per group). At day 36 and day 45, 15 animals/group were subcutaneously immunized with ovalbumin (OVA) to assess their specific immune response. Blood was sampled at 36, 45, 57 and 64 days of age to determine total and anti-OVA immunoglobulin type G (IgG) and haptoglobin levels. The cecal bacterial community was explored (18 per group) by 454 pyrosequencing of genes coding for the 16S ribosomal RNA, whereas cecal pH, NH3 and volatile fatty acid (VFA) concentrations were measured to characterize fermentative activity. A 30% reduction in feed intake reduced the growth by only 17% (P<0.001), and improved the feed conversion ratio by 15% (P<0.001), whereas the degradation of hygiene conditions slightly decreased the feed intake in ad libitum fed rabbits (−3.5%, P<0.02). As poor hygiene conditions did not affect weight gain, feed conversion was improved from day 42 (P<0.05). Restricted feeding led to a lower mortality between day 28 and day 40 (P=0.047), whereas degraded hygiene conditions decreased overall morbidity (7.8% v. 16.6%; P<0.01). Both a reduced intake and low hygiene conditions of housing affected microbiota composition and especially dominant genera belonging to the Ruminococcaceae family (P<0.01). Moreover, low hygiene was associated with a higher Ruminococcaceae/Lachnospiraceae ratio (3.7 v. 2.4; P<0.05). Cecal total VFA and pH were increased (+19%; P<0.001) and decreased (−0.1 pH unit; P<0.05), respectively, in feed-restricted rabbits. Neither specific anti-OVA IgG nor haptoglobin was affected by treatments. Total IgG concentrations were the highest in animals raised in poor hygiene conditions after 8 days of restriction, but decreased after 19 days of restriction in high hygiene conditions (−2.15%; P<0.05). In conclusion, the degradation of hygiene conditions failed to induce a systematic specific and inflammatory response in rabbit, but reduced morbidity instead. Our results suggest that the microbiota composition would be a helpful source of biomarkers of digestive health.
The VIMOS VLT Deep Survey (VVDS) is underway to study the evolution of galaxies, large scale structures and AGNs, from the measurement of more than 100 000 spectra of faint objects. We present here the results from the first epoch observations of more than 20000 spectra. The main challenge of the program, the redshift measurements, is described, in particular entering the “redshift desert” in the range 1.5 < z < 3 for which only very weak features are detected in the observed wavelength range. The redshift distribution of a magnitude limited sample brighter than IAB = 24 is presented for the first time, showing a peak at a low redshift z ∼ 0.7, and a tail extending all the way above z = 4. The evolution of the luminosity function out to z = 1.5 is presented, with the LF of blue star forming galaxies carrying most of the evolution, with L* changing by more than two magnitudes for this sub-sample.
Gamma-ray burst host galaxies are deficient in molecular gas, and show anomalous metal-poor regions close to GRB positions. Using recent Australia Telescope Compact Array (ATCA) Hi observations we show that they have substantial atomic gas reservoirs. This suggests that star formation in these galaxies may be fuelled by recent inflow of metal-poor atomic gas. While this process is debated, it can happen in low-metallicity gas near the onset of star formation because gas cooling (necessary for star formation) is faster than the Hi-to-H2 conversion.
Epidemiological studies have identified increased colorectal cancer (CRC) risk with high red meat (HRM) intakes, whereas dietary fibre intake appears to be protective. In the present study, we examined whether a HRM diet increased rectal O6-methyl-2-deoxyguanosine (O6MeG) adduct levels in healthy human subjects, and whether butyrylated high-amylose maize starch (HAMSB) was protective. A group of twenty-three individuals consumed 300 g/d of cooked red meat without (HRM diet) or with 40 g/d of HAMSB (HRM+HAMSB diet) over 4-week periods separated by a 4-week washout in a randomised cross-over design. Stool and rectal biopsy samples were collected for biochemical, microbial and immunohistochemical analyses at baseline and at the end of each 4-week intervention period. The HRM diet increased rectal O6MeG adducts relative to its baseline by 21 % (P< 0·01), whereas the addition of HAMSB to the HRM diet prevented this increase. Epithelial proliferation increased with both the HRM (P< 0·001) and HRM+HAMSB (P< 0·05) diets when compared with their respective baseline levels, but was lower following the HRM+HAMSB diet compared with the HRM diet (P< 0·05). Relative to its baseline, the HRM+HAMSB diet increased the excretion of SCFA by over 20 % (P< 0·05) and increased the absolute abundances of the Clostridium coccoides group (P< 0·05), the Clostridiumleptum group (P< 0·05), Lactobacillus spp. (P< 0·01), Parabacteroides distasonis (P< 0·001) and Ruminococcus bromii (P< 0·05), but lowered Ruminococcus torques (P< 0·05) and the proportions of Ruminococcus gnavus, Ruminococcus torques and Escherichia coli (P< 0·01). HRM consumption could increase the risk of CRC through increased formation of colorectal epithelial O6MeG adducts. HAMSB consumption prevented red meat-induced adduct formation, which may be associated with increased stool SCFA levels and/or changes in the microbiota composition.
The objective of the Apollon project is the generation of 10 PW peak power pulses of 15 fs at 1 shot/minute. In this paper the Apollon facility design, the technological challenges and the current progress of the project will be presented.
Antarctic and Southern Ocean science is vital to understanding natural variability, the processes that govern global change and the role of humans in the Earth and climate system. The potential for new knowledge to be gained from future Antarctic science is substantial. Therefore, the international Antarctic community came together to ‘scan the horizon’ to identify the highest priority scientific questions that researchers should aspire to answer in the next two decades and beyond. Wide consultation was a fundamental principle for the development of a collective, international view of the most important future directions in Antarctic science. From the many possibilities, the horizon scan identified 80 key scientific questions through structured debate, discussion, revision and voting. Questions were clustered into seven topics: i) Antarctic atmosphere and global connections, ii) Southern Ocean and sea ice in a warming world, iii) ice sheet and sea level, iv) the dynamic Earth, v) life on the precipice, vi) near-Earth space and beyond, and vii) human presence in Antarctica. Answering the questions identified by the horizon scan will require innovative experimental designs, novel applications of technology, invention of next-generation field and laboratory approaches, and expanded observing systems and networks. Unbiased, non-contaminating procedures will be required to retrieve the requisite air, biota, sediment, rock, ice and water samples. Sustained year-round access to Antarctica and the Southern Ocean will be essential to increase winter-time measurements. Improved models are needed that represent Antarctica and the Southern Ocean in the Earth System, and provide predictions at spatial and temporal resolutions useful for decision making. A co-ordinated portfolio of cross-disciplinary science, based on new models of international collaboration, will be essential as no scientist, programme or nation can realize these aspirations alone.