To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The study aimed at assessing stunting, wasting and breast-feeding as correlates of body composition in Cambodian children. As part of a nutrition trial (ISRCTN19918531), fat mass (FM) and fat-free mass (FFM) were measured using 2H dilution at 6 and 15 months of age. Of 419 infants enrolled, 98 % were breastfed, 15 % stunted and 4 % wasted at 6 months. At 15 months, 78 % were breastfed, 24 % stunted and 11 % wasted. Those not breastfed had lower FMI at 6 months but not at 15 months. Stunted children had lower FM at 6 months and lower FFM at 6 and 15 months compared with children with length-for-age z ≥0. Stunting was not associated with height-adjusted indexes fat mass index (FMI) or fat-free mass index (FFMI). Wasted children had lower FM, FFM, FMI and FFMI at 6 and 15 months compared with children with weight-for-length z (WLZ) ≥0. Generally, FFM and FFMI deficits increased with age, whereas FM and FMI deficits decreased, reflecting interactions between age and WLZ. For example, the FFM deficits were –0·99 (95 % CI –1·26, –0·72) kg at 6 months and –1·44 (95 % CI –1·69; –1·19) kg at 15 months (interaction, P<0·05), while the FMI deficits were –2·12 (95 % CI –2·53, –1·72) kg/m2 at 6 months and –1·32 (95 % CI –1·77, –0·87) kg/m2 at 15 months (interaction, P<0·05). This indicates that undernourished children preserve body fat at the detriment of fat-free tissue, which may have long-term consequences for health and working capacity.
Given the challenges in accurately identifying unexposed controls in case–control studies of diarrhoea, we examined diarrhoea incidence, subclinical enteric infections and growth stunting within a reference population in the Global Enteric Multicenter Study, Kenya site. Within ‘control’ children (0–59 months old without diarrhoea in the 7 days before enrolment, n = 2384), we examined surveys at enrolment and 60-day follow-up, stool at enrolment and a 14-day post-enrolment memory aid for diarrhoea incidence. At enrolment, 19% of controls had ⩾1 enteric pathogen associated with moderate-to-severe diarrhoea (‘MSD pathogens’) in stool; following enrolment, many reported diarrhoea (27% in 7 days, 39% in 14 days). Controls with and without reported diarrhoea had similar carriage of MSD pathogens at enrolment; however, controls reporting diarrhoea were more likely to report visiting a health facility for diarrhoea (27% vs. 7%) or fever (23% vs. 16%) at follow-up than controls without diarrhoea. Odds of stunting differed by both MSD and ‘any’ (including non-MSD pathogens) enteric pathogen carriage, but not diarrhoea, suggesting control classification may warrant modification when assessing long-term outcomes. High diarrhoea incidence following enrolment and prevalent carriage of enteric pathogens have implications for sequelae associated with subclinical enteric infections and for design and interpretation of case–control studies examining diarrhoea.
Emerging adulthood is a peak period of risk for alcohol and illicit drug use. Recent advances in psychiatric genetics suggest that the co-occurrence of substance use and psychopathology arises, in part, from a shared genetic etiology. We sought to extend this research by investigating the influence of genetic risk for schizophrenia on trajectories of four substance use behaviors as they occurred across emerging adulthood.
Young adult participants of non-Hispanic European descent provided DNA samples and completed daily reports of substance use for 1 month per year across 4 years (N = 30 085 observations of N = 342 participants). A schizophrenia polygenic score was included in two-level hierarchical linear models designed to test associations between genetic risk for schizophrenia, participant age, and four substance use phenotypes.
Participants with a greater schizophrenia polygenic score experienced greater age-related increases in the likelihood of using substances across emerging adulthood (p < 0.005). Additionally, our results suggest that the polygenic score was positively associated with participants’ overall likelihood to engage in illicit drug use but not alcohol-related substance use.
This study used a novel combination of polygenic prediction and intensive longitudinal methods to characterize the influence of genetic risk for schizophrenia on patterns of age-related change in substance use across emerging adulthood. Results suggest that genetic risk for schizophrenia has developmentally specific effects on substance use behaviors in a non-clinical population of young adults.
Toca 511 (vocimagene amiretrorepvec) is an investigational, conditionally lytic, retroviral replicating vector (RRV). RRVs selectively infect cancer cells due to innate and adaptive immune response defects in cancers that allow virus replication, and the requirement for cell division for virus integration into the genome. Toca 511 spreads through tumors, stably delivering an optimized yeast cytosine deaminase gene that converts the prodrug Toca FC (investigational, extended-release 5-FC) into 5-FU within the tumor microenvironment. 5-FU kills infected dividing cancer cells and surrounding tumor, myeloid derived suppressor cells, and tumor associated macrophages, resulting in long-term tumor immunity in preclinical models. Data from a Phase 1 resection trial showed six durable CRs and extended mOS compared to historical controls. The FDA granted Breakthrough Therapy Designation for Toca 511 & Toca FC in the treatment of patients with rHGG. Toca 5 is an international, randomized, open-label Phase 3 trial (NCT02414165) of Toca 511 & Toca FC versus SOC in patients undergoing resection for first or second recurrence of rHGG. Patients will be stratified by IDH1 status, KPS, and geographic region. Primary endpoint is OS, and secondary endpoints are durable response rate, durable clinical benefit rate, duration of durable response, and 12-month survival rate. Key inclusion criteria are histologically proven GBM or AA, tumor size ≥1cm and ≤5cm, and KPS ≥70. Immune monitoring and molecular profiling will be performed. Approximately 380 patients will be randomized. An IDMC is commissioned to review the safety and efficacy data which includes 2 interim analyses. Enrollment is ongoing.
Introduction: Patients with Heart failure (HF) experience frequent decompensation necessitating multiple emergency department (ED) visits and hospitalizations. If patients are able to receive timely interventions and optimize self-management, recurrent ED visits may be reduced. In this feasibility study, we piloted the application of home telemonitoring to support the discharge of HF patients from hospital to home. We hypothesized that TEC4Home would decrease ED revisits and hospital admissions and improve patient health outcomes. Methods: Upon discharge from the ED or hospital, patients with HF received a blood pressure cuff, weight scale, pulse oximeter, and a touchscreen tablet. Participants submitted measurements and answered questions on the tablet about their HF symptoms daily for 60 days. Data were reviewed by a monitoring nurse. From November 2016 to July 2017, 69 participants were recruited from Vancouver General Hospital (VGH), St. Pauls Hospital (SPH) and Kelowna General Hospital (KGH). Participants completed pre-surveys at enrollement and post-surveys 30 days after monitoring finished. Administrative data related to ED visits and hospital admissions were reviewed. Interviews were conducted with the monitoring nurses to assess the impact of monitoring on patient health outcomes. Results: A preliminary analysis was conducted on a subsample of participants (n=22) enrolled across all 3 sites by March 31, 2017. At VGH and SPH (n=14), 25% fewer patients required an ED visit in the post-survey reporting compared to pre-survey. During the monitoring period, the monitoring nurse observed seven likely avoided ED admissions due to early intervention. In total, admissions were reduced by 20% and total hospital length of stay reduced by 69%. At KGH (n=8), 43% fewer patients required an ED visit in the post-survey reporting compared to the pre-survey. Hospital admissions were reduced by 20% and total hospital length of stay reduced by 50%. Overall, TEC4Home participants from all sites showed a significant improvement in health-related quality of life and in self-care behaviour pre- to 90 days post-monitoring. A full analysis of the 69 patients will be complete in February 2018. Conclusion: Preliminary findings indicate that home telemonitoring for HF patients can decrease ED revisits and improve patient experience. The length of stay data may also suggest the potential for early discharge of ED patients with home telemonitoring to avoid or reduce hospitalization. A stepped-wedge randomized controlled trial of TEC4Home in 22 BC communities will be conducted in 2018 to generate evidence and scale up the service in urban, regional and rural communities. This work is submitted on behalf of the TEC4Home Healthcare Innovation Community.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
The overall objective of our work is to assess the relative contributions of plant enzymes and rumen microbes to rumen degradation of freshly-ingested herbage. In situ techniques have been used extensively to compare rumen degradation characteristics of feeds, though there remain technical problems associated with microbial contamination of residues after incubation. We hypothesised that techniques to study microbial contamination might also provide insights into microbial colonisation. Our earlier studies (Lee et al., 1999) identified distinctive odd-chain fatty acids that could be used as microbial markers. A dacron bag study was conducted to examine the influence of dacron bag rinsing techniques on DM disappearance and microbial contamination in residues from fresh grass, assessed using odd-chain fatty acids as markers.
We compared the impact of a commercial chlorination product (brand name Air RahMat) in stored drinking water to traditional boiling practices in Indonesia. We conducted a baseline survey of all households with children <5 years in four communities, made 11 subsequent weekly home visits to assess acceptability and use of water treatment methods, measured Escherichia coli concentration in stored water, and determined diarrhoea prevalence among children <5 years. Of 281 households surveyed, boiling (83%) and Air RahMat (7%) were the principal water treatment methods. Multivariable log-binomial regression analyses showed lower risk of E. coli in stored water treated with Air RahMat than boiling (risk ratio (RR) 0·75, 95% confidence interval (CI) 0·56–1·00). The risk of diarrhoea in children <5 years was lower among households using Air RahMat (RR 0·43, 95% CI 0·19–0·97) than boiling, and higher in households with E. coli concentrations of 1–1000 MPN/100 ml (RR 1·54, 95% CI 1·04–2·28) or >1000 MPN/100 ml (RR 1·86, 95% CI 1·09–3·19) in stored water than in households without detectable E. coli. Although results suggested that Air RahMat water treatment was associated with lower E. coli contamination and diarrhoeal rates among children <5 years than water treatment by boiling, Air RahMat use remained low.
We sought to comprehensively assess the prevalence and outcomes of complications associated with Staphylococcus aureus bacteremia (SAB) in children. Secondarily, prevalence of methicillin resistance and outcomes of complications from methicillin-resistant S. aureus (MRSA) vs. methicillin-susceptible S. aureus SAB were assessed. This is a single-center cross-sectional study of 376 patients ⩽18 years old with SAB in 1990–2014. Overall, 197 (52%) patients experienced complications, the most common being osteomyelitis (33%), skin and soft tissue infection (31%), and pneumonia (25%). Patients with complications were older (median 3 vs. 0·7 years, P = 0·05) and more had community-associated SAB (66% vs. 34%, P = 0·001). Fewer patients with complications had a SAB-related emergency department or hospital readmission (10% vs. 19%, P = 0·014). Prevalence of methicillin resistance increased from 1990–1999 to 2000–2009, but decreased in 2010–2014. Complicated MRSA bacteremia resulted in more intensive care unit admissions (66% vs. 47%, P = 0·03) and led to increased likelihood of having ⩾2 foci (58% vs. 26%, P < 0·001). From multivariate analysis, community-associated SAB increased risk and concurrent infections decreased risk of complications (odds ratio (OR) 1·82 (1·1–3·02), P = 0·021) and (OR 0·58 (0·34–0·97), P = 0·038), respectively. In conclusion, children with SAB should be carefully evaluated for complications. Methicillin resistance remains associated with poor outcomes but have decreased in overall prevalence.
Holstein-Friesian steer beef production is renowned globally as a secondary product of the milk industry. Grass feeding is a common practice in raising Holstein steers because of its low cost. Furthermore, grass feeding is an alternative way to produce beef with a balanced n-6 to n-3 fatty acids (FAs) ratio. However, the performance and meat quality of Holstein-Friesian cattle is more likely to depend on a high-quality diet. The aim of this study was to observe whether feeding two mixed diets; a corn-based total mixed ration (TMR) with winter ryegrass (Lolium perenne) or flaxseed oil-supplemented pellets with reed canary grass haylage (n-3 mix) provided benefits on carcass weight, meat quality and FA composition compared with cattle fed with reed canary grass (Phalaris arundinacea) haylage alone. In all, 15 21-month-old Holstein-Friesian steers were randomly assigned to three group pens, were allowed free access to water and were fed different experimental diets for 150 days. Blood samples were taken a week before slaughter. Carcass weight and meat quality were evaluated after slaughter. Plasma lipid levels and aspartate aminotransferase (AST), γ-glutamyl transpeptidase (GGT), creatine kinase (CK) and alkaline phosphatase (ALP) activities were determined. Diet did not affect plasma triglyceride levels and GGT activity. Plasma cholesterol levels, including low-density and high-density lipoproteins, were higher in both mixed-diet groups than in the haylae group. The highest activities of plasma AST, CK and ALP were observed in the haylage group, followed by n-3 mix and TMR groups, respectively. Carcass weight was lower in the haylage group than in the other groups and no differences were found between the TMR and n-3 mix groups. Although the n-3 mix-fed and haylage-fed beef provided lower n-6 to n-3 FAs ratio than TMR-fed beef, the roasted beef obtained from the TMR group was more acceptable with better overall meat physicochemical properties and sensory scores. According to daily cost, carcass weight and n-6 to n-3 FAs ratio, the finishing diet containing flaxseed oil-supplemented pellets and reed canary grass haylage at the as-fed ratio of 40 : 60 could be beneficial for the production of n-3-enriched beef.
Forest carbon sequestration plays an important role in reducing the build-up of greenhouse gases that are known to contribute to global climate change. However, private landowners will supply less carbon sequestration than would be socially desirable if they are unable to capture the economic value of sequestration. We examine the viability of offering landowners property tax subsidies for forest carbon sequestration (referred to as a ‘tax-based subsidy approach’). Waiving property taxes on forestland provides incentives for landowners to afforest non-forested land and/or sustain forests that are at risk of deforestation. We focus on 17 Tennessee counties and one Kentucky county, constituting one of 179 Bureau of Economic Analysis areas in the United States, as a case study. Higher forestland net return from waiving property taxes increases the share of forestland in the 18 counties, which in turn increases the accumulation of carbon in the forest ecosystem, suggesting that this is a viable approach. The annualized county-level cost of supplying forest carbon sequestration using a tax-based subsidy ranges between US$15.56 and US$563.58 per carbon tonne across the 18 counties. Relevant government agencies can use these estimates to target selected counties for more cost-effective adoption of the county-level tax-based subsidy approach.
Universal screening for postpartum depression is recommended in many countries. Knowledge of whether the disclosure of depressive symptoms in the postpartum period differs across cultures could improve detection and provide new insights into the pathogenesis. Moreover, it is a necessary step to evaluate the universal use of screening instruments in research and clinical practice. In the current study we sought to assess whether the Edinburgh Postnatal Depression Scale (EPDS), the most widely used screening tool for postpartum depression, measures the same underlying construct across cultural groups in a large international dataset.
Ordinal regression and measurement invariance were used to explore the association between culture, operationalized as education, ethnicity/race and continent, and endorsement of depressive symptoms using the EPDS on 8209 new mothers from Europe and the USA.
Education, but not ethnicity/race, influenced the reporting of postpartum depression [difference between robust comparative fit indexes (∆*CFI) < 0.01]. The structure of EPDS responses significantly differed between Europe and the USA (∆*CFI > 0.01), but not between European countries (∆*CFI < 0.01).
Investigators and clinicians should be aware of the potential differences in expression of phenotype of postpartum depression that women of different educational backgrounds may manifest. The increasing cultural heterogeneity of societies together with the tendency towards globalization requires a culturally sensitive approach to patients, research and policies, that takes into account, beyond rhetoric, the context of a person's experiences and the context in which the research is conducted.
The Antarctic Roadmap Challenges (ARC) project identified critical requirements to deliver high priority Antarctic research in the 21st century. The ARC project addressed the challenges of enabling technologies, facilitating access, providing logistics and infrastructure, and capitalizing on international co-operation. Technological requirements include: i) innovative automated in situ observing systems, sensors and interoperable platforms (including power demands), ii) realistic and holistic numerical models, iii) enhanced remote sensing and sensors, iv) expanded sample collection and retrieval technologies, and v) greater cyber-infrastructure to process ‘big data’ collection, transmission and analyses while promoting data accessibility. These technologies must be widely available, performance and reliability must be improved and technologies used elsewhere must be applied to the Antarctic. Considerable Antarctic research is field-based, making access to vital geographical targets essential. Future research will require continent- and ocean-wide environmentally responsible access to coastal and interior Antarctica and the Southern Ocean. Year-round access is indispensable. The cost of future Antarctic science is great but there are opportunities for all to participate commensurate with national resources, expertise and interests. The scope of future Antarctic research will necessitate enhanced and inventive interdisciplinary and international collaborations. The full promise of Antarctic science will only be realized if nations act together.
A new approach is proposed to analyze Bremsstrahlung X-rays that are emitted from laser-produced plasmas (LPP) and are measured by a stack type spectrometer. This new method is based on a spectral tomographic reconstruction concept with the variational principle for optimization, without referring to the electron energy distribution of a plasma. This approach is applied to the analysis of some experimental data obtained at a few major laser facilities to demonstrate the applicability of the method. Slope temperatures of X-rays from LPP are determined with a two-temperature model, showing different spectral characteristics of X-rays depending on laser properties used in the experiments.
The Coma cluster is a rich cluster of galaxies nested in an even larger supercluster of galaxies. The plane of the supercluster appears to be defined by the Coma cluster itself and another galaxy cluster, Abell 1367, that lies about 40 Mpc (H0 = 75 Mpc km−1s−1 (≡h75)) farther west (Tifft and Gregory 1976).
A new series of detailed observations of the Coma cluster of galaxies has been undertaken with the aim of better specifying the intracluster gas component throgh high dynamic range, combined-telescope observations of the radio halo emission from the cluster which detect both the cluster-scale and galaxy-scale emission. By combining these multi-frequency maps with a Faraday rotation probe experiment using cluster and background sources, and the published X-ray data, we have been able to estimate the intracluster magnetic field strength independently of the unusal assumption of equipartition. The result is approximately 2 microgauss and the tangling of the rms field occurs on an optical galaxy scale.
Construction of a new science complex in Osong, Cheongwon-gun, Korea, has allowed the investigation of 14 different Paleolithic localities, excavated during 2005–2007. Here, we investigated localities 1 and 12 of the Mansuri Paleolithic site to obtain chronological information using radiocarbon dating. The soil deposition rates varied from 0.09 to 0.15 mm/yr over the period ranging from 33 to 31 kyr BP for locality 1. Locality 12 samples were more recent, <10 ka, and have similar accumulation rates, averaging 0.11 mm/yr. The soil ages of locality 12 were found to be younger than 10 kyr BP. Results for both soil and organic materials at this locality gave much younger ages at shallower depths than the ages expected by the Korean Paleolithic cultural history for this region. Therefore, these more recent deposits may not be associated with the cultural layers and are interpreted to have been hydrologically modified following emplacement. 14C dates of the soil and organic materials at locality 12 confirm that there is evidence for multiple human occupations throughout the last 9 kyr BP.
The relationship between temperature and time required for collagenization using modern bone samples was investigated. Gelatinized samples of bone collagen were filtered to selectively collect different molecular weight fractions. The results of this study suggest that heating to 70 ° for a duration of 12 hr provides the optimal conditions for gelatinization.
The development of radiocarbon dating for degraded bone samples collected at Korean archaeological sites has been successful through the characterization of raw bone C/N ratios and application of an ultrafiltration method. It was found that the C/N ratios of raw bone samples are inversely proportional to the carbon content and residue amount after gelatinization. We have examined a few dozen Korean archaeological bone samples for this study. Well-preserved bone samples are found to be physically dense. The range of C/N ratios of Korean raw bone samples ranged from 3.4 to 74. We found that the C/N ratios of degraded raw bone samples can be used to determine whether 14C samples are acceptable for normal pretreatment processing and eventual dating. The results of this study support that even if the C/N ratio of a degraded raw bone sample is 11, extraction of collagen for bone dating is feasible by a carefully designed ultrafiltration process. Our preliminary 14C dating results of a depth profile of Gunang-gul Cave, an archaeological site in Danyang, Korea, indicate that this site has been either geologically or anthropologically disturbed in the past, with 14C ages ranging from 28,910 ± 200 to 48,090 ± 1050 yr BP. The C/N ratios of the collagen samples of Gunang-gul were determined to be 3.2–3.6. Our study establishes a new guide for the pretreatment of degraded bone samples such as those collected in Korea for 14C dating.