To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The search for life in the Universe is a fundamental problem of astrobiology and modern science. The current progress in the detection of terrestrial-type exoplanets has opened a new avenue in the characterization of exoplanetary atmospheres and in the search for biosignatures of life with the upcoming ground-based and space missions. To specify the conditions favourable for the origin, development and sustainment of life as we know it in other worlds, we need to understand the nature of global (astrospheric), and local (atmospheric and surface) environments of exoplanets in the habitable zones (HZs) around G-K-M dwarf stars including our young Sun. Global environment is formed by propagated disturbances from the planet-hosting stars in the form of stellar flares, coronal mass ejections, energetic particles and winds collectively known as astrospheric space weather. Its characterization will help in understanding how an exoplanetary ecosystem interacts with its host star, as well as in the specification of the physical, chemical and biochemical conditions that can create favourable and/or detrimental conditions for planetary climate and habitability along with evolution of planetary internal dynamics over geological timescales. A key linkage of (astro)physical, chemical and geological processes can only be understood in the framework of interdisciplinary studies with the incorporation of progress in heliophysics, astrophysics, planetary and Earth sciences. The assessment of the impacts of host stars on the climate and habitability of terrestrial (exo)planets will significantly expand the current definition of the HZ to the biogenic zone and provide new observational strategies for searching for signatures of life. The major goal of this paper is to describe and discuss the current status and recent progress in this interdisciplinary field in light of presentations and discussions during the NASA Nexus for Exoplanetary System Science funded workshop ‘Exoplanetary Space Weather, Climate and Habitability’ and to provide a new roadmap for the future development of the emerging field of exoplanetary science and astrobiology.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
Laser interaction with an ultra-thin pre-structured target is investigated with the help of both two-dimensional and three-dimensional particle-in-cell simulations. With the existence of a periodic structure on the target surface, the laser seems to penetrate through the target at its fundamental frequency even if the plasma density of the target is much higher than the laser’s relativistically critical density. The particle-in-cell simulations show that the transmitted laser energy behind the pre-structured target is increased by about two orders of magnitude compared to that behind the flat target. Theoretical analyses show that the transmitted energy behind the pre-structured target is actually re-emitted by electron ‘islands’ formed by the surface plasma waves on the target surfaces. In other words, the radiation with the fundamental frequency is actually ‘surface emission’ on the target rear surface. Besides the intensity of the component with the fundamental frequency, the intensity of the high-order harmonics behind the pre-structured target is also much enhanced compared to that behind the flat target. The enhancement of the high-order harmonics is also related to the surface plasma waves generated on the target surfaces.
India has the second largest number of people with type 2 diabetes (T2D) globally. Epidemiological evidence indicates that consumption of white rice is positively associated with T2D risk, while intake of brown rice is inversely associated. Thus, we explored the effect of substituting brown rice for white rice on T2D risk factors among adults in urban South India. A total of 166 overweight (BMI ≥ 23 kg/m2) adults aged 25–65 years were enrolled in a randomised cross-over trial in Chennai, India. Interventions were a parboiled brown rice or white rice regimen providing two ad libitum meals/d, 6 d/week for 3 months with a 2-week washout period. Primary outcomes were blood glucose, insulin, glycosylated Hb (HbA1c), insulin resistance (homeostasis model assessment of insulin resistance) and lipids. High-sensitivity C-reactive protein (hs-CRP) was a secondary outcome. We did not observe significant between-group differences for primary outcomes among all participants. However, a significant reduction in HbA1c was observed in the brown rice group among participants with the metabolic syndrome (−0·18 (se 0·08) %) relative to those without the metabolic syndrome (0·05 (se 0·05) %) (P-for-heterogeneity = 0·02). Improvements in HbA1c, total and LDL-cholesterol were observed in the brown rice group among participants with a BMI ≥ 25 kg/m2 compared with those with a BMI < 25 kg/m2 (P-for-heterogeneity < 0·05). We observed a smaller increase in hs-CRP in the brown (0·03 (sd 2·12) mg/l) compared with white rice group (0·63 (sd 2·35) mg/l) (P = 0·04). In conclusion, substituting brown rice for white rice showed a potential benefit on HbA1c among participants with the metabolic syndrome and an elevated BMI. A small benefit on inflammation was also observed.
Previous studies have demonstrated that type 1 diabetes mellitus (T1DM) could be triggered by an early childhood infection. Whether maternal infection during pregnancy is associated with T1DM in offspring is unknown. Therefore, we aimed to study the association using a systematic review and meta-analysis. Eighteen studies including 4304 cases and 25 846 participants were enrolled in this meta-analysis. Odds ratios (ORs) and 95% confidence intervals (CIs) were synthesised using random-effects models. Subgroup analyses and sensitivity analyses were conducted to assess the robustness of associations. Overall, the pooled analysis yielded a statistically significant association between maternal infection during pregnancy and childhood T1DM (OR 1.31, 95% CI 1.07–1.62). Furthermore, six studies that tested maternal enterovirus infection showed a pooled OR of 1.54 (95% CI 1.05–2.27). Heterogeneity from different studies was evident (I2 = 70.1%, P < 0.001) and was mainly attributable to the different study designs, ascertaining methods and sample size among different studies. This study provides evidence for an association between maternal infection during pregnancy and childhood T1DM.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
To explore the prevalence and drivers of hospital-level variability in antibiotic utilization among hematopoietic cell transplant (HCT) recipients to inform antimicrobial stewardship initiatives.
Retrospective cohort study using data merged from the Pediatric Health Information System and the Center for International Blood and Marrow Transplant Research.
The study included 27 transplant centers in freestanding children’s hospitals.
The primary outcome was days of broad-spectrum antibiotic use in the interval from day of HCT through neutrophil engraftment. Hospital antibiotic utilization rates were reported as days of therapy (DOTs) per 1,000 neutropenic days. Negative binomial regression was used to estimate hospital utilization rates, adjusting for patient covariates including demographics, transplant characteristics, and severity of illness. To better quantify the magnitude of hospital variation and to explore hospital-level drivers in addition to patient-level drivers of variation, mixed-effects negative binomial models were also constructed.
Adjusted hospital rates of antipseudomonal antibiotic use varied from 436 to 1121 DOTs per 1,000 neutropenic days, and rates of broad-spectrum, gram-positive antibiotic use varied from 153 to 728 DOTs per 1,000 neutropenic days. We detected variability by hospital in choice of antipseudomonal agent (ie, cephalosporins, penicillins, and carbapenems), but gram-positive coverage was primarily driven by vancomycin use. Considerable center-level variability remained even after controlling for additional hospital-level factors. Antibiotic use was not strongly associated with days of significant illness or mortality.
Among a homogenous population of children undergoing HCT for acute leukemia, both the quantity and spectrum of antibiotic exposure in the immediate posttransplant period varied widely. Antimicrobial stewardship initiatives can apply these data to optimize the use of antibiotics in transplant patients.
Influenza is a long-standing public health concern, but its transmission remains poorly understood. To have a better knowledge of influenza transmission, we carried out a detailed modelling investigation in a nosocomial influenza outbreak in Hong Kong. We identified three hypothesised transmission modes between index patient and other inpatients based on the long-range airborne and fomite routes. We considered three kinds of healthcare workers’ routine round pathways in 1140 scenarios with various values of important parameters. In each scenario, we used a multi-agent modelling framework to estimate the infection risk for each hypothesis and conducted least-squares fitting to evaluate the hypotheses by comparing the distribution of the infection risk with that of the attack rates. Amongst the hypotheses tested in the 1140 scenarios, the prediction of modes involving the long-range airborne route fit better with the attack rates, and that of the two-route transmission mode had the best fit, with the long-range airborne route contributing about 94% and the fomite route contributing 6% to the infections. Under the assumed conditions, the influenza virus was likely to have spread via a combined long-range airborne and fomite routes, with the former predominant and the latter negligible.
Introduction: The proportion of Canadians receiving anticoagulation medication is increasing. Falls in the elderly are the most common cause of minor head injury and an increasing proportion of these patients are prescribed anticoagulation. Emergency department (ED) guidelines advise performing a CT head scan for all anticoagulated head injured patients, but the risk of intracranial hemorrhage (ICH) after a minor head injury (patients who have a Glasgow comma score (GSC) of 15) is unclear. We conducted a systematic review and meta-analysis to determine the point incidence of ICH in anticoagulated ED patients presenting with a minor head injury. Methods: We systematically searched Pubmed, EMBASE, Cochrane database, DARE, google scholar and conference abstracts (May 2017). Experts were contacted. Meta-Analyses and Systematic Reviews of Observational Studies (MOOSE) guidelines were followed with two authors reviewing titles, four authors reviewing full text and four authors performing data extraction. We included all prospective studies recruiting consecutive anticoagulated ED patients presenting with a head injury. We obtained additional data from the authors of the included studies on the subset of GCS 15 patients. We performed a meta-analysis to estimate the point incidence of ICH among patients with a GCS score of 15 using a random effects model. Results: A total of five studies (and 4,080 GCS 15, anticoagulated patients) from the Netherlands, Italy, France, USA and UK were included in the analysis. One study contributed 2,871 patients. Direct oral anticoagulants were prescribed in only 60 (1.5%) patients. There was significant heterogeneity between studies with regards to mechanism of injury, CT scanning and follow up method (I2 =93%). The random effects pooled incidence of ICH was 8.9% (95% CI 5.0-13.8%). Conclusion: We found little data to reflect contemporary anticoagulant prescribing practice. Around 9% of warfarinized patients with a minor head injury develop ICH. Future studies should evaluate the safety of selective CT head scanning in this population.
Children with CHD and acquired heart disease have unique, high-risk physiology. They may have a higher risk of adverse tracheal-intubation-associated events, as compared with children with non-cardiac disease.
Materials and methods
We sought to evaluate the occurrence of adverse tracheal-intubation-associated events in children with cardiac disease compared to children with non-cardiac disease. A retrospective analysis of tracheal intubations from 38 international paediatric ICUs was performed using the National Emergency Airway Registry for Children (NEAR4KIDS) quality improvement registry. The primary outcome was the occurrence of any tracheal-intubation-associated event. Secondary outcomes included the occurrence of severe tracheal-intubation-associated events, multiple intubation attempts, and oxygen desaturation.
A total of 8851 intubations were reported between July, 2012 and March, 2016. Cardiac patients were younger, more likely to have haemodynamic instability, and less likely to have respiratory failure as an indication. The overall frequency of tracheal-intubation-associated events was not different (cardiac: 17% versus non-cardiac: 16%, p=0.13), nor was the rate of severe tracheal-intubation-associated events (cardiac: 7% versus non-cardiac: 6%, p=0.11). Tracheal-intubation-associated cardiac arrest occurred more often in cardiac patients (2.80 versus 1.28%; p<0.001), even after adjusting for patient and provider differences (adjusted odds ratio 1.79; p=0.03). Multiple intubation attempts occurred less often in cardiac patients (p=0.04), and oxygen desaturations occurred more often, even after excluding patients with cyanotic heart disease.
The overall incidence of adverse tracheal-intubation-associated events in cardiac patients was not different from that in non-cardiac patients. However, the presence of a cardiac diagnosis was associated with a higher occurrence of both tracheal-intubation-associated cardiac arrest and oxygen desaturation.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
Plant height and lodging resistance can affect rice yield significantly, but these traits have always conflicted in crop cultivation and breeding. The current study aimed to establish a rapid and accurate plant type evaluation mechanism to provide a basis for breeding tall but lodging-resistant super rice varieties. A comprehensive approach integrating plant anatomy and histochemistry was used to investigate variations in flexural strength (a material property, defined as the stress in a material just before it yields in a flexure test) of the rice stem and the lodging index of 15 rice accessions at different growth stages to understand trends in these parameters and the potential factors influencing them. Rice stem anatomical structure was observed and the lignin content the cell wall was determined at different developmental stages. Three rice lodging evaluation models were established using correlation analysis, multivariate regression and artificial radial basis function (RBF) neural network analysis, and the results were compared to identify the most suitable model for predicting optimal rice plant types. Among the three evaluation methods, the mean residual and relative prediction errors were lowest using the RBF network, indicating that it was highly accurate and robust and could be used to establish a mathematical model of the morphological characteristics and lodging resistance of rice to identify optimal varieties.
Nitrogen (N) application and irrigation to winter wheat may decrease leaf temperature and enhance photosynthesis: as a result, more photosynthates will be allocated to the grains, resulting in higher grain yields. To investigate this hypothesis, a 2-year field study was conducted with three levels of N fertilizer application (no fertilizer, N0; 240 kg N/ha, N1; 360 kg N/ha, N2) and two different water regimes (rainfed with no irrigation, R; irrigation at the over-wintering, stem elongation and grain filling stages, W). The results show that both N application and supplemental irrigation significantly increased grain yield with increases in both grain number/m2 and the 1000-grain weight, viz., WN2>WN1>WN0>RN2>RN1>RN0. In addition, application of N under both water regimes significantly increased flag leaf area, above-ground biomass and single stem productivity and decreased leaf temperature, which led to an increase in net photosynthesis rates and ribulose bisphosphate (RuBP) carboxylase activity. Moreover, analysis of the chlorophyll α fluorescence transient showed that N fertilizer application and supplemental irrigation significantly increased electron donor and acceptor performance of the photosystem II reaction centre.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
Seed shape (SS) affects the yield and appearance of soybean seeds significantly. However, little detailed information has been reported about the quantitative trait loci (QTL) affecting SS, especially SS components such as seed length (SL), seed width (SW) and seed thickness (ST), and their mutual ratios of length-to-weight (SLW), length-to-thickness (SLT) and weight-to-thickness (SWT). The aim of the present study was to identify QTL underlying SS components using 129 recombinant inbred lines derived from a cross between Dongnong46 and L-100. Phenotypic data were collected from this population after it was grown across nine environments. A total of 213 simple sequence repeat markers were used to construct the genetic linkage map, which covered approximately 3623·39 cM, with an average distance of 17·01 cM between markers. Five QTL were identified as being associated with SL, five with SW, three with ST, four with SLW, two with SLT and three with SWT. These QTL could explain 1·46–22·16% of the phenotypic variation in SS component traits. Three QTL were identified in more than six tested environments three for SL, two for SW, one for ST, two for SLW and one for SLT. These QTL have great potential value for marker-assistant selection of SS in soybean seeds.
Iron-deficiency anemia is a public health concern that frequently occurs in pregnant mammals and neonatal offspring. Ferrous N-carbamylglycinate chelate (Fe-CGly) is a newly designed iron fortifier with proven effects in iron-deficient rats and weanling piglets. However, the effects of this new compound on pregnant mammals are unknown. Therefore, this experiment was conducted to evaluate the effects of Fe-CGly on sow reproductive performance and iron status of both sows and neonatal piglets. A total of 40 large-white sows after second parity were randomly assigned to two groups (n=20). They were receiving a diet including 80 mg Fe/kg as FeSO4 or Fe-CGly, respectively, from day 85 of gestation to parturition. The serum (day 110 of pregnancy) and placentas of sows were sampled. Litter size, mean weight of live born piglets, birth (live) litter weight, number of live born piglets, and the number of still-born piglets, mummies, and weak-born piglets were recorded. Once delivered, eight litters were randomly selected from the 20 litters per treatment, and one new-born male piglet (1.503±0.142 kg) from each selected litter was slaughtered within 3 h after birth from the selected litters, without colostrum ingestion. The serum, longissimus muscle, liver and kidneys of the piglets were collected. The iron status of the serum samples and the messenger RNA level of iron-related genes in the placenta, liver and kidney were analyzed. The results showed that litter weight of live born piglets was higher (P=0.030) in the Fe-CGly group (19.86 kg) than in the FeSO4 group (17.34 kg). Fe-CGly significantly increased placental iron concentration (P<0.05) of sows. It also significantly increased iron saturation and reduced the total iron-binding capacity of piglets (P<0.05) at birth. However, the results revealed that supplementation of Fe-CGly in sows reduced liver and kidney iron concentration of neonatal piglets (P<0.05), indicating decreased iron storage. In addition, the concentration of iron in the colostrum was not significantly changed. Therefore, the present results suggested that replacement of maternal FeSO4 supplement with Fe-CGly in the late-gestating period for sows could improve litter birth weight, probably via enhanced iron transportation in the placenta.
Knowledge, attitudes and practices (KAP) of the population regarding severe fever with thrombocytopenia syndrome (SFTS) in endemic areas of Lu'an in China were assessed before and after an intervention programme. The pre-intervention phase was conducted using a sample of 425 participants from the 12 selected villages with the highest rates of endemic SFTS infection. A predesigned interview questionnaire was used to assess KAP. Subsequently, an intervention programme was designed and applied in the selected villages. KAP was re-assessed for each population in the selected villages using the same interview questionnaire. Following 2 months of the programme, 339 participants had completed the re-assessed survey. The impact of the intervention programme was evaluated using suitable statistical methods. A significant increase in the KAP and total KAP scores was noted following the intervention programme, whereas the proportion of correct knowledge, the positive attitudes and the effective practices toward SFTS of respondents increased significantly. The intervention programme was effective in improving KAP level of SFTS in populations that were resident in endemic areas.
The ability of the aorta to buffer blood flow and provide diastolic perfusion (Windkessel function) is a determinant of cardiovascular health. We have reported cardiac dysfunction indicating downstream vascular abnormalities in young adult baboons who were intrauterine growth restricted (IUGR) at birth as a result of moderate maternal nutrient reduction. Using 3 T MRI, we examined IUGR offspring (eight male, eight female; 5.7 years; human equivalent 25 years) and age-matched controls (eight male, eight female; 5.6 years) to quantify distal descending aortic cross-section (AC) and distensibility (AD). ANOVA showed decreased IUGR AC/body surface area (0.9±0.05 cm2/m2v. 1.2±0.06 cm2/m2, M±s.e.m., P<0.005) and AD (1.7±0.2 v. 4.0±0.5×10−3/mmHg, P<0.005) without sex difference or group-sex interaction, suggesting intrinsic vascular pathology and impaired development persisting in adulthood. Future studies should evaluate potential consequences of these changes on coronary perfusion, afterload and blood pressure.