We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
The complementary feeding period (6-23 months of age) is when solid foods are introduced alongside breastmilk or infant formula and is the most significant dietary change a person will experience. The introduction of complementary foods is important to meet changing nutritional requirements(1). Despite the rising Asian population in New Zealand, and the importance of nutrition during the complementary feeding period, there is currently no research on Asian New Zealand (NZ) infants’ micronutrient intakes from complementary foods. Complementary foods are a more easily modifiable component of the diet than breastmilk or other infant milk intake. This study aimed to compare the dietary intake of micronutrients from complementary foods of Asian infants and non-Asian infants in NZ. This study reported a secondary analysis of the First Foods New Zealand cross-sectional study of infants (aged 7.0-9.9 months) in Dunedin and Auckland. 24-hour recall data were analysed using FoodFiles 10 software with the NZ food composition database FOODfiles 2018, and additional data for commercial complementary foods(2). The multiple source method was used to estimate usual dietary intake. Ethnicity was collected from the main questionnaire of the study, answered by the respondents (the infant’s parent/caregiver). Within the Asian NZ group, three Asian subgroups were identified – South East Asian, East Asian, and South Asian. The non-Asian group included all remaining participants of non-Asian ethnicities. Most nutrient reference values (NRV’s)(3) available for the 7-12 month age group are for total intake from complementary foods and infant milks, so the adequacy for the micronutrient intakes from complementary foods alone could not be determined. Vitamin A was the only micronutrient investigated in this analysis that had an NRV available from complementary foods only, allowing conclusions around adequacy to be made. The Asian NZ group (n = 99) had lower mean group intakes than the non-Asian group (n = 526) for vitamin A (274µg vs. 329µg), and vitamin B12 (0.49µg vs. 0.65µg), and similar intakes for vitamin C (27.8mg vs. 28.5mg), and zinc (1.7mg vs. 1.9mg). Mean group iron intakes were the same for both groups (3.0mg). The AI for vitamin A from complementary foods (244µg) was exceeded by the mean intakes for both groups, suggesting that Vitamin A intakes were adequate. The complementary feeding period is a critical time for obtaining nutrients essential for development and growth. The results from this study indicate that Asian NZ infants have lower intakes of two of the micronutrients of interest than the non-Asian infants in NZ. However, future research is needed with the inclusion of infant milk intake in these groups to understand the total intake of the micronutrients. Vitamin A intakes do appear to be adequate in NZ infants.
The prevalence of food allergies in New Zealand infants is unknown; however, it is thought to be similar to Australia, where the prevalence is over 10% of 1-year-olds(1). Current New Zealand recommendations for reducing the risk of food allergies are to: offer all infants major food allergens (age appropriate texture) at the start of complementary feeding (around 6 months); ensure major allergens are given to all infants before 1 year; once a major allergen is tolerated, maintain tolerance by regularly (approximately twice a week) offering the allergen food; and continue breastfeeding while introducing complementary foods(2). To our knowledge, there is no research investigating whether parents follow these recommendations. Therefore, this study aimed to explore parental offering of major food allergens to infants during complementary feeding and parental-reported food allergies. The cross-sectional study included 625 parent-infant dyads from the multi-centred (Auckland and Dunedin) First Foods New Zealand study. Infants were 7-10 months of age and participants were recruited in 2020-2022. This secondary analysis included the use of a study questionnaire and 24-hour diet recall data. The questionnaire included determining whether the infant was currently breastfed, whether major food allergens were offered to the infant, whether parents intended to avoid any foods during the first year of life, whether the infant had any known food allergies, and if so, how they were diagnosed. For assessing consumers of major food allergens, 24-hour diet recall data was used (2 days per infant). The questionnaire was used to determine that all major food allergens were offered to 17% of infants aged 9-10 months. On the diet recall days, dairy (94.4%) and wheat (91.2%) were the most common major food allergens consumed. Breastfed infants (n = 414) were more likely to consume sesame than non-breastfed infants (n = 211) (48.8% vs 33.7%, p≤0.001). Overall, 12.6% of infants had a parental-reported food allergy, with egg allergy being the most common (45.6% of the parents who reported a food allergy). A symptomatic response after exposure was the most common diagnostic tool. In conclusion, only 17% of infants were offered all major food allergens by 9-10 months of age. More guidance may be required to ensure current recommendations are followed and that all major food allergens are introduced by 1 year of age. These results provide critical insight into parents’ current practices, which is essential in determining whether more targeted advice regarding allergy prevention and diagnosis is required.
Although food insecurity affects a significant proportion of young children in New Zealand (NZ)(1), evidence of its association with dietary intake and sociodemographic characteristics in this population is lacking. This study aims to assess the household food security status of young NZ children and its association with energy and nutrient intake and sociodemographic factors. This study included 289 caregiver and child (1-3 years old) dyads from the same household in either Auckland, Wellington, or Dunedin, NZ. Household food security status was determined using a validated and NZ-specific eight-item questionnaire(2). Usual dietary intake was determined from two 24-hour food recalls, using the multiple source method(3). The prevalence of inadequate nutrient intake was assessed using the Estimated Average Requirement (EAR) cut-point method and full probability approach. Sociodemographic factors (i.e., socioeconomic status, ethnicity, caregiver education, employment status, household size and structure) were collected from questionnaires. Linear regression models were used to estimate associations with statistical significance set at p <0.05. Over 30% of participants had experienced food insecurity in the past 12 months. Of all eight indicator statements, “the variety of foods we are able to eat is limited by a lack of money,” had the highest proportion of participants responding “often” or “sometimes” (35.8%). Moderately food insecure children exhibited higher fat and saturated fat intakes, consuming 3.0 (0.2, 5.8) g/day more fat, and 2.0 (0.6, 3.5) g/day more saturated fat compared to food secure children (p<0.05). Severely food insecure children had lower g/kg/day protein intake compared to food secure children (p<0.05). In comparison to food secure children, moderately and severely food insecure children had lower fibre intake, consuming 1.6 (2.8, 0.3) g/day and 2.6 (4.0, 1.2) g/day less fibre, respectively. Severely food insecure children had the highest prevalence of inadequate calcium (7.0%) and vitamin C (9.3%) intakes, compared with food secure children [prevalence of inadequate intakes: calcium (2.3%) and vitamin C (2.8%)]. Household food insecurity was more common in those of Māori or Pacific ethnicity; living in areas of high deprivation; having a caregiver who was younger, not in paid employment, or had low educational attainment; living with ≥2 other children in the household; and living in a sole-parent household. Food insecure young NZ children consume a diet that exhibits lower nutritional quality in certain measures compared to their food-secure counterparts. Food insecurity was associated with various sociodemographic factors that are closely linked with poverty or low income. As such, there is an urgent need for poverty mitigation initiatives to safeguard vulnerable young children from the adverse consequences of food insecurity.
To investigate the symptoms of SARS-CoV-2 infection, their dynamics and their discriminatory power for the disease using longitudinally, prospectively collected information reported at the time of their occurrence. We have analysed data from a large phase 3 clinical UK COVID-19 vaccine trial. The alpha variant was the predominant strain. Participants were assessed for SARS-CoV-2 infection via nasal/throat PCR at recruitment, vaccination appointments, and when symptomatic. Statistical techniques were implemented to infer estimates representative of the UK population, accounting for multiple symptomatic episodes associated with one individual. An optimal diagnostic model for SARS-CoV-2 infection was derived. The 4-month prevalence of SARS-CoV-2 was 2.1%; increasing to 19.4% (16.0%–22.7%) in participants reporting loss of appetite and 31.9% (27.1%–36.8%) in those with anosmia/ageusia. The model identified anosmia and/or ageusia, fever, congestion, and cough to be significantly associated with SARS-CoV-2 infection. Symptoms’ dynamics were vastly different in the two groups; after a slow start peaking later and lasting longer in PCR+ participants, whilst exhibiting a consistent decline in PCR- participants, with, on average, fewer than 3 days of symptoms reported. Anosmia/ageusia peaked late in confirmed SARS-CoV-2 infection (day 12), indicating a low discrimination power for early disease diagnosis.
Undesirable behaviours (UBs) are common in dogs and can jeopardise animal and human health, leading to dog abandonment and euthanasia. Dogs exhibiting UBs may have compromised welfare from underlying emotional motivations for the behaviour (eg anxiety) or from the methods used by owners to resolve the problem (eg aversive techniques). The objective of this study was to estimate proportional mortality due to UBs and risk factors for death due to UBs, including death from road traffic accidents, in dogs under three years of age attending primary-care veterinary practices in England from 2009-2014. Cases were identified by searching de-identified electronic patient records from primary-care veterinary practices participating in the VetCompass Programme. The findings highlight that dogs under three years of age are at a proportionately high risk of death due to UBs (33.7%) compared with other specific causes of death (eg gastrointestinal issues: 14.5%). Male dogs had 1.40x the odds of death from UB compared with females. The proportional mortality from UB for male dogs where information on the cause of death was available was 0.41. Neutered dogs had 1.94x the odds of death due to a UB compared with entire dogs. Aggression was the most prevalent UB overall. Veterinarians had recommended referral in 10.3% of cases where dogs died due to exhibiting a UB and had dispensed nutraceutical, pheromone or pharmacological treatment to 3.0% of the UB cases that died. This study shows that undesirable behaviours require better preventive measures and treatment, through further research and education of veterinarians, other professionals within the dog industry and owners.
This research examines maternal smoking during pregnancy and risk for poorer executive function in siblings discordant for exposure. Data (N = 173 families) were drawn from the Missouri Mothers and Their Children study, a sample, identified using birth records (years 1998–2005), in which mothers changed smoking behavior between two pregnancies (Child 1 [older sibling]: Mage = 12.99; Child 2 [younger sibling]: Mage = 10.19). A sibling comparison approach was used, providing a robust test for the association between maternal smoking during pregnancy and different aspects of executive function in early-mid adolescence. Results suggested within-family (i.e., potentially causal) associations between maternal smoking during pregnancy and one working memory task (visual working memory) and one response inhibition task (color-word interference), with increased exposure associated with decreased performance. Maternal smoking during pregnancy was not associated with stop-signal reaction time, cognitive flexibility/set-shifting, or auditory working memory. Initial within-family associations between maternal smoking during pregnancy and visual working memory as well as color-word interference were fully attenuated in a model including child and familial covariates. These findings indicate that exposure to maternal smoking during pregnancy may be associated with poorer performance on some, but not all skills assessed; however, familial transmission of risk for low executive function appears more important.
Little is known about Se intakes and status in very young New Zealand children. However, Se intakes below recommendations and lower Se status compared with international studies have been reported in New Zealand (particularly South Island) adults. The Baby-Led Introduction to SolidS (BLISS) randomised controlled trial compared a modified version of baby-led weaning (infants feed themselves rather than being spoon-fed), with traditional spoon-feeding (Control). Weighed 3-d diet records were collected and plasma Se concentration measured using inductively coupled plasma mass spectrometry (ICP-MS). In total, 101 (BLISS n 50, Control n 51) 12-month-old toddlers provided complete data. The OR of Se intakes below the estimated average requirement (EAR) was no different between BLISS and Control (OR: 0·89; 95 % CI 0·39, 2·03), and there was no difference in mean plasma Se concentration between groups (0·04 μmol/l; 95 % CI −0·03, 0·11). In an adjusted model, consuming breast milk was associated with lower plasma Se concentrations (–0·12 μmol/l; 95 % CI −0·19, −0·04). Of the food groups other than infant milk (breast milk or infant formula), ‘breads and cereals’ contributed the most to Se intakes (12 % of intake). In conclusion, Se intakes and plasma Se concentrations of 12-month-old New Zealand toddlers were no different between those who had followed a baby-led approach to complementary feeding and those who followed traditional spoon-feeding. However, more than half of toddlers had Se intakes below the EAR.
Previous research has shown that self-reports of the amount of social support are heritable. Using the Kessler perceived social support (KPSS) measure, we explored sex differences in the genetic and environmental contributions to individual differences. We did this separately for subscales that captured the perceived support from different members of the network (spouse, twin, children, parents, relatives, friends and confidant). Our sample comprised 7059 male, female and opposite-sex twin pairs aged 18−95 years from the Australian Twin Registry. We found tentative support for different genetic mechanisms in males and females for support from friends and the average KPSS score of all subscales, but otherwise, there are no sex differences. For each subscale alone, the additive genetic (A) and unique environment (E) effects were significant. By contrast, the covariation among the subscales was explained — in roughly equal parts — by A, E and the common environment, with effects of different support constellations plausibly accounting for the latter. A single genetic and common environment factor accounted for between half and three-quarters of the variance across the subscales in both males and females, suggesting little heterogeneity in the genetic and environmental etiology of the different support sources.
Impairment in reciprocal social behavior (RSB), an essential component of early social competence, clinically defines autism spectrum disorder (ASD). However, the behavioral and genetic architecture of RSB in toddlerhood, when ASD first emerges, has not been fully characterized. We analyzed data from a quantitative video-referenced rating of RSB (vrRSB) in two toddler samples: a community-based volunteer research registry (n = 1,563) and an ethnically diverse, longitudinal twin sample ascertained from two state birth registries (n = 714). Variation in RSB was continuously distributed, temporally stable, significantly associated with ASD risk at age 18 months, and only modestly explained by sociodemographic and medical factors (r2 = 9.4%). Five latent RSB factors were identified and corresponded to aspects of social communication or restricted repetitive behaviors, the two core ASD symptom domains. Quantitative genetic analyses indicated substantial heritability for all factors at age 24 months (h2 ≥ .61). Genetic influences strongly overlapped across all factors, with a social motivation factor showing evidence of newly-emerging genetic influences between the ages of 18 and 24 months. RSB constitutes a heritable, trait-like competency whose factorial and genetic structure is generalized across diverse populations, demonstrating its role as an early, enduring dimension of inherited variation in human social behavior. Substantially overlapping RSB domains, measurable when core ASD features arise and consolidate, may serve as markers of specific pathways to autism and anchors to inform determinants of autism's heterogeneity.
The Eating Assessment in Toddlers FFQ (EAT FFQ) has been shown to have good reliability and comparative validity for ranking nutrient intakes in young children. With the addition of food items (n 4), we aimed to re-assess the validity of the EAT FFQ and estimate calibration factors in a sub-sample of children (n 97) participating in the Growing Up Milk – Lite (GUMLi) randomised control trial (2015–2017). Participants completed the ninety-nine-item GUMLi EAT FFQ and record-assisted 24-h recalls (24HR) on two occasions. Energy and nutrient intakes were assessed at months 9 and 12 post-randomisation and calibration factors calculated to determine predicted estimates from the GUMLi EAT FFQ. Validity was assessed using Pearson correlation coefficients, weighted kappa (κ) and exact quartile categorisation. Calibration was calculated using linear regression models on 24HR, adjusted for sex and treatment group. Nutrient intakes were significantly correlated between the GUMLi EAT FFQ and 24HR at both time points. Energy-adjusted, de-attenuated Pearson correlations ranged from 0·3 (fibre) to 0·8 (Fe) at 9 months and from 0·3 (Ca) to 0·7 (Fe) at 12 months. Weighted κ for the quartiles ranged from 0·2 (Zn) to 0·6 (Fe) at 9 months and from 0·1 (total fat) to 0·5 (Fe) at 12 months. Exact agreement ranged from 30 to 74 %. Calibration factors predicted up to 56 % of the variation in the 24HR at 9 months and 44 % at 12 months. The GUMLi EAT FFQ remained a useful tool for ranking nutrient intakes with similar estimated validity compared with other FFQ used in children under 2 years.
Gut microbiota data obtained by DNA sequencing are not only complex because of the number of taxa that may be detected within human cohorts, but also compositional because characteristics of the microbiota are described in relative terms (e.g., “relative abundance” of particular bacterial taxa expressed as a proportion of the total abundance of taxa). Nutrition researchers often use standard principal component analysis (PCA) to derive dietary patterns from complex food data, enabling each participant's diet to be described in terms of the extent to which it fits their cohort's dietary patterns. However, compositional PCA methods are not commonly used to describe patterns of microbiota in the way that dietary patterns are used to describe diets. This approach would be useful for identifying microbiota patterns that are associated with diet and body composition. The aim of this study is to use compositional PCA to describe gut microbiota profiles in 5 year old children and explore associations between microbiota profiles, diet, body mass index (BMI) z-score, and fat mass index (FMI) z-score. This study uses a cross-sectional data for 319 children who provided a faecal sample at 5 year of age. Their primary caregiver completed a 123-item quantitative food frequency questionnaire validated for foods of relevance to the gut microbiota. Body composition was determined using dual-energy x-ray absorptiometry, and BMI and FMI z-scores calculated. Compositional PCA identified and described gut microbiota profiles at the genus level, and profiles were examined in relation to diet and body size. Three gut microbiota profiles were found. Profile 1 (positive loadings on Blautia and Bifidobacterium; negative loadings on Bacteroides) was not related to diet or body size. Profile 2 (positive loadings on Bacteroides; negative loadings on uncultured Christensenellaceae and Ruminococcaceae) was associated with a lower BMI z-score (r = -0.16, P = 0.003). Profile 3 (positive loadings on Faecalibacterium, Eubacterium and Roseburia) was associated with higher intakes of fibre (r = 0.15, P = 0.007); total (r = 0.15, P = 0.009), and insoluble (r = 0.13, P = 0.021) non-starch polysaccharides; protein (r = 0.12, P = 0.036); meat (r = 0.15, P = 0.010); and nuts, seeds and legumes (r = 0.11, P = 0.047). Further regression analyses found that profile 2 and profile 3 were independently associated with BMI z-score and diet respectively. We encourage fellow researchers to use compositional PCA as a method for identifying further links between the gut, diet and obesity, and for developing the next generation of research in which the impact on body composition of dietary interventions that modify the gut microbiota is determined.
Estimating speciation and extinction rates is essential for understanding past and present biodiversity, but is challenging given the incompleteness of the rock and fossil records. Interest in this topic has led to a divergent suite of independent methods—paleontological estimates based on sampled stratigraphic ranges and phylogenetic estimates based on the observed branching times in a given phylogeny of living species. The fossilized birth–death (FBD) process is a model that explicitly recognizes that the branching events in a phylogenetic tree and sampled fossils were generated by the same underlying diversification process. A crucial advantage of this model is that it incorporates the possibility that some species may never be sampled. Here, we present an FBD model that estimates tree-wide diversification rates from stratigraphic range data when the underlying phylogeny of the fossil taxa may be unknown. The model can be applied when only occurrence data for taxonomically identified fossils are available, but still accounts for the incomplete phylogenetic structure of the data. We tested this new model using simulations and focused on how inferences are impacted by incomplete fossil recovery. We compared our approach with a phylogenetic model that does not incorporate incomplete species sampling and to three fossil-based alternatives for estimating diversification rates, including the widely implemented boundary-crosser and three-timer methods. The results of our simulations demonstrate that estimates under the FBD model are robust and more accurate than the alternative methods, particularly when fossil data are sparse, as the FBD model incorporates incomplete species sampling explicitly.
The spread of the Zika virus (ZIKV) in the Americas led to large outbreaks across the region and most of the Southern hemisphere. Of greatest concern were complications following acute infection during pregnancy. At the beginning of the outbreak, the risk to unborn babies and their clinical presentation was unclear. This report describes the methods and results of the UK surveillance response to assess the risk of ZIKV to children born to returning travellers. Established surveillance systems operating within the UK – the paediatric and obstetric surveillance units for rare diseases, and national laboratory monitoring – enabled rapid assessment of this emerging public health threat. A combined total of 11 women experiencing adverse pregnancy outcomes after possible ZIKV exposure were reported by the three surveillance systems; five miscarriages, two intrauterine deaths and four children with clinical presentations potentially associated with ZIKV infection. Sixteen women were diagnosed with ZIKV during pregnancy in the UK. Amongst the offspring of these women, there was unequivocal laboratory evidence of infection in only one child. In the UK, the number and risk of congenital ZIKV infection for travellers returning from ZIKV-affected countries is very small.
OBJECTIVES/SPECIFIC AIMS: Preterm birth rates have been rising in the United States, and reducing preterm birth is a high-priority clinical and public health concern. There are no existing strategies to reduce preterm birth in nulliparous individuals. The present study aims to evaluate prenatal care as a protective factor for preterm birth in this population. METHODS/STUDY POPULATION: Missouri birth record data for child birth years 1993-2016 were used to create a sample of 325,088 singleton births to nulliparous women, themselves born in MO 1975-1985. Logistic regressions, stratified by maternal race (White, African-American, Asian, American Indian/Alaskan Native, Other), were used to predict preterm birth (< 37 weeks gestational age) as a function of 1) initiation of prenatal care of by end of first trimester and 2) Adequacy of Prenatal Care Utilization Index, with sociodemographic covariates of child birth year, maternal age, highest educational level, and marital status (four level variable, including married yes/no, and partner named on birth record, yes/no). Subsequent analyses will use this logistic regression to create a propensity score predicting smoking during pregnancy using birth record parental sociodemographic characteristics, stratified by maternal race. Primary analyses will focus on the role of prenatal care in predicting smoking during pregnancy and preterm birth risk within propensity score stratum. Secondary analyses will consider the role of other risk factors, including maternal pre-pregnancy BMI and maternal DUI history, on preterm birth risk. RESULTS/ANTICIPATED RESULTS: Preliminary logistic regressions predicting preterm birth were analyzed, stratified by maternal race. In White mothers, preterm birth prevalence was 8.2%, and risk was significantly increased by maternal age ≤ 15 and ≥ 31, being unmarried, and by receiving no prenatal care, yet unaffected by timing of prenatal care initiation. For African-American mothers, preterm birth prevalence was 11.9%, and risk was significantly increased by being unmarried and both by not initiating prenatal care by end of first trimester and receiving no prenatal care. Preliminary samples were too small for solid inferences for other races. Anticipated results are that after propensity score match, earlier initiation of prenatal care will show modest protective effect on preterm birth, but other characteristics such as maternal cigarette smoking during pregnancy and DUI status will show stronger effects on predicting preterm birth risk. DISCUSSION/SIGNIFICANCE OF IMPACT: By evaluating the role of prenatal care initiation and delivery on preterm birth, this work provides an evidence base for prenatal care schedules and for understanding the interplay of sociodemographics, healthcare delivery, and individual characteristics in the context of preterm birth risk and potentially reduce negative health outcomes.
Vulnerability to depression can be measured in different ways. We here examine how genetic risk factors are inter-related for lifetime major depression (MD), self-report current depressive symptoms and the personality trait Neuroticism.
Method
We obtained data from three population-based adult twin samples (Virginia n = 4672, Australia #1 n = 3598 and Australia #2 n = 1878) to which we fitted a common factor model where risk for ‘broadly defined depression’ was indexed by (i) lifetime MD assessed at personal interview, (ii) depressive symptoms, and (iii) neuroticism. We examined the proportion of genetic risk for MD deriving from the common factor v. specific to MD in each sample and then analyzed them jointly. Structural equation modeling was conducted in Mx.
Results
The best fit models in all samples included additive genetic and unique environmental effects. The proportion of genetic effects unique to lifetime MD and not shared with the broad depression common factor in the three samples were estimated as 77, 61, and 65%, respectively. A cross-sample mega-analysis model fit well and estimated that 65% of the genetic risk for MD was unique.
Conclusion
A large proportion of genetic risk factors for lifetime MD was not, in the samples studied, captured by a common factor for broadly defined depression utilizing MD and self-report measures of current depressive symptoms and Neuroticism. The genetic substrate for MD may reflect neurobiological processes underlying the episodic nature of its cognitive, motor and neurovegetative manifestations, which are not well indexed by current depressive symptom and neuroticism.
OBJECTIVES/SPECIFIC AIMS: Smoking during pregnancy (SDP) is associated with negative health outcomes, both proximal (e.g., preterm labor, cardiovascular changes, low birth weight) and distal (e.g., increased child externalizing behaviors and attention deficit/hyperactivity disorder (ADHD) symptoms, increased risk of child smoking). As pregnancy provides a unique, strong incentive to quit smoking, investigating SDP allows analysis of individual predictive factors of recalcitrant smoking behaviors. Utilizing a female twin-pair cohort provides a model system for characterizing genotype×environment interactions using statistical approaches. METHODS/STUDY POPULATION: Using women from the Missouri Adolescent Female Twin Study, parental report of twin ADHD inattentive and hyperactive symptoms at twin median age 15, and twin report of DSM-IV lifetime diagnosis of major depressive disorder, trauma exposure (physical assault and childhood sexual abuse), collected at median age 22, were merged with Missouri birth record data for enrolled twins, leading to 1553 individuals of European ancestry and 163 individuals of African-American ancestry included in final analyses. A SDP propensity score was calculated from sociodemographic variables (maternal age, marital status, educational attainment, first born child) and used as a 6-level ordinal covariate in subsequent logistic regressions. RESULTS/ANTICIPATED RESULTS: For European ancestry individuals, parental report of hyperactive ADHD symptoms and exposure to childhood sexual abuse were predictive of SDP, while a lifetime diagnosis of major depressive disorder, parental report of inattentive ADHD symptoms, and exposure to assaultive trauma were all not significantly predictive of future SDP. For African-American individuals, none of these variables were significant in predicting future SDP. DISCUSSION/SIGNIFICANCE OF IMPACT: Understanding this relationship of risk-mechanisms is important for clinical understanding of early predictors of SDP and tailoring interventions to at risk individuals. Ultimately, the focus of this research is to mitigate risk to pregnant smokers and their children. Additionally, the cohort-ecological approach informs how well research and administrative (vital record) data agree. This allows for evaluation of whether administrative data improve prediction in research cohorts, and conversely if research data improve prediction over standard sociodemographic variables available in administrative data.
Important Bird and Biodiversity Areas (IBAs) are sites identified as being globally important for the conservation of bird populations on the basis of an internationally agreed set of criteria. We present the first review of the development and spread of the IBA concept since it was launched by BirdLife International (then ICBP) in 1979 and examine some of the characteristics of the resulting inventory. Over 13,000 global and regional IBAs have so far been identified and documented in terrestrial, freshwater and marine ecosystems in almost all of the world’s countries and territories, making this the largest global network of sites of significance for biodiversity. IBAs have been identified using standardised, data-driven criteria that have been developed and applied at global and regional levels. These criteria capture multiple dimensions of a site’s significance for avian biodiversity and relate to populations of globally threatened species (68.6% of the 10,746 IBAs that meet global criteria), restricted-range species (25.4%), biome-restricted species (27.5%) and congregatory species (50.3%); many global IBAs (52.7%) trigger two or more of these criteria. IBAs range in size from < 1 km2 to over 300,000 km2 and have an approximately log-normal size distribution (median = 125.0 km2, mean = 1,202.6 km2). They cover approximately 6.7% of the terrestrial, 1.6% of the marine and 3.1% of the total surface area of the Earth. The launch in 2016 of the KBA Global Standard, which aims to identify, document and conserve sites that contribute to the global persistence of wider biodiversity, and whose criteria for site identification build on those developed for IBAs, is a logical evolution of the IBA concept. The role of IBAs in conservation planning, policy and practice is reviewed elsewhere. Future technical priorities for the IBA initiative include completion of the global inventory, particularly in the marine environment, keeping the dataset up to date, and improving the systematic monitoring of these sites.
BirdLife International´s Important Bird and Biodiversity Areas (IBA) Programme has identified, documented and mapped over 13,000 sites of international importance for birds. IBAs have been influential with governments, multilateral agreements, businesses and others in: (1) informing governments’ efforts to expand protected area networks (in particular to meet their commitments through the Convention on Biological Diversity); (2) supporting the identification of Ecologically or Biologically Significant Areas (EBSAs) in the marine realm, (3) identifying Wetlands of International Importance under the Ramsar Convention; (4) identifying sites of importance for species under the Convention on Migratory Species and its sister agreements; (5) identifying Special Protected Areas under the EU Birds Directive; (6) applying the environmental safeguards of international finance institutions such as the International Finance Corporation; (7) supporting the private sector to manage environmental risk in its operations; and (8) helping donor organisations like the Critical Ecosystems Partnership Fund (CEPF) to prioritise investment in site-based conservation. The identification of IBAs (and IBAs in Danger: the most threatened of these) has also triggered conservation and management actions at site level, most notably by civil society organisations and local conservation groups. IBA data have therefore been widely used by stakeholders at different levels to help conserve a network of sites essential to maintaining the populations and habitats of birds as well as other biodiversity. The experience of IBA identification and conservation is shaping the design and implementation of the recently launched Key Biodiversity Areas (KBA) Partnership and programme, as IBAs form a core part of the KBA network.