To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We compare domestic architectural features in New England and the Maritime Peninsula to investigate the relationship between the adoption of horticulture and its relationship to social and settlement change during the Woodland Period. Horticulture was not practiced on the Maritime Peninsula until after European contact, despite cultural and environmental similarity to New England. In New England, horticulture has been implicated in profound social and settlement changes. However, aggregated villages, a unit typically investigated for evidence of social change, have proven elusive in the archaeological record. We compiled and analyzed a dataset of dwelling features instead of relying on identifiable villages. This novel quantitative approach uses dwelling feature shape and size as a proxy for social and settlement change, considering these changes at the scale of the house. We find that, during the Woodland Period, dwelling size was overall slightly larger in New England than on the Maritime Peninsula, but ranges heavily overlapped. After the introduction of horticulture, however, dwellings in New England grew in size overall and assumed bimodally distinct larger and smaller forms, which likely necessitated a restructuring of social and economic behavior. This pattern correlates maize horticulture with changes in social and economic lifestyle in Late Woodland New England.
Buprenorphine/samidorphan (BUP/SAM), a combination of BUP (a µ-opioid receptor partial agonist and κ-antagonist) and SAM (a sublingually bioavailable µ-opioid antagonist), is an investigational opioid system modulator for depression. BUP/SAM has shown efficacy versus placebo as an adjunctive treatment for major depressive disorder (MDD) and a consistent safety profile in previously reported, placebo-controlled clinical studies.1,2
1. To characterize the safety profile following long-term treatment with BUP/SAM
2. To explore depression symptoms and remission rates in patients with MDD following long-term treatment with BUP/SAM
FORWARD-2 (Clinicaltrials.gov ID: NCT02141399) enrolled patients who had participated in 1 of 4 controlled studies as well as de novo patients. All patients had a confirmed diagnosis of MDD, had a history of inadequate response to standard antidepressant therapies (ADTs), and had been treated with an adequate dose of an established ADT for ≥8weeks before BUP/SAM initiation. ADT dosage could be titrated, but the ADT could not be changed. During the study, patients received open-label, sublingual BUP/SAM 2mg/2mg as adjunctive treatment for up to 52weeks. Safety (primary objective) was assessed via adverse events (AEs), vital signs, laboratory analytes, and electrocardiography. Suicidal ideation or behavior (SIB) was evaluated by the Columbia Suicide Severity Rating Scale. Abuse potential, dependence, and withdrawal were assessed by AEs and the Clinical Opiate Withdrawal Scale. Exploratory efficacy endpoints included mean Montgomery–Åsberg Depression Rating Scale (MADRS) scores and remission rate (MADRS ≤10).
Of 1454 total patients, 49% completed the 52-week study, 11% discontinued due to an AE, and 40% discontinued because of other reasons as of the interim data cutoff date (April 30, 2017). Most AEs were of mild/moderate severity. Serious AEs were reported in 3.2% of patients. AEs occurring in ≥10% of patients were nausea, headache, constipation, dizziness, and somnolence. There was no evidence of increased risk of SIB with BUP/SAM. Incidence of euphoria-related events was low (1.2%). After abrupt discontinuation of BUP/SAM, there was little evidence of withdrawal. BUP/SAM was not associated with meaningful changes in laboratory or metabolic parameters or in bodyweight. The mean MADRS score decreased from 22.9 (±9.7) at baseline to 9.8 (±8.8) after 52weeks. The remission rate at 52weeks was 52.5%.
Long-term treatment with BUP/SAM did not reveal any new safety findings and confirmed that the risk of abuse and dependence with BUP/SAM was low. BUP/SAM maintained an antidepressant effect for up to 52weeks of treatment in patients with MDD.
Recovery of multidrug-resistant (MDR) Pseudomonas aeruginosa and Klebsiella pneumoniae from a cluster of patients in the medical intensive care unit (MICU) prompted an epidemiologic investigation for a common exposure.
Clinical and microbiologic data from MICU patients were retrospectively reviewed, MICU bronchoscopes underwent culturing and borescopy, and bronchoscope reprocessing procedures were reviewed. Bronchoscope and clinical MDR isolates epidemiologically linked to the cluster underwent molecular typing using pulsed-field gel electrophoresis (PFGE) followed by whole-genome sequencing.
Of the 33 case patients, 23 (70%) were exposed to a common bronchoscope (B1). Both MDR P. aeruginosa and K. pneumonia were recovered from the bronchoscope’s lumen, and borescopy revealed a luminal defect. Molecular testing demonstrated genetic relatedness among case patient and B1 isolates, providing strong evidence for horizontal bacterial transmission. MDR organism (MDRO) recovery in 19 patients was ultimately linked to B1 exposure, and 10 of 19 patients were classified as belonging to an MDRO pseudo-outbreak.
Surveillance of bronchoscope-derived clinical culture data was important for early detection of this outbreak, and whole-genome sequencing was important for the confirmation of findings. Visualization of bronchoscope lumens to confirm integrity should be a critical component of device reprocessing.
Paleoecological data from the Quaternary Period (2.6 million years ago to present) provides an opportunity for educational outreach for the earth and biological sciences. Paleoecology data repositories serve as technical hubs and focal points within their disciplinary communities and so are uniquely situated to help produce teaching modules and engagement resources. The Neotoma Paleoecology Database provides support to educators from primary schools to graduate students. In collaboration with pedagogical experts, the Neotoma Paleoecology Database team has developed teaching modules and model workflows. Early education is centered on discovery; higher-level educational tools focus on illustrating best practices for technical tasks. Collaborations among pedagogic experts, technical experts and data stewards, centered around data resources such as Neotoma, provide an important role within research communities, and an important service to society, supporting best practices, translating current research advances to interested audiences, and communicating the importance of individual research disciplines.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
Structured, empirically supported psychological interventions are lacking for patients who require organ transplantation. This stage IA psychotherapy development project developed and tested the feasibility, acceptability, tolerability, and preliminary efficacy of an 8-week group cognitive behavioral stress management intervention adapted for patients with end-stage liver disease awaiting liver transplantation.
Twenty-nine English-speaking United Network for Organ Sharing–registered patients with end-stage liver disease from a single transplantation center enrolled in 8-week, group cognitive-behavioral liver stress management and relaxation training intervention adapted for patients with end-stage liver disease. Patients completed pre- and postintervention surveys that included the Beck Depression Inventory II and the Beck Anxiety Inventory. Feasibility, acceptability, tolerability, and preliminary efficacy were assessed.
Attendance rate was 69.40%. The intervention was rated as “good” to “excellent” by 100% of participants who completed the postintervention survey in teaching them new skills to relax and to cope with stress, and by 94.12% of participants in helping them feel supported while waiting for a liver transplant. No adverse events were recorded over the course of treatment. Attrition was 13.79%. Anxious and depressive symptoms were not statistically different after the intervention.
Significance of results
The liver stress management and relaxation training intervention is feasible, acceptable, and tolerable to end-stage liver disease patients within a transplant clinic setting. Anxious and depressive symptoms remained stable postintervention. Randomized controlled trials are needed to study the intervention's effectiveness in this population.
OBJECTIVES/SPECIFIC AIMS: Rodent models can be used to study neonatal abstinence syndrome (NAS), but the applicability of findings from the models to NAS in humans is not well understood. The objective of this study was to develop a rat model of norbuprenorphine-induced NAS and validate its translational value by comparing blood concentrations in the norbuprenorphine-treated pregnant rat to those previously reported in pregnant women undergoing buprenorphine treatment. METHODS/STUDY POPULATION: Pregnant Long-Evans rats were implanted with 14-day osmotic minipumps containing vehicle, morphine (positive control), or norbuprenorphine (0.3–3 mg/kg/d) on gestation day 9. Within 12 hours of delivery, pups were tested for spontaneous or precipitated opioid withdrawal by injecting them with saline (10 mL/kg, i.p.) or naltrexone (1 or 10 mg/kg, i.p), respectively, and observing them for well-validated neonatal withdrawal signs. Blood was sampled via indwelling jugular catheters from a subset of norbuprenorphine-treated dams on gestation day 8, 10, 13, 17, and 20. Norbuprenorphine concentrations in whole blood samples were quantified using LC/MS/MS. RESULTS/ANTICIPATED RESULTS: Blood concentrations of norbuprenorphine in rats exposed to 1–3 mg/kg/d of norbuprenorphine were similar to levels previously reported in pregnant women undergoing buprenorphine treatment. Pups born to dams treated with these doses exhibited robust withdrawal signs. Blood concentrations of norbuprenorphine decreased across gestation, which is similar to previous reports in humans. DISCUSSION/SIGNIFICANCE OF IMPACT: These results suggest that dosing dams with 1–3 mg/kg/day norbuprenorphine produces maternal blood concentrations and withdrawal severity similar to those previously reported in humans. This provides evidence that, at these doses, this model is useful for testing hypotheses about norbuprenorphine that are applicable to NAS in humans.
To determine if family childcare homes (FCCH) in Nebraska meet best practices for nutrition and screen time, and if focusing on nutrition and screen time policies and practices improves the FCCH environment.
A pre–post evaluation was conducted using the Go Nutrition and Physical Activity Self-Assessment for Childcare (Go NAP SACC).
FCCH in Nebraska, USA.
FCCH enrolled in the Child and Adult Care Food Program (CACFP; n 208) participated in a pre–post evaluation using Go NAP SACC.
At baseline, all FCCH met the minimum childcare standards for fifty-four of fifty-six practices in nutrition and screen time. After the intervention, FCCH demonstrated significant improvement in fourteen of the forty-four Child Nutrition items and eleven of the twelve Screen Time items. However, FCCH providers did not meet best practices at post-intervention. Lowest scores were found in serving meals family-style, promoting visible support for healthy eating, planned nutrition education and written policy on child nutrition. For screen time, lowest scores were reported on the availability of television, offering families education on screen time and having a written policy on screen time.
FCCH in Nebraska were able to strengthen their policies and practices after utilizing Go NAP SACC. Continued professional development and participation in targeted interventions may assist programmes in sustaining improved practices and policies. Considering the varying standards and policies surrounding FCCH, future studies comparing the current findings with childcare centres and non-CACFP programmes are warranted.
OBJECTIVES/SPECIFIC AIMS: The objective of this study is to use machine Learning techniques to generate maps of epithelium and lumen density in MRI space. METHODS/STUDY POPULATION: Methods: We prospectively recruited 39 patients undergoing prostatectomy for this institutional review board (IRB) approved study. Patients underwent MP-MRI before prostatectomy on a 3T field strength MRI scanner (General Electric, Waukesha, WI, USA) using an endorectal coil. MP-MRI included field-of-view optimized and constrained undistorted single shot (FOCUS) diffusion weighted imaging with 10 b-values (b=0, 10, 25, 50, 80, 100, 200, 500, 1000, and 2000), dynamic contrast enhanced imaging, and T2-weighted imaging. T2 weighted images were intensity normalized and apparent diffusion coefficient maps were calculated. The dynamic contrast enhanced data was used to calculate the percent change in signal intensity before and after contrast injection. All images were aligned to the T2 weighted image. Robotic prostatectomy was performed 2 weeks after image acquisition. Prostate samples were sliced using a 3D printed slicing jig matching the slice profile of the T2 weighted image. Whole mount samples at 10 μm thickness were taken, hematoxylin and eosin stained, digitized, and annotated by a board certified pathologist. A total of 210 slides were included in this study. Lumen and epithelium were automatically segmented using a custom algorithm written in MATLAB. The algorithm was validated by comparing manual to automatic segmentation on 18 samples. Slides were aligned with the T2 weighted image using a nonlinear control point warping technique. Lumen and epithelium density and the expert annotation were subsequently transformed into MRI space. Co-registration was validated by applying a known warp to tumor masks noted by the pathologist and control point warping the whole mount slide to match the transform. Overlap was measured using a DICE coefficient. A learning curve was generated to determine the optimal number of patients to train the algorithm on. A PLS algorithm was trained on 150 random permutations of patients incrementing from 1 to 29 patients. Slides were stratified such that all slides from a single patient were in the same cohort. Three cohorts were generated, with tumor burden balanced across all cohort. A PLS algorithm was trained on 2 independent training sets (cohorts 1 and 2) and applied to cohort 3. The input vector consisted of MRI values and the target variable was lumen and epithelium density. The algorithm was trained lesion-wise. Trained PiCT models were applied to the test cohort voxel-wise to generate 2 new image contrasts. Mean lesion values were compared between high grade, low grade, and healthy tissue using an ANOVA. An ROC analysis was performed lesion-wise on the test set. RESULTS/ANTICIPATED RESULTS: Results: The segmentation accuracy validation revealed R=0.99 and R=0.72 (p<0.001) for lumen and epithelium, respectively. The co-registration accuracy revealed a 94.5% overlap. The learning curve stabilized at 10 patients with a root mean square error of 0.14, thus the size of the 2 independent training cohorts was set to 10, leaving 19 for the test cohort. DISCUSSION/SIGNIFICANCE OF IMPACT: We present a technique for combining radiology and pathology with machine learning for generating predictive cytological topography (PiCT) maps of cellularity and lumen density prostate. The voxel-wise approach to mapping cellular features generates 2 new interpretable image contrasts, which can potentially increase confidence in diagnosis or guide biopsy and radiation treatment.
Field identification of ST-elevation myocardial infarction (STEMI) and advanced hospital notification decreases first-medical-contact-to-balloon (FMC2B) time. A recent study in this system found that electrocardiogram (ECG) transmission following a STEMI alert was frequently unsuccessful.
Instituting weekly test ECG transmissions from paramedic units to the hospital would increase successful transmission of ECGs and decrease FMC2B and door-to-balloon (D2B) times.
This was a natural experiment of consecutive patients with field-identified STEMI transported to a single percutaneous coronary intervention (PCI)-capable hospital in a regional STEMI system before and after implementation of scheduled test ECG transmissions. In November 2014, paramedic units began weekly test transmissions. The mobile intensive care nurse (MICN) confirmed the transmission, or if not received, contacted the paramedic unit and the department’s nurse educator to identify and resolve the problem. Per system-wide protocol, paramedics transmit all ECGs with interpretation of STEMI. Receiving hospitals submit patient data to a single registry as part of ongoing system quality improvement. The frequency of successful ECG transmission and time to intervention (FMC2B and D2B times) in the 18 months following implementation was compared to the 10 months prior. Post-implementation, the time the ECG transmission was received was also collected to determine the transmission gap time (time from ECG acquisition to ECG transmission received) and the advanced notification time (time from ECG transmission received to patient arrival).
There were 388 patients with field ECG interpretations of STEMI, 131 pre-intervention and 257 post-intervention. The frequency of successful transmission post-intervention was 73% compared to 64% prior; risk difference (RD)=9%; 95% CI, 1-18%. In the post-intervention period, the median FMC2B time was 79 minutes (inter-quartile range [IQR]=68-102) versus 86 minutes (IQR=71-108) pre-intervention (P=.3) and the median D2B time was 59 minutes (IQR=44-74) versus 60 minutes (IQR=53-88) pre-intervention (P=.2). The median transmission gap was three minutes (IQR=1-8) and median advanced notification time was 16 minutes (IQR=10-25).
Implementation of weekly test ECG transmissions was associated with improvement in successful real-time transmissions from field to hospital, which provided a median advanced notification time of 16 minutes, but no decrease in FMC2B or D2B times.
The correlation between ATP concentration and bacterial burden in the patient care environment was assessed. These findings suggest that a correlation exists between ATP concentration and bacterial burden, and they generally support ATP technology manufacturer-recommended cutoff values. Despite relatively modest discriminative ability, this technology may serve as a useful proxy for cleanliness.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
The purpose of this article was to examine the psychometric properties of the Crisis Counseling Assistance and Training Program (CCP) data collection instrument, the Individual/Family Encounter Log (IFEL). Data collected from disaster survivors included how they reacted to events in emotional, behavioral, physical, and cognitive domains. These domains are based on conceptual categorization of event reactions and allow CCP staff to provide survivors with referrals to appropriate behavioral health support resources, if warranted.
This study explored the factor structure of these survey items to determine how best to use the available information as a screen of disaster-related behavioral health indicators. Specifically, our first research question explored and confirmed the optimal factor structure of the event reaction items, and our second question examined whether the new factor structure was similar across disaster types: hurricanes, tornadoes, floods, and wildfires. Using a factor analytic technique, we tested whether our event reaction outcomes achieved consistent and reliable measurement across different disaster situations. Finally, we assessed how the new subscales were correlated with the type of risk to which CCP disaster survivors were exposed.
Our analyses revealed 3 factors: (1) depressive-like, (2) anxiety-like, and (3) somatic. In addition, we found that these factors were coherent for hurricanes, floods, and wildfires, although the basic factor structure was not equivalent for tornadoes.
Implications for use of the IFEL in disaster preparedness, response, and recovery are discussed. (Disaster Med Public Health Preparedness. 2016;10:822–831)
During 1990 we surveyed the southern sky using a multi-beam receiver at frequencies of 4850 and 843 MHz. The half-power beamwidths were 4 and 25 arcmin respectively. The finished surveys cover the declination range between +10 and −90 degrees declination, essentially complete in right ascension, an area of 7.30 steradians. Preliminary analysis of the 4850 MHz data indicates that we will achieve a five sigma flux density limit of about 30 mJy. We estimate that we will find between 80 000 and 90 000 new sources above this limit. This is a revised version of the paper presented at the Regional Meeting by the first four authors; the surveys now have been completed.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.