We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: Contingency management (CM) procedures yield measurable reductions in cocaine use. This poster describes a trial aimed at using CM as a vehicle to show the biopsychosocial health benefits of reduced use, rather than total abstinence, the currently accepted metric for treatment efficacy. METHODS/STUDY POPULATION: In this 12-week, randomized controlled trial, CM was used to reduce cocaine use and evaluate associated improvements in cardiovascular, immune, and psychosocial well-being. Adults aged 18 and older who sought treatment for cocaine use (N=127) were randomized into three groups in a 1:1:1 ratio: High Value ($55) or Low Value ($13) CM incentives for cocaine-negative urine samples or a non-contingent control group. They completed outpatient sessions three days per week across the 12-week intervention period, totaling 36 clinic visits and four post-treatment follow-up visits. During each visit, participants provided observed urine samples and completed several assays of biopsychosocial health. RESULTS/ANTICIPATED RESULTS: Preliminary findings from generalized linear mixed effect modeling demonstrate the feasibility of the CM platform. Abstinence rates from cocaine use were significantly greater in the High Value group (47% negative; OR = 2.80; p = 0.01) relative to the Low Value (23% negative) and Control groups (24% negative;). In the planned primary analysis, the level of cocaine use reduction based on cocaine-negative urine samples will serve as the primary predictor of cardiovascular (e.g., endothelin-1 levels), immune (e.g., IL-10 levels) and psychosocial (e.g., Addiction Severity Index) outcomes using results from the fitted models. DISCUSSION/SIGNIFICANCE: This research will advance the field by prospectively and comprehensively demonstrating the beneficial effects of reduced cocaine use. These outcomes can, in turn, support the adoption of reduced cocaine use as a viable alternative endpoint in cocaine treatment trials.
Homeless shelter residents and staff may be at higher risk of SARS-CoV-2 infection. However, SARS-CoV-2 infection estimates in this population have been reliant on cross-sectional or outbreak investigation data. We conducted routine surveillance and outbreak testing in 23 homeless shelters in King County, Washington, to estimate the occurrence of laboratory-confirmed SARS-CoV-2 infection and risk factors during 1 January 2020–31 May 2021. Symptom surveys and nasal swabs were collected for SARS-CoV-2 testing by RT-PCR for residents aged ≥3 months and staff. We collected 12,915 specimens from 2,930 unique participants. We identified 4.74 (95% CI 4.00–5.58) SARS-CoV-2 infections per 100 individuals (residents: 4.96, 95% CI 4.12–5.91; staff: 3.86, 95% CI 2.43–5.79). Most infections were asymptomatic at the time of detection (74%) and detected during routine surveillance (73%). Outbreak testing yielded higher test positivity than routine surveillance (2.7% versus 0.9%). Among those infected, residents were less likely to report symptoms than staff. Participants who were vaccinated against seasonal influenza and were current smokers had lower odds of having an infection detected. Active surveillance that includes SARS-CoV-2 testing of all persons is essential in ascertaining the true burden of SARS-CoV-2 infections among residents and staff of congregate settings.
Dropout from psychotherapy for borderline personality disorder (BPD) is a notorious problem. We investigated whether treatment, treatment format, treatment setting, substance use exclusion criteria, proportion males, mean age, country, and other variables influenced dropout.
Methods
From Pubmed, Embase, Cochrane, Psycinfo and other sources, 111 studies (159 treatment arms, N = 9100) of psychotherapy for non-forensic adult patients with BPD were included. Dropout per quarter during one year of treatment was analyzed on participant level with multilevel survival analysis, to deal with multiple predictors, nonconstant dropout chance over time, and censored data. Multiple imputation was used to estimate quarter of drop-out if unreported. Sensitivity analyses were done by excluding DBT-arms with deviating push-out rules.
Results
Dropout was highest in the first quarter of treatment. Schema therapy had the lowest dropout overall, and mentalization-based treatment in the first two quarters. Community treatment by experts had the highest dropout. Moreover, individual therapy had lowest dropout, group therapy highest, with combined formats in-between. Other variables such as age or substance-use exclusion criteria were not associated with dropout.
Conclusion
The findings do not support claims that all treatments are equal, and indicate that efforts to reduce dropout should focus on early stages of treatment and on group treatment.
OBJECTIVES/GOALS: The goal of this study was to understand the impact of a high sodium diet on gene networks in the kidney that correlate with blood pressure in female primates, and translating findings to women. METHODS/STUDY POPULATION: Sodium-naïve female baboons (n=7) were fed a low-sodium (LS) diet for 6 weeks followed by a high sodium (HS) diet for 6 weeks. Sodium intake, serum 17 beta-estradiol, and ultrasound-guided kidney biopsies for RNA-Seq were collected at the end of each diet. Blood pressure was continuously measured for 64-hour periods throughout the study by implantable telemetry devices. Weighted gene coexpression network analysis was performed on RNA-Seq data to identify transcripts correlated with blood pressure on each diet. Network analysis was performed on transcripts highly correlated with BP, and in silico findings were validated by immunohistochemistry of kidney tissues. RESULTS/ANTICIPATED RESULTS: On the LS diet, Na+ intake and serum 17 beta-estradiol concentration correlated with BP. Cell type composition of renal biopsies was consistent among all animals for both diets. Kidney transcriptomes differed by diet; analysis by unbiased weighted gene co-expression network analysis revealed modules of genes correlated with BP on the HS diet. Network analysis of module genes showed causal networks linking hormone receptors, proliferation and differentiation, methylation, hypoxia, insulin and lipid regulation, and inflammation as regulators underlying variation in BP on the HS diet. Our results show variation in BP correlated with novel kidney gene networks with master regulators PPARG and MYC in female baboons on a HS diet. DISCUSSION/SIGNIFICANCE: Previous studies in primates to identify molecular networks dysregulated by HS diet focused on males. Current clinical guidelines do not offer sex-specific treatment plans for sodium sensitive hypertension. This study leveraged variation in BP as a first step to identify correlated kidney regulatory gene networks in female primates after a HS diet.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
Parasites sometimes expand their host range and cause new disease aetiologies. Genetic changes can then occur due to host-specific adaptive alterations, particularly when parasites cross between evolutionarily distant hosts. Characterizing genetic variation in Cryptosporidium from humans and other animals may have important implications for understanding disease dynamics and transmission. We analyse sequences from four loci (gp60, HSP-70, COWP and actin) representing multiple Cryptosporidium species reported in humans. We predicted low genetic diversity in species that present unusual human infections due to founder events and bottlenecks. High genetic diversity was observed in isolates from humans of Cryptosporidium meleagridis, Cryptosporidium cuniculus, Cryptosporidium hominis and Cryptosporidium parvum. A deviation of expected values of neutrality using Tajima's D was observed in C. cuniculus and C. meleagridis. The high genetic diversity in C. meleagridis and C. cuniculus did not match our expectations but deviations from neutrality indicate a recent decrease in genetic variability through a population bottleneck after an expansion event. Cryptosporidium hominis was also found with a significant Tajima's D positive value likely caused by recent population expansion of unusual genotypes in humans. These insights indicate that changes in genetic diversity can help us to understand host-parasite adaptation and evolution.
We present 0.″2–0.″4 resolution ALMA images of the submillimeter dust continuum and the CO, H2O, and H2O+ line emission in a z = 3.63 strongly lensed dusty starburst. We construct the lens model for the system with an MCMC technique. While the average magnification for the dust continuum is about 11, the magnification of the line emission varies from 5 to 22 across the source, resolving the source down to sub-kpc scales. The ISM content reveals that it is a pre-coalescence major merger of two ultra-luminous infrared galaxies, both with a large amount of molecular gas reservoir. The approaching galaxy in the south shows no apparent kinematic structure with a half-light radius of 0.4 kpc, while the preceding one resembles a 1.2 kpc rotating disk, separated by a projected distance of 1.3 kpc. The distribution of dust and gas emission suggests a large amount of cold ISM concentrated in the interacting region.
Introduction: Point of care ultrasound (PoCUS) has become an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). Current established protocols (e.g. RUSH and ACES) were developed by expert user opinion, rather than objective, prospective data. Recently the SHoC Protocol was published, recommending 3 core scans; cardiac, lung, and IVC; plus other scans when indicated clinically. We report the abnormal ultrasound findings from our international multicenter randomized controlled trial, to assess if the recommended 3 core SHoC protocol scans were chosen appropriately for this population. Methods: Recruitment occurred at seven centres in North America (4) and South Africa (3). Screening at triage identified patients (SBP<100 or shock index>1) who were randomized to PoCUS or control (standard care with no PoCUS) groups. All scans were performed by PoCUS-trained physicians within one hour of arrival in the ED. Demographics, clinical details and study findings were collected prospectively. A threshold incidence for positive findings of 10% was established as significant for the purposes of assessing the appropriateness of the core recommendations. Results: 138 patients had a PoCUS screen completed. All patients had cardiac, lung, IVC, aorta, abdominal, and pelvic scans. Reported abnormal findings included hyperdynamic LV function (59; 43%); small collapsing IVC (46; 33%); pericardial effusion (24; 17%); pleural fluid (19; 14%); hypodynamic LV function (15; 11%); large poorly collapsing IVC (13; 9%); peritoneal fluid (13; 9%); and aortic aneurysm (5; 4%). Conclusion: The 3 core SHoC Protocol recommendations included appropriate scans to detect all pathologies recorded at a rate of greater than 10 percent. The 3 most frequent findings were cardiac and IVC abnormalities, followed by lung. It is noted that peritoneal fluid was seen at a rate of 9%. Aortic aneurysms were rare. This data from the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients, supports the use of the prioritized SHoC protocol, though a larger study is required to confirm these findings.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
The public health threat posed by zoonotic Plasmodium knowlesi appears to be growing: it is increasingly reported across South East Asia, and is the leading cause of malaria in Malaysian Borneo. Plasmodium knowlesi threatens progress towards malaria elimination as aspects of its transmission, such as spillover from wildlife reservoirs and reliance on outdoor-biting vectors, may limit the effectiveness of conventional methods of malaria control. The development of new quantitative approaches that address the ecological complexity of P. knowlesi, particularly through a focus on its primary reservoir hosts, will be required to control it. Here, we review what is known about P. knowlesi transmission, identify key knowledge gaps in the context of current approaches to transmission modelling, and discuss the integration of these approaches with clinical parasitology and geostatistical analysis. We highlight the need to incorporate the influences of fine-scale spatial variation, rapid changes to the landscape, and reservoir population and transmission dynamics. The proposed integrated approach would address the unique challenges posed by malaria as a zoonosis, aid the identification of transmission hotspots, provide insight into the mechanistic links between incidence and land use change and support the design of appropriate interventions.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
Method
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
Results
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Conclusions
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
During improved oil recovery (IOR), gas may be introduced into a porous reservoir filled with surfactant solution in order to form foam. A model for the evolution of the resulting foam front known as ‘pressure-driven growth’ is analysed. An asymptotic solution of this model for long times is derived that shows that foam can propagate indefinitely into the reservoir without gravity override. Moreover, ‘pressure-driven growth’ is shown to correspond to a special case of the more general ‘viscous froth’ model. In particular, it is a singular limit of the viscous froth, corresponding to the elimination of a surface tension term, permitting sharp corners and kinks in the predicted shape of the front. Sharp corners tend to develop from concave regions of the front. The principal solution of interest has a convex front, however, so that although this solution itself has no sharp corners (except for some kinks that develop spuriously owing to errors in a numerical scheme), it is found nevertheless to exhibit milder singularities in front curvature, as the long-time asymptotic analytical solution makes clear. Numerical schemes for the evolving front shape which perform robustly (avoiding the development of spurious kinks) are also developed. Generalisations of this solution to geologically heterogeneous reservoirs should exhibit concavities and/or sharp corner singularities as an inherent part of their evolution: propagation of fronts containing such ‘inherent’ singularities can be readily incorporated into these numerical schemes.
It has been postulated that aging is the consequence of an accelerated accumulation of somatic DNA mutations and that subsequent errors in the primary structure of proteins ultimately reach levels sufficient to affect organismal functions. The technical limitations of detecting somatic changes and the lack of insight about the minimum level of erroneous proteins to cause an error catastrophe hampered any firm conclusions on these theories. In this study, we sequenced the whole genome of DNA in whole blood of two pairs of monozygotic (MZ) twins, 40 and 100 years old, by two independent next-generation sequencing (NGS) platforms (Illumina and Complete Genomics). Potentially discordant single-base substitutions supported by both platforms were validated extensively by Sanger, Roche 454, and Ion Torrent sequencing. We demonstrate that the genomes of the two twin pairs are germ-line identical between co-twins, and that the genomes of the 100-year-old MZ twins are discerned by eight confirmed somatic single-base substitutions, five of which are within introns. Putative somatic variation between the 40-year-old twins was not confirmed in the validation phase. We conclude from this systematic effort that by using two independent NGS platforms, somatic single nucleotide substitutions can be detected, and that a century of life did not result in a large number of detectable somatic mutations in blood. The low number of somatic variants observed by using two NGS platforms might provide a framework for detecting disease-related somatic variants in phenotypically discordant MZ twins.
Biological reference points (BRPs) in fisheries policy are typically sensitive to stock assessment model assumptions, thus increasing uncertainty in harvest decision-making and potentially blocking adoption of precautionary harvest policies. A collaborative management strategy evaluation approach and closed-loop simulation modelling was used to evaluate expected fishery economic and conservation performance of the sablefish (Anoplopoma fimbria) fishery in British Columbia (Canada), in the presence of uncertainty about BRPs. Comparison of models derived using two precautionary harvest control rules, which each complied with biological conservation objectives and short-term economic objectives given by industry, suggested that both rules were likely to avert biomass decline below limit BRPs, even when stock biomass and production were persistently overestimated by stock assessment models. The slightly less conservative, industry-preferred harvest control rule also avoided short-term economic losses of c. CAN$ 2.7–10 million annually, or 10–50% of current landed value. Distinguishing between the role of BRPs in setting fishery conservation objectives and operational control points that define harvest control rules improved the flexibility of the sablefish management system, and has led to adoption of precautionary management procedures.
Pelagic ecosystems and their fisheries are of particular economic and social importance to the countries and territories of the Wider Caribbean for various reasons. In some countries (e.g. Barbados, Grenada) commercial pelagic fisheries already contribute significantly to total landings and seafood export foreign exchange earnings. Ports and postharvest facilities service the vessels, ranging from artisanal canoes to industrial longliners, and their catch which often reaches tourists as well as locals (Mahon and McConney 2004). In other places where the focus has previously been on inshore and demersal fisheries (e.g. Antigua and Barbuda, Belize) there is growing interest in the potential of pelagic fisheries development. This potential lies not only in commercial fisheries, but also in the high-revenue and conservation-aware recreational fisheries well established in a few locations (e.g. Puerto Rico, Costa Rica) and undertaken at a lower level in many others.
Underlying all of this is the complexity due to many of the valued pelagics being migratory or highly migratory shared and straddling stocks falling under the 1995 United Nations Fish Stocks Agreement and subject to several international instruments and management regimes, such as those of the International Commission for the Conservation of Atlantic Tunas (ICCAT). The web of linkages across Caribbean marine jurisdictions and organizations is complex (McConney et al. 2007). The related issues call for an ecosystem approach (McConney and Salas Chapter 7; Schuhmann et al. Chapter 8) and some progress has already been made at multiple levels (Fanning and Oxenford Chapter 16; Singh-Renton et al. Chapter 14).
This synthesis chapter presents the outputs of facilitated symposium sessions specifically related to achieving and implementing a shared vision for the pelagic ecosystem in marine ecosystem based management (EBM) in the Wider Caribbean. The methodology was described in Chapter 1 of this volume. This chapter first describes a vision for the pelagic ecosystem and reports on the priorities assigned to the identified vision elements. It then addresses how the vision might be achieved by taking into account assisting factors (those that facilitate achievement) and resisting factors (those that inhibit achievement). The chapter concludes with guidance on the strategic direction needed to implement the vision, identifying specific actions to be undertaken for each of the vision elements.
In this article we present the protocol of the Birmingham Registry for Twin Heritability Studies (BiRTHS), which aims to establish a long-term prospective twin registry with twins identified from the antenatal period and subjected to detailed follow-up. We plan to investigate the concordance in anthropo-metrics and early childhood phenotypes between 66 monozygotic and 154 dizygotic twin pairs in the first 2 years of recruitment. In this project we plan to determine the relative contributions of heritability and environment to fetal growth, birth size, growth in infancy and development up to 2 years of age in an ethnically mixed population. Twins will be assessed with the Griffitth's Mental Development Scales, which will enable us to obtain detailed information on development. As maternal depression may have an effect on the twins' neurodevelopment, the Edinburgh Postnatal Depression Scale will be used at various stages during pregnancy and after delivery to assess maternal depressive symptoms. The increasing prevalence of obesity in both adults and children has raised concerns about the effect of maternal obesity in pregnancy on fetal growth. The prospective study design gives us the opportunity to obtain data on maternal nutrition (reflected by body mass index) and ante- and postnatal growth and development of twins.