To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Substantial clinical heterogeneity of major depressive disorder (MDD) suggests it may group together individuals with diverse aetiologies. Identifying distinct subtypes should lead to more effective diagnosis and treatment, while providing more useful targets for further research. Genetic and clinical overlap between MDD and schizophrenia (SCZ) suggests an MDD subtype may share underlying mechanisms with SCZ.
The present study investigated whether a neurobiologically distinct subtype of MDD could be identified by SCZ polygenic risk score (PRS). We explored interactive effects between SCZ PRS and MDD case/control status on a range of cortical, subcortical and white matter metrics among 2370 male and 2574 female UK Biobank participants.
There was a significant SCZ PRS by MDD interaction for rostral anterior cingulate cortex (RACC) thickness (β = 0.191, q = 0.043). This was driven by a positive association between SCZ PRS and RACC thickness among MDD cases (β = 0.098, p = 0.026), compared to a negative association among controls (β = −0.087, p = 0.002). MDD cases with low SCZ PRS showed thinner RACC, although the opposite difference for high-SCZ-PRS cases was not significant. There were nominal interactions for other brain metrics, but none remained significant after correcting for multiple comparisons.
Our significant results indicate that MDD case-control differences in RACC thickness vary as a function of SCZ PRS. Although this was not the case for most other brain measures assessed, our specific findings still provide some further evidence that MDD in the presence of high genetic risk for SCZ is subtly neurobiologically distinct from MDD in general.
Measles is a notifiable disease, but not everyone infected seeks care, nor is every consultation reported. We estimated the completeness of reporting during a measles outbreak in The Netherlands in 2013–2014. Children below 15 years of age in a low vaccination coverage community (n = 3422) received a questionnaire to identify measles cases. Cases found in the survey were matched with the register of notifiable diseases to estimate the completeness of reporting. Second, completeness of reporting was assessed by comparing the number of susceptible individuals prior to the outbreak with the number of reported cases in the surveyed community and on a national level.
We found 307 (15%) self-identified measles cases among 2077 returned questionnaires (61%), of which 27 could be matched to a case reported to the national register; completeness of reporting was 8.8%. Based on the number of susceptible individuals and number of reported cases in the surveyed community and on national level, the completeness of reporting was estimated to be 9.1% and 8.6%, respectively. Estimating the completeness of reporting gave almost identical estimates, which lends support to the credibility and validity of both approaches. The size of the 2013–2014 outbreak approximated 31 400 measles infections.
We hypothesized that a computerized clinical decision support tool for Clostridium difficile testing would reduce unnecessary inpatient tests, resulting in fewer laboratory-identified events. Census-adjusted interrupted time-series analyses demonstrated significant reductions of 41% fewer tests and 31% fewer hospital-onset C. difficile infection laboratory-identified events following this intervention.
Plasmodium knowlesi has risen in importance as a zoonotic parasite that has been causing regular episodes of malaria throughout South East Asia. The P. knowlesi genome sequence generated in 2008 highlighted and confirmed many similarities and differences in Plasmodium species, including a global view of several multigene families, such as the large SICAvar multigene family encoding the variant antigens known as the schizont-infected cell agglutination proteins. However, repetitive DNA sequences are the bane of any genome project, and this and other Plasmodium genome projects have not been immune to the gaps, rearrangements and other pitfalls created by these genomic features. Today, long-read PacBio and chromatin conformation technologies are overcoming such obstacles. Here, based on the use of these technologies, we present a highly refined de novo P. knowlesi genome sequence of the Pk1(A+) clone. This sequence and annotation, referred to as the ‘MaHPIC Pk genome sequence’, includes manual annotation of the SICAvar gene family with 136 full-length members categorized as type I or II. This sequence provides a framework that will permit a better understanding of the SICAvar repertoire, selective pressures acting on this gene family and mechanisms of antigenic variation in this species and other pathogens.
Antigenic variation in malaria was discovered in Plasmodium knowlesi studies involving longitudinal infections of rhesus macaques (M. mulatta). The variant proteins, known as the P. knowlesi Schizont Infected Cell Agglutination (SICA) antigens and the P. falciparum Erythrocyte Membrane Protein 1 (PfEMP1) antigens, expressed by the SICAvar and var multigene families, respectively, have been studied for over 30 years. Expression of the SICA antigens in P. knowlesi requires a splenic component, and specific antibodies are necessary for variant antigen switch events in vivo. Outstanding questions revolve around the role of the spleen and the mechanisms by which the expression of these variant antigen families are regulated. Importantly, the longitudinal dynamics and molecular mechanisms that govern variant antigen expression can be studied with P. knowlesi infection of its mammalian and vector hosts. Synchronous infections can be initiated with established clones and studied at multi-omic levels, with the benefit of computational tools from systems biology that permit the integration of datasets and the design of explanatory, predictive mathematical models. Here we provide an historical account of this topic, while highlighting the potential for maximizing the use of P. knowlesi – macaque model systems and summarizing exciting new progress in this area of research.
In the face of shifting demographics and an increase in human longevity, it is important to examine carefully what is known about cognitive ageing, and to identify and promote possibly malleable lifestyle and health-related factors that might mitigate age-associated cognitive decline. The Lothian Birth Cohorts of 1921 (LBC1921, n = 550) and 1936 (LBC1936, n = 1091) are longitudinal studies of cognitive and brain ageing based in Scotland. Childhood IQ data are available for these participants, who were recruited in later life and then followed up regularly. This overview summarises some of the main LBC findings to date, illustrating the possible genetic and environmental contributions to cognitive function (level and change) and brain imaging biomarkers in later life. Key associations include genetic variation, health and fitness, psychosocial and lifestyle factors, and aspects of the brain's structure. It addresses some key methodological issues such as confounding by early-life intelligence and social factors and emphasises areas requiring further investigation. Overall, the findings that have emerged from the LBC studies highlight that there are multiple correlates of cognitive ability level in later life, many of which have small effects, that there are as yet few reliable predictors of cognitive change, and that not all of the correlates have independent additive associations. The concept of marginal gains, whereby there might be a cumulative effect of small incremental improvements across a wide range of lifestyle and health-related factors, may offer a useful way to think about and promote a multivariate recipe for healthy cognitive and brain ageing.
Introduction: Point of care ultrasound (PoCUS) has become an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). Current established protocols (e.g. RUSH and ACES) were developed by expert user opinion, rather than objective, prospective data. Recently the SHoC Protocol was published, recommending 3 core scans; cardiac, lung, and IVC; plus other scans when indicated clinically. We report the abnormal ultrasound findings from our international multicenter randomized controlled trial, to assess if the recommended 3 core SHoC protocol scans were chosen appropriately for this population. Methods: Recruitment occurred at seven centres in North America (4) and South Africa (3). Screening at triage identified patients (SBP<100 or shock index>1) who were randomized to PoCUS or control (standard care with no PoCUS) groups. All scans were performed by PoCUS-trained physicians within one hour of arrival in the ED. Demographics, clinical details and study findings were collected prospectively. A threshold incidence for positive findings of 10% was established as significant for the purposes of assessing the appropriateness of the core recommendations. Results: 138 patients had a PoCUS screen completed. All patients had cardiac, lung, IVC, aorta, abdominal, and pelvic scans. Reported abnormal findings included hyperdynamic LV function (59; 43%); small collapsing IVC (46; 33%); pericardial effusion (24; 17%); pleural fluid (19; 14%); hypodynamic LV function (15; 11%); large poorly collapsing IVC (13; 9%); peritoneal fluid (13; 9%); and aortic aneurysm (5; 4%). Conclusion: The 3 core SHoC Protocol recommendations included appropriate scans to detect all pathologies recorded at a rate of greater than 10 percent. The 3 most frequent findings were cardiac and IVC abnormalities, followed by lung. It is noted that peritoneal fluid was seen at a rate of 9%. Aortic aneurysms were rare. This data from the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients, supports the use of the prioritized SHoC protocol, though a larger study is required to confirm these findings.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: In Ottawa, STEMI patients are transported directly to percutaneous coronary intervention (PCI) by advanced care paramedics (ACPs), primary care paramedics (PCPs), or transferred from PCP to ACP crew (ACP-intercept). PCPs have a limited skill set to address complications during transport.The objective of this study was to determine what clinically important events (CIEs) occurred in STEMI patients transported for primary PCI via a PCP crew, and what proportion of such events could only be treated by ACP protocols. Methods: We conducted a health record review of STEMI patients transported for primary PCI from Jan 1, 2011-Dec 21, 2015. Ottawa has a single PCI center and its EMS system employs both PCP and ACP paramedics. We identified consecutive STEMI bypass patients transported by PCP-only and ACP-intercept using the dispatch database. A data extraction form was piloted and used to extract patient demographics, transport times, and primary outcomes: CIEs and interventions performed during transport, and secondary outcomes: hospital diagnosis, and mortality. CIEs were reviewed by two investigators to determine if they would be treated differently by ACP protocols. We present descriptive statistics. Results: We identified 967 STEMI bypass cases among which 214 (118 PCP-only and 96 ACP-intercept) met all inclusion criteria. Characteristics were: mean age 61.4 years, 78% male, 31.8% anterior and 44.4% inferior infarcts, mean response time 6 min, total paramedic contact time 29 min, and in cases of ACP-intercept 7 min of PCP-only contact time.A CIE occurred in 127 (59%) of cases: SBP<90 mmHg 26.2%, HR<60 30.4%, HR>100 20.6%, malignant arrhythmias 7.5%, altered mental status 6.5%, airway intervention 2.3%, 2 patients (0.9%) arrested, both survived. Of the CIE identified, 54 (42.5%) could be addressed differently by ACP vs PCP protocols (25.2% of total cases). The majority related to fluid boluses for hypotension (44 cases; 35% of CIE). ACP intervention for CIEs within the ACP intercept group was 51.6%. There were 6 in-hospital deaths (2.8%) with no difference in transport crew type. Conclusion: CIEs are common in STEMI bypass patients however a smaller proportion of such CIE would be addressed differently by ACP protocols compared to PCP protocols. The vast majority of CIE appeared to be transient and of limited clinical significance.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
The public health threat posed by zoonotic Plasmodium knowlesi appears to be growing: it is increasingly reported across South East Asia, and is the leading cause of malaria in Malaysian Borneo. Plasmodium knowlesi threatens progress towards malaria elimination as aspects of its transmission, such as spillover from wildlife reservoirs and reliance on outdoor-biting vectors, may limit the effectiveness of conventional methods of malaria control. The development of new quantitative approaches that address the ecological complexity of P. knowlesi, particularly through a focus on its primary reservoir hosts, will be required to control it. Here, we review what is known about P. knowlesi transmission, identify key knowledge gaps in the context of current approaches to transmission modelling, and discuss the integration of these approaches with clinical parasitology and geostatistical analysis. We highlight the need to incorporate the influences of fine-scale spatial variation, rapid changes to the landscape, and reservoir population and transmission dynamics. The proposed integrated approach would address the unique challenges posed by malaria as a zoonosis, aid the identification of transmission hotspots, provide insight into the mechanistic links between incidence and land use change and support the design of appropriate interventions.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
The Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS) has found that the proportional elevation in the US Army enlisted soldier suicide rate during deployment (compared with the never-deployed or previously deployed) is significantly higher among women than men, raising the possibility of gender differences in the adverse psychological effects of deployment.
Person-month survival models based on a consolidated administrative database for active duty enlisted Regular Army soldiers in 2004–2009 (n = 975 057) were used to characterize the gender × deployment interaction predicting suicide. Four explanatory hypotheses were explored involving the proportion of females in each soldier's occupation, the proportion of same-gender soldiers in each soldier's unit, whether the soldier reported sexual assault victimization in the previous 12 months, and the soldier's pre-deployment history of treated mental/behavioral disorders.
The suicide rate of currently deployed women (14.0/100 000 person-years) was 3.1–3.5 times the rates of other (i.e. never-deployed/previously deployed) women. The suicide rate of currently deployed men (22.6/100 000 person-years) was 0.9–1.2 times the rates of other men. The adjusted (for time trends, sociodemographics, and Army career variables) female:male odds ratio comparing the suicide rates of currently deployed v. other women v. men was 2.8 (95% confidence interval 1.1–6.8), became 2.4 after excluding soldiers with Direct Combat Arms occupations, and remained elevated (in the range 1.9–2.8) after adjusting for the hypothesized explanatory variables.
These results are valuable in excluding otherwise plausible hypotheses for the elevated suicide rate of deployed women and point to the importance of expanding future research on the psychological challenges of deployment for women.
We describe the efficacy of enhanced infection control measures, including those recommended in the Centers for Disease Control and Prevention’s 2012 carbapenem-resistant Enterobacteriaceae (CRE) toolkit, to control concurrent outbreaks of carbapenemase-producing Enterobacteriaceae (CPE) and extensively drug-resistant Acinetobacter baumannii (XDR-AB).
Before-after intervention study.
Fifteen-bed surgical trauma intensive care unit (ICU).
We investigated the impact of enhanced infection control measures in response to clusters of CPE and XDR-AB infections in an ICU from April 2009 to March 2010. Polymerase chain reaction was used to detect the presence of blaKPC and resistance plasmids in CRE. Pulsed-field gel electrophoresis was performed to assess XDR-AB clonality. Enhanced infection-control measures were implemented in response to ongoing transmission of CPE and a new outbreak of XDR-AB. Efficacy was evaluated by comparing the incidence rate (IR) of CPE and XDR-AB before and after the implementation of these measures.
The IR of CPE for the 12 months before the implementation of enhanced measures was 7.77 cases per 1,000 patient-days, whereas the IR of XDR-AB for the 3 months before implementation was 6.79 cases per 1,000 patient-days. All examined CPE shared endemic blaKPC resistance plasmids, and 6 of the 7 XDR-AB isolates were clonal. Following institution of enhanced infection control measures, the CPE IR decreased to 1.22 cases per 1,000 patient-days (P = .001), and no more cases of XDR-AB were identified.
Use of infection control measures described in the Centers for Disease Control and Prevention’s 2012 CRE toolkit was associated with a reduction in the IR of CPE and an interruption in XDR-AB transmission.
During improved oil recovery (IOR), gas may be introduced into a porous reservoir filled with surfactant solution in order to form foam. A model for the evolution of the resulting foam front known as ‘pressure-driven growth’ is analysed. An asymptotic solution of this model for long times is derived that shows that foam can propagate indefinitely into the reservoir without gravity override. Moreover, ‘pressure-driven growth’ is shown to correspond to a special case of the more general ‘viscous froth’ model. In particular, it is a singular limit of the viscous froth, corresponding to the elimination of a surface tension term, permitting sharp corners and kinks in the predicted shape of the front. Sharp corners tend to develop from concave regions of the front. The principal solution of interest has a convex front, however, so that although this solution itself has no sharp corners (except for some kinks that develop spuriously owing to errors in a numerical scheme), it is found nevertheless to exhibit milder singularities in front curvature, as the long-time asymptotic analytical solution makes clear. Numerical schemes for the evolving front shape which perform robustly (avoiding the development of spurious kinks) are also developed. Generalisations of this solution to geologically heterogeneous reservoirs should exhibit concavities and/or sharp corner singularities as an inherent part of their evolution: propagation of fronts containing such ‘inherent’ singularities can be readily incorporated into these numerical schemes.
The US Army suicide rate has increased sharply in recent years. Identifying significant predictors of Army suicides in Army and Department of Defense (DoD) administrative records might help focus prevention efforts and guide intervention content. Previous studies of administrative data, although documenting significant predictors, were based on limited samples and models. A career history perspective is used here to develop more textured models.
The analysis was carried out as part of the Historical Administrative Data Study (HADS) of the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). De-identified data were combined across numerous Army and DoD administrative data systems for all Regular Army soldiers on active duty in 2004–2009. Multivariate associations of sociodemographics and Army career variables with suicide were examined in subgroups defined by time in service, rank and deployment history.
Several novel results were found that could have intervention implications. The most notable of these were significantly elevated suicide rates (69.6–80.0 suicides per 100 000 person-years compared with 18.5 suicides per 100 000 person-years in the total Army) among enlisted soldiers deployed either during their first year of service or with less than expected (based on time in service) junior enlisted rank; a substantially greater rise in suicide among women than men during deployment; and a protective effect of marriage against suicide only during deployment.
A career history approach produces several actionable insights missed in less textured analyses of administrative data predictors. Expansion of analyses to a richer set of predictors might help refine understanding of intervention implications.
It has been postulated that aging is the consequence of an accelerated accumulation of somatic DNA mutations and that subsequent errors in the primary structure of proteins ultimately reach levels sufficient to affect organismal functions. The technical limitations of detecting somatic changes and the lack of insight about the minimum level of erroneous proteins to cause an error catastrophe hampered any firm conclusions on these theories. In this study, we sequenced the whole genome of DNA in whole blood of two pairs of monozygotic (MZ) twins, 40 and 100 years old, by two independent next-generation sequencing (NGS) platforms (Illumina and Complete Genomics). Potentially discordant single-base substitutions supported by both platforms were validated extensively by Sanger, Roche 454, and Ion Torrent sequencing. We demonstrate that the genomes of the two twin pairs are germ-line identical between co-twins, and that the genomes of the 100-year-old MZ twins are discerned by eight confirmed somatic single-base substitutions, five of which are within introns. Putative somatic variation between the 40-year-old twins was not confirmed in the validation phase. We conclude from this systematic effort that by using two independent NGS platforms, somatic single nucleotide substitutions can be detected, and that a century of life did not result in a large number of detectable somatic mutations in blood. The low number of somatic variants observed by using two NGS platforms might provide a framework for detecting disease-related somatic variants in phenotypically discordant MZ twins.
Biological reference points (BRPs) in fisheries policy are typically sensitive to stock assessment model assumptions, thus increasing uncertainty in harvest decision-making and potentially blocking adoption of precautionary harvest policies. A collaborative management strategy evaluation approach and closed-loop simulation modelling was used to evaluate expected fishery economic and conservation performance of the sablefish (Anoplopoma fimbria) fishery in British Columbia (Canada), in the presence of uncertainty about BRPs. Comparison of models derived using two precautionary harvest control rules, which each complied with biological conservation objectives and short-term economic objectives given by industry, suggested that both rules were likely to avert biomass decline below limit BRPs, even when stock biomass and production were persistently overestimated by stock assessment models. The slightly less conservative, industry-preferred harvest control rule also avoided short-term economic losses of c. CAN$ 2.7–10 million annually, or 10–50% of current landed value. Distinguishing between the role of BRPs in setting fishery conservation objectives and operational control points that define harvest control rules improved the flexibility of the sablefish management system, and has led to adoption of precautionary management procedures.