To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
White kidney bean extract (WKBE) is a nutraceutical often advocated as an anti-obesity agent. The main proposed mechanism for these effects is alpha-amylase inhibition, thereby slowing carbohydrate digestion and absorption. Thus, it is possible that WKBE could impact the gut microbiota and modulate gut health. We investigated the effects of supplementing 20 healthy adults with WKBE for 1 week in a randomised, placebo-controlled crossover trial on the composition of the gut microbiota, gastrointestinal (GI) inflammation (faecal calprotectin), GI symptoms, and stool habits. We conducted in vitro experiments and used a gut model system to explore potential inhibition of alpha-amylase. We gained qualitative insight into participant experiences of using WKBE via focus groups. WKBE supplementation decreased the relative abundance of Bacteroidetes and increased that of Firmicutes, however, there were no significant differences in post-intervention gut microbiota measurements between the WKBE and control. There were no significant effects on GI inflammation or symptoms related to constipation, or stool consistency or frequency. Our in vitro and gut model system analyses showed no effects of WKBE on alpha-amylase activity. Our findings suggest that WKBE may modulate the gut microbiota in healthy adults, however, the underlying mechanism is unlikely due to active site inhibition of alpha-amylase.
The dominant images of the First World War are of a land war fought in the trenches of north-western Europe. Yet the war was ‘as much a war of competing blockades … as of competing armies’.1 These blockades aimed at interdicting enemy powers’ seaborne trade and the shipping that carried it, which the ‘first wave of globalization’ had placed at the heart of the economies of Europe and North America.2 No belligerent power was more dependent on trade and shipping than Great Britain, and it would therefore be difficult to overstate their importance to the British and Allied war effort. Moreover, shipping, ‘the world’s key industry’, as one recent study has termed it, was the one major British industry situated on the front line, directly and constantly exposed to enemy action.3
Parasites have the power to impose significant regulatory pressures on host populations, making evolutionary patterns of host switching by parasites salient to a range of contemporary ecological issues. However, relatively little is known about the colonization of new hosts by parasitic, commensal and mutualistic eukaryotes of metazoans. As ubiquitous symbionts of coelomate animals, Blastocystis spp. represent excellent candidate organisms for the study of evolutionary patterns of host switching by protists. Here, we apply a big-data phylogenetic approach using archival sequence data to assess the relative roles of several host-associated traits in shaping the evolutionary history of the Blastocystis species-complex within an ecological framework. Patterns of host usage were principally determined by geographic location and shared environments of hosts, suggesting that weight of exposure (i.e. propagule pressure) represents the primary force for colonization of new hosts within the Blastocystis species-complex. While Blastocystis lineages showed a propensity to recolonize the same host taxa, these taxa were often evolutionarily unrelated, suggesting that historical contingency and retention of previous adaptions by the parasite were more important to host switching than host phylogeny. Ultimately, our findings highlight the ability of ecological theory (i.e. ‘ecological fitting’) to explain host switching and host specificity within the Blastocystis species-complex.
In April 2019, the U.S. Fish and Wildlife Service (USFWS) released its recovery plan for the jaguar Panthera onca after several decades of discussion, litigation and controversy about the status of the species in the USA. The USFWS estimated that potential habitat, south of the Interstate-10 highway in Arizona and New Mexico, had a carrying capacity of c. six jaguars, and so focused its recovery programme on areas south of the USA–Mexico border. Here we present a systematic review of the modelling and assessment efforts over the last 25 years, with a focus on areas north of Interstate-10 in Arizona and New Mexico, outside the recovery unit considered by the USFWS. Despite differences in data inputs, methods, and analytical extent, the nine previous studies found support for potential suitable jaguar habitat in the central mountain ranges of Arizona and New Mexico. Applying slightly modified versions of the USFWS model and recalculating an Arizona-focused model over both states provided additional confirmation. Extending the area of consideration also substantially raised the carrying capacity of habitats in Arizona and New Mexico, from six to 90 or 151 adult jaguars, using the modified USFWS models. This review demonstrates the crucial ways in which choosing the extent of analysis influences the conclusions of a conservation plan. More importantly, it opens a new opportunity for jaguar conservation in North America that could help address threats from habitat losses, climate change and border infrastructure.
The recipients of NIH’s Clinical and Translational Science Awards (CTSA) have worked for over a decade to build informatics infrastructure in support of clinical and translational research. This infrastructure has proved invaluable for supporting responses to the current COVID-19 pandemic through direct patient care, clinical decision support, training researchers and practitioners, as well as public health surveillance and clinical research to levels that could not have been accomplished without the years of ground-laying work by the CTSAs. In this paper, we provide a perspective on our COVID-19 work and present relevant results of a survey of CTSA sites to broaden our understanding of the key features of their informatics programs, the informatics-related challenges they have experienced under COVID-19, and some of the innovations and solutions they developed in response to the pandemic. Responses demonstrated increased reliance by healthcare providers and researchers on access to electronic health record (EHR) data, both for local needs and for sharing with other institutions and national consortia. The initial work of the CTSAs on data capture, standards, interchange, and sharing policies all contributed to solutions, best illustrated by the creation, in record time, of a national clinical data repository in the National COVID-19 Cohort Collaborative (N3C). The survey data support seven recommendations for areas of informatics and public health investment and further study to support clinical and translational research in the post-COVID-19 era.
Some psychiatric disorders have been associated with increased risk of miscarriage. However, there is a lack of studies considering a broader spectrum of psychiatric disorders to clarify the role of common as opposed to independent mechanisms.
To examine the risk of miscarriage among women diagnosed with psychiatric conditions.
We studied registered pregnancies in Norway between 2010 and 2016 (n = 593 009). The birth registry captures pregnancies ending in gestational week 12 or later, and the patient and general practitioner databases were used to identify miscarriages and induced abortions before 12 gestational weeks. Odds ratios of miscarriage according to 12 psychiatric diagnoses were calculated by logistic regression.
Miscarriage risk was increased among women with bipolar disorders (adjusted odds ratio 1.35, 95% CI 1.26–1.44), personality disorders (adjusted odds ratio 1.32, 95% CI 1.12–1.55), attention-deficit hyperactivity disorder (adjusted odds ratio 1.27, 95% CI 1.21–1.33), conduct disorders (1.21, 95% CI 1.01, 1.46), anxiety disorders (adjusted odds ratio 1.25, 95% CI 1.23–1.28), depressive disorders (adjusted odds ratio 1.25, 95% CI 1.23–1.27), somatoform disorders (adjusted odds ratio 1.18, 95% CI 1.07–1.31) and eating disorders (adjusted odds ratio 1.14, 95% CI 1.08–1.22). The miscarriage risk was further increased among women with more than one psychiatric diagnosis. Our findings were robust to adjustment for other psychiatric diagnoses, chronic somatic disorders and substance use disorders. After mutual adjustment for co-occurring psychiatric disorders, we also observed a modest increased risk among women with schizophrenia spectrum disorders (adjusted odds ratio 1.22, 95% CI 1.03–1.44).
A wide range of psychiatric disorders were associated with increased risk of miscarriage. The heightened risk of miscarriage among women diagnosed with psychiatric disorders highlights the need for awareness and surveillance of this risk group in antenatal care.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Extracorporeal membrane oxygenation (ECMO) has accelerated rapidly for patients in severe cardiac or respiratory failure. As a result, ECMO networks are being developed across the world using a “hub and spoke” model. Current guidelines call for all patients transported on ECMO to be accompanied by a physician during transport. However, as ECMO centers and networks grow, the increasing number of transports will be limited by this mandate.
The aim of this study was to compare rates of adverse events occurring during transport of ECMO patients with and without an additional clinician, defined as a physician, nurse practitioner (NP), or physician assistant (PA).
This is a retrospective cohort study of all adults transported while cannulated on ECMO from 2011-2018 via ground and air between 21 hospitals in the northeastern United States, comparing transports with and without additional clinicians. The primary outcome was the rate of major adverse events, and the secondary outcome was minor adverse events.
Over the seven-year study period, 93 patients on ECMO were transported. Twenty-three transports (24.7%) were accompanied by a physician or other additional clinician. Major adverse events occurred in 21.5% of all transports. There was no difference in the total rate of major adverse events between accompanied and unaccompanied transports (P = .91). Multivariate analysis did not demonstrate any parameter as being predictive of major adverse events.
In a retrospective cohort study of transports of ECMO patients, there was no association between the overall rate of major adverse events in transport and the accompaniment of an additional clinician. No variables were associated with major adverse events in either cohort.
Three new tephras have been identified in Southeast Alaska. An 8-cm-thick black basaltic tephra with nine discrete normally graded beds is present in cores from a lake on Baker Island. The estimated age of the tephra is 13,492 ± 237 cal yr BP. Although similar in age to the MEd tephra from the adjacent Mt. Edgecumbe volcanic field, this tephra is geochemically distinct. Black basaltic tephras recovered from two additional sites in Southeast Alaska, Heceta Island and the Gulf of Esquibel, are also geochemically distinct from the MEd tephra. The age of the tephra from Heceta Island is 14,609 ± 343 cal yr BP. Whereas the tephras recovered from Baker Island/Heceta Island/Gulf of Esquibel are geochemically distinct from each other, similarities in the ages of these tephras and the MEd tephra suggest a shared eruptive trigger, possibly crustal unloading caused by retreat of the Cordilleran Ice Sheet. The submerged Addington volcanic field on the continental shelf, which may have been subaerially exposed during the late Pleistocene, is a possible source for the Southeast Alaska tephras.
The aim of this study was to assess the impact of a urinary tract infection (UTI) management bundle to reduce the treatment of asymptomatic bacteriuria (AB) and to improve the management of symptomatic UTIs.
Before-and-after intervention study.
Consecutive sample of inpatients with positive single or mixed-predominant urine cultures collected and reported while admitted to the hospital.
The UTI management bundle consisted of nursing and prescriber education, modification of the reporting of positive urine cultures, and pharmacists’ prospective audit and feedback. A retrospective chart review of consecutive inpatients with positive urinary cultures was performed before and after implementation of the management bundle.
Prior to the implementation of the management bundle, 276 patients were eligible criteria for chart review. Of these 276 patients, 165 (59·8%) were found to have AB; of these 165 patients with AB, 111 (67·3%) were treated with antimicrobials. Moreover, 268 patients met eligibility criteria for postintervention review. Of these 268, 133 patients (49·6%) were found to have AB; of these 133 with AB, 22 (16·5%) were treated with antimicrobials. Thus, a 75·5% reduction of AB treatment was achieved. Educational components of the bundle resulted in a substantial decrease in nonphysician-directed urine sample submission. Adherence to a UTI management algorithm improved substantially in the intervention period, with a notable decrease in fluoroquinolone prescription for empiric UTI treatment.
A UTI management bundle resulted in a dramatic improvement in the management of urinary tract infection, particularly a reduction in the treatment of AB and improved management of symptomatic UTI.
Posttraumatic stress disorder (PTSD) and stress/trauma exposure are cross-sectionally associated with advanced DNA methylation age relative to chronological age. However, longitudinal inquiry and examination of associations between advanced DNA methylation age and a broader range of psychiatric disorders is lacking. The aim of this study was to examine if PTSD, depression, generalized anxiety, and alcohol-use disorders predicted acceleration of DNA methylation age over time (i.e. an increasing pace, or rate of advancement, of the epigenetic clock).
Genome-wide DNA methylation and a comprehensive set of psychiatric symptoms and diagnoses were assessed in 179 Iraq/Afghanistan war veterans who completed two assessments over the course of approximately 2 years. Two DNA methylation age indices (Horvath and Hannum), each a weighted index of an array of genome-wide DNA methylation probes, were quantified. The pace of the epigenetic clock was operationalized as change in DNA methylation age as a function of time between assessments.
Analyses revealed that alcohol-use disorders (p = 0.001) and PTSD avoidance and numbing symptoms (p = 0.02) at Time 1 were associated with an increasing pace of the epigenetic clock over time, per the Horvath (but not the Hannum) index of cellular aging.
This is the first study to suggest that posttraumatic psychopathology is longitudinally associated with a quickened pace of the epigenetic clock. Results raise the possibility that accelerated cellular aging is a common biological consequence of stress-related psychopathology, which carries implications for identifying mechanisms of stress-related cellular aging and developing interventions to slow its pace.
This paper presents a novel approach to sensor-based feature evaluation and selection using a self-organizing map and spatial statistics as a combined technique applied to tool condition monitoring of the turning process. This approach takes advantage of the unique features of unsupervised neural networks combined with spatial statistics to perform analyses into the contributions of the different sensor-based features, carrying large quantities of noise, to achieve a classification of tool wear and a quantitative measure of each feature's suitability. This method does not assume a prior direct correlation between features avoiding misconstructions inherent to common approaches that assume that only obviously correlated features should be considered for condition monitoring. Instead, and taking advantage of neural networks ability to perform non-linear modeling, it has allowed a prior modeling of the process and then analyzed each feature's contribution toward classification. It was found that some of the commonly used features have proven to have a significant contribution to the classification of cutting tool wear, whereas others adversely affect classification performance. Further, it is demonstrated that the proposed combined technique can be used extensively to quantitatively evaluate the contribution of different features toward system monitoring in the presence of noisy data.
Inter-facility transport of critically ill patients is associated with a high risk of adverse events, and critical care transport (CCT) teams may spend considerable time at sending institutions preparing patients for transport. The effect of mode of transport and distance to be traveled on on-scene times (OSTs) has not been well-described.
Quantification of the time required to package patients and complete CCTs based on mode of transport and distance between facilities is important for hospitals and CCT teams to allocate resources effectively.
This is a retrospective review of OSTs and transport times for patients with hypoxemic respiratory failure transported from October 2009 through December 2012 from sending hospitals to three tertiary care hospitals. Differences among the OSTs and transport times based on the mode of transport (ground, rotor wing, or fixed wing), distance traveled, and intra-hospital pick-up location (emergency department [ED] vs intensive care unit [ICU]) were assessed. Correlations between OSTs and transport times were performed based on mode of transport and distance traveled.
Two hundred thirty-nine charts were identified for review. Mean OST was 42.2 (SD=18.8) minutes, and mean transport time was 35.7 (SD=19.5) minutes. On-scene time was greater than en route time for 147 patients and greater than total trip time for 91. Mean transport distance was 42.2 (SD=35.1) miles. There were no differences in the OST based on mode of transport; however, total transport time was significantly shorter for rotor versus ground, (39.9 [SD=19.9] minutes vs 54.2 [SD=24.7] minutes; P <.001) and for rotor versus fixed wing (84.3 [SD=34.2] minutes; P=0.02). On-scene time in the ED was significantly shorter than the ICU (33.5 [SD=15.7] minutes vs 45.2 [SD=18.8] minutes; P <.001). For all patients, regardless of mode of transportation, there was no correlation between OST and total miles travelled; although, there was a significant correlation between the time en route and distance, as well as total trip time and distance.
In this cohort of critically ill patients with hypoxemic respiratory failure, OST was over 40 minutes and was often longer than the total trip time. On-scene time did not correlate with mode of transport or distance traveled. These data can assist in planning inter-facility transports for both the sending and receiving hospitals, as well as CCT services.
WilcoxSR, SaiaMS, WadenH, McGahnSJ, FrakesM, WedelSK, RichardsJB. On-scene Times for Inter-facility Transport of Patients with Hypoxemic Respiratory Failure. Prehosp Disaster Med. 2016;31(3):267–271.
Clinicians need guidance to address the heterogeneity of treatment responses of patients with major depressive disorder (MDD). While prediction schemes based on symptom clustering and biomarkers have so far not yielded results of sufficient strength to inform clinical decision-making, prediction schemes based on big data predictive analytic models might be more practically useful.
We review evidence suggesting that prediction equations based on symptoms and other easily-assessed clinical features found in previous research to predict MDD treatment outcomes might provide a foundation for developing predictive analytic clinical decision support models that could help clinicians select optimal (personalised) MDD treatments. These methods could also be useful in targeting patient subsamples for more expensive biomarker assessments.
Approximately two dozen baseline variables obtained from medical records or patient reports have been found repeatedly in MDD treatment trials to predict overall treatment outcomes (i.e., intervention v. control) or differential treatment outcomes (i.e., intervention A v. intervention B). Similar evidence has been found in observational studies of MDD persistence-severity. However, no treatment studies have yet attempted to develop treatment outcome equations using the full set of these predictors. Promising preliminary empirical results coupled with recent developments in statistical methodology suggest that models could be developed to provide useful clinical decision support in personalised treatment selection. These tools could also provide a strong foundation to increase statistical power in focused studies of biomarkers and MDD heterogeneity of treatment response in subsequent controlled trials.
Coordinated efforts are needed to develop a protocol for systematically collecting information about established predictors of heterogeneity of MDD treatment response in large observational treatment studies, applying and refining these models in subsequent pragmatic trials, carrying out pooled secondary analyses to extract the maximum amount of information from these coordinated studies, and using this information to focus future discovery efforts in the segment of the patient population in which continued uncertainty about treatment response exists.
In a 1-year survey at a university hospital we found that 20·6% (81/392) of patients with antibiotic associated diarrohea where positive for C. difficile. The most common PCR ribotypes were 012 (14·8%), 027 (12·3%), 046 (12·3%) and 014/020 (9·9). The incidence rate was 2·6 cases of C. difficile infection for every 1000 outpatients.
Critical care transport (CCT) teams must manage a wide array of medications before and during transport. Appreciating the medications required for transport impacts formulary development as well as staff education and training.
As there are few data describing the patterns of medication administration, this study quantifies medication administrations and patterns in a series of adult CCTs.
This was a retrospective review of medication administration during CCTs of patients with severe hypoxemic respiratory failure from October 2009 through December 2012 from referring hospitals to three tertiary care hospitals.
Two hundred thirty-nine charts were identified for review. Medications were administered by the CCT team to 98.7% of these patients, with only three patients not receiving any medications from the team. Fifty-nine medications were administered in total with 996 instances of administration. Fifteen drugs were each administered to only one patient. The mean number of medications per patient was 4.2 (SD=1.8) with a mean of 1.9 (SD=1.1) drug infusions per patient.
These results demonstrate that, even within a relatively homogeneous population of patients transferred with hypoxemic respiratory failure, a wide range of medications were administered. The CCT teams frequently initiated, titrated, and discontinued continuous infusions, in addition to providing numerous doses of bolused medications.
WilcoxSR, SaiaMS, WadenH, McGahnSJ, FrakesM, WedelSK, RichardsJB. Medication Administration in Critical Care Transport of Adult Patients with Hypoxemic Respiratory Failure. Prehosp Disaster Med. 2015;30(4):1-5.
To develop latent classes of exposure to traumatic experiences before the age of 13 years in an urban community sample and to use these latent classes to predict the development of negative behavioral outcomes in adolescence and young adulthood.
A total of 1815 participants in an epidemiologically based, randomized field trial as children completed comprehensive psychiatric assessments as young adults. Reported experiences of nine traumatic experiences before age 13 years were used in a latent class analysis to create latent profiles of traumatic experiences. Latent classes were used to predict psychiatric outcomes at age ⩾13 years, criminal convictions, physical health problems and traumatic experiences reported in young adulthood.
Three latent classes of childhood traumatic experiences were supported by the data. One class (8% of sample), primarily female, was characterized by experiences of sexual assault and reported significantly higher rates of a range of psychiatric outcomes by young adulthood. Another class (8%), primarily male, was characterized by experiences of violence exposure and reported higher levels of antisocial personality disorder and post-traumatic stress. The final class (84%) reported low levels of childhood traumatic experiences. Parental psychopathology was related to membership in the sexual assault group.
Classes of childhood traumatic experiences predict specific psychiatric and behavioral outcomes in adolescence and young adulthood. The long-term adverse effects of childhood traumas are primarily concentrated in victims of sexual and non-sexual violence. Gender emerged as a key covariate in the classes of trauma exposure and outcomes.
The first aim was to use confirmatory factor analysis (CFA) to test a hypothesis that two factors (internalizing and externalizing) account for lifetime co-morbid DSM-IV diagnoses among adults with bipolar I (BPI) disorder. The second aim was to use confirmatory latent class analysis (CLCA) to test the hypothesis that four clinical subtypes are detectible: pure BPI; BPI plus internalizing disorders only; BPI plus externalizing disorders only; and BPI plus internalizing and externalizing disorders.
A cohort of 699 multiplex BPI families was studied, ascertained and assessed (1998–2003) by the National Institute of Mental Health Genetics Initiative Bipolar Consortium: 1156 with BPI disorder (504 adult probands; 594 first-degree relatives; and 58 more distant relatives) and 563 first-degree relatives without BPI. Best-estimate consensus DSM-IV diagnoses were based on structured interviews, family history and medical records. MPLUS software was used for CFA and CLCA.
The two-factor CFA model fit the data very well, and could not be improved by adding or removing paths. The four-class CLCA model fit better than exploratory LCA models or post-hoc-modified CLCA models. The two factors and four classes were associated with distinctive clinical course and severity variables, adjusted for proband gender. Co-morbidity, especially more than one internalizing and/or externalizing disorder, was associated with a more severe and complicated course of illness. The four classes demonstrated significant familial aggregation, adjusted for gender and age of relatives.
The BPI two-factor and four-cluster hypotheses demonstrated substantial confirmatory support. These models may be useful for subtyping BPI disorders, predicting course of illness and refining the phenotype in genetic studies.
To examine cross-national patterns and correlates of lifetime and 12-month comorbid DSM-IV anxiety disorders among people with lifetime and 12-month DSM-IV major depressive disorder (MDD).
Nationally or regionally representative epidemiological interviews were administered to 74 045 adults in 27 surveys across 24 countries in the WHO World Mental Health (WMH) Surveys. DSM-IV MDD, a wide range of comorbid DSM-IV anxiety disorders, and a number of correlates were assessed with the WHO Composite International Diagnostic Interview (CIDI).
45.7% of respondents with lifetime MDD (32.0–46.5% inter-quartile range (IQR) across surveys) had one of more lifetime anxiety disorders. A slightly higher proportion of respondents with 12-month MDD had lifetime anxiety disorders (51.7%, 37.8–54.0% IQR) and only slightly lower proportions of respondents with 12-month MDD had 12-month anxiety disorders (41.6%, 29.9–47.2% IQR). Two-thirds (68%) of respondents with lifetime comorbid anxiety disorders and MDD reported an earlier age-of-onset (AOO) of their first anxiety disorder than their MDD, while 13.5% reported an earlier AOO of MDD and the remaining 18.5% reported the same AOO of both disorders. Women and previously married people had consistently elevated rates of lifetime and 12-month MDD as well as comorbid anxiety disorders. Consistently higher proportions of respondents with 12-month anxious than non-anxious MDD reported severe role impairment (64.4 v. 46.0%; χ21 = 187.0, p < 0.001) and suicide ideation (19.5 v. 8.9%; χ21 = 71.6, p < 0.001). Significantly more respondents with 12-month anxious than non-anxious MDD received treatment for their depression in the 12 months before interview, but this difference was more pronounced in high-income countries (68.8 v. 45.4%; χ21 = 108.8, p < 0.001) than low/middle-income countries (30.3 v. 20.6%; χ21 = 11.7, p < 0.001).
Patterns and correlates of comorbid DSM-IV anxiety disorders among people with DSM-IV MDD are similar across WMH countries. The narrow IQR of the proportion of respondents with temporally prior AOO of anxiety disorders than comorbid MDD (69.6–74.7%) is especially noteworthy. However, the fact that these proportions are not higher among respondents with 12-month than lifetime comorbidity means that temporal priority between lifetime anxiety disorders and MDD is not related to MDD persistence among people with anxious MDD. This, in turn, raises complex questions about the relative importance of temporally primary anxiety disorders as risk markers v. causal risk factors for subsequent MDD onset and persistence, including the possibility that anxiety disorders might primarily be risk markers for MDD onset and causal risk factors for MDD persistence.