To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Traditionally, depression phenotypes have been defined based on interindividual differences that distinguish between subgroups of individuals expressing distinct depressive symptoms often from cross-sectional data. Alternatively, depression phenotypes can be defined based on intraindividual differences, differentiating between transitory states of distinct symptoms profiles that a person transitions into or out of over time. Such within-person phenotypic states are less examined, despite their potential significance for understanding and treating depression.
The current study used intensive longitudinal data of youths (N = 120) at risk for depression. Clinical interviews (at baseline, 4, 10, 16, and 22 months) yielded 90 weekly assessments. We applied a multilevel hidden Markov model to identify intraindividual phenotypes of weekly depressive symptoms for at-risk youth.
Three intraindividual phenotypes emerged: a low-depression state, an elevated-depression state, and a cognitive-physical-symptom state. Youth had a high probability of remaining in the same state over time. Furthermore, probabilities of transitioning from one state to another did not differ by age or ethnoracial minority status; girls were more likely than boys to transition from a low-depression state to either the elevated-depression state or the cognitive-physical symptom state. Finally, these intraindividual phenotypes and their dynamics were associated with comorbid externalizing symptoms.
Identifying these states as well as the transitions between them characterizes how symptoms of depression change over time and provide potential directions for intervention efforts
International reports suggest there have been prehospital delays for time-sensitive emergencies like stroke and TIA during the COVID-19 pandemic. The aim was to investigate the impact of the COVID-19 pandemic on ambulance times and emergency call volume for adults with suspected stroke and TIA in Ireland.
We conducted a retrospective cohort study of patients ≥ 18 years with suspected stroke/TIA, based on data from the National Ambulance Service. We included all cases assigned code 28 (suspected stroke/TIA) by the emergency call-taker, from 2018-2021. We compared ambulance times and emergency call volume by week, the four COVID-19 waves (defined by the Health Protection Surveillance Centre) and annually. The COVID-19 period was from March 1, 2020 - December 19, 2021 and the pre-COVID-19 period January 1, 2018 - February 29, 2020. Continuous variables were compared with t-tests and categorical variables with Pearson’s χ2 tests.
40,012 cases were included: 20,281 in the pre-COVID-19 period and 19,731 in the COVID-19 period. Mean patient age significantly decreased between the two periods, from 71 years (±16.5) to 69.8 years (±17.1); p<0.001. Mean ambulance response time increased between the two periods from 17 minutes 31 seconds to 18 minutes 59 seconds (p<0.001). The number of cases with symptom onset to emergency call time of >4 hours significantly increased from 5,581 to 6,060 during the COVID-19 period (p<0.001). Mean calls/day increased from 25.1/day to 30.1/day during the COVID-19 period.
Early findings from the study suggest an increase in call volume for stroke/TIA between the COVID-19 and pre-COVID-19 periods. An increase in response times during the same periods was also found. We concluded that longer symptom-to-call times indicate a change in healthcare-seeking behavior. Sustaining high levels of compliance with stroke code protocols is crucial during healthcare crises. Future research will involve further analysis including controlling for confounders.
This article argues that the European Central Bank (ECB), supported by the Court of Justice of the European Union (CJEU), can be perceived to have functionally softened the no sovereign lender of last resort (LOLR) rule originally implied by Articles 123 and 125 of the Treaty on the Functioning of the European Union (TFEU) towards a rule-with-exceptions and, increasingly, towards a presumption: The ECB will act as sovereign LOLR to a constituent Member unless and until that Member is insolvent or unwilling to cooperate with measures designed to restore market confidence. This functional moderation of a rule, from an ex ante specification of an outcome towards the exercise of greater choice at the point of application, carries with it contentious normative questions. To motivate discussion thereof beyond a largely ahistorical, non-indexical, rules versus discretion debate, the rules of the currency union are located within the genealogy of international exchange rate regimes. The “convertibility” rule of the gold standard and the “parity” rule of the Bretton Woods system are contrasted with their Eurozone equivalent. A consequentialist standpoint is sketched out from which the interventions of the ECB, in light of their available alternatives, appear broadly consistent with welfarist cost-benefit analysis and less normatively worrisome than by reference to evaluative criteria that emphasize a narrowly rule-bound conception of the rule of law.
Sports participation, physical activity, and friendship quality are theorized to have protective effects on the developmental emergence of substance use and self-harm behavior in adolescence, but existing research has been mixed. This ambiguity could reflect, in part, the potential for confounding of observed associations by genetic and environmental factors, which previous research has been unable to rigorously rule out. We used data from the prospective, population-based Child and Adolescent Twin Study in Sweden (n = 18,234 born 1994–2001) and applied a co-twin control design to account for potential genetic and environmental confounding of sports participation, physical activity, and friendship quality (assessed at age 15) as presumed protective factors for adolescent substance use and self-harm behavior (assessed at age 18). While confidence intervals widened to include the null in numerous co-twin control analyses adjusting for childhood psychopathology, parent-reported sports participation and twin-reported positive friendship quality were associated with increased odds of alcohol problems and nicotine use. However, parent-reported sports participation, twin-reported physical activity, and twin-reported friendship quality were associated with decreased odds of self-harm behavior. The findings provide a more nuanced understanding of the risks and benefits of putative protective factors for risky behaviors that emerge during adolescence.
Retrospective self-report is typically used for diagnosing previous pediatric traumatic brain injury (TBI). A new semi-structured interview instrument (New Mexico Assessment of Pediatric TBI; NewMAP TBI) investigated test–retest reliability for TBI characteristics in both the TBI that qualified for study inclusion and for lifetime history of TBI.
One-hundred and eight-four mTBI (aged 8–18), 156 matched healthy controls (HC), and their parents completed the NewMAP TBI within 11 days (subacute; SA) and 4 months (early chronic; EC) of injury, with a subset returning at 1 year (late chronic; LC).
The test–retest reliability of common TBI characteristics [loss of consciousness (LOC), post-traumatic amnesia (PTA), retrograde amnesia, confusion/disorientation] and post-concussion symptoms (PCS) were examined across study visits. Aside from PTA, binary reporting (present/absent) for all TBI characteristics exhibited acceptable (≥0.60) test–retest reliability for both Qualifying and Remote TBIs across all three visits. In contrast, reliability for continuous data (exact duration) was generally unacceptable, with LOC and PCS meeting acceptable criteria at only half of the assessments. Transforming continuous self-report ratings into discrete categories based on injury severity resulted in acceptable reliability. Reliability was not strongly affected by the parent completing the NewMAP TBI.
Categorical reporting of TBI characteristics in children and adolescents can aid clinicians in retrospectively obtaining reliable estimates of TBI severity up to a year post-injury. However, test–retest reliability is strongly impacted by the initial data distribution, selected statistical methods, and potentially by patient difficulty in distinguishing among conceptually similar medical concepts (i.e., PTA vs. confusion).
To describe a pilot project infection prevention and control (IPC) assessment conducted in skilled nursing facilities (SNFs) in New York State (NYS) during a pivotal 2-week period when the region became the nation’s epicenter for coronavirus disease 2019 (COVID-19).
A telephone and video assessment of IPC measures in SNFs at high risk or experiencing COVID-19 activity.
SNFs in 14 New York counties, including New York City.
A 3-component remote IPC assessment: (1) screening tool; (2) telephone IPC checklist; and (3) COVID-19 video IPC assessment (ie, “COVIDeo”).
In total, 92 SNFs completed the IPC screening tool and checklist: 52 (57%) were conducted as part COVID-19 investigations, and 40 (43%) were proactive prevention-based assessments. Among the 40 proactive assessments, 14 (35%) identified suspected or confirmed COVID-19 cases. COVIDeo was performed in 26 (28%) of 92 assessments and provided observations that other tools would have missed: personal protective equipment (PPE) that was not easily accessible, redundant, or improperly donned, doffed, or stored and specific challenges implementing IPC in specialty populations. The IPC assessments took ∼1 hour each and reached an estimated 4 times as many SNFs as on-site visits in a similar time frame.
Remote IPC assessments by telephone and video were timely and feasible methods of assessing the extent to which IPC interventions had been implemented in a vulnerable setting and to disseminate real-time recommendations. Remote assessments are now being implemented across New York State and in various healthcare facility types. Similar methods have been adapted nationally by the Centers for Disease Control and Prevention.
Authority of the European Central Bank (the Bank) over its operational norms in the eyes of market actors – Exogenous and endogenous authority and legitimacy – The reconciliation by the Bank and the Court of Justice of the EU (the Court) of the pre-existing norm and political-economic reality with Article 123 TFEU – Sovereign lender of last resort – Eurozone Crisis – Outright Monetary Transactions (OMT) – Public Sector Purchase Programme (PSPP) – Pandemic Emergency Purchase Programme (PEPP)
Horseweed and giant ragweed are competitive, annual weeds that can negatively impact crop yield. Biotypes of glyphosate-resistant (GR) giant ragweed and horseweed were first reported in 2008 and 2010 in Ontario, respectively. GR horseweed has spread throughout the southern portion of the province. The presence of GR biotypes poses new challenges for soybean producers in Canada and the United States. Halauxifen-methyl is a recently registered selective herbicide against broadleaf weeds for preplant use in corn and soybean. There is limited literature on the efficacy of halauxifen-methyl on GR horseweed and giant ragweed when combined with currently registered products in Canada. The purpose of this study was to determine the effectiveness of halauxifen-methyl applied alone and tank-mixed to control GR giant ragweed and GR horseweed in glyphosate and dicamba-resistant (GDR) soybean in southwestern Ontario. Six field experiments were conducted separately for each weed species over 2018 and 2019. Halauxifen-methyl applied alone offered 72% control of GR horseweed at 8 wk after application (WAA). Control was improved to >91% when halauxifen-methyl applied in combination with metribuzin, saflufenacil, chlorimuron-ethyl + metribuzin, and saflufenacil + metribuzin. At 8 WAA, halauxifen-methyl provided 11% control of GR giant ragweed, and 76% to 88% control when glyphosate/2,4-D choline, glyphosate/dicamba, glyphosate/2,4-D choline + halauxifen-methyl, and glyphosate/dicamba + halauxifen-methyl were used. We conclude that halauxifen-methyl applied preplant in a tank-mixture can provide effective control of GR giant ragweed and horseweed in GDR soybean.
Preplant (PP) herbicide applications are an important tool within an integrated weed management system, specifically in no-till production. An understanding of crop tolerance regarding PP applications is important for effectively integrating a new herbicide into no-till cropping systems. Twelve field trials (six in corn and six in soybean) were conducted over a 2-yr period (2018 and 2019) near Exeter and Ridgetown, ON. The purpose of these studies was to evaluate the tolerance of soybean and corn to halauxifen-methyl applied PP, PRE, or POST at the registered rate (5 g a.i. ha−1) and twice the registered rate (10 g a.i. ha−1), hereafter referred to as the 1× and 2× rate, respectively. All trials were kept weed-free throughout the growing season to remove the confounding effect of weed interference. Halauxifen-methyl applied 14 d preplant (DPP), 7 DPP, 1 DPP, and 5 d after seeding (DAS) at the 1× and 2× rates caused ≤10% visible soybean injury. In contrast, halauxifen-methyl applied POST (cotyledon–unifoliate stage, VE-VC) caused 67% to 87% visible soybean injury, a 50% to 53% reduction in height, 65% to 81% decrease in population, 56% to 67% lower biomass, and 53% to 63% decline in yield. Halauxifen-methyl applied 10 DPP, 5 DPP, 1 DPP, 5 DAS, and POST (spike–one leaf stage, VE-V1) at the 1× and 2× rate caused ≤3% visible corn injury and caused no effect on corn height or biomass. Halauxifen-methyl applied at VE-V1 at the 2× rate reduced corn yield 10%. Based on these studies, the current application restriction of 7 DPP in soybean and 5 DPP in corn is conservative and could be expanded. Expanding the application window of halauxifen-methyl would increase the utility of this herbicide for producers.
Horseweed is a competitive summer or winter annual weed that produces up to 230,000 small seeds per plant that are capable of traveling more than 500 km via wind. Giant ragweed is a tall, highly competitive summer annual weed. Glyphosate-resistant (GR) horseweed and GR giant ragweed pose significant challenges for producers in the United States and Ontario, Canada. It is thought that an integrated weed management (IWM) system involving herbicide rotation is required to control GR biotypes. Halauxifen-methyl is a new selective broadleaf POST herbicide registered for use in cereal crops; there is limited information on its efficacy on horseweed and giant ragweed. The purpose of this research was to determine the efficacy of halauxifen-methyl applied POST, alone and in a tank mix, for the control of GR horseweed and GR giant ragweed in wheat across southwestern Ontario. For each weed species, an efficacy study consisting of six field experiments was conducted over a 2-yr period (2018, 2019). At 8 wk after application (WAA), halauxifen-methyl, fluroxypyr/halauxifen-methyl, fluroxypyr/halauxifen-methyl + MCPA EHE, fluroxypyr + MCPA ester, 2,4-D ester, clopyralid, and pyrasulfotole/bromoxynil + ammonium sulfate controlled GR horseweed >95%. Fluroxypyr and MCPA provided only 86% and 37% control of GR horseweed, respectively. At 8 WAA, fluroxypyr, fluroxypyr/halauxifen-methyl, fluroxypyr/halauxifen-methyl + MCPA EHE, fluroxypyr + MCPA ester, fluroxypyr/halauxifen-methyl + MCPA EHE + pyroxsulam, 2,4-D ester, clopyralid, and thifensulfuron/tribenuron + fluroxypyr + MCPA ester controlled GR giant ragweed 87%, 88%, 90%, 94%, 96%, 96%, 98%, and 93%, respectively. Halauxifen-methyl and pyroxsulam provided only 45% and 28% control of GR giant ragweed, respectively. Halauxifen-methyl applied alone POST in the spring controlled GR horseweed but not GR giant ragweed in winter wheat.
In New York City, a multi-disciplinary Mass Casualty Consultation team is proposed to support prioritization of patients for coordinated inter-facility transfer after a large-scale mass casualty event. This study examines factors that influence consultation team prioritization decisions.
As part of a multi-hospital functional exercise, 2 teams prioritized the same set of 69 patient profiles. Prioritization decisions were compared between teams. Agreement between teams was assessed based on patient profile demographics and injury severity. An investigator interviewed team leaders to determine reasons for discordant transfer decisions.
The 2 teams differed significantly in the total number of transfers recommended (49 vs 36; P = 0.003). However, there was substantial agreement when recommending transfer to burn centers, with 85.5% agreement and inter-rater reliability of 0.67 (confidence interval: 0.49–0.85). There was better agreement for patients with a higher acuity of injuries. Based on interviews, the most common reason for discordance was insider knowledge of the local community hospital and its capabilities.
A multi-disciplinary Mass Casualty Consultation team was able to rapidly prioritize patients for coordinated secondary transfer using limited clinical information. Training for consultation teams should emphasize guidelines for transfer based on existing services at sending and receiving hospitals, as knowledge of local community hospital capabilities influence physician decision-making.
Objective: Post-stroke cognitive impairment is common, but mechanisms and risk factors are poorly understood. Frailty may be an important risk factor for cognitive impairment after stroke. We investigated the association between pre-stroke frailty and acute post-stoke cognition. Methods: We studied consecutively admitted acute stroke patients in a single urban teaching hospital during three recruitment waves between May 2016 and December 2017. Cognition was assessed using the Mini-Montreal Cognitive Assessment (min=0; max=12). A Frailty Index was used to generate frailty scores for each patient (min=0; max=100). Clinical and demographic information were collected, including pre-stroke cognition, delirium, and stroke-severity. We conducted univariate and multiple-linear regression analyses with covariates forced in (covariates included were: age, sex, stroke severity, stroke-type, pre-stroke cognitive impairment, delirium, previous stroke/transient ischemic attack) to investigate the association between pre-stroke frailty and post-stroke cognition. Results: Complete data were available for 154 stroke patients. Mean age was 68 years (SD=11; range=32–97); 93 (60%) were male. Median mini-Montreal Cognitive Assessment score was 8 (IQR=4–12). Mean Frailty Index score was 18 (SD=11). Pre-stroke cognitive impairment was apparent in 13/154 (8%) patients. Pre-stroke frailty was significantly associated with lower post-stroke cognition (Standardized-Beta=−0.40; p<0.001) and this association was independent of covariates (Unstandardized-Beta=−0.05; p=0.005). Additional significant variables in the multiple regression model were age (Unstandardized-Beta=−0.05; p=0.002), delirium (Unstandardized-Beta=−2.81; p<0.001), pre-stroke cognitive impairment (Unstandardized-Beta=−2.28; p=0.001), and stroke-severity (Unstandardized-Beta=−0.20; p<0.001). Conclusions: Pre-stroke frailty may be a moderator of post-stroke cognition, independent of other well-established post-stroke cognitive impairment risk factors. (JINS, 2019, 25, 501–506)
We describe the motivation and design details of the ‘Phase II’ upgrade of the Murchison Widefield Array radio telescope. The expansion doubles to 256 the number of antenna tiles deployed in the array. The new antenna tiles enhance the capabilities of the Murchison Widefield Array in several key science areas. Seventy-two of the new tiles are deployed in a regular configuration near the existing array core. These new tiles enhance the surface brightness sensitivity of the array and will improve the ability of the Murchison Widefield Array to estimate the slope of the Epoch of Reionisation power spectrum by a factor of ∼3.5. The remaining 56 tiles are deployed on long baselines, doubling the maximum baseline of the array and improving the array u, v coverage. The improved imaging capabilities will provide an order of magnitude improvement in the noise floor of Murchison Widefield Array continuum images. The upgrade retains all of the features that have underpinned the Murchison Widefield Array’s success (large field of view, snapshot image quality, and pointing agility) and boosts the scientific potential with enhanced imaging capabilities and by enabling new calibration strategies.
Depression is a common post-stroke complication. Pre-stroke depression may be an important contributor, however the epidemiology of pre-stroke depression is poorly understood. Using systematic review and meta-analysis, we described the prevalence of pre-stroke depression and its association with post-stroke depression.
We searched multiple cross-disciplinary databases from inception to July 2017 and extracted data on the prevalence of pre-stroke depression and its association with post-stroke depression. We assessed the risk of bias (RoB) using validated tools. We described summary estimates of prevalence and summary odds ratio (OR) for association with post-stroke depression, using random-effects models. We performed subgroup analysis describing the effect of depression assessment method. We used a funnel plot to describe potential publication bias. The strength of evidence presented in this review was summarised via ‘GRADE’.
Of 11 884 studies identified, 29 were included (total participants n = 164 993). Pre-stroke depression pooled prevalence was 11.6% [95% confidence interval (CI) 9.2–14.7]; range: 0.4–24% (I2 95.8). Prevalence of pre-stroke depression varied by assessment method (p = 0.02) with clinical interview suggesting greater pre-stroke depression prevalence (~17%) than case-note review (9%) or self-report (11%). Pre-stroke depression was associated with increased odds of post-stroke depression; summary OR 3.0 (95% CI 2.3–4.0). All studies were judged to be at RoB: 59% of included studies had an uncertain RoB in stroke assessment; 83% had high or uncertain RoB for pre-stroke depression assessment. Funnel plot indicated no risk of publication bias. The strength of evidence based on GRADE was ‘very low’.
One in six stroke patients have had pre-stroke depression. Reported rates may be routinely underestimated due to limitations around assessment. Pre-stroke depression significantly increases odds of post-stroke depression.
We evaluated the extent to which providing training and technical assistance to early childcare centre (ECC) directors, faculty and staff in the implementation of evidence-based nutrition strategies improved the nutrition contexts, policies and practices of ECC serving racially and ethnically diverse, low-income children in Broward County, Florida, USA. The nutrition strategies targeted snack and beverage policies and practices, consistent with Caring for Our Children National Standards.
We used the nutrition observation and document review portions of the Environment and Policy Assessment and Observation (EPAO) instrument to observe ECC as part of a one-group pre-test/post-test evaluation design.
ECC located within areas of high rates of poverty, diabetes, minority representation and unhealthy food index in Broward County, Florida, USA.
Eighteen ECC enrolled, mean 112·9 (sd 53·4) children aged 2–5 years; 12·3 (sd 7·2) staff members; and 10·2 (sd 4·6) children per staff member at each centre.
We found significant improvements in centres’ overall nutrition contexts, as measured by total EPAO nutrition scores (P=0·01). ECC made specific significant gains within written nutrition policies (P=0·03) and nutrition training and education (P=0·01).
Our findings support training ECC directors, faculty and staff in evidence-based nutrition strategies to improve the nutrition policies and practices of ECC serving racially and ethnically diverse children from low-income families. The intervention resulted in improvements in some nutrition policies and practices, but not others. There remains a need to further develop the evaluation base involving the effectiveness of policy and practice interventions within ECC serving children in high-need areas.
Many perennial bioenergy grasses have the potential to escape cultivation and invade natural areas. We quantify dispersal, a key component in invasion, for two bioenergy candidates:Miscanthus sinensis and M. × giganteus. For each species, approximately 1 × 106 caryopses dispersed anemochorously from a point source into traps placed in annuli near the source (0.5 to 5 m; 1.6 to 16.4 ft) and in arcs (10 to 400 m) in the prevailing wind direction. For both species, most caryopses (95% for M. sinensis and 77% for M. × giganteus) were captured within 50 m of the source, but a small percentage (0.2 to 3%) were captured at 300 m and 400 m. Using a maximum-likelihood approach, we evaluated the degree of support in our empirical dispersal data for competing functions to describe seed-dispersal kernels. Fat-tailed functions (lognormal, Weibull, and gamma (Γ)) fit dispersal patterns best for both species overall, but because M. sinensis dispersal distances were significantly affected by wind speed, curves were also fit separately for dispersal distances in low, moderate, and high wind events. Wind speeds shifted the M. sinensis dispersal curve from a thin-tailed exponential function at low speeds to fat-tailed lognormal functions at moderate and high wind speeds. M. sinensis caryopses traveled farther in higher wind speeds (low, 30 m; moderate, 150 m; high, 400 m). Our results demonstrate the ability of Miscanthus caryopses to travel long distances and raise important implications for potential escape and invasion of fertile Miscanthus varieties from bioenergy cultivation.
The Asian grass Miscanthus sinensis (Poaceae) is being considered for use as a bioenergy crop in the U.S. Corn Belt. Originally introduced to the United States for ornamental plantings, it escaped, forming invasive populations. The concern is that naturalized M. sinensis populations have evolved shade tolerance. We tested the hypothesis that seedlings from within the invasive U.S. range of M. sinensis would display traits associated with shade tolerance, namely increased area for light capture and phenotypic plasticity, compared with seedlings from the native Japanese populations. In a common garden experiment, seedlings of 80 half-sib maternal lines were grown from the native range (Japan) and 60 half-sib maternal lines from the invasive range (U.S.) under four light levels. Seedling leaf area, leaf size, growth, and biomass allocation were measured on the resulting seedlings after 12 wk. Seedlings from both regions responded strongly to the light gradient. High light conditions resulted in seedlings with greater leaf area, larger leaves, and a shift to greater belowground biomass investment, compared with shaded seedlings. Japanese seedlings produced more biomass and total leaf area than U.S. seedlings across all light levels. Generally, U.S. and Japanese seedlings allocated a similar amount of biomass to foliage and equal leaf area per leaf mass. Subtle differences in light response by region were observed for total leaf area, mass, growth, and leaf size. U.S. seedlings had slightly higher plasticity for total mass and leaf area but lower plasticity for measures of biomass allocation and leaf traits compared with Japanese seedlings. Our results do not provide general support for the hypothesis of increased M. sinensis shade tolerance within its introduced U.S. range compared with native Japanese populations.
The world’s largest outbreak of Ebola virus disease began in West Africa in 2014. Although few cases were identified in the United States, the possibility of imported cases led US public health systems and health care facilities to focus on preparing the health care system to quickly and safely identify and respond to emerging infectious diseases. In New York City, early, coordinated planning among city and state agencies and the health care delivery system led to a successful response to a single case diagnosed in a returned health care worker. In this article we describe public health and health care system preparedness efforts in New York City to respond to Ebola and conclude that coordinated public health emergency response relies on joint planning and sustained resources for public health emergency response, epidemiology and laboratory capacity, and health care emergency management. (Disaster Med Public Health Preparedness. 2017;11:370–374).