We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Risk of suicide-related behaviors is elevated among military personnel transitioning to civilian life. An earlier report showed that high-risk U.S. Army soldiers could be identified shortly before this transition with a machine learning model that included predictors from administrative systems, self-report surveys, and geospatial data. Based on this result, a Veterans Affairs and Army initiative was launched to evaluate a suicide-prevention intervention for high-risk transitioning soldiers. To make targeting practical, though, a streamlined model and risk calculator were needed that used only a short series of self-report survey questions.
Methods
We revised the original model in a sample of n = 8335 observations from the Study to Assess Risk and Resilience in Servicemembers-Longitudinal Study (STARRS-LS) who participated in one of three Army STARRS 2011–2014 baseline surveys while in service and in one or more subsequent panel surveys (LS1: 2016–2018, LS2: 2018–2019) after leaving service. We trained ensemble machine learning models with constrained numbers of item-level survey predictors in a 70% training sample. The outcome was self-reported post-transition suicide attempts (SA). The models were validated in the 30% test sample.
Results
Twelve-month post-transition SA prevalence was 1.0% (s.e. = 0.1). The best constrained model, with only 17 predictors, had a test sample ROC-AUC of 0.85 (s.e. = 0.03). The 10–30% of respondents with the highest predicted risk included 44.9–92.5% of 12-month SAs.
Conclusions
An accurate SA risk calculator based on a short self-report survey can target transitioning soldiers shortly before leaving service for intervention to prevent post-transition SA.
Identification of genetic risk factors may inform the prevention and treatment of posttraumatic stress disorder (PTSD). This study evaluates the associations of polygenic risk scores (PRS) with patterns of posttraumatic stress symptoms following combat deployment.
Method
US Army soldiers of European ancestry (n = 4900) provided genomic data and ratings of posttraumatic stress symptoms before and after deployment to Afghanistan in 2012. Latent growth mixture modeling was used to model posttraumatic stress symptom trajectories among participants who provided post-deployment data (n = 4353). Multinomial logistic regression models tested independent associations between trajectory membership and PRS for PTSD, major depressive disorder (MDD), schizophrenia, neuroticism, alcohol use disorder, and suicide attempt, controlling for age, sex, ancestry, and exposure to potentially traumatic events, and weighted to account for uncertainty in trajectory classification and missing data.
Results
Participants were classified into low-severity (77.2%), increasing-severity (10.5%), decreasing-severity (8.0%), and high-severity (4.3%) posttraumatic stress symptom trajectories. Standardized PTSD-PRS and MDD-PRS were associated with greater odds of membership in the high-severity v. low-severity trajectory [adjusted odds ratios and 95% confidence intervals, 1.23 (1.06–1.43) and 1.18 (1.02–1.37), respectively] and the increasing-severity v. low-severity trajectory [1.12 (1.01–1.25) and 1.16 (1.04–1.28), respectively]. Additionally, MDD-PRS was associated with greater odds of membership in the decreasing-severity v. low-severity trajectory [1.16 (1.03–1.31)]. No other associations were statistically significant.
Conclusions
Higher polygenic risk for PTSD or MDD is associated with more severe posttraumatic stress symptom trajectories following combat deployment. PRS may help stratify at-risk individuals, enabling more precise targeting of treatment and prevention programs.
Emotion reactivity and risk behaviors (ERRB) are transdiagnostic dimensions associated with suicide attempt (SA). ERRB patterns may identify individuals at increased risk of future SAs.
Methods
A representative sample of US Army soldiers entering basic combat training (n = 21 772) was surveyed and followed via administrative records for their first 48 months of service. Latent profile analysis of baseline survey items assessing ERRB dimensions, including emotion reactivity, impulsivity, and risk-taking behaviors, identified distinct response patterns (classes). SAs were identified using administrative medical records. A discrete-time survival framework was used to examine associations of ERRB classes with subsequent SA during the first 48 months of service, adjusting for time in service, socio-demographic and service-related variables, and mental health diagnosis (MH-Dx). We examined whether associations of ERRB classes with SA differed by year of service and for soldiers with and without a MH-Dx.
Results
Of 21 772 respondents (86.2% male, 61.8% White non-Hispanic), 253 made a SA. Four ERRB classes were identified: ‘Indirect Harming’ (8.9% of soldiers), ‘Impulsive’ (19.3%), ‘Risk-Taking’ (16.3%), and ‘Low ERRB’ (55.6%). Compared to Low ERRB, Impulsive [OR 1.8 (95% CI 1.3–2.4)] and Risk-Taking [OR 1.6 (95% CI 1.1–2.2)] had higher odds of SA after adjusting for covariates. The ERRB class and MH-Dx interaction was non-significant. Within each class, SA risk varied across service time.
Conclusions
SA risk within the four identified ERRB classes varied across service time. Impulsive and Risk-Taking soldiers had increased risk of future SA. MH-Dx did not modify these associations, which may therefore help identify risk in those not yet receiving mental healthcare.
Personality traits (e.g. neuroticism) and the social environment predict risk for internalizing disorders and suicidal behavior. Studying these characteristics together and prospectively within a population confronted with high stressor exposure (e.g. U.S. Army soldiers) has not been done, yet could uncover unique and interactive predictive effects that may inform prevention and early intervention efforts.
Methods
Five broad personality traits and social network size were assessed via self-administered questionnaires among experienced soldiers preparing for deployment (N = 4645) and new soldiers reporting for basic training (N = 6216). Predictive models examined associations of baseline personality and social network variables with recent distress disorders or suicidal behaviors assessed 3- and 9-months post-deployment and approximately 5 years following enlistment.
Results
Among the personality traits, elevated neuroticism was consistently associated with increased mental health risk following deployment. Small social networks were also associated with increased mental health risk following deployment, beyond the variance accounted for by personality. Limited support was found for social network size moderating the association between personality and mental health outcomes. Small social networks also predicted distress disorders and suicidal behavior 5 years following enlistment, whereas unique effects of personality traits on these more distal outcomes were rare.
Conclusions
Heightened neuroticism and small social networks predict a greater risk for negative mental health sequelae, especially following deployment. Social ties may mitigate adverse impacts of personality traits on psychopathology in some contexts. Early identification and targeted intervention for these distinct, modifiable factors may decrease the risk of distress disorders and suicidal behavior.
In times of repeated disaster events, including natural disasters and pandemics, public health workers must recover rapidly to respond to subsequent events. Understanding predictors of time to recovery and developing predictive models of time to recovery can aid planning and management.
Methods:
We examined 681 public health workers (21-72 y, M(standard deviation [SD]) = 48.25(10.15); 79% female) 1 mo before (T1) and 9 mo after (T2) the 2005 hurricane season. Demographics, trauma history, social support, time to recover from previous hurricane season, and predisaster work productivity were assessed at T1. T2 assessed previous disaster work, initial emotional response, and personal hurricane injury/damage. The primary outcome was time to recover from the most recent hurricane event.
Results:
Multivariate analyses found that less support (T1; odds ratio [OR] = .74[95% confidence interval [CI] = .60-.92]), longer previous recovery time (T1; OR = 5.22[95%CI = 3.01-9.08]), lower predisaster work productivity (T1; OR = 1.98[95%CI = 1.08-3.61]), disaster-related personal injury/damage (T2; OR = 3.08[95%CI = 1.70-5.58]), and initial emotional response (T2; OR = 1.71[95%CI = 1.34-2.19]) were associated with longer recovery time (T2).
Conclusions:
Recovery time was adversely affected in disaster responders with a history of longer recovery time, personal injury/damage, lower work productivity following prior hurricanes, and initial emotional response, whereas responders with social support had shorter recovery time. Predictors of recovery time should be a focus for disaster preparedness planners.
The transition from military service to civilian life is a high-risk period for suicide attempts (SAs). Although stressful life events (SLEs) faced by transitioning soldiers are thought to be implicated, systematic prospective evidence is lacking.
Methods
Participants in the Army Study to Assess Risk and Resilience in Servicemembers (STARRS) completed baseline self-report surveys while on active duty in 2011–2014. Two self-report follow-up Longitudinal Surveys (LS1: 2016–2018; LS2: 2018–2019) were subsequently administered to probability subsamples of these baseline respondents. As detailed in a previous report, a SA risk index based on survey, administrative, and geospatial data collected before separation/deactivation identified 15% of the LS respondents who had separated/deactivated as being high-risk for self-reported post-separation/deactivation SAs. The current report presents an investigation of the extent to which self-reported SLEs occurring in the 12 months before each LS survey might have mediated/modified the association between this SA risk index and post-separation/deactivation SAs.
Results
The 15% of respondents identified as high-risk had a significantly elevated prevalence of some post-separation/deactivation SLEs. In addition, the associations of some SLEs with SAs were significantly stronger among predicted high-risk than lower-risk respondents. Demographic rate decomposition showed that 59.5% (s.e. = 10.2) of the overall association between the predicted high-risk index and subsequent SAs was linked to these SLEs.
Conclusions
It might be possible to prevent a substantial proportion of post-separation/deactivation SAs by providing high-risk soldiers with targeted preventive interventions for exposure/vulnerability to commonly occurring SLEs.
Problematic anger is frequently reported by soldiers who have deployed to combat zones. However, evidence is lacking with respect to how anger changes over a deployment cycle, and which factors prospectively influence change in anger among combat-deployed soldiers.
Methods
Reports of problematic anger were obtained from 7298 US Army soldiers who deployed to Afghanistan in 2012. A series of mixed-effects growth models estimated linear trajectories of anger over a period of 1–2 months before deployment to 9 months post-deployment, and evaluated the effects of pre-deployment factors (prior deployments and perceived resilience) on average levels and growth of problematic anger.
Results
A model with random intercepts and slopes provided the best fit, indicating heterogeneity in soldiers' levels and trajectories of anger. First-time deployers reported the lowest anger overall, but the most growth in anger over time. Soldiers with multiple prior deployments displayed the highest anger overall, which remained relatively stable over time. Higher pre-deployment resilience was associated with lower reports of anger, but its protective effect diminished over time. First- and second-time deployers reporting low resilience displayed different anger trajectories (stable v. decreasing, respectively).
Conclusions
Change in anger from pre- to post-deployment varies based on pre-deployment factors. The observed differences in anger trajectories suggest that efforts to detect and reduce problematic anger should be tailored for first-time v. repeat deployers. Ongoing screening is needed even for soldiers reporting high resilience before deployment, as the protective effect of pre-deployment resilience on anger erodes over time.
Definition of disorder subtypes may facilitate precision treatment for posttraumatic stress disorder (PTSD). We aimed to identify PTSD subtypes and evaluate their associations with genetic risk factors, types of stress exposures, comorbidity, and course of PTSD.
Methods
Data came from a prospective study of three U.S. Army Brigade Combat Teams that deployed to Afghanistan in 2012. Soldiers with probable PTSD (PTSD Checklist for Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition ≥31) at three months postdeployment comprised the sample (N = 423) for latent profile analysis using Gaussian mixture modeling and PTSD symptom ratings as indicators. PTSD profiles were compared on polygenic risk scores (derived from external genomewide association study summary statistics), experiences during deployment, comorbidity at three months postdeployment, and persistence of PTSD at nine months postdeployment.
Results
Latent profile analysis revealed profiles characterized by prominent intrusions, avoidance, and hyperarousal (threat-reactivity profile; n = 129), anhedonia and negative affect (dysphoric profile; n = 195), and high levels of all PTSD symptoms (high-symptom profile; n = 99). The threat-reactivity profile had the most combat exposure and the least comorbidity. The dysphoric profile had the highest polygenic risk for major depression, and more personal life stress and co-occurring major depression than the threat-reactivity profile. The high-symptom profile had the highest rates of concurrent mental disorders and persistence of PTSD.
Conclusions
Genetic and trauma-related factors likely contribute to PTSD heterogeneity, which can be parsed into subtypes that differ in symptom expression, comorbidity, and course. Future studies should evaluate whether PTSD typology modifies treatment response and should clarify distinctions between the dysphoric profile and depressive disorders.
Community characteristics, such as collective efficacy, a measure of community strength, can affect behavioral responses following disasters. We measured collective efficacy 1 month before multiple hurricanes in 2005, and assessed its association to preparedness 9 months following the hurricane season.
Methods:
Participants were 631 Florida Department of Health workers who responded to multiple hurricanes in 2004 and 2005. They completed questionnaires that were distributed electronically approximately 1 month before (6.2005-T1) and 9 months after (6.2006-T2) several storms over the 2005 hurricane season. Collective efficacy, preparedness behaviors, and socio-demographics were assessed at T1, and preparedness behaviors and hurricane-related characteristics (injury, community-related damage) were assessed at T2. Participant ages ranged from 21-72 (M(SD) = 48.50 (10.15)), and the majority were female (78%).
Results:
In linear regression models, univariate analyses indicated that being older (B = 0.01, SE = 0.003, P < 0.001), White (B = 0.22, SE = 0.08, P < 0.01), and married (B = 0.05, SE = 0.02, p < 0.001) was associated with preparedness following the 2005 hurricanes. Multivariate analyses, adjusting for socio-demographics, preparedness (T1), and hurricane-related characteristics (T2), found that higher collective efficacy (T1) was associated with preparedness after the hurricanes (B = 0.10, SE = 0.03, P < 0.01; and B = 0.47, SE = 0.04, P < 0.001 respectively).
Conclusion:
Programs enhancing collective efficacy may be a significant part of prevention practices and promote preparedness efforts before disasters.
This study examined the relationship of perceived safety and confidence in local law enforcement and government to changes in daily life activities during the Washington, DC, sniper attacks.
Methods:
Participants were 1238 residents from the Washington, DC metropolitan area who were assessed using an Internet survey that included items related to safety at work, at home, and in general, confidence in law enforcement/government, and changes in routine daily life activities.
Results:
A majority of participants (52%, n = 640) reported changing their daily life activities, with approximately one-third identifying changes related to being in large places and getting gas. Perceived safety was associated with confidence in local law enforcement/government. After adjusting for demographics, lower feelings of safety and less confidence in law enforcement/government were related to a higher likelihood of altered daily activities. Confidence in local law enforcement/government modified the association of safety with changes in daily activities. Among participants with high safety, less confidence in local law enforcement/government was associated with greater changes in daily life activities.
Conclusions:
Serial shooting events affect feelings of safety and disrupt routine life activities. Focus on enhancing experiences of safety and confidence in local law enforcement and government may decrease the life disruption associated with terrorist shootings.
Unit cohesion may protect service member mental health by mitigating effects of combat exposure; however, questions remain about the origins of potential stress-buffering effects. We examined buffering effects associated with two forms of unit cohesion (peer-oriented horizontal cohesion and subordinate-leader vertical cohesion) defined as either individual-level or aggregated unit-level variables.
Methods
Longitudinal survey data from US Army soldiers who deployed to Afghanistan in 2012 were analyzed using mixed-effects regression. Models evaluated individual- and unit-level interaction effects of combat exposure and cohesion during deployment on symptoms of post-traumatic stress disorder (PTSD), depression, and suicidal ideation reported at 3 months post-deployment (model n's = 6684 to 6826). Given the small effective sample size (k = 89), the significance of unit-level interactions was evaluated at a 90% confidence level.
Results
At the individual-level, buffering effects of horizontal cohesion were found for PTSD symptoms [B = −0.11, 95% CI (−0.18 to −0.04), p < 0.01] and depressive symptoms [B = −0.06, 95% CI (−0.10 to −0.01), p < 0.05]; while a buffering effect of vertical cohesion was observed for PTSD symptoms only [B = −0.03, 95% CI (−0.06 to −0.0001), p < 0.05]. At the unit-level, buffering effects of horizontal (but not vertical) cohesion were observed for PTSD symptoms [B = −0.91, 90% CI (−1.70 to −0.11), p = 0.06], depressive symptoms [B = −0.83, 90% CI (−1.24 to −0.41), p < 0.01], and suicidal ideation [B = −0.32, 90% CI (−0.62 to −0.01), p = 0.08].
Conclusions
Policies and interventions that enhance horizontal cohesion may protect combat-exposed units against post-deployment mental health problems. Efforts to support individual soldiers who report low levels of horizontal or vertical cohesion may also yield mental health benefits.
Whereas genetic susceptibility increases the risk for major depressive disorder (MDD), non-genetic protective factors may mitigate this risk. In a large-scale prospective study of US Army soldiers, we examined whether trait resilience and/or unit cohesion could protect against the onset of MDD following combat deployment, even in soldiers at high polygenic risk.
Methods
Data were analyzed from 3079 soldiers of European ancestry assessed before and after their deployment to Afghanistan. Incident MDD was defined as no MDD episode at pre-deployment, followed by a MDD episode following deployment. Polygenic risk scores were constructed from a large-scale genome-wide association study of major depression. We first examined the main effects of the MDD PRS and each protective factor on incident MDD. We then tested the effects of each protective factor on incident MDD across strata of polygenic risk.
Results
Polygenic risk showed a dose–response relationship to depression, such that soldiers at high polygenic risk had greatest odds for incident MDD. Both unit cohesion and trait resilience were prospectively associated with reduced risk for incident MDD. Notably, the protective effect of unit cohesion persisted even in soldiers at highest polygenic risk.
Conclusions
Polygenic risk was associated with new-onset MDD in deployed soldiers. However, unit cohesion – an index of perceived support and morale – was protective against incident MDD even among those at highest genetic risk, and may represent a potent target for promoting resilience in vulnerable soldiers. Findings illustrate the value of combining genomic and environmental data in a prospective design to identify robust protective factors for mental health.
Distinguishing a disorder of persistent and impairing grief from normative grief allows clinicians to identify this often undetected and disabling condition. As four diagnostic criteria sets for a grief disorder have been proposed, their similarities and differences need to be elucidated.
Methods
Participants were family members bereaved by US military service death (N = 1732). We conducted analyses to assess the accuracy of each criteria set in identifying threshold cases (participants who endorsed baseline Inventory of Complicated Grief ⩾30 and Work and Social Adjustment Scale ⩾20) and excluding those below this threshold. We also calculated agreement among criteria sets by varying numbers of required associated symptoms.
Results
All four criteria sets accurately excluded participants below our identified clinical threshold (i.e. correctly excluding 86–96% of those subthreshold), but they varied in identification of threshold cases (i.e. correctly identifying 47–82%). When the number of associated symptoms was held constant, criteria sets performed similarly. Accurate case identification was optimized when one or two associated symptoms were required. When employing optimized symptom numbers, pairwise agreements among criteria became correspondingly ‘very good’ (κ = 0.86–0.96).
Conclusions
The four proposed criteria sets describe a similar condition of persistent and impairing grief, but differ primarily in criteria restrictiveness. Diagnostic guidance for prolonged grief disorder in International Classification of Diseases, 11th Edition (ICD-11) functions well, whereas the criteria put forth in Section III of Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) are unnecessarily restrictive.
Community characteristics, such as perceived collective efficacy, a measure of community strength, can affect mental health outcomes following disasters. We examined the association of perceived collective efficacy with posttraumatic stress disorder (PTSD) and frequent mental distress (14 or more mentally unhealthy days in the past month) following exposure to the 2004 and 2005 hurricane seasons.
Methods
Participants were 1486 Florida Department of Health workers who completed anonymous questionnaires that were distributed electronically 9 months after the 2005 hurricane season. Participant ages ranged from 20 to 79 years (mean, 48; SD, 10.7), and the majority were female (79%), white (75%), and currently married (64%). Fifty percent had a BA/BS degree or higher.
Results
In 2 separate logistic regression models, each adjusted for individual sociodemographics, community socioeconomic characteristics, individual injury/damage, and community storm damage, lower perceived collective efficacy was significantly associated with a greater likelihood of having PTSD (OR, 0.93; 95% CI, 0.90-0.96), and lower collective efficacy was significantly associated with frequent mental distress (OR, 0.94; 95% CI, 0.92-0.96).
Conclusions
Programs enhancing community collective efficacy may be a significant part of prevention practices and possibly lead to a reduction in the rate of PTSD and persistent distress postdisaster. (Disaster Med Public Health Preparedness. 2019;13:44–52).
This study examined the relationship of sniper-related television viewing (TV) and perceived safety to posttraumatic stress (PTS) and depressive symptoms during the Washington, DC sniper attacks.
Methods
Participants were 1238 Washington, DC area residents assessed using an internet survey including the Impact of Event Scale-Revised, Patient Health Questionnaire-9, hours of TV, and perceived safety.
Results
Almost 40% (n = 459) of participants watched at least 2 hours of sniper-related TV daily. TV viewing was associated with lower total perceived safety. After adjusting for demographics, more TV viewing and decreased perceived safety were related to increased PTS and depressive symptoms. TV viewing modified the effect of safety on PTS and depressive symptoms. Among participants with low and high perceived safety, hours of TV were positively associated with PTS; however, the effect was stronger among those with low perceived safety. The relationship of TV to increased depressive symptoms was identified only in participants who reported low perceived safety.
Conclusions
The influence of media exposure and perceived safety have implications for intervention by community leaders and mental health care providers. Recommendations include limiting media exposure during a terrorist event, particularly among those who perceive that their safety is at risk, and targeting safety in communication strategies. (Disaster Med Public Health Preparedness. 2019;13:570-576)
Investigations of drinking behavior across military deployment cycles are scarce, and few prospective studies have examined risk factors for post-deployment alcohol misuse.
Methods
Prevalence of alcohol misuse was estimated among 4645 US Army soldiers who participated in a longitudinal survey. Assessment occurred 1–2 months before soldiers deployed to Afghanistan in 2012 (T0), upon their return to the USA (T1), 3 months later (T2), and 9 months later (T3). Weights-adjusted logistic regression was used to evaluate associations of hypothesized risk factors with post-deployment incidence and persistence of heavy drinking (HD) (consuming 5 + alcoholic drinks at least 1–2×/week) and alcohol or substance use disorder (AUD/SUD).
Results
Prevalence of past-month HD at T0, T2, and T3 was 23.3% (s.e. = 0.7%), 26.1% (s.e. = 0.8%), and 22.3% (s.e. = 0.7%); corresponding estimates for any binge drinking (BD) were 52.5% (s.e. = 1.0%), 52.5% (s.e. = 1.0%), and 41.3% (s.e. = 0.9%). Greater personal life stress during deployment (e.g., relationship, family, or financial problems) – but not combat stress – was associated with new onset of HD at T2 [per standard score increase: adjusted odds ratio (AOR) = 1.20, 95% CI 1.06–1.35, p = 0.003]; incidence of AUD/SUD at T2 (AOR = 1.54, 95% CI 1.25–1.89, p < 0.0005); and persistence of AUD/SUD at T2 and T3 (AOR = 1.30, 95% CI 1.08–1.56, p = 0.005). Any BD pre-deployment was associated with post-deployment onset of HD (AOR = 3.21, 95% CI 2.57–4.02, p < 0.0005) and AUD/SUD (AOR = 1.85, 95% CI 1.27–2.70, p = 0.001).
Conclusions
Alcohol misuse is common during the months preceding and following deployment. Timely intervention aimed at alleviating/managing personal stressors or curbing risky drinking might reduce risk of alcohol-related problems post-deployment.
A series of sniper attacks in the Washington, DC, area left 10 people dead and 3 wounded. We developed and tested a model that examined the unique and interdependent relationships of sniper-related television viewing, prior life-threatening events, and parental status to identification with attack victims.
Methods
Participants were 1238 residents of the DC area (aged 18-90 years, mean=41.7 years; 51% female; 68% white) who completed an online survey that assessed identification with sniper attack victims, amount of television viewing, and prior life-threatening events. Identification was measured by using a previously developed scale that assessed to what extent participants identified victims as similar to themselves, a friend, or a family member.
Results
The relationship of television viewing to identification was examined by using multivariate linear regression analyses. In univariate analyses, female gender, having children, higher levels of television viewing, and past life-threatening events were independently related to greater identification. After adjustment for demographics and life-threatening events, sniper-related television viewing continued to be associated with identification (B=0.61, P≤0.001, ∆R2=0.07). Examination of the interactions of television viewing by parental status and television viewing by life-threatening event revealed significant relationships.
Conclusions
Attention to events preceding and during a terrorist event could help in the recognition of those at particular risk for increased identification with attack victims. These findings also have implications for recommendations for media exposure during an event. (Disaster Med Public Health Preparedness. 2018; 12: 337–344)