To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
OBJECTIVES/SPECIFIC AIMS: More men than women develop urinary stones and their prevalence alters in women with menopause suggesting a steroidal influence. In men the incidence of stones is highest during July and August suggesting that environmental factors such as Vitamin D (VitD), a steroid, may affect stone formation. Previous studies have found differences in the development of stones between men and women; however, the reasons for sex differences in stone formation and type remain unclear. METHODS/STUDY POPULATION: We examined VitD levels in men and women (n = 18,753) that had no diseases based on a lack of an ICD-9 or ICD-10 code in their electronic medical record. We found that normal, healthy women had significantly higher levels of sera VitD compared to men (p = 6x10-6). We then examined whether sex differences existed for key endpoints/data from the Mayo Clinic Urinary Stone Disease (USD) Registry, which has around 1,600 urinary stone patients that are well-phenotyped according to sex, age and stone type. RESULTS/ANTICIPATED RESULTS: Control women were found to have higher sera VitD levels than men, but the sex difference no longer exists in kidney stone disease patients. When we further separated by race, we found that differences in VitD levels reappeared; this suggests that race also plays a role in sera VitD variances. DISCUSSION/SIGNIFICANCE OF IMPACT: We are developing a disease severity score, which we will use to correlate to sera VitD levels in patients according to sex, age and race. Future analyses will take into account whether subjects had VitD and calcium supplementation. This project begins to explore the mechanism behind the sex differences known to exist in urinary stone disease, which is critically needed to provide improved diagnosis and therapy for this debilitating disease.
The second year of life is a period of nutritional vulnerability. We aimed to investigate the dietary patterns and nutrient intakes from 1 to 2 years of age during the 12-month follow-up period of the Growing Up Milk – Lite (GUMLi) trial. The GUMLi trial was a multi-centre, double-blinded, randomised controlled trial of 160 healthy 1-year-old children in Auckland, New Zealand and Brisbane, Australia. Dietary intakes were collected at baseline, 3, 6, 9 and 12 months post-randomisation, using a validated FFQ. Dietary patterns were identified using principal component analysis of the frequency of food item consumption per d. The effect of the intervention on dietary patterns and intake of eleven nutrients over the duration of the trial were investigated using random effects mixed models. A total of three dietary patterns were identified at baseline: ‘junk/snack foods’, ‘healthy/guideline foods’ and ‘breast milk/formula’. A significant group difference was observed in ‘breast milk/formula’ dietary pattern z scores at 12 months post-randomisation, where those in the GUMLi group loaded more positively on this pattern, suggesting more frequent consumption of breast milk. No difference was seen in the other two dietary patterns. Significant intervention effects were seen on nutrient intake between the GUMLi (intervention) and cows’ milk (control) groups, with lower protein and vitamin B12, and higher Fe, vitamin D, vitamin C and Zn intake in the GUMLi (intervention) group. The consumption of GUMLi did not affect dietary patterns, however, GUMLi participants had lower protein intake and higher Fe, vitamins D and C and Zn intake at 2 years of age.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.
The objective of this study was to investigate the impact of the most commonly cited factors that may have influenced infants’ gut microbiota profiles at one year of age: mode of delivery, breastfeeding duration and antibiotic exposure. Barcoded V3/V4 amplicons of bacterial 16S-rRNA gene were prepared from the stool samples of 52 healthy 1-year-old Australian children and sequenced using the Illumina MiSeq platform. Following the quality checks, the data were processed using the Quantitative Insights Into Microbial Ecology pipeline and analysed using the Calypso package for microbiome data analysis. The stool microbiota profiles of children still breastfed were significantly different from that of children weaned earlier (P<0.05), independent of the age of solid food introduction. Among children still breastfed, Veillonella spp. abundance was higher. Children no longer breastfed possessed a more ‘mature’ microbiota, with notable increases of Firmicutes. The microbiota profiles of the children could not be differentiated by delivery mode or antibiotic exposure. Further analysis based on children’s feeding patterns found children who were breastfed alongside solid food had significantly different microbiota profiles compared to that of children who were receiving both breastmilk and formula milk alongside solid food. This study provided evidence that breastfeeding continues to influence gut microbial community even at late infancy when these children are also consuming table foods. At this age, any impacts from mode of delivery or antibiotic exposure did not appear to be discernible imprints on the microbial community profiles of these healthy children.
We consider the effect of high rotation rates on two liquid layers that initially form concentric cylinders, centred on the axis of rotation. The configuration may be thought of as a fluid–fluid centrifuge. There are two types of perturbation to the interface that may be considered, an azimuthal perturbation around the circumference of the interface and a varicose perturbation in the axial direction along the length of the interface. It is the first of these types of perturbation that we consider here, and so the flow may be considered essentially two-dimensional, taking place in a circular domain. A linear stability analysis is carried out on a perturbation to the hydrostatic background state and a fourth-order Orr–Sommerfeld-like equation that governs the system is derived. We consider the dynamics of systems of stable and unstable configurations, inviscid and viscous fluids, immiscible fluid layers with surface tension and miscible fluid layers that may have some initial diffusion of density. In the most simple case of two layers of inviscid fluid separated by a sharp interface with no surface tension acting, we show that the effects of the curvature of the interface and the confinement of the system may be characterised by a modified Atwood number. The classical Atwood number is recovered in the limit of high azimuthal wavenumber, or the outer fluid layer being unconfined. Theoretical predictions are compared with numerical experiments and the agreement is shown to be good. We do not restrict our analysis to equal volume fluid layers and so our results also have applications in coating and lubrication problems in rapidly rotating systems and machinery.
Research into the gut microbiota of human infants is necessary in order to better understand how inter-species interactions and ecological succession shape the diversity of the gut microbiota, and in turn, how the specific composition of the gut microbiota impacts on host health both during infancy and in later years. Blastocystis is a ubiquitous intestinal protist that has been linked to a number of intestinal and extra-intestinal diseases. However, emerging data show that asymptomatic carriage is common and that Blastocystis is prevalent in the healthy adult gut microbiota. Nonetheless, little is known about the prevalence and diversity of this microorganism in the healthy infant gut, including when and how individuals become colonized by Blastocystis. Here, we surveyed the prevalence and diversity of Blastocystis in an infant population (n = 59) from an industrialized country (Ireland) using Blastocystis-specific primers at three or more time-points up to 24 months old. Only three infants were positive for Blastocystis (prevalence = 5%) and this was only noted for samples collected at month 24. This rate is comparatively low relative to previously reported prevalence rates in the contemporaneous adult population. These data suggest that infants in Westernized countries that are successfully colonized by Blastocystis most likely acquire this microorganism via horizontal transfer.
The Tasmanian Cenozoic macrofossil record is relatively rich, and changes that have occurred in the vegetation of the region are becoming increasingly well understood. The record is essentially one of rainforest elements, especially in the Paleogene, but taxa that are now common in sclerophyllous heathlands and woodlands are increasingly prevalent in Quaternary sediments.
Extant Tasmanian rainforest is renowned for its beauty, and botanists have long recognised its marked taxonomic and structural similarity to other southern hemisphere ‘cool temperate’ forests of New Zealand and Chile. These are generally dominated by Nothofagus trees, their boughs laden with lichens and verdant shrouds of bryophytes. Other links are often made by phytogeographers to similar forests in high altitude regions of northern New South Wales and the much more species-rich vegetation of the generally montane regions of New Guinea and New Caledonia where Nothofagus also grows. A striking aspect of these forests is the presence of a variety of conifers, principally Podocarpaceae, but also Cupressaceae and Araucariaceae. In Tasmania the Araucariaceae are extinct, but the region is unique in the southern hemisphere in having a genus of Taxodiaceae, Athrotaxis. Athrotaxis spp. are often associated with Australia's only winter deciduous plant, Nothofagus gunnii, in montane regions of the island. The macrofossil record shows conclusively that the current diversity of Tasmania's woody rainforest flora is very much lower than at any other time during the Cenozoic. It confirms that there are strong floristic links to regions as widespread as eastern and southwestern mainland Australia, southern South America, New Zealand and New Guinea. In fact, Tasmanian Paleogene floras contain a wealth of taxa that are closely related to plants now confined to these regions.
Apart from the relatively large tracts of rainforest in Tasmania, closed forest lacking eucalypts is now confined to small patches along the east coast of Australia. In contrast to mainland Australia, Tasmania is relatively mountainous and has a well-developed woody alpine vegetation, dominated by shrubs of the Asteraceae, Epacridaceae, Myrtaceae and Proteaceae.
A number of socio-economic, biological and lifestyle characteristics change with advancing age and place very old adults at increased risk of micronutrient deficiencies. The aim of this study was to assess vitamin and mineral intakes and respective food sources in 793 75-year-olds (302 men and 491 women) in the North-East of England, participating in the Newcastle 85+ Study. Micronutrient intakes were estimated using a multiple-pass recall tool (2×24 h recalls). Determinants of micronutrient intake were assessed with multinomial logistic regression. Median vitamin D, Ca and Mg intakes were 2·0 (interquartile range (IQR) 1·2–6·5) µg/d, 731 (IQR 554–916) mg/d and 215 (IQR 166–266) mg/d, respectively. Fe intake was 8·7 (IQR 6·7–11·6) mg/d, and Se intake was 39·0 (IQR 27·3–55·5) µg/d. Cereals and cereal products were the top contributors to intakes of folate (31·5 %), Fe (49·2 %) and Se (46·7 %) and the second highest contributors to intakes of vitamin D (23·8 %), Ca (27·5 %) and K (15·8 %). More than 95 % (n 756) of the participants had vitamin D intakes below the UK’s Reference Nutrient Intake (10 µg/d). In all, >20 % of the participants were below the Lower Reference Nutrient Intake for Mg (n 175), K (n 238) and Se (n 418) (comparisons with dietary reference values (DRV) do not include supplements). As most DRV are not age specific and have been extrapolated from younger populations, results should be interpreted with caution. Participants with higher education, from higher social class and who were more physically active had more nutrient-dense diets. More studies are needed to inform the development of age-specific DRV for micronutrients for the very old.
Very old people (referred to as those aged 85 years and over) are the fastest growing age segment of many Western societies owing to the steady rise of life expectancy and decrease in later life mortality. In the UK, there are now more than 1·5 million very old people (2·5 % of total population) and the number is projected to rise to 3·3 million or 5 % over the next 20 years. Reduced mobility and independence, financial constraints, higher rates of hospitalisation, chronic diseases and disabilities, changes in body composition, taste perception, digestion and absorption of food all potentially influence either nutrient intake or needs at this stage of life. The nutritional needs of the very old have been identified as a research priority by the British Nutrition Foundation's Task Force report, Healthy Ageing: The Role of Nutrition and Lifestyle. However, very little is known about the dietary habits and nutritional status of the very old. The Newcastle 85+ study, a cohort of more than 1000 85-year olds from the North East of England and the Life and Living in Advanced Age study (New Zealand), a bicultural cohort study of advanced ageing of more than 900 participants from the Bay of Plenty and Rotorua regions of New Zealand are two unique cohort studies of ageing, which aim to assess the spectrum of health in the very old as well as examine the associations of health trajectories and outcomes with biological, clinical and social factors as each cohort ages. The nutrition domain included in both studies will help to fill the evidence gap by identifying eating patterns, and measures of nutritional status associated with better, or worse, health and wellbeing. This review will explore some of this ongoing work.
Food and nutrient intake data are scarce in very old adults (85 years and older) – one of the fastest growing age segments of Western societies, including the UK. Our primary objective was to assess energy and macronutrient intakes and respective food sources in 793 85-year-olds (302 men and 491 women) living in North-East England and participating in the Newcastle 85+ cohort Study. Dietary information was collected using a repeated multiple-pass recall (2×24 h recalls). Energy, macronutrient and NSP intakes were estimated, and the contribution (%) of food groups to nutrient intake was calculated. The median energy intake was 6·65 (interquartile ranges (IQR) 5·49–8·16) MJ/d – 46·8 % was from carbohydrates, 36·8 % from fats and 15·7 % from proteins. NSP intake was 10·2 g/d (IQR 7·3–13·7). NSP intake was higher in non-institutionalised, more educated, from higher social class and more physically active 85-year-olds. Cereals and cereal products were the top contributors to intakes of energy and most macronutrients (carbohydrates, non-milk extrinsic sugars, NSP and fat), followed by meat and meat products. The median intakes of energy and NSP were much lower than the estimated average requirement for energy (9·6 MJ/d for men and 7·7 MJ/d for women) and the dietary reference value (DRV) for NSP (≥18 g/d). The median SFA intake was higher than the DRV (≤11 % of dietary energy). This study highlights the paucity of data on dietary intake and the uncertainties about DRV for this age group.
Integrated weed management (IWM) for agronomic and vegetable production
systems utilizes all available options to effectively manage weeds.
Late-season weed control measures are often needed to improve crop harvest
and stop additions to the weed seed bank. Eliminating the production of
viable weed seeds is one of the key IWM practices. The objective of this
research was to determine how termination method and timing influence viable
weed seed production of late-season weed infestations. Research was
conducted in Delaware, Michigan, and New York over a 2-yr period. The weeds
studied included: common lambsquarters, common ragweed, giant foxtail,
jimsonweed, and velvetleaf. Three termination methods were imposed: cutting
at the plant base (simulating hand hoeing), chopping (simulating mowing),
and applying glyphosate. The three termination timings were flowering,
immature seeds present, and mature seeds present. Following termination,
plants were stored in the field in mesh bags until mid-Fall when seeds were
counted and tested for viability. Termination timing influenced viable seed
development; however, termination method did not. Common ragweed and giant
foxtail produced viable seeds when terminated at the time of flowering. All
species produced some viable seed when immature seeds were present at the
time of termination. The time of viable seed formation varied based on
species and site-year, ranging from plants terminated the day of flowering
to 1,337 growing degree d after flowering (base 10, 0 to 57 calendar d).
Viable seed production was reduced by 64 to 100% when common lambsquarters,
giant foxtail, jimsonweed, and velvetleaf were terminated with immature
seeds present, compared to when plants were terminated with some mature
seeds present. Our results suggest that terminating common lambsquarters,
common ragweed, and giant foxtail prior to flowering, and velvetleaf and
jimsonweed less than 2 and 3 wk after flowering, respectively, greatly
reduces weed seed bank inputs.
Calcium is considered important in buffering excess stomach acid in mammals, including horses. Control of stomach acid is important in preventing the development of ulcers within the stomach lining, which, in horses, are considered to be caused by acid splashing. Algae supplements contain various minerals which are in natural form, as seen in all plant and feedstuffs. The current trial was conducted to examine if a high calcium algae supplement had any impact on gastric ulceration in horses, which may be due to buffering stomach acid, reducing the pH in a gradual manner, without resorting to medication. Ten horses, of either thoroughbred, standardbred or sport horse breed, were selected on the basis of the presence of ulcers in their stomach, as ascertained by endoscopy. The average ulceration score before algae supplementation was 2.2 ± 0.75 according to the EGUC scoring system. The horses were then maintained on their normal diet (unchanged from the initial ulcer scoring) by the owner with the addition of 40 g per day of the high calcium, algae based Maxia Complete® (Seahorse Supplements Ltd, Christchurch, NZ) for thirty days (T30). All horses were then re endoscoped to assess any change in ulceration score. All horses showed a significant improvement in ulcer score, with seven having a score of zero (fully healed, no evidence of further ulceration) and two with a score of one (some residual inflammation or keratinosis in areas of healed ulcers). This resulted in a mean score of 0.3 ± 0.48 (P < 0.0001: T0 versus T30) at the end of the study. This trial demonstrated that feeding an organic form of high calcium from algae reduced ulceration in horses.
There is now expert consensus that directly observing the work of trainee therapists vs. relying upon self-report of sessions, is critical to providing the accurate feedback required to attain a range of competencies. In spite of this expert consensus however, and the broadly positive attitudes towards video review among supervisees, video feedback methods remain under-utilized in clinical supervision. This paper outlines some of the weaknesses that affect feedback based solely on self-report methods, before introducing some of the specific benefits that video feedback methods can offer the training and supervision context. It is argued that video feedback methods fit seamlessly into CBT supervision providing direct, accessible, effective, efficient and accurate observation of the learning situation, and optimizing the chances for accurate self-reflections and planning further improvements in performance. To demonstrate the utility of video feedback techniques to CBT supervision, two specific video feedback techniques are introduced and described: the Give-me-5 technique and the I-spy technique. Case examples of CBT supervision using the two techniques are provided and explored, and guidance as to the supervision contexts in which each of the two techniques are suitable, individually, and in tandem, are outlined. Finally, best practice guidelines for the use of video feedback techniques in supervision are outlined.
The first observations by a worldwide network of advanced interferometric gravitational wave detectors offer a unique opportunity for the astronomical community. At design sensitivity, these facilities will be able to detect coalescing binary neutron stars to distances approaching 400 Mpc, and neutron star–black hole systems to 1 Gpc. Both of these sources are associated with gamma-ray bursts which are known to emit across the entire electromagnetic spectrum. Gravitational wave detections provide the opportunity for ‘multi-messenger’ observations, combining gravitational wave with electromagnetic, cosmic ray, or neutrino observations. This review provides an overview of how Australian astronomical facilities and collaborations with the gravitational wave community can contribute to this new era of discovery, via contemporaneous follow-up observations from the radio to the optical and high energy. We discuss some of the frontier discoveries that will be made possible when this new window to the Universe is opened.
To determine if total lifetime physical activity (PA) is associated with better cognitive functioning with aging and if cerebrovascular function mediates this association. A sample of 226 (52.2% female) community dwelling middle-aged and older adults (66.5±6.4 years) in the Brain in Motion Study, completed the Lifetime Total Physical Activity Questionnaire and underwent neuropsychological and cerebrovascular blood flow testing. Multiple robust linear regressions were used to model the associations between lifetime PA and global cognition after adjusting for age, sex, North American Adult Reading Test results (i.e., an estimate of premorbid intellectual ability), maximal aerobic capacity, body mass index and interactions between age, sex, and lifetime PA. Mediation analysis assessed the effect of cerebrovascular measures on the association between lifetime PA and global cognition. Post hoc analyses assessed past year PA and current fitness levels relation to global cognition and cerebrovascular measures. Better global cognitive performance was associated with higher lifetime PA (p=.045), recreational PA (p=.021), and vigorous intensity PA (p=.004), PA between the ages of 0 and 20 years (p=.036), and between the ages of 21 and 35 years (p<.0001). Cerebrovascular measures did not mediate the association between PA and global cognition scores (p>.5), but partially mediated the relation between current fitness and global cognition. This study revealed significant associations between higher levels of PA (i.e., total lifetime, recreational, vigorous PA, and past year) and better cognitive function in later life. Current fitness levels relation to cognitive function may be partially mediated through current cerebrovascular function. (JINS, 2015, 21, 816–830)
Considerable research has documented that exposure to traumatic events has negative effects on physical and mental health. Much less research has examined the predictors of traumatic event exposure. Increased understanding of risk factors for exposure to traumatic events could be of considerable value in targeting preventive interventions and anticipating service needs.
General population surveys in 24 countries with a combined sample of 68 894 adult respondents across six continents assessed exposure to 29 traumatic event types. Differences in prevalence were examined with cross-tabulations. Exploratory factor analysis was conducted to determine whether traumatic event types clustered into interpretable factors. Survival analysis was carried out to examine associations of sociodemographic characteristics and prior traumatic events with subsequent exposure.
Over 70% of respondents reported a traumatic event; 30.5% were exposed to four or more. Five types – witnessing death or serious injury, the unexpected death of a loved one, being mugged, being in a life-threatening automobile accident, and experiencing a life-threatening illness or injury – accounted for over half of all exposures. Exposure varied by country, sociodemographics and history of prior traumatic events. Being married was the most consistent protective factor. Exposure to interpersonal violence had the strongest associations with subsequent traumatic events.
Given the near ubiquity of exposure, limited resources may best be dedicated to those that are more likely to be further exposed such as victims of interpersonal violence. Identifying mechanisms that account for the associations of prior interpersonal violence with subsequent trauma is critical to develop interventions to prevent revictimization.
Although interventions exist to reduce violent crime, optimal implementation requires accurate targeting. We report the results of an attempt to develop an actuarial model using machine learning methods to predict future violent crimes among US Army soldiers.
A consolidated administrative database for all 975 057 soldiers in the US Army in 2004–2009 was created in the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). Of these soldiers, 5771 committed a first founded major physical violent crime (murder-manslaughter, kidnapping, aggravated arson, aggravated assault, robbery) over that time period. Temporally prior administrative records measuring socio-demographic, Army career, criminal justice, medical/pharmacy, and contextual variables were used to build an actuarial model for these crimes separately among men and women using machine learning methods (cross-validated stepwise regression, random forests, penalized regressions). The model was then validated in an independent 2011–2013 sample.
Key predictors were indicators of disadvantaged social/socioeconomic status, early career stage, prior crime, and mental disorder treatment. Area under the receiver-operating characteristic curve was 0.80–0.82 in 2004–2009 and 0.77 in the 2011–2013 validation sample. Of all administratively recorded crimes, 36.2–33.1% (male-female) were committed by the 5% of soldiers having the highest predicted risk in 2004–2009 and an even higher proportion (50.5%) in the 2011–2013 validation sample.
Although these results suggest that the models could be used to target soldiers at high risk of violent crime perpetration for preventive interventions, final implementation decisions would require further validation and weighing of predicted effectiveness against intervention costs and competing risks.