We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The history of maize (Zea mays L.) in the eastern Woodlands remains an important study topic. As currently understood, these histories appear to vary regionally and include scenarios positing an early introduction and an increase in use over hundreds of, if not a thousand, years. In this article, we address the history of maize in the American Bottom region of Illinois and its importance in the development of regional Mississippian societies, specifically in the Cahokian polity located in the central Mississippi River valley. We present new lines of evidence that confirm subsistence-level maize use at Cahokia was introduced rather abruptly at about AD 900 and increased rapidly over the following centuries. Directly dated archaeobotanical maize remains, human and dog skeletal carbon isotope values, and a revised interpretation of the archaeological record support this interpretation. Our results suggest that population increases and the nucleation associated with Cahokia were facilitated by the newly introduced practices of maize cultivation and consumption. Maize should be recognized as having had a key role in providing subsistence security that—combined with social, political, and religious changes—fueled the emergence of Cahokia in AD 1050.
Schizophrenia puts a significant burden on caregivers.
Objectives
To explore the effects of two long-acting treatments (LAT), paliperidone palmitate 1-month and 3-month formulations on caregiver burden (CGB) in European patients with schizophrenia using the Involvement Evaluation Questionnaire (IEQ)
Aims
To conduct a subgroup analysis of two randomized, double-blind studies (NCT01515423 and NCT01529515).
Methods
Caregivers (≥ 1 h of contact/week with the patients) were offered to complete the IEQ (31 items, each scoring: 0–4; total score: sum of 27 items [0–108]).
Results
Among 756 European caregivers (53% parents, 18% spouse/partner or girl/boyfriend, 10% sister/brother), 60% reported a CGB of ≥ 32 hours/week at open-label baseline (BL-OL). CGB reduced significantly for patients with both BL-OL and at least one double-blind IEQ sum-score (n = 433): mean improvement [SD] (9.9 [12.66], P < 0.001) from BL-OL (mean [SD] 26.0 [13.30]) to study end (16.0 [10.47]); (reduction in burden associated with worrying [2.9 points] and urging [4.3 points]). CGB significantly improved in patients on prior oral antipsychotics post-switching to LAT with less leisure days impacted and less hours spent in caregiving (P < 0.001). There was significant relationship between improvements and relapse status, patient age (P < 0.001), age at diagnosis (P < 0.002), and number of prior psychiatric hospitalizations in the last 24 months (P < 0.05). Prior use of long-acting antipsychotics other than paliperidone palmitate 1-month or 3-month formulations at BL-OL and duration of prior psychiatric hospitalizations in the last 24 months did not show significant effect on improvements.
Conclusion
Switching from an oral antipsychotic to an LAT can provide a meaningful and significant improvement in caregiver burden.
Disclosure of interest
All authors are employees of Janssen Research & Development, LLC and hold stocks in the company.
Reductions in insulin sensitivity in periparturient dairy cows develop as a means to support lactation; however, excessive mobilization of fatty acids (FA) increases the risk for peripartal metabolic disorders. Our objectives were to investigate the effect of prepartum body condition score (BCS) on systemic glucose and insulin tolerance, and to compare direct and indirect measurements of insulin sensitivity in peripartal lean and overweight dairy cows. Fourteen multiparous Holstein cows were allocated into two groups according to their BCS at day −28 prepartum: lean (n = 7; BCS ≤ 3.0) or overweight; (n = 7; BCS ≥ 4.0). Liver biopsies were performed on day −27, −14 and 4, relative to expected parturition. Intravenous insulin or glucose tolerances tests were performed following each liver biopsy. Relative to lean cows, overweight cows exhibited lower dry matter intake, lost more BCS and displayed increased plasma FA and β-hydroxybutyrate concentrations and elevated liver lipid content during peripartum. Glucose clearance rate was lower for all cows postpartum. Prepartum BCS had minimal effects on insulin and glucose tolerance; however, the ability of the cow to restore blood glucose levels following an insulin challenge was suppressed by increased BCS. Glucose-dependent parameters of insulin and glucose tolerance were not correlated with surrogate indices of insulin sensitivity. We conclude that prepartum BCS had minimal effect on systemic insulin sensitivity following parturition. The observed inconsistency between surrogate indices of insulin sensitivity and direct measurements of insulin and glucose tolerance adds support to growing concerns regarding their usefulness as tools to estimate systemic insulin action in periparturient cows.
Family education programs (FEPs) target caregiving-related psychological distress for carers of relatives/friends diagnosed with serious mental health conditions. While FEPs are efficacious in reducing distress, the mechanisms are not fully known. Peer group support and greater mental health knowledge are proposed to reduce carers' psychological distress by reducing stigmatising attitudes and self-blame, and strengthening carers' relationship with their relative.
Methods
Adult carers (n = 1016) who participated in Wellways Australia's FEP from 2009 to 2016 completed self-report questionnaires at the core program's start and end, during the consolidation period, and at a 6-month follow-up. Those who enrolled early completed questionnaires prior to a wait-list period. We used linear mixed-effects modelling to assess the program's effectiveness using a naturalistic wait-list control longitudinal design, and multivariate latent growth modelling to test a theory-based process change model.
Results
While there was no significant change over the wait-list period, psychological distress, self-blame and stigmatising attitudes significantly decreased, and communication and relationship quality/feelings increased from the core program's start to its end. Changes were maintained throughout the consolidation period and follow-up. Peer group support significantly predicted the declining trajectory of distress. Peer group support and greater knowledge significantly predicted declining levels of self-blame and stigmatising attitudes, and increasing levels of communication.
Conclusions
This is the first study to quantitatively validate the mechanisms underlying the effect of FEPs on carers' psychological distress. Peer group support is key in modifying carers' appraisals of their friend/relatives' condition. Continued implementation of FEPs within mental health service systems is warranted.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
Objective:
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Methods:
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Results:
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Conclusions:
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
Studies were conducted to determine the tolerance of sweetpotato and Palmer amaranth control to a premix of flumioxazin and pyroxasulfone pretransplant (PREtr) followed by (fb) irrigation. Greenhouse studies were conducted in a factorial arrangement of four herbicide rates (flumioxazin/pyroxasulfone PREtr at 105/133 and 57/72 g ai ha–1, S-metolachlor PREtr 803 g ai ha–1, nontreated) by three irrigation timings [2, 5, and 14 d after transplanting (DAP)]. Field studies were conducted in a factorial arrangement of seven herbicide treatments (flumioxazin/pyroxasulfone PREtr at 40/51, 57/72, 63/80, and 105/133 g ha–1, 107 g ha–1 flumioxazin PREtr fb 803 g ha–1S-metolachlor 7 to 10 DAP, and season-long weedy and weed-free checks) by three 1.9-cm irrigation timings (0 to 2, 3 to 5, or 14 DAP). In greenhouse studies, flumioxazin/pyroxasulfone reduced sweetpotato vine length and shoot and storage root fresh biomass compared to the nontreated check and S-metolachlor. Irrigation timing had no influence on vine length and root fresh biomass. In field studies, Palmer amaranth control was≥91% season-long regardless of flumioxazin/pyroxasulfone rate or irrigation timing. At 38 DAP, sweetpotato injury was≤37 and≤9% at locations 1 and 2, respectively. Visual estimates of sweetpotato injury from flumioxazin/pyroxasulfone were greater when irrigation timing was delayed 3 to 5 or 14 DAP (22 and 20%, respectively) compared to 0 to 2 DAP (7%) at location 1 but similar at location 2. Irrigation timing did not influence no.1, jumbo, or marketable yields or root length-to-width ratio. With the exception of 105/133 g ha–1, all rates of flumioxazin/pyroxasulfone resulted in marketable sweetpotato yield and root length-to-width ratio similar to flumioxazin fb S-metolachlor or the weed-free checks. In conclusion, flumioxazin/pyroxasulfone PREtr at 40/51, 57/72, and 63/80 g ha–1 has potential for use in sweetpotato for Palmer amaranth control without causing significant crop injury and yield reduction.
A significant amount of the fuel consumed by marine vehicles is expended to overcome skin-friction drag resulting from turbulent boundary layer flows. Hence, a substantial reduction in this frictional drag would notably reduce cost and environmental impact. Superhydrophobic surfaces (SHSs), which entrap a layer of air underwater, have shown promise in reducing drag in small-scale applications and/or in laminar flow conditions. Recently, the efficacy of these surfaces in reducing drag resulting from turbulent flows has been shown. In this work we examine four different, mechanically durable, large-scale SHSs. When evaluated in fully developed turbulent flow, in the height-based Reynolds number range of 10 000 to 30 000, significant drag reduction was observed on some of the surfaces, dependent on their exact morphology. We then discuss how neither the roughness of the SHSs, nor the conventional contact angle goniometry method of evaluating the non-wettability of SHSs at ambient pressure, can predict their drag reduction under turbulent flow conditions. Instead, we propose a new characterization parameter, based on the contact angle hysteresis at higher pressure, which aids in the rational design of randomly rough, friction-reducing SHSs. Overall, we find that both the contact angle hysteresis at higher pressure, and the non-dimensionalized surface roughness, must be minimized to achieve meaningful turbulent drag reduction. Further, we show that even SHSs that are considered hydrodynamically smooth can cause significant drag increase if these two parameters are not sufficiently minimized.
The unique phenotypic and genetic aspects of obsessive-compulsive (OCD) and attention-deficit/hyperactivity disorder (ADHD) among individuals with Tourette syndrome (TS) are not well characterized. Here, we examine symptom patterns and heritability of OCD and ADHD in TS families.
Method
OCD and ADHD symptom patterns were examined in TS patients and their family members (N = 3494) using exploratory factor analyses (EFA) for OCD and ADHD symptoms separately, followed by latent class analyses (LCA) of the resulting OCD and ADHD factor sum scores jointly; heritability and clinical relevance of the resulting factors and classes were assessed.
Results
EFA yielded a 2-factor model for ADHD and an 8-factor model for OCD. Both ADHD factors (inattentive and hyperactive/impulsive symptoms) were genetically related to TS, ADHD, and OCD. The doubts, contamination, need for sameness, and superstitions factors were genetically related to OCD, but not ADHD or TS; symmetry/exactness and fear-of-harm were associated with TS and OCD while hoarding was associated with ADHD and OCD. In contrast, aggressive urges were genetically associated with TS, OCD, and ADHD. LCA revealed a three-class solution: few OCD/ADHD symptoms (LC1), OCD & ADHD symptoms (LC2), and symmetry/exactness, hoarding, and ADHD symptoms (LC3). LC2 had the highest psychiatric comorbidity rates (⩾50% for all disorders).
Conclusions
Symmetry/exactness, aggressive urges, fear-of-harm, and hoarding show complex genetic relationships with TS, OCD, and ADHD, and, rather than being specific subtypes of OCD, transcend traditional diagnostic boundaries, perhaps representing an underlying vulnerability (e.g. failure of top-down cognitive control) common to all three disorders.
This study evaluated the psychometric properties of the Strengths and Difficulties Questionnaire Self-Report (SDQ-S) in South African adolescents, and compared findings with data from the UK, Australia and China.
Methods.
A sample of 3451 South African adolescents in grade 8, the first year of secondary school (Mage = 13.7 years), completed the SDQ-S in Afrikaans, English or isiXhosa. Means, group differences and internal consistency were analysed using SPSS V22, and confirmatory factor analyses were conducted using MPlus V7.
Results.
In the South African sample, significant gender differences were found for four of the five sub-scale means and for total difficulties, but gender differences of alpha scores were negligible. The internal consistency for the total difficulties, prosocial behaviour and emotional symptoms sub-scales were fair. UK cut-off values for caseness (set to identify the top 10% of scores in a UK sample) led to a higher proportion of South African adolescents classified in the ‘abnormal’ range on emotional and peer difficulties and a lower proportion classified in the ‘abnormal’ range for hyperactivity. South African cut-offs were therefore generated. The cross-country comparison with UK, Australian and Chinese data showed that South African adolescent boys and girls had the highest mean scores on total difficulties as well as on the subscales of emotional symptoms and conduct problems. In contrast, South African boys and girls had the lowest mean scores for hyperactivity/inattention. The UK boys and girls had the highest mean scores for hyperactivity/inattention, while the Australian sample had the highest scores for prosocial behaviours. The Chinese boys had the highest peer problem mean scores and Chinese boys and girls had the lowest means on prosocial behaviours. Confirmatory factor analyses showed significant item loadings with loadings higher than 0.40 for the emotional and prosocial behaviour sub-scales on the five-factor model, but not for all relevant items on the other three domains.
Conclusions.
Findings support the potential usefulness of the SDQ-S in a South African setting, but suggest that the SDQ-S should not be used with UK cut-off values, and indicate the need for further validation and standardisation work in South African adolescents. We recommend that in-country cut-offs for ‘caseness’ should be used for clinical purposes in South Africa, that cross-country comparisons should be made with caution, and that further examination of naturalistic clusters and factors of the SDQ should be performed in culturally and contextually diverse settings.
Genetic–epidemiological studies that estimate the contributions of genetic factors to variation in tic symptoms are scarce. We estimated the extent to which genetic and environmental influences contribute to tics, employing various phenotypic definitions ranging between mild and severe symptomatology, in a large population-based adult twin-family sample.
Method
In an extended twin-family design, we analysed lifetime tic data reported by adult mono- and dizygotic twins (n = 8323) and their family members (n = 7164; parents and siblings) from 7311 families in the Netherlands Twin Register. We measured tics by the abbreviated version of the Schedule for Tourette and Other Behavioral Syndromes. Heritability was estimated by genetic structural equation modeling for four tic disorder definitions: three dichotomous and one trichotomous phenotype, characterized by increasingly strictly defined criteria.
Results
Prevalence rates of the different tic disorders in our sample varied between 0.3 and 4.5% depending on tic disorder definition. Tic frequencies decreased with increasing age. Heritability estimates varied between 0.25 and 0.37, depending on phenotypic definitions. None of the phenotypes showed evidence of assortative mating, effects of shared environment or non-additive genetic effects.
Conclusions
Heritabilities of mild and severe tic phenotypes were estimated to be moderate. Overlapping confidence intervals of the heritability estimates suggest overlapping genetic liabilities between the various tic phenotypes. The most lenient phenotype (defined only by tic characteristics, excluding criteria B, C and D of DSM-IV) rendered sufficiently reliable heritability estimates. These findings have implications in phenotypic definitions for future genetic studies.
The aviation industry is dominated by the domain of heavier-than-air, fixed-wing, subsonic flight, and central to any design in this domain is the wing itself. One of the earliest debates in aviation still centres around the usefulness of the wing volume. On the one hand it is held that the wing, as an inevitable necessity, should provide the volume also for the payload. On the other, it is argued that more efficient wings do not even have sufficient volume for the entire wing structure. This work proposes precise definitions of the Wing Density and the Inflation Factor, two parameters that can quantitatively reflect the economic and technological trends in aviation. The wing volume of a hypothetical Ideal Wing is derived from the Operational Parameters of any given Flight Objective and compared to the volume requirement of that flight objective. We conclude that the dominant aircraft configuration of the future is likely to remain within the same family of the current dominant configuration, in conflict with some older predictions.
To determine the effect of variation in test methods on performance of an ultraviolet-C (UV-C) room decontamination device.
DESIGN
Laboratory evaluation.
METHODS
We compared the efficacy of 2 UV-C room decontamination devices with low pressure mercury gas bulbs. For 1 of the devices, we evaluated the effect of variation in spreading of the inoculum, carrier orientation relative to the device, type of organic load, type of carrier, height of carrier, and uninterrupted versus interrupted exposures on measured UV-C killing of methicillin-resistant Staphylococcus aureus and Clostridium difficile spores.
RESULTS
The 2 UV-C room decontamination devices achieved similar log10 colony-forming unit reductions in the pathogens with exposure times ranging from 5 to 40 minutes. On steel carriers, spreading of the inoculum over a larger surface area significantly enhanced killing of both pathogens, such that a 10-minute exposure on a 22-mm2 disk resulted in greater than 2 log reduction in C. difficile spores. Orientation of carriers in parallel rather than perpendicular with the UV-C lamps significantly enhanced killing of both pathogens. Different types of organic load also significantly affected measured organism reductions, whereas type of carrier, variation in carrier height, and interrupted exposure cycles did not.
CONCLUSIONS
Variation in test methods can significantly impact measured reductions in pathogens by UV-C devices during experimental testing. Our findings highlight the need for standardized laboratory methods for testing the efficacy of UV-C devices and for evaluations of the efficacy of short UV-C exposure times in real-world settings.
Attachment theory proposes that psychological functioning and affect regulations are influenced by the attachment we form with others. Early relationships with parents or caregivers lay the foundations for attachment styles. These styles are proposed to influence how we relate to others during our life can be modified by the relationships and events we experience in our lifespan. A secure attachment style is associated with a capacity to manage distress, comfort with autonomy and the ability to form relationships with others, whereas insecure attachment can lead to dysfunctional relationships, emotional and behaviour avoidance. Attachment theory provides a useful framework to inform our understanding of relationship difficulties in people with psychosis. This paper aims to complement recent systematic reviews by providing an overview of attachment theory, its application to psychosis, including an understanding of measurement issues and the clinical implications offered.
Method.
A narrative review was completed of the measures of attachment and parental bonding in psychosis. Its clinical implications are also discussed. The paper also explores the link between insecure attachment styles and illness course, social functioning and symptomatology. The following questions are addressed: What are the key attachment measures that have been used within the attachment and psychosis literature? What are the results of studies that have measured attachment or parental bonding in psychosis and what clinical implications can we derive from it? What are some of the key questions for future research from these findings in relation to the onset of psychosis research field?
Results.
The most commonly used measures of attachment in psychosis research are reviewed. Self-report questionnaires and semi-structured interviews have mainly been used to examine attachment styles in adult samples and in recent years comprise a measure specifically developed for a psychosis group. The review suggests that insecure attachment styles are common in psychosis samples. Key relationships were observed between insecure, avoidant and anxious attachment styles and psychosis development, expression and long-term outcome.
Conclusions.
Attachment theory can provide a useful framework to facilitate our understanding of interpersonal difficulties in psychosis that may predate its onset and impact on observed variability in outcomes, including treatment engagement. Greater attention should be given to the assessment of attachment needs and to the development of interventions that seek to compensate for these difficulties. However, further investigations are required on specifying the exact mechanisms by which specific attachment styles impact on the development of psychosis and its course.
Cognitive reserve (CR) is a protective factor that supports cognition by increasing the resilience of an individual's cognitive function to the deleterious effects of cerebral lesions. A single environmental proxy indicator is often used to estimate CR (e.g. education), possibly resulting in a loss of the accuracy and predictive power of the investigation. Furthermore, while estimates of an individual's prior CR can be made, no operational measure exists to estimate dynamic change in CR resulting from exposure to new life experiences.
Methods:
We aimed to develop two latent measures of CR through factor analysis: prior and current, in a sample of 467 healthy older adults.
Results:
The prior CR measure combined proxy measures traditionally associated with CR, while the current CR measure combined variables that had the potential to reflect dynamic change in CR due to new life experiences. Our main finding was that the analyses uncovered latent variables in hypothesized prior and current models of CR.
Conclusions:
The prior CR model supports multivariate estimation of pre-existing CR and may be applied to more accurately estimate CR in the absence of neuropathological data. The current CR model may be applied to evaluate and explore the potential benefits of CR-based interventions prior to dementia onset.
Cultural and linguistic minorities can be hard to survey either as the target population of interest or as a subpopulation of a general population survey. The challenges associated with studying these minorities are important to understand in order to assess and address the survey error that can be introduced when designing and implementing studies that include these groups. This chapter begins with a description of what constitutes cultural and linguistic minorities, based on a systematic review of the literature (see Chapter 5 in this volume, for a complete description of the process). We note that the literature in this area is largely limited to research among cultural and linguistic minorities in the context of Western and industrialized countries. Therefore, we supplement this literature by drawing upon our own experience and discussions with colleagues who conduct research among cultural and linguistic minorities in other parts of the world. This review is followed by a discussion of the potential challenges faced by researchers interested in surveying cultural and linguistic minorities and approaches taken to address these challenges in the areas of sampling, questionnaire development, adaptation and translation, pretesting, and data collection. We then discuss additional approaches to studying these hard-to-survey populations including qualitative, mixed-methods, and community-based research methods and how these can complement survey methods. The concluding section addresses needed improvements in the documentation and development of research methods to expand solutions and increase the quality of hard-to-survey cultural and linguistic minority research.
Defining cultural and linguistic minorities
This section sets out the key features of cultural and linguistic minorities. Three core concepts are defined and discussed. First, we define minority populations followed by a discussion of linguistic and cultural minorities. The distinct concept of hard-to-survey is also relevant and discussed in this context (also see Chapter 1 in this volume). On the one hand, it is a relatively straightforward task to define these concepts; however, as we discuss below, applying these definitions in a survey context is far more complicated. Formal definitions serve as a good starting point for this discussion, however.
Substance use in young adults is a significant and growing problem. Emergency Medical Services (EMS) personnel often encounter this problem, yet the use of prehospital data to evaluate the prevalence and magnitude of substance abuse has been limited.
Hypothesis/Problem
This study evaluated drug and alcohol use through the use of prehospital and EMS data in one suburban county in Maryland (USA). The primary hypothesis was that the type of drug being abused is associated with age. The secondary hypothesis was substance abuse incidence is associated with location. The tertiary hypothesis was that substance abuse is associated with a history of mental illness.
Methods
Deidentified patient care reports (PCRs) were obtained during a 24-month period from October 2010 through September 2012 for patients 0 through 25 years of age. Inclusion criteria included chief complaint of alcohol overdose, drug overdose, or the use of naloxone.
Results
The primary hypothesis was supported that age was associated with drug category (P < .001). Younger adolescents were more likely to use household items, prescription drugs, or over-the-counter drugs, whereas older adolescents were more likely to use illicit drugs. The secondary hypothesis was supported that both alcohol (P < .001) and drugs (P < .001) were associated with location of call. Calls involving alcohol were more likely to be at a home or business, whereas calls involving drugs were more likely to be at home or at a public venue. The tertiary hypothesis was supported that both alcohol (P = .001) and drug use (P < .001) were associated with history of mental illness. Older adolescents were more likely to report a history of mental illness. Chi-squared tests indicated there were significant differences between genders and drug category (P = .002) and gender and current suicide attempt (P = .004). Females were more likely to use prescription drugs, whereas males were more likely to use illicit drugs. Calls involving younger adolescents under 18 were more likely to be at school or the mall, whereas calls involving older adolescents were likely to be at a prison, public venue, or a business.
Conclusion
All three hypotheses were supported: the type of substance being abused was associated with both age and location, and substance abuse was associated with a history of mental illness. This research has important implications for understanding how EMS resources are utilized for substance use. This information is valuable in not only the education and training of prehospital care providers, but also for the targeting of future public health interventions.
SeamanEL, LevyMJ, JenkinsJL, GodarCC, SeamanKG. Assessing Pediatric and Young Adult Substance Use Through Analysis of Prehospital Data. Prehosp Disaster Med. 2014;29(4):1-6.