To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In Canada, recreational use of cannabis was legalized in October 2018. This policy change along with recent publications evaluating the efficacy of cannabis for the medical treatment of epilepsy and media awareness about its use have increased the public interest about this agent. The Canadian League Against Epilepsy Medical Therapeutics Committee, along with a multidisciplinary group of experts and Canadian Epilepsy Alliance representatives, has developed a position statement about the use of medical cannabis for epilepsy. This article addresses the current Canadian legal framework, recent publications about its efficacy and safety profile, and our understanding of the clinical issues that should be considered when contemplating cannabis use for medical purposes.
Objectives: To describe multivariate base rates (MBRs) of low scores and reliable change (decline) scores on Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) in college athletes at baseline, as well as to assess MBR differences among demographic and medical history subpopulations. Methods: Data were reported on 15,909 participants (46.5% female) from the NCAA/DoD CARE Consortium. MBRs of ImPACT composite scores were derived using published CARE normative data and reliability metrics. MBRs of sex-corrected low scores were reported at <25th percentile (Low Average), <10th percentile (Borderline), and ≤2nd percentile (Impaired). MBRs of reliable decline scores were reported at the 75%, 90%, 95%, and 99% confidence intervals. We analyzed subgroups by sex, race, attention-deficit/hyperactivity disorder and/or learning disability (ADHD/LD), anxiety/depression, and concussion history using chi-square analyses. Results: Base rates of low scores and reliable decline scores on individual composites approximated the normative distribution. Athletes obtained ≥1 low score with frequencies of 63.4% (Low Average), 32.0% (Borderline), and 9.1% (Impaired). Athletes obtained ≥1 reliable decline score with frequencies of 66.8%, 32.2%, 18%, and 3.8%, respectively. Comparatively few athletes had low scores or reliable decline on ≥2 composite scores. Black/African American athletes and athletes with ADHD/LD had higher rates of low scores, while greater concussion history was associated with lower MBRs (p < .01). MBRs of reliable decline were not associated with demographic or medical factors. Conclusions: Clinical interpretation of low scores and reliable decline on ImPACT depends on the strictness of the low score cutoff, the reliable change criterion, and the number of scores exceeding these cutoffs. Race and ADHD influence the frequency of low scores at all cutoffs cross-sectionally.
Immune system markers may predict affective disorder treatment response, but whether an overall immune system marker predicts bipolar disorder treatment effect is unclear.
Bipolar CHOICE (N = 482) and LiTMUS (N = 283) were similar comparative effectiveness trials treating patients with bipolar disorder for 24 weeks with four different treatment arms (standard-dose lithium, quetiapine, moderate-dose lithium plus optimised personalised treatment (OPT) and OPT without lithium). We performed secondary mixed effects linear regression analyses adjusted for age, gender, smoking and body mass index to investigate relationships between pre-treatment white blood cell (WBC) levels and clinical global impression scale (CGI) response.
Compared to participants with WBC counts of 4.5–10 × 109/l, participants with WBC < 4.5 or WBC ≥ 10 showed similar improvement within each specific treatment arm and in gender-stratified analyses.
An overall immune system marker did not predict differential treatment response to four different treatment approaches for bipolar disorder all lasting 24 weeks.
Opioid overdose deaths in the United States are increasing. Time to restoration of ventilation is critical. Rapid bystander administration of opioid antidote (naloxone) is an effective interim response but is historically constrained by legal restrictions.
To review and contextualize development of legislation facilitating layperson administration of naloxone across the United States.
Publicly accessible databases (1,2) were searched for legislation relevant to naloxone administration between January 2001 and July 2017.
All 51 jurisdictions implemented naloxone access laws between 2001 and 2017; 45 of these between 2012 and 2017. Nationwide mortality from opioid overdose increased from 3.3 per 100,000 population in 2001 to 13.3 in 2016, 42, and 35 jurisdictions enacted laws giving prescribers immunity from criminal prosecution, civil liability, and professional sanctions, respectively. 36, 41, and 35 jurisdictions implemented laws allowing dispensers immunity in the same domains. 38 and 46 jurisdictions gave laypeople administering naloxone immunity from criminal and civil liability. Forty-seven jurisdictions implemented laws allowing prescription of naloxone to third parties. All jurisdictions except Nebraska allowed pharmacists to dispense naloxone without a patient-specific prescription. Fifteen jurisdictions removed criminal liability for possession of non-prescribed naloxone. The 10 states with highest average rates of opioid overdose-related mortality had not legislated in a higher number of domains compared to the 10 lowest states and the average of all jurisdictions (3.4 vs 2.9 vs 2.7, respectively).
Effective involvement of bystanders in early recognition and reversal of opioid overdose requires removal of legal deterrents to prescription, dispensing, distribution, and administration of naloxone. Jurisdictions have varied in degree and speed of creating this legal environment. Understanding the integration of legislation into epidemic response may inform the response to this and future public health crises.
Human Stampedes (HS) occur at religious mass gatherings. Religious events have a higher rate of morbidity and mortality than other events that experience HS. This study is a subset analysis of religious event HS data regarding the physics principles involved in HS, and the associated event morbidity and mortality.
To analyze reports of religious HS to determine the initiating physics principles and associated morbidity and mortality.
Thirty-four reports of religious HS were analyzed to find shared variables. Thirty-three (97.1%) were written media reports with photographic, drawn, or video documentation. 29 (85.3%) cited footage/photographs and 1 (2.9%) was not associated with visual evidence. Descriptive phrases associated with physics principles contributing to the onset of HS and morbidity data were extracted and analyzed to evaluate frequency before, during, and after events.
34 (39.1%) reports of HS found in the literature review were associated with religious HS. Of these, 83% were found to take place in an open space, and 82.3% were associated with population density changes. 82.3% of events were associated with architectural nozzles (small streets, alleys, etc). 100% were found to have loss of XY-axis motion and 89% reached an average velocity of zero. 100% had loss of proxemics and 91% had associated Z-axis displacement (falls). Minimum reported attendance for a religious HS was 3000. 100% of religious HS had reported mortality at the event and 56% with further associated morbidity.
HS are deadly events at religious mass gatherings. Religious events are often recurring, planned gatherings in specific geographic locations. They are frequently associated with an increase in population density, loss of proxemics and velocity, followed by Z-axis displacements, leading to injury and death. This is frequently due to architectural nozzles, which those organizing religious mass gatherings can predict and utilize to mitigate future events.
Major depressive disorder (MDD) is a leading cause of disease burden worldwide, with lifetime prevalence in the United States of 17%. Here we present the results of the first prospective, large-scale, patient- and rater-blind, randomized controlled trial evaluating the clinical importance of achieving congruence between combinatorial pharmacogenomic (PGx) testing and medication selection for MDD.
1,167 outpatients diagnosed with MDD and an inadequate response to ≥1 psychotropic medications were enrolled and randomized 1:1 to a Treatment as Usual (TAU) arm or PGx-guided care arm. Combinatorial PGx testing categorized medications in three groups based on the level of gene-drug interactions: use as directed, use with caution, or use with increased caution and more frequent monitoring. Patient assessments were performed at weeks 0 (baseline), 4, 8, 12 and 24. Patients, site raters, and central raters were blinded in both arms until after week 8. In the guided-care arm, physicians had access to the combinatorial PGx test result to guide medication selection. Primary outcomes utilized the Hamilton Depression Rating Scale (HAM-D17) and included symptom improvement (percent change in HAM-D17 from baseline), response (50% decrease in HAM-D17 from baseline), and remission (HAM-D17<7) at the fully blinded week 8 time point. The durability of patient outcomes was assessed at week 24. Medications were considered congruent with PGx test results if they were in the ‘use as directed’ or ‘use with caution’ report categories while medications in the ‘use with increased caution and more frequent monitoring’ were considered incongruent. Patients who started on incongruent medications were analyzed separately according to whether they changed to congruent medications by week8.
At week 8, symptom improvement for individuals in the guided-care arm was not significantly different than TAU (27.2% versus 24.4%, p=0.11). However, individuals in the guided-care arm were more likely than those in TAU to achieve remission (15% versus 10%; p<0.01) and response (26% versus 20%; p=0.01). Remission rates, response rates, and symptom reductions continued to improve in the guided-treatment arm until the 24week time point. Congruent prescribing increased to 91% in the guided-care arm by week 8. Among patients who were taking one or more incongruent medication at baseline, those who changed to congruent medications by week 8 demonstrated significantly greater symptom improvement (p<0.01), response (p=0.04), and remission rates (p<0.01) compared to those who persisted on incongruent medications.
Combinatorial PGx testing improves short- and long-term response and remission rates for MDD compared to standard of care. In addition, prescribing congruency with PGx-guided medication recommendations is important for achieving symptom improvement, response, and remission for MDD patients.
Funding Acknowledgements: This study was supported by Assurex Health, Inc.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
Chronic kidney disease (CKD) is described as a progressive alteration of kidney function, resulting from multiple factors, including behaviours. We investigated the association of the Dietary Inflammatory Index (DII®) with prevalent CKD in adult Americans. National Health and Nutrition Examination Survey participants with measured data on kidney function markers from 2005 to 2012 were included in this study. Prevalent CKD was based on an estimated glomerular filtration rate (eGFR) <60 ml/min per 1·73 m2 or urinary albumin/creatinine≥30 mg/g. Energy-adjusted DII (E-DIITM) scores were calculated from 24-h dietary recalls. Statistical analyses accounted for the survey design and sample weights. We included 21 649 participants, with 1634 (6·8 %) having prevalent CKD. Participants with high E-DII scores had greater BMI, fasting blood glucose and systolic blood pressure, and were more likely to be diabetic or hypertensive (all P<0·001) compared with those with lower E-DII scores. In regression models adjusted for age, sex, race, fasting blood glucose, blood pressure, BMI, hypertension and diabetes status, mean eGFR significantly decreased across increasing quartiles of E-DII, whereas serum uric acid level and log urinary albumin:creatinine ratio significantly increased (all P<0·001). Prevalent CKD increased from 5·3 % in the lowest to 9·3 % in the highest E-DII quartile (P=0·02). In multivariable-adjusted logistic regression models, the odds of prevalent CKD were 29 % higher in the highest compared with the lowest E-DII quartile. Pro-inflammatory diet is associated with declining kidney function and high prevalence of CKD. Dietary changes that reduce inflammation have a potential to prevent CKD.
OBJECTIVES/SPECIFIC AIMS: The purpose of the present secondary data analysis was to examine the effect of moderate-severe disturbed sleep before the start of radiation therapy (RT) on subsequent RT-induced pain. METHODS/STUDY POPULATION: Analyses were performed on 676 RT-naïve breast cancer patients (mean age 58, 100% female) scheduled to receive RT from a previously completed nationwide, multicenter, phase II randomized controlled trial examining the efficacy of oral curcumin on radiation dermatitis severity. The trial was conducted at 21 community oncology practices throughout the US affiliated with the University of Rochester Cancer Center NCI’s Community Oncology Research Program (URCC NCORP) Research Base. Sleep disturbance was assessed using a single item question from the modified MD Anderson Symptom Inventory (SI) on a 0–10 scale, with higher scores indicating greater sleep disturbance. Total subjective pain as well as the subdomains of pain (sensory, affective, and perceived) were assessed by the short-form McGill Pain Questionnaire. Pain at treatment site (pain-Tx) was also assessed using a single item question from the SI. These assessments were included for pre-RT (baseline) and post-RT. For the present analyses, patients were dichotomized into 2 groups: those who had moderate-severe disturbed sleep at baseline (score≥4 on the SI; n=101) Versus those who had mild or no disturbed sleep (control group; score=0–3 on the SI; n=575). RESULTS/ANTICIPATED RESULTS: Prior to the start of RT, breast cancer patients with moderate-severe disturbed sleep at baseline were younger, less likely to have had lumpectomy or partial mastectomy while more likely to have had total mastectomy and chemotherapy, more likely to be on sleep, anti-anxiety/depression, and prescription pain medications, and more likely to suffer from depression or anxiety disorder than the control group (all p’s≤0.02). Spearman rank correlations showed that changes in sleep disturbance from baseline to post-RT were significantly correlated with concurrent changes in total pain (r=0.38; p<0.001), sensory pain (r=0.35; p<0.001), affective pain (r=0.21; p<0.001), perceived pain intensity (r=0.37; p<0.001), and pain-Tx (r=0.35; p<0.001). In total, 92% of patients with moderate-severe disturbed sleep at baseline reported post-RT total pain compared with 79% of patients in the control group (p=0.006). Generalized linear estimating equations, after controlling for baseline pain and other covariates (baseline fatigue and distress, age, sleep medications, anti-anxiety/depression medications, prescription pain medications, and depression or anxiety disorder), showed that patients with moderate-severe disturbed sleep at baseline had significantly higher mean values of post-RT total pain (by 39%; p=0.033), post-RT sensory pain (by 41%; p=0.046), and post-RT affective pain (by 55%; p=0.035) than the control group. Perceived pain intensity (p=0.066) and pain-Tx (p=0.086) at post-RT were not significantly different between the 2 groups. DISCUSSION/SIGNIFICANCE OF IMPACT: These findings suggest that moderate-severe disturbed sleep prior to RT is an important predictor for worsening of pain at post-RT in breast cancer patients. There could be several plausible reasons for this. Sleep disturbance, such as sleep loss and sleep continuity disturbance, could result in impaired sleep related recovery and repair of tissue damage associated with cancer and its treatment; thus, resulting in the amplification of pain. Sleep disturbance may also reduce pain tolerance threshold through increased sensitization of the central nervous system. In addition, pain and sleep disturbance may share common neuroimmunological pathways. Sleep disturbance may modulate inflammation, which in turn may contribute to increased pain. Further research is needed to confirm these findings and whether interventions targeting sleep disturbance in early phase could be potential alternate approaches to reduce pain after RT.
Early life exposures affect health and disease across the life course and potentially across multiple generations. The Clinical and Translational Research Institutes (CTSIs) offer an opportunity to utilize and link existing databases to conduct lifespan research.
A survey with Lifespan Domain Taskforce expert input was created and distributed to lead lifespan researchers at each of the 64 CTSIs. The survey requested information regarding institutional databases related to early life exposure, child-maternal health, or lifespan research.
Of 64 CTSI, 88% provided information on a total of 130 databases. Approximately 59% (n=76/130) had an associated biorepository. Longitudinal data were available for 72% (n=93/130) of reported databases. Many of the biorepositories (n=44/76; 68%) have standard operating procedures that can be shared with other researchers.
The majority of CTSI databases and biorepositories focusing on child-maternal health and lifespan research could be leveraged for lifespan research, increased generalizability and enhanced multi-institutional research in the United States.
We surveyed resident physicians at 2 academic medical centers regarding urinary testing and treatment as they progressed through training. Demographics and self-reported confidence were compared to overall knowledge using clinical vignette-based questions. Overall knowledge was 40% in 2011 and increased to 48%, 55%, and 63% in subsequent years (P<.001).
The objective of this study was to examine the association between dietary inflammatory potential and memory and cognitive functioning among a representative sample of the US older adult population. Cross-sectional data from the 2011–2012 and 2013–2014 National Health and Nutrition Examination Survey were utilised to identify an aggregate sample of adults 60–85 years of age (n 1723). Dietary inflammatory index (DII®) scores were calculated using 24-h dietary recall interviews. Three memory-related assessments were employed, including the Consortium to Establish a Registry for Alzheimer’s disease (CERAD) Word Learning subset, the Animal Fluency test and the Digit Symbol Substitution Test (DSST). Inverse associations were observed between DII scores and the different memory parameters. Episodic memory (CERAD) (badjusted=−0·39; 95 % CI −0·79, 0·00), semantic-based memory (Animal Fluency Test) (badjusted=−1·18; 95 % CI −2·17, −0·20) and executive function and working-memory (DSST) (badjusted=−2·80; 95 % CI −5·58, −0·02) performances were lowest among those with the highest mean DII score. Though inverse relationships were observed between DII scores and memory and cognitive functioning, future work is needed to further explore the neurobiological mechanisms underlying the complex relationship between inflammation-related dietary behaviour and memory and cognition.
22q11.2 deletion syndrome (22q11.2DS) is associated with a high risk of childhood as well as adult psychiatric disorders, in particular schizophrenia. Childhood cognitive deterioration in 22q11.2DS has previously been reported, but only in studies lacking a control sample.
To compare cognitive trajectories in children with 22q11.2DS and unaffected control siblings.
A longitudinal study of neurocognitive functioning (IQ, executive function, processing speed and attention) was conducted in children with 22q11.2DS (n = 75, mean age time 1 (T1) 9.9, time 2 (T2) 12.5) and control siblings (n = 33, mean age T1 10.6, T2 134).
Children with 22q11.2DS exhibited deficits in all cognitive domains. However, mean scores did not indicate deterioration. When individual trajectories were examined, some participants showed significant decline over time, but the prevalence was similar for 22q11.2DS and control siblings. Findings are more likely to reflect normal developmental fluctuation than a 22q11.2DS-specific abnormality.
Childhood cognitive deterioration is not associated with 22q11.2DS. Contrary to previous suggestions, we believe it is premature to recommend repeated monitoring of cognitive function to identifying individual children with 22q11.2DS at high risk of developing schizophrenia.
The Dietary Inflammatory Index (DII)TM, which was developed to characterize the inflammatory potential of a person’s diet, has been shown to be associated with inflammatory conditions such as cancer. The present study aimed to investigate the association between DII scores and colorectal adenoma (CRA), a pre-cancerous condition.
Responses to baseline dietary questionnaires were used calculate DII scores. In a cross-sectional study design, the association between DII scores and CRA prevalence was determined in men and women separately using logistic regression models.
Ten cancer screening centres across the USA.
Participants were those included in the screening arm of the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial.
Among the 44 278 individuals included in these analyses, men with diets in the most inflammatory quartile of DII scores had higher odds of all types of CRA (advanced, non-advanced and multiple (>1)) compared with those with diets in the least inflammatory quartile of DII scores. In fully adjusted models, compared with those with DII scores in quartile 1 (least inflammatory), males with DII scores in quartile 3 (adjusted odds ratio (aOR)=1·28; 95 % CI 1·12, 1·47) and quartile 4 (aOR=1·41; 95 % CI 1·23, 1·62) were more likely to have prevalent distal CRA. Higher DII scores, representing a more inflammatory diet, also were weakly associated with a higher prevalence of CRA in women.
Implementing an anti-inflammatory diet may be an effective means of primary prevention of CRA, especially in men.
Chronic low-grade inflammation has been recognised as a key underlying mechanism for several chronic diseases, including cancer and CVD. Nutrition represents a host of key modifiable factors that influence chronic inflammation. Dietary inflammatory scores were developed to assess the inflammatory potential of the diet and have been associated with inflammatory biomarkers in cross-sectional and short-term longitudinal studies. The objective of this study was to investigate the relationship between the dietary inflammatory index (DII), the alternate dietary inflammatory index (ADII) and long-term C-reactive protein (CRP). We also tested age as an effect modifier of this relationship. Participants were selected in the Supplémentation en Vitamines et Minéraux Antioxydants study, which included subjects aged 45–60 years old for men and 35–60 years old for women in 1994. Participants with ≥3 24-h dietary records at baseline and a CRP measurement at the 12-year follow-up evaluation were included in the present study (n 1980). The relationships between the DII and ADII and elevated CRP (>3 mg/l) were investigated using logistic multivariable regression. All analyses were stratified by age (cut-off at median age=50 years old). The overall associations between DII and ADII and long-term CRP were not statistically significant (Ptrend across tertiles=0·16 for DII and 0·10 for ADII). A quantitative interaction was found between ADII score and age (P=0·16 for ADII, 0·36 for DII). In stratified analyses the ADII was significantly prospectively associated with CRP only in younger participants: OR tertile 3 v. tertile 1: 1·79 (95 % CI 1·04, 3·07). Pro-inflammatory diets may have long-term effect on CRP only in younger subjects.
Transactional cascades among child internalizing and externalizing symptoms, and fathers’ and mothers’ posttraumatic stress disorder (PTSD) symptoms were examined in a sample of families with a male parent who had been deployed to recent military conflicts in the Middle East. The role of parents’ positive engagement and coercive interaction with their child, and family members’ emotion regulation were tested as processes linking cascades of parent and child symptoms. A subsample of 183 families with deployed fathers and nondeployed mothers and their 4- to 13-year-old children who participated in a randomized control trial intervention (After Deployment: Adaptive Parenting Tools) were assessed at baseline prior to intervention, and at 12 and 24 months after baseline, using parent reports of their own and their child's symptoms. Parents’ observed behavior during interaction with their children was coded using a multimethod approach at each assessment point. Reciprocal cascades among fathers’ and mothers’ PTSD symptoms, and child internalizing and externalizing symptoms, were observed. Fathers’ and mothers’ positive engagement during parent–child interaction linked their PTSD symptoms and their child's internalizing symptoms. Fathers’ and mothers’ coercive behavior toward their child linked their PTSD symptoms and their child's externalizing symptoms. Each family member's capacity for emotion regulation was associated with his or her adjustment problems at baseline. Implications for intervention, and for research using longitudinal models and a family-systems perspective of co-occurrence and cascades of symptoms across family members are described.
Although there is a significant willingness to respond to disasters, a review of post-event reports following incidents shows troubling repeated patterns with poorly integrated response activities and response managers inadequately trained for the requirements of disasters. This calls for a new overall approach in disaster management.
An in-depth review of the education and training opportunities available to responders and disaster managers has been undertaken, as well as an extensive review of the educational competencies and their parent domains identified by subject matter experts as necessary for competent performance.
Seven domains of competency and competencies that should be mastered by disaster mangers were identified. This set of domains and individual competencies was utilized to define a new and evolving curriculum. In order to evaluate and assess the mastery of each competency, objectives were more widely defined as activities under specific topics, as the measurable elements of the curriculum, for each managerial level.
This program enables interagency cooperation and collaboration and could be used to increase and improve decision-makers’ understanding of disaster managers’ capabilities; at the strategic/tactical level to promote the knowledge and capability of the disaster managers themselves; and as continuing education or further career development for disaster managers at the operational level. (Disaster Med Public Health Preparedness. 2016;10:854–873)
This report uses 6-year outcomes of the Oregon Divorce Study to examine the processes by which parenting practices affect deviant peer association during two developmental stages: early to middle childhood and late childhood to early adolescence. The participants were 238 newly divorced mothers and their 5- to 8-year-old sons who were randomly assigned to Parent Management Training—Oregon Model (PMTO®) or to a no-treatment control group. Parenting practices, child delinquent behavior, and deviant peer association were repeatedly assessed from baseline to 6 years after baseline using multiple methods and informants. PMTO had a beneficial effect on parenting practices relative to the control group. Two stage models linking changes in parenting generated by PMTO to children's growth in deviant peer association were supported. During the early to middle childhood stage, the relationship of improved parenting practices on deviant peer association was moderated by family socioeconomic status (SES); effective parenting was particularly important in mitigating deviant peer association for lower SES families whose children experience higher densities of deviant peers in schools and neighborhoods. During late childhood and early adolescence, the relationship of improved parenting to youths' growth in deviant peer association was mediated by reductions in the growth of delinquency during childhood; higher levels of early delinquency are likely to promote deviant peer association through processes of selective affiliation and reciprocal deviancy training. The results are discussed in terms of multilevel developmental progressions of diminished parenting, child involvement in deviancy producing processes in peer groups, and increased variety and severity of antisocial behavior, all exacerbated by ecological risks associated with low family SES.