To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We use three-dimensional (3-D) fully kinetic particle-in-cell simulations to study the occurrence of magnetic reconnection in a simulation of decaying turbulence created by anisotropic counter-propagating low-frequency Alfvén waves consistent with critical-balance theory. We observe the formation of small-scale current-density structures such as current filaments and current sheets as well as the formation of magnetic flux ropes as part of the turbulent cascade. The large magnetic structures present in the simulation domain retain the initial anisotropy while the small-scale structures produced by the turbulent cascade are less anisotropic. To quantify the occurrence of reconnection in our simulation domain, we develop a new set of indicators based on intensity thresholds to identify reconnection events in which both ions and electrons are heated and accelerated in 3-D particle-in-cell simulations. According to the application of these indicators, we identify the occurrence of reconnection events in the simulation domain and analyse one of these events in detail. The event is related to the reconnection of two flux ropes, and the associated ion and electron exhausts exhibit a complex 3-D structure. We study the profiles of plasma and magnetic-field fluctuations recorded along artificial-spacecraft trajectories passing near and through the reconnection region. Our results suggest the presence of particle heating and acceleration related to small-scale reconnection events within magnetic flux ropes produced by the anisotropic Alfvénic turbulent cascade in the solar wind. These events are related to current structures of the order of a few ion inertial lengths in size.
(i) to examine demographic and health characteristics of women of reproductive age on a vegan diet in Australia and compare these to the general population, (ii) to identify sources and intake of vitamin B12, and compare intake to current recommendations (iii) examine associations between participant characteristics and adequacy of vitamin B12 intake.
In this cross-sectional study data was collected via an online survey. Demographic and health characteristics of women on a vegan diet were compared to women in the general population (using Australian Bureau of Statistics data). Intake of vitamin B12 was estimated using a food frequency questionnaire and estimation of supplemental intake.
Participants (n1530) were women 18-44 years who had been on a vegan diet for at least six months.
While Body Mass Index, smoking habits and intakes of fruit and vegetables compared favourably to the general population, 26% of respondents had estimated intakes of vitamin B12 below recommendations. Analyses of relationships between vitamin B12 intake and participant characteristics revealed that the strongest predictor of intake was supplementation (p<0.001), however, 25% had not supplemented with vitamin B12 in the past three months.
The vitamin B12 intakes of a substantial proportion of Australian women of reproductive age consuming a vegan diet do not meet the recommended intake, which could adversely affect their health, and, if they are pregnant or lactating, that of their infants too. There is a need for further research in this area to identify effective strategies to address this situation.
To determine whether cascade reporting is associated with a change in meropenem and fluoroquinolone consumption.
A quasi-experimental study was conducted using an interrupted time series to compare antimicrobial consumption before and after the implementation of cascade reporting.
A 399-bed, tertiary-care, Veterans’ Affairs medical center.
Antimicrobial consumption data across 8 inpatient units were extracted from the Center for Disease Control and Prevention (CDC) National Health Safety Network (NHSN) antimicrobial use (AU) module from April 2017 through March 2019, reported as antimicrobial days of therapy (DOT) per 1,000 days present (DP).
Cascade reporting is a strategy of reporting antimicrobial susceptibility test results in which secondary agents are only reported if an organism is resistant to primary, narrow-spectrum agents. A multidisciplinary team developed cascade reporting algorithms for gram-negative bacteria based on local antibiogram and infectious diseases practice guidelines, aimed at restricting the use of fluoroquinolones and carbapenems. The algorithms were implemented in March 2018.
Following the implementation of cascade reporting, mean monthly meropenem (P =.005) and piperacillin/tazobactam (P = .002) consumption decreased and cefepime consumption increased (P < .001). Ciprofloxacin consumption decreased by 2.16 DOT per 1,000 DP per month (SE, 0.25; P < .001). Clostridioides difficile rates did not significantly change.
Ciprofloxacin consumption significantly decreased after the implementation of cascade reporting. Mean meropenem consumption decreased after cascade reporting was implemented, but we observed no significant change in the slope of consumption. cascade reporting may be a useful strategy to optimize antimicrobial prescribing.
Serpentinization of ultramafic rocks in the sea and on land leads to the generation of alkaline fluids rich in molecular hydrogen (H2) and methane (CH4) that favour the formation of carbonate mineralization, such as veins in the sub-seafloor, seafloor carbonate chimneys and terrestrial hyperalkaline spring deposits. Examples of this type of seawater–rock interaction and the formation of serpentinization-derived carbonates in a shallow-marine environment are scarce, and almost entirely lacking in the geological record. Here we present evidence for serpentinization-induced fluid seepage in shallow-marine sedimentary rocks from the Upper Cretaceous (upper Campanian to lower Maastrichtian) Qahlah Formation at Jebel Huwayyah, United Arab Emirates. The research object is a metre-scale structure (the Jebel Huwayyah Mound) formed of calcite-cemented sand grains, which formed a positive seafloor feature. The Jebel Huwayyah Mound contains numerous vertically orientated fluid conduits containing two main phases of calcite cement. We use C and O stable isotopes and elemental composition to reconstruct the fluids from which these cements precipitated and infer that the fluids consisted of variable mixtures of seawater and fluids derived from serpentinization of the underlying Semail Ophiolite. Based on their negative δ13C values, hardgrounds in the same section as the Jebel Huwayyah Mound may also have had a similar origin. The Jebel Huwayyah Mound shows that serpentinization of the Semail Ophiolite by seawater occurred very soon after obduction and marine transgression, a process that continued through to the Miocene, and, with interaction of meteoric water, up to the present day.
United States dentists prescribe 10% of all outpatient antibiotics. Assessing appropriateness of antibiotic prescribing has been challenging due to a lack of guidelines for oral infections. In 2019, the American Dental Association (ADA) published clinical practice guidelines (CPG) on the management of acute oral infections. Our objective was to describe baseline national antibiotic prescribing for acute oral infections prior to the release of the ADA CPG and to identify patient-level variables associated with an antibiotic prescription.
We performed an analysis of national VA data from January 1, 2017, to December 31, 2017. We identified cases of acute oral infections using International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM) codes. Antibiotics prescribed by a dentist within ±7 days of a visit were included. Multivariable logistic regression identified patient-level variables associated with an antibiotic prescription.
Of the 470,039 VA dental visits with oral infections coded, 12% of patient visits with irreversible pulpitis, 17% with apical periodontitis, and 28% with acute apical abscess received antibiotics. Although the median days’ supply was 7, prolonged use of antibiotics was frequent (≥8 days, 42%–49%). Patients with high-risk cardiac conditions, prosthetic joints, and endodontic, implant, and oral and maxillofacial surgery dental procedures were more likely to receive antibiotics.
Most treatments of irreversible pulpitis and apical periodontitis cases were concordant with new ADA guidelines. However, in cases where antibiotics were prescribed, prolonged antibiotic courses >7 days were frequent. These findings demonstrate opportunities for the new ADA guidelines to standardize and improve dental prescribing practices.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to COVID-19 with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplemental materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
Clarifying the relationship between depression symptoms and cardiometabolic and related health could clarify risk factors and treatment targets. The objective of this study was to assess whether depression symptoms in midlife are associated with the subsequent onset of cardiometabolic health problems.
The study sample comprised 787 male twin veterans with polygenic risk score data who participated in the Harvard Twin Study of Substance Abuse (‘baseline’) and the longitudinal Vietnam Era Twin Study of Aging (‘follow-up’). Depression symptoms were assessed at baseline [mean age 41.42 years (s.d. = 2.34)] using the Diagnostic Interview Schedule, Version III, Revised. The onset of eight cardiometabolic conditions (atrial fibrillation, diabetes, erectile dysfunction, hypercholesterolemia, hypertension, myocardial infarction, sleep apnea, and stroke) was assessed via self-reported doctor diagnosis at follow-up [mean age 67.59 years (s.d. = 2.41)].
Total depression symptoms were longitudinally associated with incident diabetes (OR 1.29, 95% CI 1.07–1.57), erectile dysfunction (OR 1.32, 95% CI 1.10–1.59), hypercholesterolemia (OR 1.26, 95% CI 1.04–1.53), and sleep apnea (OR 1.40, 95% CI 1.13–1.74) over 27 years after controlling for age, alcohol consumption, smoking, body mass index, C-reactive protein, and polygenic risk for specific health conditions. In sensitivity analyses that excluded somatic depression symptoms, only the association with sleep apnea remained significant (OR 1.32, 95% CI 1.09–1.60).
A history of depression symptoms by early midlife is associated with an elevated risk for subsequent development of several self-reported health conditions. When isolated, non-somatic depression symptoms are associated with incident self-reported sleep apnea. Depression symptom history may be a predictor or marker of cardiometabolic risk over decades.
The Rapid ASKAP Continuum Survey (RACS) is the first large-area survey to be conducted with the full 36-antenna Australian Square Kilometre Array Pathfinder (ASKAP) telescope. RACS will provide a shallow model of the ASKAP sky that will aid the calibration of future deep ASKAP surveys. RACS will cover the whole sky visible from the ASKAP site in Western Australia and will cover the full ASKAP band of 700–1800 MHz. The RACS images are generally deeper than the existing NRAO VLA Sky Survey and Sydney University Molonglo Sky Survey radio surveys and have better spatial resolution. All RACS survey products will be public, including radio images (with
15 arcsec resolution) and catalogues of about three million source components with spectral index and polarisation information. In this paper, we present a description of the RACS survey and the first data release of 903 images covering the sky south of declination
made over a 288-MHz band centred at 887.5 MHz.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
In the absence of pyuria, positive urine cultures are unlikely to represent infection. Conditional urine reflex culture policies have the potential to limit unnecessary urine culturing. We evaluated the impact of this diagnostic stewardship intervention.
We conducted a retrospective, quasi-experimental (nonrandomized) study, with interrupted time series, from August 2013 to January 2018 to examine rates of urine cultures before versus after the policy intervention. We compared 3 intervention sites to 3 control sites in an aggregated series using segmented negative binomial regression.
The study included 6 acute-care hospitals within the Veterans’ Health Administration across the United States.
Adult patients with at least 1 urinalysis ordered during acute-care admission, excluding pregnant patients or those undergoing urological procedures, were included.
At the intervention sites, urine cultures were performed if a preceding urinalysis met prespecified criteria. No such restrictions occurred at the control sites. The primary outcome was the rate of urine cultures performed per 1,000 patient days. The safety outcome was the rate of gram-negative bloodstream infection per 1,000 patient days.
The study included 224,573 urine cultures from 50,901 admissions in 24,759 unique patients. Among the intervention sites, the overall average number of urine cultures performed did not significantly decrease relative to the preintervention period (5.9% decrease; P = 0.8) but did decrease by 21% relative to control sites (P < .01). We detected no significant difference in the rates of gram-negative bloodstream infection among intervention or control sites (P = .49).
Conditional urine reflex culture policies were associated with a decrease in urine culturing without a change in the incidence of gram-negative bloodstream infection.
Executive functions (EF) are associated to frontal lobes and cognitive decline (CD) with worse results on EF tests.
Analyze if the Frontal Assessment Battery/FAB assessing EF discriminates elders with CD (vs. with no CD; Montreal Cognitive Assessment/MoCA), and if the results obtained with the Rey Osterreith Complex Figure Test/ROCF (copy's quality, immediate, and delayed memory) are associated with the CD presence/absence. Moreover, we wanted to assess if copy's quality and 3 minutes memory test are associated with FAB results, since these two tests are supposedly associated with EF and with frontal lobes assessed by the FAB, contrarily to the 20 minutes memory (supposedly related to the temporal area).
556 institutionalized elders (age: M ± SD =80.2 ± 5.23; range=60-100) filled in voluntarily a sociodemographic questionnaire, ROCF, MoCA and FAB.
FAB and all ROCF tests were associated with the absence/presence of CD. Regarding variables stratified by age and education, FAB was associated with immediate memory but not with copy's quality nor with delayed memory. With no stratified ROCF and FAB, correlations confirmed the previous associations, but also between FAB and copy's quality.
Results follow the literature regarding the association between immediate memory and EF (associated to frontal lobes), in contrast to the long-term memory which is associated with the temporal area and that was not associated with FAB. Results concerning copy's quality (ROCF) are not consensual.
When cognitive decline (CD) is present, attention is one of the impaired mental functions. CD is also associated with anxious/depressive symptoms and with some demographic variables, particularly, age.
Investigate the associations between selective attention (Stroop Test: Stroop_Word, Stroop_Color, Difference between Stroop_Word and Stroop_Color, Stroop Ratio_Word, Stroop Ratio_Color and Difference between Stroop Ratio_Word and Stroop Ratio_ Color) and CD (Montreal Cognitive Assessment/MoCA) in institutionalized elders; explore the predictive value of Stroop variables for CD, controlling anxious/depressive symptoms and sociodemographic variables.
140 institutionalized elders (mean age, M = 78.4, SD = 7.48, range = 60-97) voluntarily answered to sociodemographic questions, the MoCA, the Geriatric Anxiety Inventory/GAI, the Geriatric Depression Scale/GDS and Stroop test.
73 elders (52, 1%) had CD. Dichotomized MoCA was associated with Stroop_Word, Stroop_Color, Stroop Ratio_Word, Stroop Ratio_Color, GDS and the sociodemographic variable schooling × profession. Age and education were not tested, since MoCA was stratified according to those variables. GDS, Stroop Ratio_Word and Stroop Ratio_Color showed to predict CD.
There was an association between Stroop_Word, Stroop_Color, Stroop Ratio_Word and Stroop Ratio_Color and CD, confirming that selective attention is smaller when the elderly reveal CD. GDS and CD were, also, associated. However, there was no association between MoCA dichotomized and differences between the correct answers (Stroop_Word and Stroop_Color) and Ratios (Stroop Ratio_Word and Stroop Ratio_Color). Selective attention and depressive symptoms predicted CD. It would be important to intervene through cognitive rehabilitation with the elders to improve their attention.
Quality-adjusted life-years (QALYs) and disability-adjusted life-years (DALYs) are commonly used in cost-effectiveness analysis (CEA) to measure health benefits. We sought to quantify and explain differences between QALY- and DALY-based cost-effectiveness ratios, and explore whether using one versus the other would materially affect conclusions about an intervention's cost-effectiveness.
We identified CEAs using both QALYs and DALYs from the Tufts Medical Center CEA Registry and Global Health CEA Registry, with a supplemental search to ensure comprehensive literature coverage. We calculated absolute and relative differences between the QALY- and DALY-based ratios, and compared ratios to common benchmarks (e.g., 1× gross domestic product per capita). We converted reported costs into US dollars.
Among eleven published CEAs reporting both QALYs and DALYs, seven focused on pharmaceuticals and infectious disease, and five were conducted in high-income countries. Four studies concluded that the intervention was “dominant” (cost-saving). Among the QALY- and DALY-based ratios reported from the remaining seven studies, absolute differences ranged from approximately $2 to $15,000 per unit of benefit, and relative differences from 6–120 percent, but most differences were modest in comparison with the ratio value itself. The values assigned to utility and disability weights explained most observed differences. In comparison with cost-effectiveness thresholds, conclusions were consistent regardless of the ratio type in ten of eleven cases.
Our results suggest that although QALY- and DALY-based ratios for the same intervention can differ, differences tend to be modest and do not materially affect comparisons to common cost-effectiveness thresholds.
Current guidelines recommend highly specialized care for patients with severe personality disorders (PDs). However, there is little knowledge about how to detect older patients with severe PDs. The aim of the current study was to develop an age-specific tool to detect older adults with severe PDs for highly specialized mental health care.
In a Delphi study, a tool to detect adults with severe PDs for highly specialized mental health care was adjusted for older adults based on expert opinion. Subsequently, the psychometric properties of the age-specific tool were evaluated.
The psychometric part of the study was performed in two Dutch highly specialized centers for PDs in older adults.
Patients (N = 90) from two highly specialized centers on PDs in older adults were enrolled.
The age-specific tool was evaluated using clinical judgment as the gold standard.
The Delphi study resulted in an age-specific tool, consisting of seven items to detect older adults with severe PDs for highly specialized mental health care. Psychometric properties of this tool were evaluated. Receiver operating curve analysis showed that the questionnaire was characterized by sufficient diagnostic accuracy. Internal consistency of the tool was sufficient and inter-rater reliability was moderate.
An age-specific tool to detect older adults with severe PDs was developed based on expert opinion. Psychometric properties were evaluated showing sufficient diagnostic accuracy. The tool may preliminarily be used in mental health care to detect older adults with severe PDs to refer them to highly specialized care in an early phase.
This study examined the long-term effects of a randomized controlled trial of the Family Check-Up (FCU) intervention initiated at age 2 on inhibitory control in middle childhood and adolescent internalizing and externalizing problems. We hypothesized that the FCU would promote higher inhibitory control in middle childhood relative to the control group, which in turn would be associated with lower internalizing and externalizing symptomology at age 14. Participants were 731 families, with half (n = 367) of the families assigned to the FCU intervention. Using an intent-to-treat design, results indicate that the FCU intervention was indirectly associated with both lower internalizing and externalizing symptoms at age 14 via its effect on increased inhibitory control in middle childhood (i.e., ages 8.5–10.5). Findings highlight the potential for interventions initiated in toddlerhood to have long-term impacts on self-regulation processes, which can further reduce the risk for behavioral and emotional difficulties in adolescence.
Self-reported activity restriction is an established correlate of depression in dementia caregivers (dCGs). It is plausible that the daily distribution of objectively measured activity is also altered in dCGs with depression symptoms; if so, such activity characteristics could provide a passively measurable marker of depression or specific times to target preventive interventions. We therefore investigated how levels of activity throughout the day differed in dCGs with and without depression symptoms, then tested whether any such differences predicted changes in symptoms 6 months later.
Design, setting, participants, and measurements:
We examined 56 dCGs (mean age = 71, standard deviation (SD) = 6.7; 68% female) and used clustering to identify subgroups which had distinct depression symptom levels, leveraging baseline Center for Epidemiologic Studies of Depression Scale–Revised Edition and Patient Health Questionnaire-9 (PHQ-9) measures, as well as a PHQ-9 score from 6 months later. Using wrist activity (mean recording length = 12.9 days, minimum = 6 days), we calculated average hourly activity levels and then assessed when activity levels relate to depression symptoms and changes in symptoms 6 months later.
Clustering identified subgroups characterized by: (1) no/minimal symptoms (36%) and (2) depression symptoms (64%). After multiple comparison correction, the group of dCGs with depression symptoms was less active from 8 to 10 AM (Cohen’s d ≤ −0.9). These morning activity levels predicted the degree of symptom change on the PHQ-9 6 months later (per SD unit β = −0.8, 95% confidence interval: −1.6, −0.1, p = 0.03) independent of self-reported activity restriction and other key factors.
These novel findings suggest that morning activity may protect dCGs from depression symptoms. Future studies should test whether helping dCGs get active in the morning influences the other features of depression in this population (i.e. insomnia, intrusive thoughts, and perceived activity restriction).
We examined Clostridioides difficile infection (CDI) prevention practices and their relationship with hospital-onset healthcare facility-associated CDI rates (CDI rates) in Veterans Affairs (VA) acute-care facilities.
From January 2017 to February 2017, we conducted an electronic survey of CDI prevention practices and hospital characteristics in the VA. We linked survey data with CDI rate data for the period January 2015 to December 2016. We stratified facilities according to whether their overall CDI rate per 10,000 bed days of care was above or below the national VA mean CDI rate. We examined whether specific CDI prevention practices were associated with an increased risk of a CDI rate above the national VA mean CDI rate.
All 126 facilities responded (100% response rate). Since implementing CDI prevention practices in July 2012, 60 of 123 facilities (49%) reported a decrease in CDI rates; 22 of 123 facilities (18%) reported an increase, and 41 of 123 (33%) reported no change. Facilities reporting an increase in the CDI rate (vs those reporting a decrease) after implementing prevention practices were 2.54 times more likely to have CDI rates that were above the national mean CDI rate. Whether a facility’s CDI rates were above or below the national mean CDI rate was not associated with self-reported cleaning practices, duration of contact precautions, availability of private rooms, or certification of infection preventionists in infection prevention.
We found considerable variation in CDI rates. We were unable to identify which particular CDI prevention practices (i.e., bundle components) were associated with lower CDI rates.
A tubular group G is a finite graph of groups with ℤ2 vertex groups and ℤ edge groups. We characterize residually finite tubular groups: G is residually finite if and only if its edge groups are separable. Methods are provided to determine if G is residually finite. When G has a single vertex group an algorithm is given to determine residual finiteness.
Building on prior work regarding the potential for peer contagion or deviance training in group delivered interventions (Dishion & Dodge, 2005, 2006; Dodge, Dishion, & Lansford, 2006), we leveraged data from a randomized trial, testing the integration of two preventive interventions (Promoting Alternative THinking Strategies and PAX Good Behavior Game), to explore the extent to which classroom contextual factors served as either a barrier to or a motivator for teachers to implement the evidence-based PAX Good Behavior Game with high frequency or dosage. We included students’ baseline levels of behavior, measured with regard to both positive (i.e., engagement and social emotional skills) and negative (i.e., hyperactive and aggressive-disruptive) behaviors. Data were collected from 204 teachers in 18 urban elementary schools. A series of multilevel structural equation models were fit to the data. The analyses indicated that classrooms with higher classroom levels of aggressive behavior, on average, at baseline had teachers with lower implementation dosage (i.e., played fewer games) across the school year. In addition, teachers who reported higher baseline levels of emotional exhaustion, regardless of student behavior, also reported lower implementation dosage. Taken together, the results indicated that negative, but not positive, contextual factors at baseline were related to lower implementation dosage; this, in turn, suggests that negative contextual factors may serve as a barrier, rather than a motivator, of teachers’ implementation dosage of classroom-based preventive interventions.