To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clostridioides difficile infection (CDI) is the most frequently reported hospital-acquired infection in the United States. Bioaerosols generated during toilet flushing are a possible mechanism for the spread of this pathogen in clinical settings.
To measure the bioaerosol concentration from toilets of patients with CDI before and after flushing.
In this pilot study, bioaerosols were collected 0.15 m, 0.5 m, and 1.0 m from the rims of the toilets in the bathrooms of hospitalized patients with CDI. Inhibitory, selective media were used to detect C. difficile and other facultative anaerobes. Room air was collected continuously for 20 minutes with a bioaerosol sampler before and after toilet flushing. Wilcoxon rank-sum tests were used to assess the difference in bioaerosol production before and after flushing.
Rooms of patients with CDI at University of Iowa Hospitals and Clinics.
Bacteria were positively cultured from 8 of 24 rooms (33%). In total, 72 preflush and 72 postflush samples were collected; 9 of the preflush samples (13%) and 19 of the postflush samples (26%) were culture positive for healthcare-associated bacteria. The predominant species cultured were Enterococcus faecalis, E. faecium, and C. difficile. Compared to the preflush samples, the postflush samples showed significant increases in the concentrations of the 2 large particle-size categories: 5.0 µm (P = .0095) and 10.0 µm (P = .0082).
Bioaerosols produced by toilet flushing potentially contribute to hospital environmental contamination. Prevention measures (eg, toilet lids) should be evaluated as interventions to prevent toilet-associated environmental contamination in clinical settings.
This study examined the long-term effects of a randomized controlled trial of the Family Check-Up (FCU) intervention initiated at age 2 on inhibitory control in middle childhood and adolescent internalizing and externalizing problems. We hypothesized that the FCU would promote higher inhibitory control in middle childhood relative to the control group, which in turn would be associated with lower internalizing and externalizing symptomology at age 14. Participants were 731 families, with half (n = 367) of the families assigned to the FCU intervention. Using an intent-to-treat design, results indicate that the FCU intervention was indirectly associated with both lower internalizing and externalizing symptoms at age 14 via its effect on increased inhibitory control in middle childhood (i.e., ages 8.5–10.5). Findings highlight the potential for interventions initiated in toddlerhood to have long-term impacts on self-regulation processes, which can further reduce the risk for behavioral and emotional difficulties in adolescence.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Building on prior work using Tom Dishion's Family Check-Up, the current article examined intervention effects on dysregulated irritability in early childhood. Dysregulated irritability, defined as reactive and intense response to frustration, and prolonged angry mood, is an ideal marker of neurodevelopmental vulnerability to later psychopathology because it is a transdiagnostic indicator of decrements in self-regulation that are measurable in the first years of life that have lifelong implications for health and disease. This study is perhaps the first randomized trial to examine the direct effects of an evidence- and family-based intervention, the Family Check-Up (FCU), on irritability in early childhood and the effects of reductions in irritability on later risk of child internalizing and externalizing symptomatology. Data from the geographically and sociodemographically diverse multisite Early Steps randomized prevention trial were used. Path modeling revealed intervention effects on irritability at age 4, which predicted lower externalizing and internalizing symptoms at age 10.5. Results indicate that family-based programs initiated in early childhood can reduce early childhood irritability and later risk for psychopathology. This holds promise for earlier identification and prevention approaches that target transdiagnostic pathways. Implications for future basic and prevention research are discussed.
Several research teams have previously traced patterns of emerging conduct problems (CP) from early or middle childhood. The current study expands on this previous literature by using a genetically-informed, experimental, and long-term longitudinal design to examine trajectories of early-emerging conduct problems and early childhood discriminators of such patterns from the toddler period to adolescence. The sample represents a cohort of 731 toddlers and diverse families recruited based on socioeconomic, child, and family risk, varying in urbanicity and assessed on nine occasions between ages 2 and 14. In addition to examining child, family, and community level discriminators of patterns of emerging conduct problems, we were able to account for genetic susceptibility using polygenic scores and the study's experimental design to determine whether random assignment to the Family Check-Up (FCU) discriminated trajectory groups. In addition, in accord with differential susceptibility theory, we tested whether the effects of the FCU were stronger for those children with higher genetic susceptibility. Results augmented previous findings documenting the influence of child (inhibitory control [IC], gender) and family (harsh parenting, parental depression, and educational attainment) risk. In addition, children in the FCU were overrepresented in the persistent low versus persistent high CP group, but such direct effects were qualified by an interaction between the intervention and genetic susceptibility that was consistent with differential susceptibility. Implications are discussed for early identification and specifically, prevention efforts addressing early child and family risk.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Introduction: Recently there have been many studies performed on the effectiveness of implementing LEAN principals to improve wait times for emergency departments (EDs), but there have been relatively few studies on implementing these concepts on length of stay (LOS) in the ED. This research aims to explore the initial feasibility of applying the LEAN model to length-of-stay metrics in an ED by identifying areas of non-value added time for patients staying in the ED. Methods: In this project we used a sample of 10,000 ED visits at the Health Science Centre in St. John's over a 1-year period and compared patients’ LOS in the ED on four criteria: day of the week, hour of presentation, whether laboratory tests were ordered, and whether diagnostic imaging was ordered. Two sets of analyses were then performed. First a two-sided Wilcoxon rank-sum test was used to evaluate whether ordering either lab tests or diagnostic imaging affected LOS. Second a generalized linear model (GLM) was created using a 10-fold cross-validation with a LASSO operator to analyze the effect size and significance of each of the four criteria on LOS. Additionally, a post-test analysis of the GLM was performed on a second sample of 10,000 ED visits in the same 1-year period to assess its predictive power and infer the degree to which a patient's LOS is determined by the four criteria. Results: For the Wilcoxon rank-sum test there was no significant difference in LOS for patients who were ordered diagnostic imaging compared to those who were not (p = 0.6998) but there was a statistically significant decrease in LOS for patients who were ordered lab tests compared to those who were not (p = 2.696 x 10-10). When assessing the GLM there were two significant takeaways: ordering lab tests reduced LOS (95% CI = 42.953 - 68.173min reduction), and arriving at the ED on Thursday increased LOS significantly (95% CI = 6.846 – 52.002min increase). Conclusion: This preliminary analysis identified several factors that increased patients’ LOS in the ED, which would be suitable for potential LEAN interventions. The increase in LOS for both patients who are not ordered lab tests and who visit the ED on Thursday warrant further investigation to identify causal factors. Finally, while this analysis revealed several actionable criteria for improving ED LOS the relatively low predictive power of the final GLM in the post-test analysis (R2 = 0.00363) indicates there are more criteria that influence LOS for exploration in future analyses.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
Filamentary structures can form within the beam of protons accelerated during the interaction of an intense laser pulse with an ultrathin foil target. Such behaviour is shown to be dependent upon the formation time of quasi-static magnetic field structures throughout the target volume and the extent of the rear surface proton expansion over the same period. This is observed via both numerical and experimental investigations. By controlling the intensity profile of the laser drive, via the use of two temporally separated pulses, both the initial rear surface proton expansion and magnetic field formation time can be varied, resulting in modification to the degree of filamentary structure present within the laser-driven proton beam.
Development involves synergistic interplay among genotypes and the physical and cultural environments, and integrating genetics into experimental designs that manipulate the environment can improve understanding of developmental psychopathology and intervention efficacy. Consistent with differential susceptibility theory, individuals can vary in their sensitivity to environmental conditions including intervention for reasons including their genotype. As a consequence, understanding genetic influences on intervention response is critical. Empirically, we tested an interaction between a genetic index representing sensitivity to the environment and the Family Check-Up intervention. Participants were drawn from the Early Steps Multisite randomized prevention trial that included a low-income and racially/ethnically diverse sample of children and their families followed longitudinally (n = 515). As hypothesized, polygenic sensitivity to the environment moderated the effects of the intervention on 10-year-old children's symptoms of internalizing psychopathology, such that children who were genetically sensitive and were randomly assigned to the intervention had fewer symptoms of child psychopathology than genetically sensitive children assigned to the control condition. A significant difference in internalizing symptoms assessed with a clinical interview emerged between the intervention and control groups for those 0.493 SD above the mean on polygenic sensitivity, or 25% of the sample. Similar to personalized medicine, it is time to understand individual and sociocultural differences in treatment response and individualize psychosocial interventions to reduce the burden of child psychopathology and maximize well-being for children growing up in a wide range of physical environments and cultures.
To achieve their conservation goals individuals, communities and organizations need to acquire a diversity of skills, knowledge and information (i.e. capacity). Despite current efforts to build and maintain appropriate levels of conservation capacity, it has been recognized that there will need to be a significant scaling-up of these activities in sub-Saharan Africa. This is because of the rapid increase in the number and extent of environmental problems in the region. We present a range of socio-economic contexts relevant to four key areas of African conservation capacity building: protected area management, community engagement, effective leadership, and professional e-learning. Under these core themes, 39 specific recommendations are presented. These were derived from multi-stakeholder workshop discussions at an international conference held in Nairobi, Kenya, in 2015. At the meeting 185 delegates (practitioners, scientists, community groups and government agencies) represented 105 organizations from 24 African nations and eight non-African nations. The 39 recommendations constituted six broad types of suggested action: (1) the development of new methods, (2) the provision of capacity building resources (e.g. information or data), (3) the communication of ideas or examples of successful initiatives, (4) the implementation of new research or gap analyses, (5) the establishment of new structures within and between organizations, and (6) the development of new partnerships. A number of cross-cutting issues also emerged from the discussions: the need for a greater sense of urgency in developing capacity building activities; the need to develop novel capacity building methodologies; and the need to move away from one-size-fits-all approaches.
Infants and young children are frequently colonized with C. difficile but rarely have symptomatic disease. However, C. difficile testing remains prevalent in this age group.
To design a computerized provider order entry (CPOE) alert to decrease testing for C. difficile in young children and infants.
An interventional age-targeted before-after trial with comparison group
Monroe Carell Jr. Children’s Hospital at Vanderbilt University, Nashville, Tennessee.
All children seen in the inpatient or emergency room settings from July 2012 through July 2013 (pre-CPOE alert) and September 2013 through September 2014 (post-CPOE alert)
In August of 2013, we implemented a CPOE alert advising against testing in infants and young children based on the American Academy of Pediatrics recommendations with an optional override. We further offered healthcare providers educational seminars regarding recommended C. difficile testing.
The average monthly testing rate significantly decreased after the CPOE alert for children 0–11 months old (11.5 pre-alert vs 0 post-alert per 10,000 patient days; P<.001) and 12–35 months old (61.6 pre-alert vs 30.1 post-alert per 10,000 patients days; P<.001), but not for those children ≥36 months old (50.9 pre-alert vs 46.4 post-alert per 10,000 patient days; P=.3) who were not targeted with a CPOE alert. There were no complications in those children who testing positive for C. difficile.
The average monthly testing rate for C. difficile for children <35 months old decreased without complication after the use of a CPOE alert in those who tested positive for C. difficile.
There is limited longitudinal research that has looked at the longer term incidence of depressive symptoms, comparing women with a hysterectomy to women without a hysterectomy. We aimed to investigate the association between hysterectomy status and the 12-year incidence of depressive symptoms in a mid-aged cohort of Australian women, and whether these relationships were modified by use of exogenous hormones.
We used generalised estimating equation models for binary outcome data to assess the associations of the incidence of depressive symptoms (measured by the 10-item Centre for Epidemiologic Studies Depression Scale) across five surveys over a 12-year period, in women with a hysterectomy with ovarian conservation, or a hysterectomy with bilateral oophorectomy compared with women without a hysterectomy. We further stratified women with hysterectomy by their current use of menopausal hormone therapy (MHT). Women who reported prior treatment for depression were excluded from the analysis.
Compared with women without a hysterectomy (n = 4002), both women with a hysterectomy with ovarian conservation (n = 884) and women with a hysterectomy and bilateral oophorectomy (n = 450) had a higher risk of depressive symptoms (relative risk (RR) 1.20; 95% confidence interval (CI) 1.06–1.36 and RR 1.44; 95% CI 1.22–1.68, respectively). There were differences in the strength of the risk for women with a hysterectomy with ovarian conservation, compared with those without, when we stratified by current MHT use. Compared with women without a hysterectomy who did not use MHT, women with a hysterectomy with ovarian conservation who were also MHT users had a higher risk of depressive symptoms (RR 1.57; 95% CI 1.31–1.88) than women with a hysterectomy with ovarian conservation but did not use MHT (RR 1.17; 95% CI 1.02–1.35). For women with a hysterectomy and bilateral oophorectomy, MHT use did not attenuate the risk. We could not rule out, however, that the higher risk seen among MHT users may be due to confounding by indication, i.e. MHT was prescribed to treat depressive symptoms, but their depressive symptoms persisted.
Women with a hysterectomy (with and without bilateral oophorectomy) have a higher risk of new incidence of depressive symptoms in the longer term that was not explained by lifestyle or socio-economic factors.
The Antarctic Roadmap Challenges (ARC) project identified critical requirements to deliver high priority Antarctic research in the 21st century. The ARC project addressed the challenges of enabling technologies, facilitating access, providing logistics and infrastructure, and capitalizing on international co-operation. Technological requirements include: i) innovative automated in situ observing systems, sensors and interoperable platforms (including power demands), ii) realistic and holistic numerical models, iii) enhanced remote sensing and sensors, iv) expanded sample collection and retrieval technologies, and v) greater cyber-infrastructure to process ‘big data’ collection, transmission and analyses while promoting data accessibility. These technologies must be widely available, performance and reliability must be improved and technologies used elsewhere must be applied to the Antarctic. Considerable Antarctic research is field-based, making access to vital geographical targets essential. Future research will require continent- and ocean-wide environmentally responsible access to coastal and interior Antarctica and the Southern Ocean. Year-round access is indispensable. The cost of future Antarctic science is great but there are opportunities for all to participate commensurate with national resources, expertise and interests. The scope of future Antarctic research will necessitate enhanced and inventive interdisciplinary and international collaborations. The full promise of Antarctic science will only be realized if nations act together.
Fetal alcohol spectrum disorder (FASD) is increasingly recognized as a growing public health issue worldwide. Although more research is needed on both the diagnosis and treatment of FASD, and a broader and more culturally diverse range of services are needed to support those who suffer from FASD and their families, both research and practice for FASD raise significant ethical issues. In response, from the point of view of both research and clinical neuroethics, we provide a framework that emphasizes the need to maximize benefits and minimize harm, promote justice, and foster respect for persons within a global context.
We describe the performance of the Boolardy Engineering Test Array, the prototype for the Australian Square Kilometre Array Pathfinder telescope. Boolardy Engineering Test Array is the first aperture synthesis radio telescope to use phased array feed technology, giving it the ability to electronically form up to nine dual-polarisation beams. We report the methods developed for forming and measuring the beams, and the adaptations that have been made to the traditional calibration and imaging procedures in order to allow BETA to function as a multi-beam aperture synthesis telescope. We describe the commissioning of the instrument and present details of Boolardy Engineering Test Array’s performance: sensitivity, beam characteristics, polarimetric properties, and image quality. We summarise the astronomical science that it has produced and draw lessons from operating Boolardy Engineering Test Array that will be relevant to the commissioning and operation of the final Australian Square Kilometre Array Path telescope.
Skin and soft tissue infections (SSTIs) due to Staphylococcus aureus have become increasingly common in the outpatient setting; however, risk factors for differentiating methicillin-resistant S. aureus (MRSA) and methicillin-susceptible S. aureus (MSSA) SSTIs are needed to better inform antibiotic treatment decisions. We performed a case-case-control study within 14 primary-care clinics in South Texas from 2007 to 2015. Overall, 325 patients [S. aureus SSTI cases (case group 1, n = 175); MRSA SSTI cases (case group 2, n = 115); MSSA SSTI cases (case group 3, n = 60); uninfected control group (control, n = 150)] were evaluated. Each case group was compared to the control group, and then qualitatively contrasted to identify unique risk factors associated with S. aureus, MRSA, and MSSA SSTIs. Overall, prior SSTIs [adjusted odds ratio (aOR) 7·60, 95% confidence interval (CI) 3·31–17·45], male gender (aOR 1·74, 95% CI 1·06–2·85), and absence of healthcare occupation status (aOR 0·14, 95% CI 0·03–0·68) were independently associated with S. aureus SSTIs. The only unique risk factor for community-associated (CA)-MRSA SSTIs was a high body weight (⩾110 kg) (aOR 2·03, 95% CI 1·01–4·09).
The current study sought to advance our understanding of transactional processes among maternal depression, neighborhood deprivation, and child conduct problems (CP) using two samples of low-income families assessed repeatedly from early childhood to early adolescence. After accounting for initial levels of negative parenting, independent and reciprocal effects between maternal depressive symptoms and child CP were evident across both samples, beginning in early childhood and continuing through middle childhood and adolescence. In addition, neighborhood effects were consistently found in both samples after children reached age 5, with earlier neighborhood effects on child CP and maternal depression found in the one exclusively urban sample of families with male children. The results confirm prior research on the independent contribution of maternal depression and child CP to the maintenance of both problem behaviors. The findings also have implications for designing preventative and clinical interventions to address child CP for families living in high-risk neighborhoods.
Many adults with autism spectrum disorder (ASD) remain undiagnosed. Specialist assessment clinics enable the detection of these cases, but such services are often overstretched. It has been proposed that unnecessary referrals to these services could be reduced by prioritizing individuals who score highly on the Autism-Spectrum Quotient (AQ), a self-report questionnaire measure of autistic traits. However, the ability of the AQ to predict who will go on to receive a diagnosis of ASD in adults is unclear.
We studied 476 adults, seen consecutively at a national ASD diagnostic referral service for suspected ASD. We tested AQ scores as predictors of ASD diagnosis made by expert clinicians according to International Classification of Diseases (ICD)-10 criteria, informed by the Autism Diagnostic Observation Schedule-Generic (ADOS-G) and Autism Diagnostic Interview-Revised (ADI-R) assessments.
Of the participants, 73% received a clinical diagnosis of ASD. Self-report AQ scores did not significantly predict receipt of a diagnosis. While AQ scores provided high sensitivity of 0.77 [95% confidence interval (CI) 0.72–0.82] and positive predictive value of 0.76 (95% CI 0.70–0.80), the specificity of 0.29 (95% CI 0.20–0.38) and negative predictive value of 0.36 (95% CI 0.22–0.40) were low. Thus, 64% of those who scored below the AQ cut-off were ‘false negatives’ who did in fact have ASD. Co-morbidity data revealed that generalized anxiety disorder may ‘mimic’ ASD and inflate AQ scores, leading to false positives.
The AQ's utility for screening referrals was limited in this sample. Recommendations supporting the AQ's role in the assessment of adult ASD, e.g. UK NICE guidelines, may need to be reconsidered.
In New Zealand, efforts to control acute rheumatic fever (ARF) and its sequelae have focused on school-age children in the poorest socioeconomic areas; however, it is unclear whether this approach is optimal given the strong association with demographic risk factors other than deprivation, especially ethnicity. The aim of this study was to estimate the stratum-specific risk of ARF by key sociodemographic characteristics. We used hospitalization and disease notification data to identify new cases of ARF between 2010 and 2013, and used population count data from the 2013 New Zealand Census as our denominator. Poisson logistic regression methods were used to estimate stratum-specific risk of ARF development. The likelihood of ARF development varied considerably by age, ethnicity and deprivation strata: while risk was greatest in Māori and Pacific children aged 10–14 years residing in the most extreme deprivation, both of these ethnic groups experienced elevated risk across a wide age range and across deprivation levels. Interventions that target populations based on deprivation will include the highest-risk strata, but they will also (a) include groups with very low risk of ARF, such as non-Māori/non-Pacific children; and (b) exclude groups with moderate risk of ARF, such as Māori and Pacific individuals living outside high deprivation areas.