We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The MITIGATE toolkit was developed to assist urgent care and emergency departments in the development of antimicrobial stewardship programs. At the University of Washington, we adopted the MITIGATE toolkit in 10 urgent care centers, 9 primary care clinics, and 1 emergency department. We encountered and overcame challenges: a complex data build, choosing feasible outcomes to measure, issues with accurate coding, and maintaining positive stewardship relationships. Herein, we discuss solutions to challenges we encountered to provide guidance for those considering using this toolkit.
Although trauma-focused cognitive behavior therapy (TF-CBT) is the frontline treatment for post-traumatic stress disorder (PTSD), one-third of patients are treatment non-responders. To identify neural markers of treatment response to TF-CBT when participants are reappraising aversive material.
Methods
This study assessed PTSD patients (n = 37) prior to TF-CBT during functional magnetic brain resonance imaging (fMRI) when they reappraised or watched traumatic images. Patients then underwent nine sessions of TF-CBT, and were then assessed for symptom severity on the Clinician-Administered PTSD Scale. FMRI responses for cognitive reappraisal and emotional reactivity contrasts of traumatic images were correlated with the reduction of PTSD severity from pretreatment to post-treatment.
Results
Symptom improvement was associated with decreased activation of the left amygdala during reappraisal, but increased activation of bilateral amygdala and hippocampus during emotional reactivity prior to treatment. Lower connectivity of the left amygdala to the subgenual anterior cingulate cortex, pregenual anterior cingulate cortex, and right insula, and that between the left hippocampus and right amygdala were also associated with symptom improvement.
Conclusions
These findings provide evidence that optimal treatment response to TF-CBT involves the capacity to engage emotional networks during emotional processing, and also to reduce the engagement of these networks when down-regulating emotions.
Introduction: Venipuncture is a frequent cause of pain and distress in the pediatric emergency department (ED). Distraction, which can improve patient experience, remains the most studied psychological intervention. Virtual reality (VR) is a method of immersive distraction that can contribute to the multi-modal management of procedural pain and distress. Methods: The main objectives of this study were to determine the feasibility and acceptability of Virtual Reality (VR) distraction for pain management associated with venipunctures and to examine its preliminary effects on pain and distress in the pediatric ED. Children 7-17 years requiring a venipuncture in the pediatric ED were recruited. Participants were randomized to either a control group (standard care) or intervention group (standard of care + VR). Principal clinical outcome was the mean level of procedural pain, measured by the verbal numerical rating scale (VNRS). Distress was also measured using the Child Fear Scale (CFS) and the Procedure Behavior Check List (PBCL) and memory of pain using the VNRS. Side effects were documented. Results: A total of 63 patients were recruited. Results showed feasibility and acceptability of VR in the PED and overall high satisfaction levels (79% recruitment rate of eligible families, 90% rate of VR game completion, and overall high mean satisfaction levels). There was a significantly higher level of satisfaction among healthcare providers in the intervention group, and 93% of those were willing to use this technology again for the same procedure. Regarding clinical outcomes, no significant difference was observed between groups on procedural pain. Distress evaluated by proxy (10/40 vs 13.2/40, p = 0.007) and memory of pain at 24 hours (2.4 vs 4.2, p = 0.027) were significantly lower in the VR group. Venipuncture was successful on first attempt in 23/31 patients (74%) in the VR group and 15/30 (50%) patients in the control group (p = 0.039). Five of the 31 patients (16%) in the VR group reported side effects Conclusion: The addition of VR to standard care is feasible and acceptable for pain and distress management during venipunctures in the pediatric ED. There was no difference in self-reported procedural pain between groups. Levels of procedural distress and memory of pain at 24 hours were lower in the VR group.
Short-term peripheral venous catheter–related bloodstream infection (PVCR-BSI) rates have not been systematically studied in resource-limited countries, and data on their incidence by number of device days are not available.
Methods:
Prospective, surveillance study on PVCR-BSI conducted from September 1, 2013, to May 31, 2019, in 727 intensive care units (ICUs), by members of the International Nosocomial Infection Control Consortium (INICC), from 268 hospitals in 141 cities of 42 countries of Africa, the Americas, Eastern Mediterranean, Europe, South East Asia, and Western Pacific regions. For this research, we applied definition and criteria of the CDC NHSN, methodology of the INICC, and software named INICC Surveillance Online System.
Results:
We followed 149,609 ICU patients for 731,135 bed days and 743,508 short-term peripheral venous catheter (PVC) days. We identified 1,789 PVCR-BSIs for an overall rate of 2.41 per 1,000 PVC days. Mortality in patients with PVC but without PVCR-BSI was 6.67%, and mortality was 18% in patients with PVC and PVCR-BSI. The length of stay of patients with PVC but without PVCR-BSI was 4.83 days, and the length of stay was 9.85 days in patients with PVC and PVCR-BSI. Among these infections, the microorganism profile showed 58% gram-negative bacteria: Escherichia coli (16%), Klebsiella spp (11%), Pseudomonas aeruginosa (6%), Enterobacter spp (4%), and others (20%) including Serratia marcescens. Staphylococcus aureus were the predominant gram-positive bacteria (12%).
Conclusions:
PVCR-BSI rates in INICC ICUs were much higher than rates published from industrialized countries. Infection prevention programs must be implemented to reduce the incidence of PVCR-BSIs in resource-limited countries.
Associations between different forms of malnutrition and environmental conditions, including water, sanitation and hygiene (WASH), may contribute towards persistently poor child health, growth and cognitive development. Experiencing poor nutrition in utero or during early childhood is furthermore associated with chronic diseases later in life. The primary responsibility for provision of water and sanitation, as a basic service and human right, lies with the State; however, a number of stakeholders are involved. The situation is most critical in sub-Saharan Africa (SSA), where, in 2015, 311 million people lacked a safe water source, and >70% of SSA populations were living without adequate sanitation. The aim of this paper was to conduct a systematic review to investigate the state of literature concerned with WASH and its association with nutritional status, and governance in children from birth to 5 years of age in SSA. Articles were sourced from PubMed Central, Science Direct and ProQuest Social Science databases published between 1990 and 2017. The PRISMA Statement was utilised and this systematic review is registered with PROSPERO (CRD42017071700). The search terms returned 15,351 articles for screening, with 46 articles included. This is indicative of a limited body of knowledge; however, the number of publications on this topic has been increasing, suggesting burgeoning field of interest. Targeted research on the governance of WASH through the identification of the various role players and stakeholders at various levels, while understanding the policy environment in relation to particular health-related outcomes is imperative to address the burden of child undernutrition.
Chalcopyrite quantum dots (QDs) have emerged as a safe alternative to cadmium-based QDs for bio-applications. However, the research on AgInS2 chalcopyrite QDs has not been widely explored in terms of their toxicity. Herein, we report a synthesis of biocompatible AgInS2/ZnS QDs via a greener approach. The emission intensity of the as-synthesized AgInS2 core QDs was enhanced 2-fold after the ZnS shell growth. X-ray diffraction revealed the tetragonal crystal structure of QDs, and high-resolution transmission electron microscope images show that the QDs are spherical in shape and crystalline in nature. Cell viability assays conducted on different cell lines, such as HeLa, A549, and BHK-21 cells, indicated that AgInS2/ZnS QDs are least toxic at a QD concentration range of 100 µg/mL. The fluorescent microscope analysis of A549 cells incubated with AgInS2/ZnS QDs shows that the QDs were accumulated in the cell membranes. The as-synthesized AgInS2/ZnS QDs are less toxic and eco-friendly, and can be used for biolabeling.
To better understand hepatitis C virus (HCV) epidemiology in Punjab state, India, we estimated the distribution of HCV antibody positivity (anti-HCV+) using a 2013–2014 HCV household seroprevalence survey. Household anti-HCV+ clustering was investigated (a) by individual-level multivariable logistic regression, and (b) comparing the observed frequency of households with multiple anti-HCV+ persons against the expected, simulated frequency assuming anti-HCV+ persons are randomly distributed. Village/ward-level clustering was investigated similarly. We estimated household-level associations between exposures and the number of anti-HCV+ members in a household (N = 1593 households) using multivariable ordered logistic regression. Anti-HCV+ prevalence was 3.6% (95% confidence interval 3.0–4.2%). Individual-level regression (N = 5543 participants) found an odds ratio of 3.19 (2.25–4.50) for someone being anti-HCV+ if another household member was anti-HCV+. Thirty households surveyed had ⩾2 anti-HCV+ members, whereas 0/1000 (P < 0.001) simulations had ⩾30 such households. Excess village-level clustering was evident: 10 villages had ⩾6 anti-HCV+ members, occurring in 31/1000 simulations (P = 0.031). The household-level model indicated the number of household members, living in southern Punjab, lower socio-economic score, and a higher proportion having ever used opium/bhuki were associated with a household's number of anti-HCV+ members. Anti-HCV+ clusters within households and villages in Punjab, India. These data should be used to inform screening efforts.
We herein report the detection of folic acid (FA) via the fluorometric method using water-soluble AgInS2 quantum dots (QDs). The optical analysis showed that the addition of FA to AgInS2 QDs results in significant, blue-shifted photoluminescence emission. A linear plot of the blueshift in the photoluminescence wavelength position against FA concentration was obtained in the range of 0.03–33 µM with the detection limit of 52 nM. Interference study showed the selective detection of FA in the presence of other biomolecules. The as-synthesized AgInS2 QDs can be employed as an optical sensor for the rapid detection of FA in aqueous solutions.
Introduction: Needle-related procedures are considered the most important source of pain and distress in children in hospital settings. Time constraints, heavy workload, busy and noisy environment represent barriers to the use of available interventions for pain management during needle-related procedures. Therefore, the use of a rapid, easy-to-use intervention could improve procedural pain management practices. The objective was to determine if a device combining cold and vibration (Buzzy) is non-inferior (no worse) to a topical anesthetic (Maxilene) for pain management in children undergoing needle-related procedures in the Emergency Department (ED). Methods: This study was a randomized, controlled, non-inferiority trial. We enrolled children aged between 4-17 years presenting to the ED and requiring a needle-related procedure. Participants were randomly assigned to the Buzzy or Maxilene group. The primary outcome was the mean difference in pain intensity during the procedure, as measured with the CAS (0-10). Secondary outcomes were procedural distress, success of the procedure at first-attempt and satisfaction of parents. Results: A total of 352 participants were enrolled and 346 were randomized (Buzzy = 172; Maxilene = 174). Mean difference in procedural pain scores between groups was 0.64 (95%CI -0.1 to 1.3), showing that the Buzzy device was not non-inferior to Maxilene according to a non-inferiority margin of 0.70. No significant differences were observed for procedural distress (p = .370) and success of the procedure at first attempt (p = .602). Parents of both groups were very satisfied with both interventions (Buzzy = 7.8 ±2.66; Maxilene = 8.1 ±2.4), but there was no significant difference between groups (p = .236). Conclusion: Non-inferiority of the Buzzy device over a topical anesthetic was not demonstrated for pain management of children during a needle-related procedure in the ED. However, considering that topical anesthetics are underused in the ED setting and require time, the Buzzy device seems to be a promising alternative as it is a rapid, low-cost, easy-to-use and reusable intervention.
Increasingly, demands are placed on healthcare systems to meet antimicrobial stewardship standards and reporting requirements. This trend, combined with reduced financial and personnel resources, has created a need to adopt information technology (IT) to help ease these burdens and facilitate action. The incorporation of IT into an antimicrobial stewardship program can help improve stewardship intervention efficiencies and facilitate the tracking and reporting of key metrics, including outcomes. This paper provides a review of the stewardship-related functionality within these IT systems, describes how these platforms can be used to improve antimicrobial use, and identifies how they can support current and potential future antimicrobial stewardship regulatory and accreditation standards. Finally, recommendations to help close the gaps in existing systems are provided and suggestions for future areas of development within these programs are delineated.
In dairy cattle, resistance, tolerance and resilience refer to the adaptation ability to a broad range of environmental conditions, implying stable performances (e.g. production level, fertility status) independent from disease or infection pressure. All three mechanisms resistance, tolerance and resilience contribute to overall robustness, implying the evaluation of phenotyping and breeding strategies for improved robustness in dairy cattle populations. Classically, breeding approaches on improved robustness rely on simple production traits, in combination with detailed environmental descriptors and enhanced statistical modelling to infer possible genotype by environment interactions. In this regard, innovative environmental descriptors were heat stress indicators, and statistical modelling focussed on random regression or reaction norm methodology. A robust animal has high breeding values over a broad spectra of environmental levels. During the last years, direct health traits were included into selection indices, implying advances in genetic evaluations for traits being linked to resistance or tolerance against infectious and non-infectious diseases. Up to now, genetic evaluation for health traits is primarily based on subjectively measured producer-recorded data, with disease trait heritabilities in a low-to-moderate range. Thus, it is imperative to identify objectively measurable phenotypes as suitable biomarkers. New technologies (e.g. mid-infrared spectrometry) offer possibilities to determine potential biomarkers via laboratory analyses. Novel biomarkers include measurable physiological traits (e.g. serum metabolites, hormone levels) as indicators for a current infection, or the host’s reaction to environmental stressors. The rumen microbiome composition is proposed as a biomarker to detect interactions between host genotype and environmental effects. The understanding of host genetic variation in disease resistance and individual expression of robustness encourages analyses on the underlying immune response (IR) system. Recent advances have been made in order to infer the genetic background of IR traits and cows immunological competence in relation to functional and production traits. Thus, a last aspect of this review addresses the genetic background and current state of genetic control for resistance to economically relevant infectious and non-infectious dairy cattle diseases by considering immune-related factors.
The house mouse (Mus musculus) and the black rat (Rattus rattus) are reservoir hosts for zoonotic pathogens, several of which cause neglected tropical diseases (NTDs). Studies of the prevalence of these NTD-causing zoonotic pathogens, in house mice and black rats from tropical residential areas are scarce. Three hundred and two house mice and 161 black rats were trapped in 2013 from two urban neighbourhoods and a rural village in Yucatan, Mexico, and subsequently tested for Trypanosoma cruzi, Hymenolepis diminuta and Leptospira interrogans. Using the polymerase chain reaction we detected T. cruzi DNA in the hearts of 4·9% (8/165) and 6·2% (7/113) of house mice and black rats, respectively. We applied the sedimentation technique to detect eggs of H. diminuta in 0·5% (1/182) and 14·2% (15/106) of house mice and black rats, respectively. Through the immunofluorescent imprint method, L. interrogans was identified in 0·9% (1/106) of rat kidney impressions. Our results suggest that the black rat could be an important reservoir for T. cruzi and H. diminuta in the studied sites. Further studies examining seasonal and geographical patterns could increase our knowledge on the epidemiology of these pathogens in Mexico and the risk to public health posed by rodents.
One view of major Solar Energetic Particle (SEP) events is that these (proton-dominated) fluxes are accelerated in heliospheric shock sources created by Interplanetary Coronal Mass Ejections (ICMEs), and then travel mainly along interplanetary magnetic field lines connecting the shock(s) to the observer(s). This places a particular emphasis on the role of the heliospheric conditions during the event, requiring a realistic description of the latter to interpret and/or model SEP events. The well-known ENLIL heliospheric simulation with cone model generated ICME shocks is used together with the SEPMOD particle event modeling scheme to demonstrate the value of applying these concepts at multiple inner heliosphere sites.
Introduction: Appropriate pain management relies on the use of valid, reliable and age-appropriate tools that are validated in the setting in which they are intended to be used. The aim of the study was to assess the psychometric properties of pain scales commonly used in children presenting to the pediatric emergency department (PED) with an acute musculoskeletal injury. Methods: Convergent validity was assessed by determining the Spearman’s correlations and the agreement using the Bland-Altman method between the Visual Analogue Scale (VAS), Faces Pain Scale-Revised (FPS-R) and Color Analogue Scale (CAS). Responsiveness to change was determined by performing the Wilcoxon signed-rank test between the pre-post analgesia mean scores. Reliability of the scales was estimated using relative (Spearman’s correlation, Intraclass Correlation Coefficient) and absolute indices (Coefficient of Reliability). Results: A total of 495 participants was included in the analyses. Mean age was 11.9 ±2.7 years and participants were mainly boys (55.3%). Correlation between each pair of scales was 0.79 (VAS/FPS-R), 0.92 (VAS/CAS) and 0.81 (CAS/FPS-R). Limits of agreement (80%CI) were -2.71 to 1.27 (VAS/FPS-R), -1.13 to 1.15 (VAS/CAS) and -1.45 to 2.61 (CAS/FPS-R). Responsiveness to change was demonstrated by significant differences in mean pain scores, among the three scales, between pre- and post-medication administration (p<0.0001). ICC and CR estimates suggested acceptable reliability for the three scales at 0.79 and ±1.49 for VAS, 0.82 and ±1.35 for CAS, and 0.76 and ±1.84 for FPS-R. Conclusion: The scales demonstrated good psychometric properties with a large sample of children with acute pain in the PED. The VAS and CAS showed a stronger convergent validity, while FPS-R was not in agreement with the other scales. Clinically, VAS and CAS scales can be used interchangeably to assess pain intensity of children with acute pain.
Introduction: In Ottawa, STEMI patients are transported directly to percutaneous coronary intervention (PCI) by advanced care paramedics (ACPs), primary care paramedics (PCPs), or transferred from PCP to ACP crew (ACP-intercept). PCPs have a limited skill set to address complications during transport.The objective of this study was to determine what clinically important events (CIEs) occurred in STEMI patients transported for primary PCI via a PCP crew, and what proportion of such events could only be treated by ACP protocols. Methods: We conducted a health record review of STEMI patients transported for primary PCI from Jan 1, 2011-Dec 21, 2015. Ottawa has a single PCI center and its EMS system employs both PCP and ACP paramedics. We identified consecutive STEMI bypass patients transported by PCP-only and ACP-intercept using the dispatch database. A data extraction form was piloted and used to extract patient demographics, transport times, and primary outcomes: CIEs and interventions performed during transport, and secondary outcomes: hospital diagnosis, and mortality. CIEs were reviewed by two investigators to determine if they would be treated differently by ACP protocols. We present descriptive statistics. Results: We identified 967 STEMI bypass cases among which 214 (118 PCP-only and 96 ACP-intercept) met all inclusion criteria. Characteristics were: mean age 61.4 years, 78% male, 31.8% anterior and 44.4% inferior infarcts, mean response time 6 min, total paramedic contact time 29 min, and in cases of ACP-intercept 7 min of PCP-only contact time.A CIE occurred in 127 (59%) of cases: SBP<90 mmHg 26.2%, HR<60 30.4%, HR>100 20.6%, malignant arrhythmias 7.5%, altered mental status 6.5%, airway intervention 2.3%, 2 patients (0.9%) arrested, both survived. Of the CIE identified, 54 (42.5%) could be addressed differently by ACP vs PCP protocols (25.2% of total cases). The majority related to fluid boluses for hypotension (44 cases; 35% of CIE). ACP intervention for CIEs within the ACP intercept group was 51.6%. There were 6 in-hospital deaths (2.8%) with no difference in transport crew type. Conclusion: CIEs are common in STEMI bypass patients however a smaller proportion of such CIE would be addressed differently by ACP protocols compared to PCP protocols. The vast majority of CIE appeared to be transient and of limited clinical significance.
Antimicrobial stewardship programs (ASPs) effectively optimize antibiotic use for inpatients; however, the extent of emergency department (ED) involvement in ASPs has not been described.
OBJECTIVE
To determine current ED involvement in children’s hospital ASPs and to assess beliefs and preferred methods of implementation for ED-based ASPs.
METHODS
A cross-sectional survey of 37 children’s hospitals participating in the Sharing Antimicrobial Resistance Practices collaboration was conducted. Surveys were distributed to ASP leaders and ED medical directors at each institution. Items assessed included beliefs regarding ED antibiotic prescribing, ED prescribing resources, ASP methods used in the ED such as clinical decision support and clinical care guidelines, ED participation in ASP activities, and preferred methods for ED-based ASP implementation.
RESULTS
A total of 36 ASP leaders (97.3%) and 32 ED directors (86.5%) responded; the overall response rate was 91.9%. Most ASP leaders (97.8%) and ED directors (93.7%) agreed that creation of ED-based ASPs was necessary. ED resources for antibiotic prescribing were obtained via the Internet or electronic health records (EHRs) for 29 hospitals (81.3%). The main ASP activities for the ED included production of antibiograms (77.8%) and creation of clinical care guidelines for pneumonia (83.3%). The ED was represented on 3 hospital ASP committees (8.3%). No hospital ASPs actively monitored outpatient ED prescribing. Most ASP leaders (77.8%) and ED directors (81.3%) preferred implementation of ED-based ASPs using clinical decision support integrated into the EHR.
CONCLUSIONS
Although ED involvement in ASPs is limited, both ASP and ED leaders believe that ED-based ASPs are necessary. Many children’s hospitals have the capability to implement ED-based ASPs via the preferred method: EHR clinical decision support.
Background: Painful diabetic neuropathy (PDN) is a frequent complication of diabetes mellitus. Current treatment recommendations are based on short-term trials, generally of ≤3 months’ duration. Limited data are available on the long-term outcomes of this chronic disease. The objective of this study was to determine the long-term clinical effectiveness of the management of chronic PDN at tertiary pain centres. Methods: From a prospective observational cohort study of patients with chronic neuropathic non-cancer pain recruited from seven Canadian tertiary pain centres, 60 patients diagnosed with PDN were identified for analysis. Data were collected according to Initiative on Methods, Measurement, and Pain Assessment in Clinical Trials guidelines including the Brief Pain Inventory. Results: At 12-month follow-up, 37.2% (95% confidence interval [CI], 23.0-53.3) of 43 patients with complete data achieved pain reduction of ≥30%, 51.2% (95% CI, 35.5-66.7) achieved functional improvement with a reduction of ≥1 on the Pain Interference Scale (0-10, Brief Pain Inventory) and 30.2% (95% CI, 17.2-46.1) had achieved both these measures. Symptom management included at least two medication classes in 55.3% and three medication classes in 25.5% (opioids, antidepressants, anticonvulsants). Conclusions: Almost one-third of patients being managed for PDN in a tertiary care setting achieve meaningful improvements in pain and function in the long term. Polypharmacy including analgesic antidepressants and anticonvulsants were the mainstays of effective symptom management.
Urban slum environments in the tropics are conducive to the proliferation and the spread of rodent-borne zoonotic pathogens to humans. Calodium hepaticum (Brancroft, 1893) is a zoonotic nematode known to infect a variety of mammalian hosts, including humans. Norway rats (Rattus norvegicus) are considered the most important mammalian host of C. hepaticum and are therefore a potentially useful species to inform estimates of the risk to humans living in urban slum environments. There is a lack of studies systematically evaluating the role of demographic and environmental factors that influence both carriage and intensity of infection of C. hepaticum in rodents from urban slum areas within tropical regions. Carriage and the intensity of infection of C. hepaticum were studied in 402 Norway rats over a 2-year period in an urban slum in Salvador, Brazil. Overall, prevalence in Norway rats was 83% (337/402). Independent risk factors for C. hepaticum carriage in R. norvegicus were age and valley of capture. Of those infected the proportion with gross liver involvement (i.e. >75% of the liver affected, a proxy for a high level intensity of infection), was low (8%, 26/337). Sixty soil samples were collected from ten locations to estimate levels of environmental contamination and provide information on the potential risk to humans of contracting C. hepaticum from the environment. Sixty percent (6/10) of the sites were contaminated with C. hepaticum. High carriage levels of C. hepaticum within Norway rats and sub-standard living conditions within slum areas may increase the risk to humans of exposure to the infective eggs of C. hepaticum. This study supports the need for further studies to assess whether humans are becoming infected within this community and whether C. hepaticum is posing a significant risk to human health.
There is increasing interest in the link between early linguistic skills and later language development. In a longitudinal study, we investigated infants’ (a) ability to use speech sound categories to guide word learning in the habituation-based minimal pair switch task, and (b) early productive vocabulary, related to their concurrent and later language task performance. The participants at Phase 1 were 64 infants aged 16–24 months (25 with familial risk of language/speech impairment), followed up at 27 months (Phase 2) and at 3 years (Phase 3). Phase 1 productive vocabulary was correlated with Phase 2 productive vocabulary, and with concurrent and later (Phase 3) tests of language production and comprehension scores (standardized tool), and phonology. Phase 1 switch task performance was correlated with concurrent productive vocabulary and language production scores, but not by Phase 3. However, a combination of early low vocabulary score and a preference for looking at an already-habituated word–object combination in the switch task may show some promise as an identifier for early speech–language intervention. We discuss how these relations can help us better understand the foundations of word learning.