To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Stop the Bleed (STB) is a national initiative that provides lifesaving hemorrhagic control education. In 2019, pharmacists were added as health-care personnel eligible to become STB instructors. This study was conducted to evaluate the efficacy of pharmacist-led STB trainings for school employees in South Texas.
Pharmacist-led STB trainings were provided to teachers and staff in Laredo, Texas. The 60-min trainings included a presentation followed by hands-on practice of tourniquet application, wound-packing, and direct pressure application. Training efficacy was assessed through anonymous pre- and postevent surveys, which evaluated changes in knowledge, comfort level, and willingness to assist in hemorrhage control interventions. Student volunteers (predominantly pharmacy and medical students) assisted in leading the hands-on portion, providing a unique interprofessional learning opportunity.
Participants with previous training (N = 98) were excluded, resulting in a final cohort of 437 (response rate 87.4%). Compared with baseline, comfort level using tourniquets (mean, 3.17/5 vs 4.20/5; P < 0.0001), opinion regarding tourniquet safety (2.59/3 vs 2.94/3; P < 0.0001), and knowledge regarding tourniquets (70.86/100 vs 75.84/100; P < 0.0001) and proper tourniquet placement (2.40/4 vs 3.15/4; P < 0.0001) significantly improved.
Pharmacist-led STB trainings are efficacious in increasing school worker knowledge and willingness to respond in an emergency hemorrhagic situation.
This is a cross-sectional study aiming to understand the early characteristics and background of bone health impairment in clinically well children with Fontan circulation.
We enrolled 10 clinically well children with Fontan palliation (operated >5 years before study entrance, Tanner stage ≤3, age 12.1 ± 1.77 years, 7 males) and 11 healthy controls (age 12.0 ± 1.45 years, 9 males) at two children’s hospitals. All patients underwent peripheral quantitative CT. For the Fontan group, we obtained clinical characteristics, NYHA class, cardiac index by MRI, dual x-ray absorptiometry, and biochemical studies. Linear regression was used to compare radius and tibia peripheral quantitative CT measures between Fontan patients and controls.
All Fontan patients were clinically well (NYHA class 1 or 2, cardiac index 4.85 ± 1.51 L/min/m2) and without significant comorbidities. Adjusted trabecular bone mineral density, cortical thickness, and bone strength index at the radius were significantly decreased in Fontan patients compared to controls with mean differences −30.13 mg/cm3 (p = 0.041), −0.31 mm (p = 0.043), and −6.65 mg2/mm4 (p = 0.036), respectively. No differences were found for tibial measures. In Fontan patients, the mean height-adjusted lumbar bone mineral density and total body less head z scores were −0.46 ± 1.1 and −0.63 ± 1.1, respectively, which are below the average, but within normal range for age and sex.
In a clinically well Fontan cohort, we found significant bone deficits by peripheral quantitative CT in the radius but not the tibia, suggesting non-weight-bearing bones may be more vulnerable to the unique haemodynamics of the Fontan circulation.
Congenital heart defects (CHDs) occur in 8 of 1000 live-born children, making them common birth defects in the adolescent population. CHDs may have single gene, chromosomal, or multifactorial causes. Despite evidence that patients with CHD want information on heritability and genetics, no studies have investigated the interest or knowledge base in the adolescent population. This information is necessary as patients in adolescence take greater ownership of their health care and discuss reproductive risks with their physicians. The objectives of this survey-based study were to determine adolescents’ recall of their own heart condition, to assess patient and parent perception of the genetic contribution to the adolescent’s CHD, and to obtain information about the preferred method(s) for education. The results show that adolescent patients had good recall of their type of CHD. Less than half of adolescents and parents believed their CHD had a genetic basis or was heritable; however, adolescents with a positive family history of CHD were more likely to believe that their condition was genetic (p = 0.0005). The majority of patients were interested in receiving additional genetics education and preferred education in-person and in consultation with both parents and a physician. The adolescents who felt most competent to have discussions with their doctors regarding potential causes of their heart defect previously had a school science course which covered topics in genetics. These results provide insight into adolescents’ perceptions and understanding about their CHD and genetic risk and may inform the creation and provision of additional genetic education.
Alcohol and cannabis remain the substances most widely used by adolescents. Better understanding of the dynamic relationship between trajectories of substance use in relation to neuropsychological functioning is needed. The aim of this study was to examine the different impacts of within- and between-person changes in alcohol and cannabis use on neuropsychological functioning over multiple time points.
Hierarchical linear modeling examined the effects of alcohol and cannabis use on neuropsychological functioning over the course of 14 years in a sample of 175 adolescents (aged 12–15 years at baseline).
Time-specific fluctuations in alcohol use (within-person effect) predicted worse performance across time on the Wechsler Abbreviated Scale of Intelligence Block Design subtest (B = −.05, SE = .02, p = .01). Greater mean levels of percent days of cannabis use across time (between-person effect) were associated with an increased contrast score between Delis–Kaplan Executive Function System Color Word Inhibition and Color Naming conditions (B = .52, SE = .14, p < .0001) and poorer performance over time on Block Design (B = −.08, SE = .04, p = .03). Neither alcohol and/nor cannabis use over time was associated with performance in the verbal memory and processing speed domains.
Greater cumulative cannabis use over adolescence may be linked to poorer inhibitory control and visuospatial functioning performance, whereas more proximal increases in alcohol consumption during adolescence may drive alcohol-related performance decrements in visuospatial functioning. Results from this prospective study add to the growing body of literature on the impact of alcohol and cannabis use on cognition from adolescent to young adulthood.
Maternal input influences language development in children with Down syndrome (DS) and typical development (TD). Telegraphic input, or simplified input violating English grammatical rules, is controversial in speech–language pathology, yet no research to date has investigated whether mothers of children with DS use telegraphic input. This study investigated the quality of linguistic input to children with DS compared to age-matched children with TD, and the relationship between maternal input and child language abilities. Mothers of children with DS simplified their input in multiple ways, by using a lower lexical diversity, shorter utterances, and more telegraphic input compared to mothers of children with TD. Telegraphic input was not significantly correlated with other aspects of maternal input or child language abilities. Since children with DS demonstrate specific deficits in grammatical compared to lexical abilities, future work should investigate the long-term influence of maternal telegraphic input on language development in children with DS.
Coronary ostial atresia seen with pulmonary atresia and coronary-cameral fistulae or, more rarely, in isolation manifested as left main coronary artery atresia, is well described. We describe the clinical course and post-mortem findings in a neonate who suffered a fatal cardiac arrest and was found to have congenital absence of both coronary ostia in a single/common coronary system.
Children with congenital heart disease are at high risk for malnutrition. Standardisation of feeding protocols has shown promise in decreasing some of this risk. With little standardisation between institutions’ feeding protocols and no understanding of protocol adherence, it is important to analyse the efficacy of individual aspects of the protocols.
Adherence to and deviation from a feeding protocol in high-risk congenital heart disease patients between December 2015 and March 2017 were analysed. Associations between adherence to and deviation from the protocol and clinical outcomes were also assessed. The primary outcome was change in weight-for-age z score between time intervals.
Increased adherence to and decreased deviation from individual instructions of a feeding protocol improves patients change in weight-for-age z score between birth and hospital discharge (p = 0.031). Secondary outcomes such as markers of clinical severity and nutritional delivery were not statistically different between groups with high or low adherence or deviation rates.
High-risk feeding protocol adherence and fewer deviations are associated with weight gain independent of their influence on nutritional delivery and caloric intake. Future studies assessing the efficacy of feeding protocols should include the measures of adherence and deviations that are not merely limited to caloric delivery and illness severity.
Anxiety symptoms gradually emerge during childhood and adolescence. Individual differences in behavioral inhibition (BI), an early-childhood temperament, may shape developmental paths through which these symptoms arise. Cross-sectional research suggests that level of early-childhood BI moderates associations between later anxiety symptoms and threat-related amygdala–prefrontal cortex (PFC) circuitry function. However, no study has characterized these associations longitudinally. Here, we tested whether level of early-childhood BI predicts distinct evolving associations between amygdala–PFC function and anxiety symptoms across development.
Eighty-seven children previously assessed for BI level in early childhood provided data at ages 10 and/or 13 years, consisting of assessments of anxiety and an fMRI-based dot-probe task (including threat, happy, and neutral stimuli). Using linear-mixed-effects models, we investigated longitudinal changes in associations between anxiety symptoms and threat-related amygdala–PFC connectivity, as a function of early-childhood BI.
In children with a history of high early-childhood BI, anxiety symptoms became, with age, more negatively associated with right amygdala–left dorsolateral-PFC connectivity when attention was to be maintained on threat. In contrast, with age, low-BI children showed an increasingly positive anxiety–connectivity association during the same task condition. Behaviorally, at age 10, anxiety symptoms did not relate to fluctuations in attention bias (attention bias variability, ABV) in either group; by age 13, low-BI children showed a negative anxiety–ABV association, whereas high-BI children showed a positive anxiety–ABV association.
Early-childhood BI levels predict distinct neurodevelopmental pathways to pediatric anxiety symptoms. These pathways involve distinct relations among brain function, behavior, and anxiety symptoms, which may inform diagnosis and treatment.
Collaborative quality improvement and learning networks have amended healthcare quality and value across specialities. Motivated by these successes, the Pediatric Acute Care Cardiology Collaborative (PAC3) was founded in late 2014 with an emphasis on improving outcomes of paediatric cardiology patients within cardiac acute care units; acute care encompasses all hospital-based inpatient non-intensive care. PAC3 aims to deliver higher quality and greater value care by facilitating the sharing of ideas and building alignment among its member institutions. These aims are intentionally aligned with the work of other national clinical collaborations, registries, and parent advocacy organisations. The mission and early work of PAC3 is exemplified by the formal partnership with the Pediatric Cardiac Critical Care Consortium (PC4), as well as the creation of a clinical registry, which links with the PC4 registry to track practices and outcomes across the entire inpatient encounter from admission to discharge. Capturing the full inpatient experience allows detection of outcome differences related to variation in care delivered outside the cardiac ICU and development of benchmarks for cardiac acute care. We aspire to improve patient outcomes such as morbidity, hospital length of stay, and re-admission rates, while working to advance patient and family satisfaction. We will use quality improvement methodologies consistent with the Model for Improvement to achieve these aims. Membership currently includes 36 centres across North America, out of which 26 are also members of PC4. In this report, we describe the development of PAC3, including the philosophical, organisational, and infrastructural elements that will enable a paediatric acute care cardiology learning network.
Transcatheter right ventricle decompression in neonates with pulmonary atresia and intact ventricular septum is technically challenging, with risk of cardiac perforation and death. Further, despite successful right ventricle decompression, re-intervention on the pulmonary valve is common. The association between technical factors during right ventricle decompression and the risks of complications and re-intervention are not well described.
This is a multicentre retrospective study among the participating centres of the Congenital Catheterization Research Collaborative. Between 2005 and 2015, all neonates with pulmonary atresia and intact ventricular septum and attempted transcatheter right ventricle decompression were included. Technical factors evaluated included the use and characteristics of radiofrequency energy, maximal balloon-to-pulmonary valve annulus ratio, infundibular diameter, and right ventricle systolic pressure pre- and post-valvuloplasty (BPV). The primary end point was cardiac perforation or death; the secondary end point was re-intervention.
A total of 99 neonates underwent transcatheter right ventricle decompression at a median of 3 days (IQR 2–5) of age, including 63 patients by radiofrequency and 32 by wire perforation of the pulmonary valve. There were 32 complications including 10 (10.5%) cardiac perforations, of which two resulted in death. Cardiac perforation was associated with the use of radiofrequency (p=0.047), longer radiofrequency duration (3.5 versus 2.0 seconds, p=0.02), and higher maximal radiofrequency energy (7.5 versus 5.0 J, p<0.01) but not with patient weight (p=0.09), pulmonary valve diameter (p=0.23), or infundibular diameter (p=0.57). Re-intervention was performed in 36 patients and was associated with higher post-intervention right ventricle pressure (median 60 versus 50 mmHg, p=0.041) and residual valve gradient (median 15 versus 10 mmHg, p=0.046), but not with balloon-to-pulmonary valve annulus ratio, atmospheric pressure used during BPV, or the presence of a residual balloon waist during BPV. Re-intervention was not associated with any right ventricle anatomic characteristics, including pulmonary valve diameter.
Technical factors surrounding transcatheter right ventricle decompression in pulmonary atresia and intact ventricular septum influence the risk of procedural complications but not the risk of future re-intervention. Cardiac perforation is associated with the use of radiofrequency energy, as well as radiofrequency application characteristics. Re-intervention after right ventricle decompression for pulmonary atresia and intact ventricular septum is common and relates to haemodynamic measures surrounding initial BPV.
Approximately 30% of patients with schizophrenia experience auditory hallucinations that are refractory to antipsychotic medications. Here, we evaluated the feasibility and efficacy of transcranial alternating current stimulation (tACS) that we hypothesized would improve auditory hallucination symptoms by enhancing synchronization between the frontal and temporo-parietal areas of the left hemisphere.
22 participants were randomized to one of three arms and received twice daily, 20 min sessions of sham, 10 Hz 2 mA peak-to-peak tACS, or 2 mA tDCS over the course of 5 consecutive days. Symptom improvement was assessed using the Auditory Hallucination Rating Scale (AHRS) as the primary outcome measure. The Positive and Negative Syndrome Scale (PANSS) and the Brief Assessment of Cognition in Schizophrenia (BACS) were secondary outcomes.
Primary and secondary behavioral outcomes were not significantly different between the three arms. However, effect size analyses show that tACS had the greatest effect based on the auditory hallucinations scale for the week of stimulation (1.31 for tACS; 1.06 and 0.17, for sham and tDCS, respectively). Effect size analysis for the secondary outcomes revealed heterogeneous results across measures and stimulation conditions.
To our knowledge, this is the first clinical trial of tACS for the treatment of symptoms of a psychiatric condition. Further studies with larger sample sizes are needed to better understand the effect of tACS on auditory hallucinations.
The present study aimed to examine the correlates of fruit and vegetable intake (FVI) separately among parents and their adolescents.
Parents and adolescents completed the Family Life, Activity, Sun, Health, and Eating (FLASHE) survey through the National Cancer Institute. The survey assessed daily intake frequencies of food/beverage groups, psychosocial, parenting and sociodemographic factors. Generalized linear models were run for both parents and adolescents, for a total of six models (three each): (i) sociodemographic characteristics; (ii) psychosocial factors; (iii) parent/caregiver factors.
Parent participants (n 1542) were predominantly 35–59 years old (86 %), female (73 %), non-Hispanic White (71 %) or non-Hispanic Black (17 %), with household income <$US 100 000 (79 %). Adolescents (n 805) were aged 12–14 years (50 %), non-Hispanic White (66 %) and non-Hispanic Black (15 %). Parents consumed 2·9 cups fruits and vegetables (F&V) daily, while adolescents consumed 2·2 cups daily. Educational attainment (higher education had greater FVI) and sex (men consumed more than women; all P<0·001) were significant FVI predictors. Parents with greater autonomous and controlled motivation, self-efficacy and preferences for fruit reported higher FVI (all P<0·001). Similarly, adolescents with greater autonomous and controlled motivation, self-efficacy and knowledge reported higher FVI (all P<0·001). Parenting factors of importance were co-deciding how many F&V teens should have, rules, having F&V in the home and cooking meals from scratch (all P<0·05).
Findings suggest factors that impact FVI among parents and their adolescent(s), which highlight the importance of the role of parent behaviour and can inform tailored approaches for increasing FVI in various settings.
Evidence shows that the health of the work environment impacts staff satisfaction, interdisciplinary communication, and patient outcomes. Utilising the American Association of Critical-Care Nurses’ Healthy Work Environment standards, we developed a daily assessment tool.
The Relative Environment Assessment Lens (REAL) Indicator was developed using a consensus-based method to evaluate the health of the work environment and to identify opportunities for improvement from the front-line staff. A visual scale using images that resemble emoticons was linked with a written description of feelings about their work environment that day, with the highest number corresponding to the most positive experience. Face validity was established by seeking staff feedback and goals were set.
Over 10 months, results from the REAL Indicator in the cardiac catheterisation laboratory indicated an overall good work environment. The goal of 80% of the respondents reporting their work environment to be “Great”, “Good”, or “Satisfactory” was met each month. During the same time frame, this goal was met four times in the cardiovascular operating room. On average, 72.7% of cardiovascular operating room respondents reported their work environment to be “Satisfactory” or better.
The REAL Indicator has become a valuable tool in assessing the specific issues of the clinical area and identifying opportunities for improvement. Given the feasibility of and positive response to this tool in the cardiac catheterisation laboratory, it has been adopted in other patient-care areas where staff and leaders believe that they need to understand the health of the environment in a more specific and frequent time frame.
Influenza A (H1N1) pdm09 became the predominant circulating strain in the United States during the 2013–2014 influenza season. Little is known about the epidemiology of severe influenza during this season.
A retrospective cohort study of severely ill patients with influenza infection in intensive care units in 33 US hospitals from September 1, 2013, through April 1, 2014, was conducted to determine risk factors for mortality present on intensive care unit admission and to describe patient characteristics, spectrum of disease, management, and outcomes.
A total of 444 adults and 63 children were admitted to an intensive care unit in a study hospital; 93 adults (20.9%) and 4 children (6.3%) died. By logistic regression analysis, the following factors were significantly associated with mortality among adult patients: older age (>65 years, odds ratio, 3.1 [95% CI, 1.4–6.9], P=.006 and 50–64 years, 2.5 [1.3–4.9], P=.007; reference age 18–49 years), male sex (1.9 [1.1–3.3], P=.031), history of malignant tumor with chemotherapy administered within the prior 6 months (12.1 [3.9–37.0], P<.001), and a higher Sequential Organ Failure Assessment score (for each increase by 1 in score, 1.3 [1.2–1.4], P<.001).
Risk factors for death among US patients with severe influenza during the 2013–2014 season, when influenza A (H1N1) pdm09 was the predominant circulating strain type, shifted in the first postpandemic season in which it predominated toward those of a more typical epidemic influenza season.
Infect. Control Hosp. Epidemiol. 2015;36(11):1251–1260
To explore the feasibility of a workplace farmstand programme through the utilization of an online ordering system to build awareness for local food systems, encourage community participation, and increase local fruit and vegetable availability.
A 4-week pilot to explore feasibility of workplace farmstand programmes through a variety of outcome measures, including survey, mode of sale, weekly sales totals and intercept interviews.
A large private company in Sarpy County, Omaha, Nebraska, USA.
Employees of the company hosting the farmstand programme.
Pre-programme, a majority of employees indicated that quality (95·4 %), variety (94·6 %) and cost of fruits and vegetables (86·4 %) were driving factors in their fruit and vegetable selection when shopping. The availability of locally or regionally produced fruits and vegetables was highly important (78·1 %). Participants varied in their definition of local food, with nearly half (49·2 %) reporting within 80·5 km (50 miles), followed by 160·9 km (100 miles; 29·5 %) and 321·9 km (200 miles; 12·1 %). Weekly farmstand purchases (both walk-ups and online orders) ranged from twenty-eight to thirty-nine employees, with weekly sales ranging from $US 257·95 to 436·90 for the producer. The mode of purchase changed throughout the pilot, with higher use of online ordering in the beginning and higher use of walk-up purchasing at the end.
The workplace farmstand pilot study revealed initial interest by both employees and a producer in this type of programme, helped to establish a sustained producer–employer relationship and led to additional opportunities for both the producer and employer.
This study described prescribing trends before and after implementing a provincial strategy aimed at improving osteoporosis and fracture prevention in Ontario long-term care (LTC) homes. Data were obtained from a pharmacy provider for 10 LTC homes in 2007 and 166 homes in 2012. We used weighted, multiple linear regression analyses to examine facility-level changes in vitamin D, calcium, and osteoporosis medication prescribing rates between 2007 and 2012. After five years, the estimated increase in vitamin D, calcium, and osteoporosis medication prescribing rates, respectively, was 38.2 per cent (95% confidence interval [CI]: 29.0, 47.3; p < .001), 4.0 per cent (95% CI: –3.9, 12.0; p = .318), and 0.2 per cent (95% CI: –3.3, 3.7; p = .91). Although the study could not assess causality, findings suggest that wide-scale knowledge translation activities successfully improved vitamin D prescribing rates, although ongoing efforts are needed to target homes with low uptake.
Glyphosate-resistant giant ragweed has been confirmed in several Midwestern states. In some cases, weed resistance to glyphosate has been shown to carry a fitness penalty. Previous research has found that a glyphosate-resistant giant ragweed biotype from Indiana with a rapid necrosis response to glyphosate displayed early, rapid growth in the absence of glyphosate, flowered earlier, but produced 25% less seed than a sensitive biotype, suggesting that there may be a fitness penalty associated with the rapid necrosis resistance trait. In Wisconsin, we have recently identified a giant ragweed accession with a 6.5-fold level of resistance to glyphosate that does not demonstrate the rapid necrosis response. Our objective was to determine the noncompetitive growth and fecundity of the resistant accession in the absence of glyphosate, relative to a sensitive accession from a nearby field border population. In greenhouse experiments, plant height, leaf area, and dry shoot biomass were similar between the resistant and sensitive accessions during vegetative growth to the onset of flowering. The instantaneous relative growth rate, instantaneous net assimilation rate, and instantaneous leaf area ratio also did not differ between accessions. However, fecundity of resistant plants (812 seeds plant−1) was greater (P = 0.008) than sensitive plants (425 seeds plant−1). The percentage of intact viable seeds, intact nonviable seeds, and empty involucres did not differ between resistant and sensitive accessions. These results indicate that resistance of this accession of giant ragweed to glyphosate has not affected its growth and development relative to a sensitive accession. The greater fecundity and similar viability of resistant plants relative to sensitive plants suggests that in the absence of selection by glyphosate, the frequency of the resistance trait for glyphosate may increase in the giant ragweed field population over time.