To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Twelve specimens of Eumorphocystis Branson and Peck, 1940 provide the basis for new findings and a more informed assessment of whether this blastozoan (a group including eocrinoids, blastoids, diploporites, rhombiferans) constitutes the sister taxon to crinoids, as has been recently proposed. Both Eumorphocystis and earliest-known crinoid feeding appendages express longitudinal canals, a demonstrable trait exclusive to these taxa. However, the specimen series studied here shows that Eumorphocystis canals constrict proximally and travel within ambulacrals above the thecal cavity. This relationship is congruent with a documented blastozoan pattern but very unlike earliest crinoid topology. Earliest crinoid arm cavities lie fully beneath floor plates; these expand and merge directly with the main thecal coelomic cavity at thecal shoulders. Other associated anatomical features echo this contrasting comparison. Feeding appendages of Eumorphocystis lack two-tiered cover plates, podial basins/pores, and lateral arm plating, all features of earliest crinoid ‘true arms.’ Eumorphocystis feeding appendages are buttressed by solid block-like plates added during ontogeny at a generative zone below floor plates, a pattern with no known parallel among crinoids. Eumorphocystis feeding appendages express brachioles, erect extensions of floor plates, also unknown among crinoids. These several distinctions point to nonhomology of most feeding appendage anatomy, including longitudinal canals, removing Eumorphocystis and other blastozoans from exclusive relationship with crinoids. Eumorphocystis further differs from crinoids in that thecal plates express diplopores, respiratory structures not present among crinoids, but ubiquitous among certain groups of blastozoans. Phylogenetic analysis places Eumorphocystis as a crownward blastozoan, far removed from crinoids.
Intermediate morphologies of a new fossil crinoid shed light on the pathway by which crinoids acquired their distinctive arms. Apomorphies originating deep in echinoderm history among early nonblastozoan pentaradiate echinoderms distinguish Tremadocian (earliest Ordovician) crinoid arms from later taxa. The brachial series is separated from the ambulacra, part of the axial skeleton, by lateral plate fields. Cover plates are arrayed in two tiers, and floor plates expressed podial basins and pores. Later during the Early Ordovician, floor plates contacted and nestled into brachials, then were unexpressed as stereom elements entirely and cover plates were reduced to a single tier. Incorporation of these events into a parsimony analysis supports crinoid origin deep in echinoderm history separate from blastozoans (eocrinoids, ‘cystoids’). Arm morphology is exceptionally well-preserved in the late Tremadocian to early Floian Athenacrinus broweri new genus new species. Character analysis supports a hypothesis that this taxon originated early within in the disparid clade. Athenacrinus n. gen. (in Athenacrinidae new family) is the earliest-known crinoid to express what is commonly referred to as ‘compound’ or ‘biradial’ morphology. This terminology is misleading in that no evidence for implied fusion or fission of radials exists, rather it is suggested that this condition arose through disproportionate growth.
Introduction: Access block is a pervasive problem, even during times of minimal boarding in the ED, suggesting suboptimal use of ED stretchers can contribute. A tracking board utility was embedded into the electronic health record in Calgary, AB, allowing MDs and RNs to consider patients who could be relocated from a stretcher to a chair. Objectives of this study were to evaluate the feature's impact on total stretcher time (TST) and ED length of stay (LOS) for patients relocated to a chair. We also sought to identify facilitators and barriers to the tool's use amongst ED MDs and RNs. Methods: A retrospective cohort design was used to compare TST between those where the tool was used and not used amongst patients relocated to a chair between September 1 2017 and August 15 2018. Each use of the location tool was time-stamped in an administrative database. Median TST and ED LOS were compared between patients where the tool was used and not used using a Mann-Whitney U Test. A cross sectional convenience sample survey was used to determine facilitators and barriers to the tool's use amongst ED staff. Response proportions were used to report Likert scale questions; thematic analysis was used to code themes. Results: 194882 patients met inclusion criteria. The tool was used 4301 times, with “Ok for Chairs” selected 3914(2%) times and “Not Ok for Chairs” selected 384(0.2%) times; 54462(30%) patients were moved to a chair without the tool's use. Mean age, sex, mode of arrival and triage scores were similar between both groups. Median (IQR) TST amongst patients moved to a chair via the prompt was shorter than when the prompt was not used [142.7 (100.5) mins vs 152.3 (112.3) mins, p < 0.001], resulting in 37574 mins of saved stretcher time. LOS was similar between both groups (p = 0.22). 125 questionnaires were completed by 90 ED nurses and 35 ED MDs. 95% of staff were aware of the tool and 70% agreed/strongly agreed the tool could improve ED flow; however, 38% reported only “sometimes” using the tool. MDs reported the most common barrier was forgetting to use the tool and lack of perceived action in relocating patients. Commonly reported nursing barriers were lack of chair space and increased workload. Conclusion: Despite minimal use of the tracking board utility, triggering was associated with reduced TST amongst ED patients eventually relocated to a chair. To encourage increased use, future versions should prompt staff to select a location.
Urinary tract infections (UTIs) are common among college-aged women and often recur. Some antibiotics recommended to treat UTIs trigger dysbiosis of intestinal and vaginal microbiomes – where uropathogens originate, though few studies have investigated associations between these therapies with recurrent infections. We retrospectively analysed the electronic medical records of 6651 college-aged women diagnosed with a UTI at a US university student health centre between 2006 and 2014. Women were followed for 6 months for incidence of a recurrent infection. In a secondary analysis, associations in women whose experienced UTI recurrence within 2 weeks were also considered for potential infection relapse. Logistic regression was used to assess associations between infection recurrence or relapse and antibiotics prescribed, in addition to baseline patient characteristics including age, race/ethnicity, region of origin, year of encounter, presence of symptomology, pyelonephritis, vaginal coinfection and birth control consultation. There were 1051 instances of infection recurrence among the 6620 patients, indicating a prevalence of 16%. In the analysis of patient characteristics, Asian women were statistically more likely to experience infection recurrence whereas African American were less likely. No significant associations were identified between the antibiotic administered at the initial infection and the risk of infection recurrence after multivariable adjustment. Treatment with trimethoprim-sulphamethoxazole and being born outside of the USA were significantly associated with increased odds of infection relapse in the multivariate analysis. The results of the analyses suggest that treatment with trimethoprim-sulphamethoxazole may lead to an increased risk of UTI relapse, warranting further study.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Introduction: The specialist Emergency Medicine (EM) postgraduate training program at Queens University implemented a new Competency-Based Medical Education (CBME) model on July 1 2017. This occurred one year ahead of the national EM cohort, in the model of Competence By Design (CBD) as outlined by the Royal College of Physicians and Surgeons of Canada (RCPSC). This presents an opportunity to identify critical steps, successes, and challenges in the implementation process to inform ongoing national CBME implementation efforts. Methods: A case-study methodology with Rapid Cycle Evaluation was used to explore the lived experience of implementing CBME in EM at Queens, and capture evidence of behavioural change. Data was collected at 3- and 6- months post-implementation via multiple sources and methods, including: field observations, document analysis, and interviews with key stakeholders: residents, faculty, program director, CBME lead, academic advisors, and competence committee members. Qualitative findings have been triangulated with available quantitative electronic assessment data. Results: The critical processes of implementation have been outlined in 3 domain categories: administrative transition, resident transition, and faculty transition. Multiple themes emerged from stakeholder interviews including: need for holistic assessment beyond Entrustable Professional Activity (EPA) assessments, concerns about the utility of milestones in workplace based assessment by front-line faculty, trepidation that CBME is adding to, rather than replacing, old processes, and a need for effective data visualisation and filtering for assessment decisions by competency committees. We identified a need for administrative direction and faculty development related to: new roles and responsibilities, shared mental models of EPAs and entrustment scoring. Quantitative data indicates that the targeted number of assessments per EPA and stage of training may be too high. Conclusion: Exploring the lived experience of implementing CBME from the perspectives of all stakeholders has provided early insights regarding the successes and challenges of operationalizing CBME on the ground. Our findings will inform ongoing local implementation and higher-level national planning by the Canadian EM Specialty Committee and other programs who will be implementing CBME in the near future.
Introduction: Hospital admission within 72 hours of emergency discharge is a widely accepted measure of emergency department quality of care. Patients returning for unplanned admission may reveal opportunities for improved emergency or followup care. Calgary emergency physicians, however, are rarely notified of these readmissions. Aggregate site measures provide a high level view of readmissions for managers, but dont allow for timely, individual reflection on practice and learning opportunities. These aggregations may also not correctly account for variation in planned readmissions and other workflow nuances. There was a process in place at one facility to compile and communicate readmission details to each physician, but it was manual, provided limited visit detail, and was done weeks or months following discharge. Methods: A new, realtime 72 hour readmission notification recently implemented within the Calgary Zone provides direct and automated email alerts to all emergency physicians and residents involved in the care of a patient that has been readmitted. This alert is sent within hours of a readmission occurring and contains meaningful visit detail (discharge diagnosis, readmit diagnosis, patient name, etc) to help support practice reflection. An average of 15 alerts per day are generated and have been sent since implementation in April, 2017. Although an old technology, the use of email is a central component of the solution because it allows physicians to receive notifications at home and outside the hospital network where they routinely perform administrative tasks. A secondary notification is sent to personal email accounts (Gmail, Hotmail, etc) to indicate an unplanned admission has occurred, but without visit detail or identifiable information. It also allowed implementation with no new hardware or software cost. Results: A simple thumbs up/down rating system is used to adjust the sensitivity of the alert over time. More than 66% of those providing feedback have indicated the alert is helpful for practice reflection (i.e., thumbs up). And of those that indicated it was not helpful, comments were often entered indicating satisfaction with the alert generally, or suggestions for improvement. For example, consulted admitting physicians are often responsible for discharge decisions and should be added as recipients of the alert. Conclusion: Many physicians have indicated appreciation in knowing about return patients, and that they will reflect on their care, further review the chart, or contact the admitting physician for further discussion. Most are accepting of some ‘expected’ or ‘false positive’ alerts that aren’t helpful for practice reflection. Further tuning and expansion of the alert to specialist and consult services is needed to ensure all physicians involved in a discharge decision are adequately notified.
Introduction: There is a growing interest in providing clinicians with performance reports via audit and feedback (A&F). Despite significant evidence exists to support A&F as a tool for self-reflection and identifying unperceived learning needs, there are many questions that remain such as the optimal content of the A&F reports, the method of dissemination for emergency physicians (EP) and the perceived benefit. The goal of the project was to 1. evaluate EP perceptions regarding satisfaction with A&F reports and its’ ability to stimulate physicians to identify opportunities for practice change and 2. identify areas for optimization of the A&F reports. Methods: EP practicing at any of the four adult hospital sites in Calgary were eligible. We conducted a web survey using a modified Dillman technique eliciting EP perspectives regarding satisfaction, usefulness and suggestions for improvement regarding the A&F reports. Quantitative data were analyzed descriptively and free-text were subjected to thematic analysis. Results: From 2015 onwards, EP could access their clinical performance data via an online dashboard. Despite the online reports being available, few physicians reviewed their reports stating access and perceived lack of utility as a barrier. In October 2016, we began disseminated static performance reports to all EP containing a subset of 10 clinical and operational performance metrics via encrypted e-mail. These static reports provided clinician with their performance with peer comparator data (anonymized), rationale and evidence for A&F, information on how to use the report and how to obtain continuing medical education credits for reviewing the report. Conclusion: Of 177 EP in Calgary, we received 49 completed surveys (response rate 28%). 86% of the respondents were very/satisfied with the report. 88% of EP stated they would take action based on the report including self-reflection (91%) and modifying specific aspects of their practice (63%). Respondents indicated that by receiving static reports, 77% were equally or more likely to visit the online version of the eA&F tool. The vast majority of EP felt that receiving the A&F reports on a semi-annual basis was preferred. Three improvements were made to the eA&F based on survey results: 1) addition of trend over time data, 2) new clinical metrics, and 3) optimization of report layout. We also initiated a separate, real-time 72-hour bounceback electronic notification system based on the feedback. EP value the dissemination of clinical performance indicators both in static report and dashboard format. Eliciting feedback from clinicians allows iterative optimization of eA&F. Based on these results, we plan to continue to provide physicians with A&F reports on a semi-annual basis.
Introduction: Non-variceal upper gastrointestinal bleeding (NVUGIB) is a common presentation to the emergency department (ED) accounting for significant morbidity, mortality and health care resource usage. In Alberta, a provincial care pathway was recently developed to provide an evidence informed approach to managing patients with an UGIBs in the ED. Pantoprazole infusions are a commonly used treatment despite evidence that suggests they are generally not indicated prior to endoscopy in the ED. The goal of this project was to optimize management of patients with a NVUGIB, in particular reduce pre-endoscopy pantoprazole infusions. Methods: In July 2016, we implemented a multi-faceted intervention to optimize management of ED patients with NVUGIB including 1. de-emphasizing IV pantoprazole infusions in the ED, 2. clinical decision support (CDS) embedded (for endoscopy, disposition and transfusions) within the order set and 3. educating clinicians about the care pathway. We used a pre/post-order set design, analyzing 391 days pre and 189 days post-order set changes. Data was extracted from our fully integrated electronic health records system. The primary outcome was the % of patients receiving IV pantoprazole infusion ordered by an emergency physician (EP) among all patients with NVUGIB. Secondary outcomes included % transfused with hgb >70g/L and whether using the GIB order set impacted management of NVUGIB patients. Results: In the 391 days pre-order set changes, there were 2165 patients included and in the 189 days post-order set changes, there were 901 patients. For baseline characteristics, patients in the post-order set change group were significantly older (64.4 yrs vs 60.9 yrs p-value=0.0016) and had a lower hgb (115 vs 118, p-value=0.049) but otherwise for gender, measures of severity of illness (systolic blood pressure, heart rate, CTAS, % admitted) there were no significantly differences. For the primary outcome, in the pre-order set phase, 47.1% received a pantoprazole infusion ordered by an EP, compared to 31.5% in the post-order phase, for an absolute reduction of 15.6% (p-value= <0.001). For the secondary outcomes, transfusion rates were similar pre/post (22.08% vs 22.75%). Significant inter-site variability exists with respect to the reduction in pantoprazole infusion rates across the four sites (-23.3% to +6.12%). Conclusion: Our interventions resulted in a significant overall reduction in pantoprazole infusions in ED patients with NVUGIB. Reductions in pantoprazole infusions varied significantly across the different sites, future work in our department will explore and address this variability. Keys to the success of this project included engaging clinicians as well as leveraging the SCM order sets as well as the provincial care pathway. Although there were no changes in transfusion rates, it in unclear if this a function of the CDS not being effective or whether these transfusions were clinically indicated.
Introduction: In light of escalating health care costs, initiatives such as Choosing Wisely have been advocating the need to “reduce unnecessary or wasteful medical tests, treatments and procedures”. We have identified coagulation studies as one of those low cost, but frequently ordered items, where we can decrease unnecessary testing and costs by leveraging our Computerized Practitioner Order Entry (CPOE). Considerable evidence exists to suggest a low yield of doing coagulation studies (herein defined as PTT AND INR’s) in suspected cardiac chest pain patients (SCCP). Methods: Using administrative data merged with CPOE we extracted data 90 days pre- and 90 post-intervention (Pre-intervention: May 20, 2015 to August 19th 2015, Post-intervention: August 20th, 2015 to November 18th 2015). The setting for the study is a large urban center (4 adult ED’s with an annual census of over 320,000 visits per year). Our CPOE system is fully integrated into the ED patient care. The intervention involved modifying the nursing CPOE to remove the pre-selected coagulation studies in SCCP and providing education around appropriate usage of coagulation studies. Patients were included in the study if the bedside nurse or physician felt 1. the chest pain may be cardiac in nature and 2. Labs were ordered. The primary outcome was to compare the number of coagulation studies ordered pre and post-intervention. Results: Our analysis included 10,776 patients that were included in an SCCP pathway as determined by the CPOE database. Total number of visits in these two phases were similar (73,551 pre and 72, 769 post). In the pre-intervention phase, 5255 coagulation studies were done (4246 ordered by nursing staff and 1009 studies ordered by ED physicians). In the post-intervention phase, 1464 coagulation studies were ordered (1211 by nursing staff and 253 additional tests were ordered by ED physicians). With our intervention, we identified a net reduction of 3791 coagulation studies in our post-intervention phase for a reduction of 72.14% reduction (p=<0.0001) At a cost of 15.00$ (CDN$ at our center), we would realize an estimated cost -savings of 56,865$ for this intervention over a 90 day period. Conclusion: We have implemented a simple, sustainable, evidence based intervention that significantly minimizes the use of unnecessary coagulation studies in patients presenting with SCCP.
Introduction: In certain circumstances, skin and soft tissue infections are managed with intravenous (IV) antibiotics. In our center, patients initiated on outpatient IV antibiotics are followed up by a home parental therapy program the following day. A significant number of these patients require a repeat visit to the ED because of clinic hours. Probenecid is a drug that can prolong the half-life of certain antibiotics (such as cefazolin) and can therefore avoid a repeat ED visit, reducing health care costs and improve ED capacity. Our goal was to increase probenecid usage in the ED in order to optimize management of skin and soft tissue infections (SSTI) in the ED. The primary outcome was to compare the usage of probenecid in the pre and post-intervention phase. Secondary outcomes were to compare revisit rates between patients receiving cefazolin alone vs cefazolin + probenecid. Methods: Using administrative data merged with Computerized Physician Order Entry (CPOE), we extracted data 90 days pre- and 90 post-intervention (February 11, 2015 to August 11, 2015). The setting for the study is an urban center (4 adult ED’s with an annual census of over 320,000 visits per year). Our CPOE system is fully integrated into the ED patient care. The multi-faceted intervention involved modifying all relevant SSTI order sets in the CPOE system to link any cefazolin order with an order for probenecid. Physicians and nurses were provided with a 1 page summary of probenecid (indications, contra-indications, pharmacology), as well as decision support with the CPOE. Any patients who were receiving outpatient cefazolin therapy were included in the study. Results: Our analysis included 2512 patients (1148 and 1364 patients in the pre/post phases) who received cefazolin in the ED and were discharged during the 180 day period. Baseline variables (gender, age, % admitted) and ED visits were similar in both phases. In the pre-intervention phase 30.2% of patients received probenecid and in the post-intervention phase 43.0%, for a net increase of 12.8% (p=<0.0001). Patients who received probenecid had a 2.2% (11.4% vs 13.6%, p=0.014) lower re-visit rate in the following 72H. Conclusion: We have implemented a CPOE based clinical decision support intervention that demonstrated significant increase in probenecid usage by emergency physician and resulted in a decrease in ED revisits. This intervention would result in health care cost-savings.
Introduction: The addition of computerized physician order entry (CPOE) to Emergency Departments in recent years has led to speculation over potential benefits and pitfalls. Recent studies have shown benefits to CPOE, though there lacks sufficient evidence on how it could change physician behaviour. Physician practices are known to be difficult to change, with getting evidence into daily practice being the main challenge of knowledge translation. Our study aims were to determine if well-designed electronic order sets for CPOE improved MD practices. Methods: The Calgary Zone Pain Management in the Emergency Department Working Group relied on a GRADE-based literature review for identifying best practices for analgesia and antiemetics, resulting in soft changes to the dedicated analgesia and antiemetic electronic order set noting working group preference, and emphasizing hydromorphone over morphine, as well as 4 mg ondansetron over 8 mg. The new electronic order set was started in the only Calgary Region order entry system on December 11th, 2014. Data was collected from July 2014 - May 2015. A Yates chi-squared analysis was completed on all orders in a category, as well as the subgroups of ED staff and residents, and orders placed using the new order set. Results: A total of 100460 orders were analyzed. The use of hydromorphone increased significantly across all 4 EDs. IV hydromorphone use increased (5.82% of all opioid orders up to 26.93%, P<0.0001) with a reciprocal decline in IV morphine (67.81% of all opioid orders down to 46.56%, P<0.0001). Similar effects were observed with ondansetron 4 mg IV orders increasing (1.37% of all ondansetron orders to 18.64%, P<0.0001) with a decrease in 8 mg dosing (15.75% of all ondansetron orders to 7.23%, P<0.0001). These results were replicated to a lesser degree in the non-ED staff and non-order set subgroups. Implementation of the new order set resulted in an increase of its use (37.64% of all opioid orders up to 49.29%, P<0.0001). Finally, a cost-savings analysis was completed showing a projected annual savings of $185,676.52 on medications alone. Conclusion: This data supports the manipulation of electronic order sets to help shape physician behaviour towards best practices. This provides another strong argument towards the benefits of CPOE, and can help maintain best practices in Emergency Medicine.
Global concern around over the counter availability of codeine containing products and risk of misuse, dependence and related harms are evident. A phenomenological study of lived experiences of codeine misuse and dependence was undertaken in Ireland, following the Pharmaceutical Society of Ireland’s 2010 guidelines for restricted supply of non-prescription codeine containing products.
In-depth interviews were conducted with a purposive sample of adult codeine misusers and dependents (n=21), both actively using, in treatment and in recovery. The narratives were analysed using the Empirical Phenomenological Psychological five-step method (Karlsson, 1995). A total of 10 themes with 82 categories were identified. Two concepts at a higher level of abstraction above the theme-level emerged during the final stage of analysis. The concepts identified were ‘emotional pain and user self-legitimization of use’ and ‘entrapment into habit-forming and invisible dependent use’. These concepts were reported in different ways by a majority of participants.
Findings are presented under the following themes: (1) profile and product preferences; (2) awareness of habit forming use and harm; (3) negotiating pharmacy sales; (4) alternative sourcing routes; (5) the codeine feeling; (6) the daily routine; (7) acute and chronic side effects; (8) social isolation; (9) withdrawal and dependence and (10) help-seeking and treatment experiences.
There is a public health and regulatory imperative to develop proactive responses tackling public availability of codeine containing medicines, risk minimisation in consumer self-treatment for pain, enhanced patient awareness of potential for habit forming use and its consequences and continued health professional pharmacovigilence.
We conducted infrared spectroscopic observations of bright stars in the direction of the molecular clouds W33 and GMC G23.3 − 0.3. We compared stellar spectro-photometric distances with parallactic distances to these regions, and we were able to assess the association of the detected massive stars with these molecular complexes. The spatial and temporal distributions of the detected stars enabled us to locate sources of ionizing radiation and to gather precise information on the star formation history of these clouds. The studied clouds present different distributions of massive stars.
To examine the types of food served at family dinner in the homes of adolescents and correlations with parent and family sociodemographic characteristics, psychosocial factors and meal-specific variables.
A cross-sectional population-based survey completed by mail or telephone by parents participating in Project F-EAT (Families and Eating and Activity in Teens) in 2009–2010.
Homes of families with adolescents in Minneapolis/St. Paul urban area, MN, USA.
Participants included 1923 parents/guardians (90·8 % female; 68·5 % from ethnic/racial minorities) of adolescents who participated in EAT 2010.
Less than a third (28 %) of parents reported serving a green salad at family dinner on a regular basis, but 70 % reported regularly serving vegetables (other than potatoes). About one-fifth (21 %) of families had fast food at family dinners two or more times per week. Variables from within the sociodemographic domain (low educational attainment) psychosocial domain (high work–life stress, depressive symptoms, low family functioning) and meal-specific domain (low value of family meals, low enjoyment of cooking, low meal planning, high food purchasing barriers and fewer hours in food preparation) were associated with lower healthfulness of foods served at family dinners, in analyses adjusted for sociodemographic characteristics.
There is a need for interventions to improve the healthfulness of food served at family meals. Interventions need to be suitable for parents with low levels of education; take parent and family psychosocial factors into account; promote more positive attitudes toward family meals; and provide skills to make it easier to plan and prepare healthful family meals.
Among US racial/ethnic minority women, we examined associations between maternal experiences of racial discrimination and child growth in the first 3 years of life. We analyzed data from Project Viva, a pre-birth cohort study. We restricted analyses to 539 mother–infant pairs; 294 were Black, 127 Hispanic, 110 Asian and 8 from additional racial/ethnic groups. During pregnancy, mothers completed the Experiences of Discrimination survey that measured lifetime experiences of racial discrimination in diverse domains. We categorized responses as 0, 1–2 or ⩾3 domains. Main outcomes were birth weight for gestational age z-score; weight for age (WFA) z-score at 6 months of age; and at 3 years of age, body mass index (BMI) z-score. In multivariable analyses, we adjusted for maternal race/ethnicity, nativity, education, age, pre-pregnancy BMI, household income and child sex and age. Among this cohort of mostly (58.2%) US-born and economically non-impoverished mothers, 33% reported 0 domains of discrimination, 33% reported discrimination in 1–2 domains and 35% reported discrimination in ⩾3 domains. Compared with children whose mothers reported no discrimination, those whose mothers reported ⩾3 domains had lower birth weight for gestational age z-score (β −0.25; 95% CI: −0.45, −0.04), lower 6 month WFA z-score (β −0.34; 95% CI: −0.65, −0.03) and lower 3-year BMI z-score (β −0.33; 95% CI: −0.66, 0.00). In conclusion, we found that among this cohort of US racial/ethnic minority women, mothers’ report of experiencing lifetime discrimination in ⩾ 3 domains was associated with lower fetal growth, weight at 6 months and 3-year BMI among their offspring.
Preterm birth affects over 12% of all infants born in the United States; yet the biology of early delivery remains unclear, including whether epigenetic mechanisms are involved. We examined associations of maternal and umbilical cord blood long interspersed nuclear element-1 (LINE-1) DNA methylation with length of gestation and odds of preterm birth in singleton pregnancies in Project Viva. In white blood cells from maternal blood during first trimester (n = 914) and second trimester (n = 922), and from venous cord blood at delivery (n = 557), we measured LINE-1 by pyrosequencing [expressed as %5 methyl cytosines within the LINE-1 region analyzed (%5mC)]. We ran linear regression models to analyze differences in gestation length, and logistic models for odds of preterm birth (<37 v. ⩾37 weeks’ gestation), across quartiles of LINE-1. Mean (s.d.) LINE-1 levels were 84.3 (0.6), 84.5 (0.4) and 84.6 (0.7) %5mC for first trimester, second trimester and cord blood, respectively. Mean (s.d.) gestational age was 39.5 (1.8) weeks, and 6.5% of infants were born preterm. After adjustment for maternal age, race/ethnicity, body mass index, education, smoking status and fetal sex, women with the highest v. lowest quartile of first trimester LINE-1 had longer gestations [0.45 weeks (95% CI 0.12, 0.78)] and lower odds of preterm birth [OR 0.40 (0.17, 0.94)], whereas associations with cord blood LINE-1 were in the opposite direction (−0.45 weeks, −0.83, −0.08) and [OR 4.55 (1.18, 17.5)]. In conclusion, higher early pregnancy LINE-1 predicts lower risk of preterm birth. In contrast, preterm birth is associated with lower LINE-1 in cord blood.
We present an analysis of the properties of H i holes detected in 20 galaxies that are part of “The H i Nearby Galaxy Survey”. We detected more than 1000 holes in total in the sampled galaxies. The holes are found throughout the disks of the galaxies, out to the edge of the H i disk. We find that shear limits the age of holes in spirals. Shear is less important in dwarf galaxies which explains why H i holes in dwarfs are rounder, on average than in spirals. Shear is particularly strong in the inner part of spiral galaxies, limiting the lifespan of holes there and explaining why we find that holes outside R25 are larger and older. We proceed to derive the surface and volume porosity and find that this correlates with the type of the host galaxy: later Hubble types tend to be more porous. The size distribution of the holes in our sample follows a power law with a slope of aν ~ −2.9. Assuming that the holes are the result of massive star formation, we derive values for the supernova rate (SNR) and star formation rate (SFR) which scales with the SFR derived based on other tracers. If we extrapolate the observed number of holes to include those that fall below our resolution limit, down to holes created by a single supernova, we find that our results are compatible with the hypothesis that H i holes result from star formation.