To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study evaluated the quality of YouTube content focusing on common paediatric otolaryngology procedures, as this content can influence the opinions and medical decisions of patients.
A total of 120 YouTube videos were compiled to review using the terms ‘adenoid removal’, ‘adenoidectomy’, ‘ear tubes’, ‘tympanostomy’, ‘tonsil removal’ and ‘tonsillectomy’. The Discern criteria was used to rate the quality of health information presented in each video.
The mean bias Discern score was 3.18 and the mean overall Discern score was 2.39. Videos including US board certified physicians were rated significantly higher (p < 0.001) than videos without (bias Discern score = 3.00 vs 2.38; overall Discern score = 3.79 vs 1.55). The videos had been viewed a total of 176 769 549 times.
Unbiased, high quality videos on YouTube are lacking. As patients may rely on this information when making medical decisions, it is important that practitioners continually evaluate and improve this video content. Otolaryngologists should be prepared to discuss YouTube content with patients.
The disproportionate burden of prevalent, persistent pathogens among disadvantaged groups may contribute to socioeconomic and racial/ethnic disparities in long-term health. We assessed if the social patterning of pathogen burden changed over 16 years in a U.S.-representative sample. Data came from 17 660 National Health and Nutrition Examination Survey participants. Pathogen burden was quantified by summing the number of positive serologies for cytomegalovirus, herpes simplex virus-1, HSV-2, human papillomavirus and Toxoplasma gondii and dividing by the number of pathogens tested, giving a percent-seropositive for each participant. We examined sex- and age-adjusted mean pathogen burdens from 1999–2014, stratified by race/ethnicity and SES (poverty-to-income ratio (PIR); educational attainment). Those with a PIR < 1.3 had a mean pathogen burden 1.4–1.8 times those with a PIR > 3.5, with no change over time. Educational disparities were even greater and showed some evidence of increasing over time, with the mean pathogen burden among those with less than a high school education approximately twice that of those who completed more than high school. Non-Hispanic Black, Mexican American and other Hispanic participants had a mean pathogen burden 1.3–1.9 times non-Hispanic Whites. We demonstrate that socioeconomic and racial/ethnic disparities in pathogen burden have persisted across 16 years, with little evidence that the gap is closing.
Good education requires student experiences that deliver lessons about practice as well as theory and that encourage students to work for the public good—especially in the operation of democratic institutions (Dewey 1923; Dewy 1938). We report on an evaluation of the pedagogical value of a research project involving 23 colleges and universities across the country. Faculty trained and supervised students who observed polling places in the 2016 General Election. Our findings indicate that this was a valuable learning experience in both the short and long terms. Students found their experiences to be valuable and reported learning generally and specifically related to course material. Postelection, they also felt more knowledgeable about election science topics, voting behavior, and research methods. Students reported interest in participating in similar research in the future, would recommend other students to do so, and expressed interest in more learning and research about the topics central to their experience. Our results suggest that participants appreciated the importance of elections and their study. Collectively, the participating students are engaged and efficacious—essential qualities of citizens in a democracy.
People living in poverty are particularly vulnerable to shocks, including those caused by natural disasters such as floods and droughts. This paper analyses household survey data and hydrological riverine flood and drought data for 52 countries to find out whether poor people are disproportionally exposed to floods and droughts, and how this exposure may change in a future climate. We find that poor people are often disproportionally exposed to droughts and floods, particularly in urban areas. This pattern does not change significantly under future climate scenarios, although the absolute number of people potentially exposed to floods or droughts can increase or decrease significantly, depending on the scenario and region. In particular, many countries in Africa show a disproportionally high exposure of poor people to floods and droughts. For these hotspots, implementing risk-sensitive land-use and development policies that protect poor people should be a priority.
Since W. E. B. Du Bois documented the physical and social environments of Philadelphia’s predominantly African American Seventh Ward over a century ago, there has been continued interest in understanding the distribution of social and physical environments by racial make-up of communities. Characterization of these environments allows for documentation of inequities, identifies communities which encounter heightened risk, and can inform action to promote health equity. In this paper, we apply and extend Du Bois’s approach to examine the contemporary distribution of physical environmental exposures, health risks, and social vulnerabilities in the Detroit metropolitan area, one of the most racially-segregated areas in the United States. We begin by mapping the proximity of sensitive populations to hazardous land uses, their exposure to air pollutants and associated health risks, and social vulnerabilities, as well as cumulative risk (combined proximity, exposure, and vulnerability), across Census tracts. Next, we assess, quantitatively, the extent to which communities of color experience excess burdens of environmental exposures and associated health risks, economic and age-related vulnerabilities, and cumulative risk. The results, depicted in maps presented in the paper, suggest that Census tracts with greater proportions of people of color disproportionately encounter physical environmental exposures, socioeconomic vulnerabilities, and combined risk. Quantitative tests of inequality confirm these distributions, with statistically greater exposures, vulnerabilities, and cumulative risk in Census tracts with larger proportions of people of color. Together, these findings identify communities that experience disproportionate cumulative risk in the Detroit metropolitan area and quantify the inequitable distribution of risk by Census tract relative to the proportion of people of color. They identify clear opportunities for prioritizing communities for legislative, regulatory, policy, and local actions to promote environmental justice and health equity.
Improvements in colorectal cancer (CRC) detection and treatment have led to greater numbers of CRC survivors, for whom there is limited evidence on which to provide dietary guidelines to improve survival outcomes. Higher intake of red and processed meat and lower intake of fibre are associated with greater risk of developing CRC, but there is limited evidence regarding associations with survival after CRC diagnosis. Among 3789 CRC cases in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort, pre-diagnostic consumption of red meat, processed meat, poultry and dietary fibre was examined in relation to CRC-specific mortality (n 1008) and all-cause mortality (n 1262) using multivariable Cox regression models, adjusted for CRC risk factors. Pre-diagnostic red meat, processed meat or fibre intakes (defined as quartiles and continuous grams per day) were not associated with CRC-specific or all-cause mortality among CRC survivors; however, a marginal trend across quartiles of processed meat in relation to CRC mortality was detected (P 0·053). Pre-diagnostic poultry intake was inversely associated with all-cause mortality among women (hazard ratio (HR)/20 g/d 0·92; 95 % CI 0·84, 1·00), but not among men (HR 1·00; 95 % CI 0·91, 1·09) (Pfor heterogeneity=0·10). Pre-diagnostic intake of red meat or fibre is not associated with CRC survival in the EPIC cohort. There is suggestive evidence of an association between poultry intake and all-cause mortality among female CRC survivors and between processed meat intake and CRC-specific mortality; however, further research using post-diagnostic dietary data is required to confirm this relationship.
Higher fruit intake is associated with lower risk of all-cause and disease-specific mortality. However, data on individual fruits are limited, and the generalisability of these findings to the elderly remains uncertain. The objective of this study was to examine the association of apple intake with all-cause and disease-specific mortality over 15 years in a cohort of women aged over 70 years. Secondary analyses explored relationships of other fruits with mortality outcomes. Usual fruit intake was assessed in 1456 women using a FFQ. Incidence of all-cause and disease-specific mortality over 15 years was determined through the Western Australian Hospital Morbidity Data system. Cox regression was used to determine the hazard ratios (HR) for mortality. During 15 years of follow-up, 607 (41·7 %) women died from any cause. In the multivariable-adjusted analysis, the HR for all-cause mortality was 0·89 (95 % CI 0·81, 0·97) per sd (53 g/d) increase in apple intake, HR 0·80 (95 % CI 0·65, 0·98) for consumption of 5–100 g/d and HR 0·65 (95 % CI 0·48, 0·89) for consumption of >100 g/d (an apple a day), compared with apple intake of <5 g/d (Pfor trend=0·03). Our analysis also found that higher apple intake was associated with lower risk for cancer mortality, and that higher total fruit and banana intakes were associated lower risk of CVD mortality (P<0·05). Our results support the view that regular apple consumption may contribute to lower risk of mortality.
Despite overwhelming evidence demonstrating a persisting gap in life expectancy between those with psychotic illness and the general population, there has been no widespread implementation of interventions to improve the physical wellbeing of people with psychotic illness. This article explores opportunities to ‘Bridge the Gap’ in life expectancy. We describe an Australian evidence-based intervention that has substantially improved the physical health of young people recently commenced on antipsychotic medication. Further epidemiological research, accompanied by cultural change within mental health services, is an essential precursor to the implementation of effective and sustainable lifestyle interventions. There are other relatively neglected areas of physical wellbeing for people with psychotic illness, such as screening and diagnosis of malignancies, which need more research and clinical attention. While there has been progress with intervention development and evaluation, translation of evidence-based short-term intervention studies into feasible and sustainable system-wide changes within routine mental health service settings remains a challenge. Developing an implementation framework to support such change is an urgent priority so as to bridge the persisting premature mortality in people living with psychotic illness.
We describe a general framework for realistic analysis of sorting algorithms, and we apply it to the average-case analysis of three basic sorting algorithms (QuickSort, InsertionSort, BubbleSort). Usually the analysis deals with the mean number of key comparisons, but here we view keys as words produced by the same source, which are compared via their symbols in lexicographic order. The ‘realistic’ cost of the algorithm is now the total number of symbol comparisons performed by the algorithm, and, in this context, the average-case analysis aims to provide estimates for the mean number of symbol comparisons used by the algorithm. For sorting algorithms, and with respect to key comparisons, the average-case complexity of QuickSort is asymptotic to 2n log n, InsertionSort to n2/4 and BubbleSort to n2/2. With respect to symbol comparisons, we prove that their average-case complexity becomes Θ (n log2n), Θ(n2), Θ (n2 log n). In these three cases, we describe the dominant constants which exhibit the probabilistic behaviour of the source (namely entropy and coincidence) with respect to the algorithm.
Two broad aims drive weed science research: improved management and improved
understanding of weed biology and ecology. In recent years, agricultural
weed research addressing these two aims has effectively split into separate
subdisciplines despite repeated calls for greater integration. Although some
excellent work is being done, agricultural weed research has developed a
very high level of repetitiveness, a preponderance of purely descriptive
studies, and has failed to clearly articulate novel hypotheses linked to
established bodies of ecological and evolutionary theory. In contrast,
invasive plant research attracts a diverse cadre of nonweed scientists using
invasions to explore broader and more integrated biological questions
grounded in theory. We propose that although studies focused on weed
management remain vitally important, agricultural weed research would
benefit from deeper theoretical justification, a broader vision, and
increased collaboration across diverse disciplines. To initiate change in
this direction, we call for more emphasis on interdisciplinary training for
weed scientists, and for focused workshops and working groups to develop
specific areas of research and promote interactions among weed scientists
and with the wider scientific community.
High blood pressure (BP) variability, which may be an important determinant of hypertensive end-organ damage, is emerging as an important predictor of cardiovascular health. Dietary antioxidants can influence BP, but their effects on variability are yet to be investigated. The aim of the present study was to assess the effects of vitamin E, vitamin C and polyphenols on the rate of daytime and night-time ambulatory BP variation. To assess these effects, two randomised, double-blind, placebo-controlled trials were performed. In the first trial (vitamin E), fifty-eight individuals with type 2 diabetes were given 500 mg/d of RRR-α-tocopherol, 500 mg/d of mixed tocopherols or placebo for 6 weeks. In the second trial (vitamin C–polyphenols), sixty-nine treated hypertensive individuals were given 500 mg/d of vitamin C, 1000 mg/d of grape-seed polyphenols, both vitamin C and polyphenols, or neither (placebo) for 6 weeks. At baseline and at the end of the 6-week intervention, 24 h ambulatory BP and rate of measurement-to-measurement BP variation were assessed. Compared with placebo, treatment with α-tocopherol, mixed tocopherols, vitamin C and polyphenols did not significantly alter the rate of daytime or night-time systolic BP, diastolic BP or pulse pressure variation (P>0·05). Treatment with the vitamin C and polyphenol combination resulted in higher BP variation: the rate of night-time systolic BP variation (P= 0·022) and pulse pressure variation (P= 0·0036) were higher and the rate of daytime systolic BP variation was higher (P= 0·056). Vitamin E, vitamin C or grape-seed polyphenols did not significantly alter the rate of BP variation. However, the increase in the rate of BP variation suggests that the combination of high doses of vitamin C and polyphenols could be detrimental to treated hypertensive individuals.
Altered levels of selenium and copper have been linked with altered cardiovascular disease risk factors including changes in blood triglyceride and cholesterol levels. However, it is unclear whether this can be observed prenatally. This cross-sectional study includes 274 singleton births from 2004 to 2005 in Baltimore, Maryland. We measured umbilical cord serum selenium and copper using inductively coupled plasma mass spectrometry. We evaluated exposure levels vis-à-vis umbilical cord serum triglyceride and total cholesterol concentrations in multivariable regression models adjusted for gestational age, birth weight, maternal age, race, parity, smoking, prepregnancy body mass index, n-3 fatty acids and methyl mercury. The percent difference in triglycerides comparing those in the highest v. lowest quartile of selenium was 22.3% (95% confidence interval (CI): 7.1, 39.7). For copper this was 43.8% (95% CI: 25.9, 64.3). In multivariable models including both copper and selenium as covariates, copper, but not selenium, maintained a statistically significant association with increased triglycerides (percent difference: 40.7%, 95% CI: 22.1, 62.1). There was limited evidence of a relationship of increasing selenium with increasing total cholesterol. Our findings provide evidence that higher serum copper levels are associated with higher serum triglycerides in newborns, but should be confirmed in larger studies.