To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study aimed to determine the knowledge of first year health sciences students at a South African university regarding hearing loss and symptoms attributable to personal listening devices and their practices concerning the use of personal listening devices.
This was a cross-sectional study carried out using an anonymous self-administered questionnaire.
Of 336 students, 269 (80.1 per cent) completed the questionnaire. While most participants could identify symptoms that could be caused by extensive use of personal listening devices, almost 30 per cent did not know that it could cause permanent hearing loss. Personal listening devices were used by 90.7 per cent of participants, with 77.8 per cent having used them for more than five years. Use was at a high volume in 14.9 per cent of participants and for more than 2 hours per day in 52.7 per cent.
The findings indicate the need for an educational programme to inform students as to safe listening practices when using personal listening devices.
Here, we adopt an attachment theoretical perspective on relationship maintenance, based on the idea that a romantic relationship is an attachment bond. In doing so, we emphasize the role of normative attachment processes. We commence by introducing the attachment behavioral system and its three functions of proximity seeking/maintenance, safe haven, and secure base. We then describe the associations between normative attachment processes and relationship maintenance, including a discussion of evolutionary functions. The following part of the chapter explains how individual differences in attachment organization emerge based on early experiences with attachment figures, and why these differences are associated with relationship maintenance. Next, we review the literature on the associations of attachment style with three maintenance behaviors that have been widely studied in relation to attachment: support, communication, and commitment-enhancing behaviors. We conclude our chapter by discussing the association between attachment style and relationship satisfaction, which is regarded as an indicator of successful relationship maintenance. Overall, the normative processes of the attachment system align well with relationship maintenance behaviors, and attachment security tends to positively predict the enactment of maintenance behaviors.
We developed a tilt sensor for studying ice deformation and installed our tilt sensor systems in two boreholes drilled close to the shear margin of Jarvis Glacier, Alaska to obtain kinematic measurements of streaming ice. We used the collected tilt data to calculate borehole deformation by tracking the orientation of the sensors over time. The sensors' tilts generally trended down-glacier, with an element of cross-glacier flow in the borehole closer to the shear margin. We also evaluated our results against flow dynamic parameters derived from Glen's exponential flow law and explored the parameter space of the stress exponent n and enhancement factor E. Comparison with values from ice deformation experiments shows that the ice on Jarvis is characterized by higher n values than that is expected in regions of low stress, particularly at the shear margin (~3.4). The higher n values could be attributed to the observed high total strains coupled with potential dynamic recrystallization, causing anisotropic development and consequently sped up ice flow. Jarvis' n values place the creep regime of the ice between basal slip and dislocation creep. Tuning E towards a theoretical upper limit of 10 for anisotropic ice with single-maximum fabric reduces the n values by 0.2.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Antimicrobial stewardship programs typically use days of therapy to assess antimicrobial use. However, this metric does not account for the antimicrobial spectrum of activity. We applied an antibiotic spectrum index to a population of very-low-birth-weight infants to assess its utility to evaluate the impact of antimicrobial stewardship interventions.
Norovirus, a major cause of gastroenteritis in people of all ages worldwide, was first reported in South Korea in 1999. The most common causal agents of pediatric acute gastroenteritis are norovirus and rotavirus. While vaccination has reduced the pediatric rotavirus infection rate, norovirus vaccines have not been developed. Therefore, prediction and prevention of norovirus are very important. Norovirus is divided into genogroups GI–GVII, with GII.4 being the most prevalent. However, in 2012–2013, GII.17 showed a higher incidence than GII.4 and a novel variant, GII.P17-GII.17, appeared. In this study, 204 stool samples collected in 2013–2014 were screened by reverse transcriptase-polymerase chain reaction; 11 GI (5.39%) and 45 GII (22.06%) noroviruses were identified. GI.4, GI.5, GII.4, GII.6 and GII.17 were detected. The whole genomes of the three norovirus GII.17 were sequenced. The whole genome of GII.17 consists of three open reading frames of 5109, 1623 and 780 bp. Compared with 20 GII.17 strains isolated in other countries, we observed numerous changes in the protruding P2 domain of VP1 in the Korean GII.17 viruses. Our study provided genome information that might aid in epidemic prevention, epidemiology studies and vaccine development.
We assessed self-reported drives for alcohol use and their impact on clinical features of alcohol use disorder (AUD) patients. Our prediction was that, in contrast to “affectively” (reward or fear) driven drinking, “habitual” drinking would be associated with worse clinical features in relation to alcohol use and higher occurrence of associated psychiatric symptoms.
Fifty-eight Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) alcohol abuse patients were assessed with a comprehensive battery of reward- and fear-based behavioral tendencies. An 18-item self-report instrument (the Habit, Reward and Fear Scale; HRFS) was employed to quantify affective (fear or reward) and non-affective (habitual) motivations for alcohol use. To characterize clinical and demographic measures associated with habit, reward, and fear, we conducted a partial least squares analysis.
Habitual alcohol use was significantly associated with the severity of alcohol dependence reflected across a range of domains and with lower number of detoxifications across multiple settings. In contrast, reward-driven alcohol use was associated with a single domain of alcohol dependence, reward-related behavioral tendencies, and lower number of detoxifications.
These results seem to be consistent with a shift from goal-directed to habit-driven alcohol use with severity and progression of addiction, complementing preclinical work and informing biological models of addiction. Both reward-related and habit-driven alcohol use were associated with lower number of detoxifications, perhaps stemming from more benign course for the reward-related and lack of treatment engagement for the habit-related alcohol abuse group. Future work should further explore the role of habit in this and other addictive disorders, and in obsessive-compulsive related disorders.
OBJECTIVES/SPECIFIC AIMS: As the issues facing our global society become more complex, university faculty are called upon to address these contemporary problems using interdisciplinary approaches. But do reappointment, promotion, and tenure (RPT) guidelines reflect and reward this fundamental change in the nature of higher education and scholarly inquiry? After collecting all of the RPT guidelines across the university, our research team at the University of Cincinnati (UC) conducted a content analysis of these documents to determine how collaborative work is defined, interpreted, and supported. In addition, we also sought to identify differences in how collaborative work is valued across disciplines and how that value has changed over time. METHODS/STUDY POPULATION: An initial database was assembled that included two distinct data samples: historical and current. Both included RPT criteria for over 100 disciplinary units at the university. Working with the initial comprehensive database, the team narrowed content by selecting all language related to collaborative work using several relevant keywords or keyword fragments (team, collaborat[*], disciplin[*], and interprofessional). This process resulted in a subset of data reflecting the area of interest that could then be coded. Three investigators independently coded common portions of the data for categories. The investigators met regularly to compare the results of their coding, and discrepancies between the investigators’ coding schemes were resolved through discussion. The final, common coding scheme will used to code the remainder of the data by each independent investigator. The team meets weekly to discuss significant passages and assign codes, and then reach consensus related to important themes that are identified. Specifically, we will examine the frequency with which collaborative activities are included, the value and emphasis given to them, and the differences across units. Having a historical sample and a current sample also allows us to analyze trends over time and further compare disciplinary differences. RESULTS/ANTICIPATED RESULTS: UC is a diverse institution that includes world-renowned creative schools (the College Conservatory of Music and the College of Design, Architecture, Art, and Planning), as well as traditional colleges of medicine, nursing, pharmacy, allied health, engineering, business, arts and sciences, etc. UC also includes two branch campuses that specialize in associate’s degree level education. Given the diversity in educational and research missions across these areas, we anticipate discovering several themes within the RPT guidelines, primarily centered around the traditional foundations of faculty work such as service, research, and teaching. We anticipate strong differences by college and disciplinary focus, with emphasis on collaborative work and engagement increasing as RPT guidelines become more current. DISCUSSION/SIGNIFICANCE OF IMPACT: Our experience is that faculty members want to engage in collaborative work when possible and appropriate, but their perception is that independent contributions to their field are more highly valued than interdisciplinary work. As universities rush to endorse and promote interdisciplinary, team-oriented research and teaching, this study will afford a better understanding of the types of activities valued at one large and diverse urban institution, grounded in the actual language of RPT criteria.
Increasingly, products are designed for global markets, yet studies of design practices primarily investigate designers from high-income countries. Specifically, the use of prototypes during design is likely affected by the background of the designer and the environment in which they are designing. To broaden our understanding of the extent to which prototyping best practices are used beyond Western designers, in this study, we conducted interviews with novice designers from Ghana, a middle-income country (MIC), to examine how Ghanaian novice designers (upper-level undergraduate students) used prototypes throughout their design courses. We compared the reported use of prototypes to best practice behaviors and analyzed the types of prototypes used. We found evidence that these Ghanaian novice designers used some critical prototyping best practice behaviors, while other behaviors were underutilized, specifically during the front-end phases of design and for the purpose of engaging with stakeholders. Additionally, virtual models dominated their prototyping choices. We discuss likely reasons for these trends based on participants’ design experiences and design contexts.
Findings as to whether individuals’ experiences of physical maltreatment from their parents in childhood predict their own perpetration of physical maltreatment toward their children in adulthood are mixed. Whether the maltreatment experienced is severe versus moderate or mild may relate to the strength of intergenerational associations. Furthermore, understanding of the roles of possible mediators (intervening mechanisms linking these behaviors) and moderators of the intervening mechanisms (factors associated with stronger or weaker mediated associations) is still relatively limited. These issues were examined in the present study. Mediating mechanisms based on a social learning model included antisocial behavior as assessed by criminal behaviors and substance use (alcohol and drug use), and the extent to which parental angry temperament moderated any indirect effects of antisocial behavior was also examined. To address these issues, data were used from Generations 2 and 3 of a prospective three-generational study, which is an extension of the Oregon Youth Study. Findings indicated modest intergenerational associations for severe physical maltreatment. There was a significant association of maltreatment history, particularly severe maltreatment with mothers’ and fathers’ delinquency. However, neither delinquency nor substance use showed significant mediational effects, and parental anger as a moderator of mediation did not reach significance.
Chlamydia trachomatis (CT) infections remain highly prevalent. CT reinfection occurs frequently within months after treatment, likely contributing to sustaining the high CT infection prevalence. Sparse studies have suggested CT reinfection is associated with a lower organism load, but it is unclear whether CT load at the time of treatment influences CT reinfection risk. In this study, women presenting for treatment of a positive CT screening test were enrolled, treated and returned for 3- and 6-month follow-up visits. CT organism loads were quantified at each visit. We evaluated for an association of CT bacterial load at initial infection with reinfection risk and investigated factors influencing the CT load at baseline and follow-up in those with CT reinfection. We found no association of initial CT load with reinfection risk. We found a significant decrease in the median log10 CT load from baseline to follow-up in those with reinfection (5.6 CT/ml vs. 4.5 CT/ml; P = 0.015). Upon stratification of reinfected subjects based upon presence or absence of a history of CT infections prior to their infection at the baseline visit, we found a significant decline in the CT load from baseline to follow-up (5.7 CT/ml vs. 4.3 CT/ml; P = 0.021) exclusively in patients with a history of CT infections prior to our study. Our findings suggest repeated CT infections may lead to possible development of partial immunity against CT.
Intermediate wheatgrass (Thinopyrum intermedium; IWG) is a perennial cereal crop undergoing development for grain production; however, grain yield declines of >75% are often observed after year 2 of the perennial stand and may be linked to soil nutrient depletion. Intercropping IWG with a perennial legume such as alfalfa (Medicago sativa) could benefit nutrient cycling while increasing agroecological diversity. Intermediate wheatgrass was established at five environmentally diverse sites in Minnesota, USA in (1) bi-culture with alfalfa, (2) non-fertilized monoculture and (3) monoculture fertilized annually in the spring with 80 kg N/ha. At northern sites where alfalfa growth was favoured, IWG grain yields were reduced in year 2 by growing IWG in bi-culture with alfalfa, relative to the monoculture systems. Across all sites IWG grain yield decreased by 90% in the non-fertilized monoculture, 80% in the fertilized monoculture and 65% in the bi-culture from year 2 to 4 and plant macronutrient concentrations decreased by 25–70%. In year 4, IWG grain yield was similar or greater in the bi-culture than the fertilized monoculture at three of the five sites and alfalfa biomass was correlated positively with grain yield, harvest index and nutrient uptake in the year 4 bi-culture. Chemical-nitrogen fertilization increased grain yields in year 2 but did not mitigate the decline in yields as stands aged. Intermediate wheatgrass in the bi-culture had similar yields and nutrient uptake and lower yield declines than the chemically fertilized stand at sites where alfalfa growth was maintained throughout the life of the stand.
Identifying factors that influence the functional outcome is an important goal in schizophrenia research. The 22q11.2 deletion syndrome (22q11DS) is a unique genetic model with high risk (20–25%) for schizophrenia. This study aimed to identify potentially targetable domains of neurocognitive functioning associated with functional outcome in adults with 22q11DS.
We used comprehensive neurocognitive test data available for 99 adults with 22q11DS (n = 43 with schizophrenia) and principal component analysis to derive four domains of neurocognition (Verbal Memory, Visual and Logical Memory, Motor Performance, and Executive Performance). We then investigated the association of these neurocognitive domains with adaptive functioning using Vineland Adaptive Behavior Scales data and a linear regression model that accounted for the effects of schizophrenia status and overall intellectual level.
The regression model explained 46.8% of the variance in functional outcome (p < 0.0001). Executive Performance was significantly associated with functional outcome (p = 0.048). Age and schizophrenia were also significant factors. The effects of Executive Performance on functioning did not significantly differ between those with and without psychotic illness.
The findings provide the impetus for further studies to examine the potential of directed (early) interventions targeting Executive Performance to improve long-term adaptive functional outcome in individuals with, or at high risk for, schizophrenia. Moreover, the neurocognitive test profiles may benefit caregivers and clinicians by providing insight into the relative strengths and weaknesses of individuals with 22q11DS, with and without psychotic illness.
BACKGROUND: Meningiomas are the most common primary benign brain tumors in adults. Given the extended life expectancy of most meningiomas, consideration of quality of life (QOL) is important when selecting the optimal management strategy. There is currently a dearth of meningioma-specific QOL tools in the literature. OBJECTIVE: In this systematic review, we analyze the prevailing themes and propose toward building a meningioma-specific QOL assessment tool. METHODS: A systematic search was conducted, and only original studies based on adult patients were considered. QOL tools used in the various studies were analyzed for identification of prevailing themes in the qualitative analysis. The quality of the studies was also assessed. RESULTS: Sixteen articles met all inclusion criteria. Fifteen different QOL assessment tools assessed social and physical functioning, psychological, and emotional well-being. Patient perceptions and support networks had a major impact on QOL scores. Surgery negatively affected social functioning in younger patients, while radiation therapy had a variable impact. Any intervention appeared to have a greater negative impact on physical functioning compared to observation. CONCLUSION: Younger patients with meningiomas appear to be more vulnerable within social and physical functioning domains. All of these findings must be interpreted with great caution due to great clinical heterogeneity, limited generalizability, and risk of bias. For meningioma patients, the ideal QOL questionnaire would present outcomes that can be easily measured, presented, and compared across studies. Existing scales can be the foundation upon which a comprehensive, standard, and simple meningioma-specific survey can be prospectively developed and validated.
Childhood obesity rates are higher among Indigenous compared with non-Indigenous Australian children. It has been hypothesized that early-life influences beginning with the intrauterine environment predict the development of obesity in the offspring. The aim of this paper was to assess, in 227 mother–child dyads from the Gomeroi gaaynggal cohort, associations between prematurity, Gestation Related-Optimal Weight (GROW) centiles, maternal adiposity (percentage body fat, visceral fat area), maternal non-fasting plasma glucose levels (measured at mean gestational age of 23.1 weeks) and offspring BMI and adiposity (abdominal circumference, subscapular skinfold thickness) in early childhood (mean age 23.4 months). Maternal non-fasting plasma glucose concentrations were positively associated with infant birth weight (P=0.005) and GROW customized birth weight centiles (P=0.008). There was a significant association between maternal percentage body fat (P=0.02) and visceral fat area (P=0.00) with infant body weight in early childhood. Body mass index (BMI) in early childhood was significantly higher in offspring born preterm compared with those born at term (P=0.03). GROW customized birth weight centiles was significantly associated with body weight (P=0.01), BMI (P=0.007) and abdominal circumference (P=0.039) at early childhood. Our findings suggest that being born preterm, large for gestational age or exposed to an obesogenic intrauterine environment and higher maternal non-fasting plasma glucose concentrations are associated with increased obesity risk in early childhood. Future strategies should aim to reduce the prevalence of overweight/obesity in women of child-bearing age and emphasize the importance of optimal glycemia during pregnancy, particularly in Indigenous women.
Introduction: Pulse check by manual palpation (MP) is an unreliable skill even in the hands of healthcare professionals. In the context of cardiac arrest, this may translate into inappropriate chest compressions when a pulse is present, or conversely omitting chest compressions when one is absent. To date, no study has assessed the utility of B-mode ultrasound (US) for the detection of a carotid pulse. The primary objective of this study is to assess the time required to detect a carotid pulse in live subjects using US compared to the standard MP method. Methods: This is a prospective randomized controlled cross-over non-inferiority trial. Health care professionals from various backgrounds were invited to participate. They attended a 15 minute focused US workshop on identification of the carotid pulse. Following a washout period, they were randomized to detect a pulse in live subjects either by MP first or by US first. Both pulse check methods were timed for each participant on 2 different subjects. The primary outcome measure was time to carotid pulse detection in seconds. Secondary outcome measures included comfort levels of carotid pulse detection measured on a 100mm visual analog scale (VAS), and rates of prolonged pulse checks (greater than 5 or 10 seconds) for each technique. Mean pulse detection times were compared using Students t-test. The study was powered to determine whether US was not slower than MP by greater than 2 seconds. Results: A total of 93 participants completed the study. Time to detect pulse was 4.2 (SD=3.4) seconds by US compared with 4.7 (SD=6.5) seconds by MP (P=0.43). Seventeen (18%) participants took >5 seconds to identify the carotid pulse using US compared to 19 (20%) by MP (P=0.74). Eight (9%) candidates took >10 seconds to identify the pulse using US compared to 9 (10%) by MP (P=0.81). Prior to training, participants had a higher comfort level using MP than US pulse checks (67 vs 26 mm, P<0.001). Following the study, participants reported higher comfort levels using US than MP (88 vs 78 mm, P<0.001). Conclusion: Carotid pulse detection in live subjects was not slower using US as compared to MP in this study. A brief teaching session was sufficient to improve confidence of carotid pulse identification even in those with little to no previous US training. The preliminary results from this study provide the groundwork for larger studies to evaluate this pulse check method for patients in actual cardiac arrest.
Simulation models are used widely in pharmacology, epidemiology and health economics (HEs). However, there have been no attempts to incorporate models from these disciplines into a single integrated model. Accordingly, we explored this linkage to evaluate the epidemiological and economic impact of oseltamivir dose optimisation in supporting pandemic influenza planning in the USA. An HE decision analytic model was linked to a pharmacokinetic/pharmacodynamics (PK/PD) – dynamic transmission model simulating the impact of pandemic influenza with low virulence and low transmissibility and, high virulence and high transmissibility. The cost-utility analysis was from the payer and societal perspectives, comparing oseltamivir 75 and 150 mg twice daily (BID) to no treatment over a 1-year time horizon. Model parameters were derived from published studies. Outcomes were measured as cost per quality-adjusted life year (QALY) gained. Sensitivity analyses were performed to examine the integrated model's robustness. Under both pandemic scenarios, compared to no treatment, the use of oseltamivir 75 or 150 mg BID led to a significant reduction of influenza episodes and influenza-related deaths, translating to substantial savings of QALYs. Overall drug costs were offset by the reduction of both direct and indirect costs, making these two interventions cost-saving from both perspectives. The results were sensitive to the proportion of inpatient presentation at the emergency visit and patients’ quality of life. Integrating PK/PD–EPI/HE models is achievable. Whilst further refinement of this novel linkage model to more closely mimic the reality is needed, the current study has generated useful insights to support influenza pandemic planning.
A substantial proportion of persons with mental disorders seek treatment from complementary and alternative medicine (CAM) professionals. However, data on how CAM contacts vary across countries, mental disorders and their severity, and health care settings is largely lacking. The aim was therefore to investigate the prevalence of contacts with CAM providers in a large cross-national sample of persons with 12-month mental disorders.
In the World Mental Health Surveys, the Composite International Diagnostic Interview was administered to determine the presence of past 12 month mental disorders in 138 801 participants aged 18–100 derived from representative general population samples. Participants were recruited between 2001 and 2012. Rates of self-reported CAM contacts for each of the 28 surveys across 25 countries and 12 mental disorder groups were calculated for all persons with past 12-month mental disorders. Mental disorders were grouped into mood disorders, anxiety disorders or behavioural disorders, and further divided by severity levels. Satisfaction with conventional care was also compared with CAM contact satisfaction.
An estimated 3.6% (standard error 0.2%) of persons with a past 12-month mental disorder reported a CAM contact, which was two times higher in high-income countries (4.6%; standard error 0.3%) than in low- and middle-income countries (2.3%; standard error 0.2%). CAM contacts were largely comparable for different disorder types, but particularly high in persons receiving conventional care (8.6–17.8%). CAM contacts increased with increasing mental disorder severity. Among persons receiving specialist mental health care, CAM contacts were reported by 14.0% for severe mood disorders, 16.2% for severe anxiety disorders and 22.5% for severe behavioural disorders. Satisfaction with care was comparable with respect to CAM contacts (78.3%) and conventional care (75.6%) in persons that received both.
CAM contacts are common in persons with severe mental disorders, in high-income countries, and in persons receiving conventional care. Our findings support the notion of CAM as largely complementary but are in contrast to suggestions that this concerns person with only mild, transient complaints. There was no indication that persons were less satisfied by CAM visits than by receiving conventional care. We encourage health care professionals in conventional settings to openly discuss the care patients are receiving, whether conventional or not, and their reasons for doing so.
Despite evidence linking regular nut consumption with reduced chronic disease risk, population-level intakes remain low. Research suggests nut-promoting advice from doctors facilitates regular nut consumption. However, there is no information on current nut recommendation practices of health professionals. The aim of the present study was to examine the advice provided by health professionals regarding nut consumption.
In this cross-sectional study, participants were invited to complete a survey including questions about their nut recommendation practices.
New Zealand (NZ).
The NZ Electoral Roll was used to identify dietitians, general practitioners and practice nurses.
In total 318 dietitians, 292 general practitioners and 149 practice nurses responded. Dietitians were more likely (82·7 %) to recommend patients increase consumption of nuts than general practitioners (55·5 %) and practice nurses (63·1 %; both P<0·001). The most popular nuts recommended were almonds, Brazil nuts and walnuts, with most health professionals recommending raw nuts. The most common recommendation for frequency of consumption by dietitians and practice nurses was to eat nuts every day, while general practitioners most frequently recommended 2–4 times weekly, although not statistically significantly different between professions. Dietitians recommended a significantly greater amount of nuts (median 30 g/d) than both general practitioners and practice nurses (20 g/d; both P<0·001).
Dietitians were most likely to recommend consumption of nuts in accordance with current guidelines, but there are opportunities to improve the adoption of nut consumption recommendations for all professions. This may be a viable strategy for increasing population-level nut intakes to reduce chronic disease.