To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The aim of the present study is to use the syndemic framework to investigate the risk of contracting HIV in the US population. Cross-sectional analyses are from The National Health and Nutrition Examination Survey. We extracted and aggregated data on HIV antibody test, socio-demographic characteristics, alcohol use, drug use, depression, sexual behaviours and sexually transmitted diseases from cycle 2009–2010 to 2015–2016. We carried out weighted regression among young adults (20–39 years) and adults (40–59 years) separately. In total, 5230 men and 5794 women aged 20–59 years were included in the present analyses. In total, 0.8% men and 0.2% women were tested HIV-positive. Each increasing HIV risk behaviour was associated with elevated odds of being tested HIV-positive (1.15, 95% CI 1.15–1.15) among young adults and adults (1.61, 95% CI 1.61–1.61). Multi-faceted, community-based interventions are urgently required to reduce the incidence of HIV in the USA.
Understanding the association between diet quality and cardiometabolic risk by education level is important for preventing increased cardiometabolic risk in the Mexican population, especially considering pre-existing disparities in diet quality. The present study examined the cross-sectional association of overall diet quality with cardiometabolic risk, overall and by education level, among Mexican men and women.
Cardiometabolic risk was defined by using biomarkers and diet quality by the Mexican Diet Quality Index. We computed sex-specific multivariable logistic regression models.
Mexican men (n 634) and women (n 875) participating in the Mexican National Health and Nutrition Survey 2012.
We did not find associations of diet quality with cardiometabolic risk factors in the total sample or in men by education level. However, we observed that for each 10-unit increase in the dietary quality score, the odds of diabetes risk in women with no reading/writing skills was 0·47 (95 % CI 0·26, 0·85) relative to the odds in women with ≥10 years of school (referent). Similarly, for each 10-unit increase of the dietary quality score, the odds of having three v. no lipid biomarker level beyond the risk threshold in lower-educated women was 0·27 (95 % CI 0·12, 0·63) relative to the odds in higher-educated women.
Diet quality has a stronger protective association with some cardiometabolic disease risk factors for lower- than higher-educated Mexican women, but no association with cardiometabolic disease risk factors among men. Future research will be needed to understand what diet factors could be influencing the cardiometabolic disease risk disparities in this population.
Objectives: An important question in longitudinal research is whether the individuals who discontinue participation differ in their level of, or their change in, cognitive functioning relative to individuals who return for subsequent occasions. Methods: Performance in five cognitive domains was examined in nearly 5000 participants between 18 and 85 years of age who completed between one and five longitudinal occasions. Results: Little or no differences in cognitive performance were apparent between young adults who did or did not return for subsequent longitudinal occasions. However, among adults above about 45 years of age, returning participants had higher levels of cognitive performance, but approximately similar magnitude of longitudinal change, as participants completing fewer occasions. Conclusions: These results suggest that generalizability of longitudinal comparisons may be restricted to individuals with relatively high levels of cognitive functioning, but that rates of cognitive change are nearly comparable for individuals completing different numbers of longitudinal occasions.
Improving Access to Psychological Therapies (IAPT) services treat most patients in England who present to primary care with major depression. Psychodynamic psychotherapy is one of the psychotherapies offered. Dynamic Interpersonal Therapy (DIT) is a psychodynamic and mentalization-based treatment for depression. 16 sessions are delivered over approximately 5 months. Neither DIT's effectiveness relative to low-intensity treatment (LIT), nor the feasibility of randomizing patients to psychodynamic or cognitive-behavioural treatments (CBT) in an IAPT setting has been demonstrated.
147 patients were randomized in a 3:2:1 ratio to DIT (n = 73), LIT (control intervention; n = 54) or CBT (n = 20) in four IAPT treatment services in a combined superiority and feasibility design. Patients meeting criteria for major depressive disorder were assessed at baseline, mid-treatment (3 months) and post-treatment (6 months) using the Hamilton Rating Scale for Depression (HRSD-17), Beck Depression Inventory-II (BDI-II) and other self-rated questionnaire measures. Patients receiving DIT were also followed up 6 months post-completion.
The DIT arm showed significantly lower HRSD-17 scores at the 6-month primary end-point compared with LIT (d = 0.70). Significantly more DIT patients (51%) showed clinically significant change on the HRSD-17 compared with LIT (9%). The DIT and CBT arms showed equivalence on most outcomes. Results were similar with the BDI-II. DIT showed benefit across a range of secondary outcomes.
DIT delivered in a primary care setting is superior to LIT and can be appropriately compared with CBT in future RCTs.
Nanoemulsion formulation of vitamin D3 have been shown to have better bioavailability than the coarse emulsion preparation in vitro and in vivo animal studies. In the absence of randomised trial in humans, comparing the efficacy of nanotechnology-based miscellised vitamin D3 over conventional vitamin D3, we undertook this study. A total of 180 healthy adults were randomised to receive either micellised (DePura, group A) or conventional vitamin D3 (Calcirol, group B) at a monthly dose of 60 000 IU (1500μg) for 6 months. The outcome parameters were serum 25-hydroxyvitamin D (25(OH)D), parathyroid hormone (PTH), Ca, phosphate, alkaline phosphatase and urinary Ca:creatinine ratio. A total of eighty-nine subjects in group A and seventy-seven in group B completed the trial. Subjects in both the groups had a significant increase in their serum 25(OH)D levels following supplementation (group A: 21·5 (sd 10·9) to 76·7 (sd 18·8) nmol/l (P<0·001); group B: 22·8 (sd 10·4) to 57·8 (sd 16·0) nmol/l (P<0·001)). Participants in micellised group had an additional increase of 20·2 (95 % CI 14·0, 26·4) nmol/l in serum 25(OH)D levels (P<0·001). The difference between the groups was 17·5 (95 % CI 11·8, 23·1) nmol/l, which remained statistically significant (P<0·001) even after adjustment for age and sex. Significant decline in mean serum PTH was observed in both the groups. No hypercalcaemia or hypercalciuria was noted. Although supplementation with both the preparations resulted in a significant rise in serum 25(OH)D levels, micellised vitamin D3 appeared to be more efficacious in achieving higher levels of serum 25(OH)D.
To verify the previously untested assumption that eating more salad enhances vegetable intake and determine if salad consumption is in fact associated with higher vegetable intake and greater adherence to the Dietary Guidelines for Americans (DGA) recommendations.
Individuals were classified as salad reporters or non-reporters based upon whether they consumed a salad composed primarily of raw vegetables on the intake day. Regression analyses were applied to calculate adjusted estimates of food group intakes and assess the likelihood of meeting Healthy US-Style Food Pattern recommendations by salad reporting status.
Cross-sectional analysis of data collected in 2011–2014 in What We Eat in America, the dietary intake component of the National Health and Nutrition Examination Survey.
US adults (n 9678) aged ≥20 years (excluding pregnant and lactating women).
On the intake day, 23 % of adults ate salad. The proportion of individuals reporting salad varied by sex, age, race, income, education and smoking status (P<0·001). Compared with non-reporters, salad reporters consumed significantly larger quantities of vegetables (total, dark green, red/orange and other), which translated into a two- to threefold greater likelihood of meeting recommendations for these food groups. More modest associations were observed between salad consumption and differences in intake and likelihood of meeting recommendations for protein foods (total and seafood), oils and refined grains.
Study results confirm the DGA message that incorporating more salads in the diet is one effective strategy (among others, such as eating more cooked vegetables) to augment vegetable consumption and adherence to dietary recommendations concerning vegetables.
To profile discretionary food and beverage (DF) consumption among Australian adults.
Cross-sectional analysis. Dietary and sociodemographic data were used to profile DF intake. Prevalence of DF consumption, DF servings (1 serving=600 kJ), nutrient contribution from DF and top DF food groups by self-reported eating occasions were determined. DF consumers (>0 g) were classified according to quartile of DF intake and general linear models adjusted for age and sex were used to determine associations.
2011–12 National Nutrition and Physical Activity Survey (NNPAS).
Adults aged ≥19 years (n 9341) who participated in the NNPAS 2011–12.
Most adults consumed DF (98 %) and over 60 % exceeded 3 DF servings/d, with a mean of 5·0 (se 0·0) DF servings/d. Cakes, muffins, scones, cake-type desserts contributed the most DF energy (8·4 %) of all food groups, followed by wines (8·1 %), pastries (8·0 %) and beers (6·1 %), with all these food groups consumed in large portions (2·3–3·0 DF servings). Lunch and dinner together contributed 45 % of total DF energy intake. High DF consumers had an average of 10 DF servings, and this group contained more younger adults, males, low socio-economic status, lower usual fruit intake and higher mean waist circumference, but not higher BMI.
A focus on DF consumed in large portions at lunch and dinner may help improve interventions aimed at reducing DF intake and addressing negative adiposity-related measures found in high DF consumers.
The present article focuses on the validation of the Questionnaire of Social Representations about the Functions of Deliberate Self-Harm for adults. The understanding of the social representations about deliberate self-harm can be relevant for clinical intervention and prevention. However, there is still a lack of instruments to assess these representations. The basis for this instrument was the translation of the Inventory of Statements About Self-Injury. To complement this instrument, we conducted semi-directive interviews with adults without deliberate self-harm and analysed the Portuguese written press. Results from these studies complemented the questionnaire with new items and functions. Study 1 consisted of an exploratory factor analysis with a sample of 462 adults. Results revealed a two-factor structure of interpersonal and intrapersonal dimensions. After item reduction, the factorial analysis of the independent functions was also acceptable. This structure was then corroborated in Study 2 by a confirmatory factor analysis with a new sample of 474 adults, revealing an acceptable model fit. This questionnaire presents a relatively solid structure and is based on acceptable psychometric properties, which allows its use in future research.
To delineate trends in types of protein in US adults from 1999 to 2010, we examined the mean intake of beef, pork, lamb or goat, chicken, turkey, fish, dairy, eggs, legumes, and nuts and seeds (grams per kilogram of body weight) among adults and according to subgroups, including chronic disease status.
Six cycles of the repeated cross-sectional surveys.
National Health and Nutrition Examination Survey 1999 to 2010.
US adults aged ≥20 years (n 29 145, range: 4252–5762 per cycle).
Overall, mean chicken (0·47 to 0·52 g/kg), turkey (0·09 to 0·13 g/kg), fish (0·21 to 0·27 g/kg) and legume (0·21 to 0·26 g/kg) intake increased, whereas dairy decreased (3·56 to 3·22 g/kg) in US adults (P <0·03). Beef, lamb or goat intake did not change in adults or among those with a chronic disease. Over time, beef intake declined less, and lamb or goat intake increased more, for those of lower socio-economic status compared with those of higher socio-economic status.
Despite recommendations to reduce red meat, beef, lamb or goat intake did not change in adults, among those with a chronic disease or with lower socio-economic status.
Changes in added sugar intake have been associated with corresponding changes in body weight. Potential mechanisms, particularly the impact of added sugar intake on appetite, warrant exploration. A systematic literature review of randomised controlled trials investigated the association between added sugar consumption and appetite in overweight and obese adults. A systematic search of Medline, Cochrane CENTRAL, Web of Science and CINAHL included studies that examined the relationship between added sugar intake and appetite markers, in comparison with a group with lower added sugar intake. A total of twenty-one articles describing nineteen studies were included in the review. The effect of added sugar on appetite was explored separately by reported comparisons of added sugar type and their effect to three study outcomes: energy consumption (n 20 comparisons); satiety (n 18); and appetite hormones, leptin (n 4) or ghrelin (n 7). Increased added sugar consumption did not impact subsequent energy intake (n 9), nor did it influence satiety (n 12) or ghrelin levels (n 4). Differences in the total daily energy intake were comparable with the differences in energy values of tested products (n 3). Added sugar intake was reported to increase leptin levels (n 3). This review did not find a consistent relationship between added sugar intake and appetite measures, which may be partially explained by variations in study methodologies. There is a need for randomised controlled trials examining a range of added sugar sources and doses on appetite in overweight and obese adults to better understand implications for weight gain.
The present study aimed to identify dietary patterns, compare dietary patterns regarding nutrient profile and investigate the association between dietary patterns and body composition in a population in western Austria.
In a cross-sectional study, eating habits, anthropometric measurements and body composition were assessed. Food intake was collected by two non-consecutive 24 h recalls. Factor analysis (principal component analysis) with complementary cluster analysis was applied to identify dietary patterns. Associations of dietary patterns with body composition and nutrient profile were examined by the t test, one-way ANOVA and ANCOVA with Bonferroni’s correction. The χ2 test was used for categorical variables.
Tyrol, western Austria, 2014–2015.
Adults (n 463) aged 18–64 years.
Three dietary patterns were derived, labelled as the ‘health-conscious’, the ‘western’ and the ‘traditional’ dietary pattern. After adjustment for confounding variables, individuals following the traditional and western patterns were more likely to be overweight/obese (P <0·001) and to have a higher body fat percentage (P <0·05). Individuals following the traditional dietary pattern consumed significantly more SFA and less PUFA and dietary fibre (P <0·001) than those in the other groups.
Individuals who mostly eat in a traditional way should be encouraged to increase their consumption of vegetables, fruits, whole grains and healthy fats. It is important to know local eating habits not only for planning individual nutritional therapy, but also for well-directed public health actions.
Breakfast skipping is regarded as a public health issue among adults worldwide. Nutrition knowledge has been reported to be one of predictors of dietary behaviour. Therefore, the aim of the present study was to examine the association between nutrition knowledge and breakfast skipping.
Data regarding nutrition knowledge were obtained by using a validated, self-administered general nutrition knowledge questionnaire for Japanese adults (JGNKQ). Participants were classified into three nutrition knowledge level groups according to total JGNKQ score: Low, Middle and High. In addition, participants reported the frequency of meal consumption per week and rated the difficulty in finding time to eat breakfast, lunch and dinner in the lifestyle questionnaire. The differences in frequency of breakfast, lunch and dinner consumption among Low, Middle and High nutrition knowledge groups were determined by using ANCOVA adjusted for potential confounding factors.
Kanto region, Japan.
Japanese adults aged 18–64 years (n 1165, 57·3% women).
Mean age of the participants was 43·8 (sd 8·9) years. There were no significant differences found in the proportion of respondents reporting difficulty in finding time to eat each meal among the three groups. However, the frequency of breakfast consumption was significantly different among Low, Middle and High groups, while lunch and dinner frequency did not differ among the three groups.
The present study suggests that nutrition knowledge level is related to breakfast skipping among Japanese adults.
Many studies of food intake have been performed and published in Sweden, but to our knowledge no studies have extensively explored the beverage consumption of the Swedish adult population. The present study aimed to describe the beverage consumption and the contribution of beverage energy (including alcohol energy) to total energy intake according to gender, region of living, meal type and day for a Swedish adult population.
National dietary survey Riksmaten (2010–2011), collected by the Swedish National Food Agency.
A total of 1682 participants (57 % women) reported dietary intake data during four consecutive days, specified by portion size, meal, time point, day of the week and venue. Meals were categorized as breakfast, lunch, dinner and ‘other’.
The beverage reported to be consumed the most was water (ml/d), followed by coffee. Men had a higher consumption of juice, soft drinks, beer, spirits and low-alcohol beer, while the consumption of tea and water was higher for women. For both genders, milk contributed the most to beverage energy intake. Energy percentage from beverages was higher at lunch and dinner during weekends for both genders. Participants from the biggest cities in Sweden had a higher consumption of wine for both genders and tea for men than participants from other regions.
A considerable part of total energy intake was contributed by beverages, especially for men. Beverages can contribute to a more enjoyable diet, but at the same time provide energy, sugar and alcohol in amounts that do not promote optimal health.
The investigation of personality using the Rorschach Method has been historically established, however, its proper use requires continuous study, especially in regard to reliability, validity and normative references. This study’s objective was to verify stability indicators of Rorschach (French Approach) through a reassessment (after 15 years) of non-patient adults previously addressed in the normative study by Pasian (1998). A total of 88 adults, aged between 34 and 69 years old, of both sexes, with different socio-economic and educational levels, were reassessed in 2013 in the state of São Paulo, Brazil. The responses were independently rated by different judges, with adequate precision. The average results obtained collected in 1998 and 2013 were analyzed to determine if these two sets of data were significantly different from each other (Student’s t test, p ≤ .05) and the following variables were compared: Productivity indices, Apprehension Modes/Location, Formal Quality, Determinants, Contents and Banality. The overall stability level in these variables is considerable (mean r = .28, ± SD = 0.21). We discuss the theoretical approach of the Rorschach method regarding structural aspects of personality and developmental issues in personality assessment.
Abdominal obesity (AO) is a relative risk factor for cardiovascular events. We aimed to determine the 6-year incidence of AO and its risk factors among Tehranian adults.
In this population-based cohort study, non-abdominally obese participants, aged ≥20 years, were followed for incidence of AO. Cumulative incidence and incidence rate of AO were calculated for each sex. Cox proportional hazard regression was used to determine the association of potential risk factors including age, BMI, dysmetabolic state, smoking, marital status, educational level and physical activity (PA).
A total of 5044 participants (1912 men) were followed for a median of 6 years. Mean age was 37·7 (sd 13·5) years at baseline, with mean BMI of 24·3 (sd 3·1) kg/m2 (men, 23·0 (sd 2·4) kg/m2; women, 25·0 (sd 3·2) kg/m2). During follow-up, 3093 (1373 men) developed AO with total cumulative incidence of 76·02, 83·59 and 70·90 %, for the whole population, men and women, respectively. Corresponding incidence rates were 96·0, 138·7 and 77·1 per 1000 person-years. The highest incidence rate was observed during their 30s and 50s, in men and women, respectively. Subjects with dysmetabolic state in both sexes, married women, men with lower PA and higher educational levels at baseline were at higher risk of AO.
The incidence of AO is high among Tehranian adults, especially in young men. The risk factors for developing AO should be highlighted to halt this growing trend of AO.
Objectives: Obstructive sleep apnea (OSA) is associated with cognitive impairment but the relationships between specific biomarkers and neurocognitive domains remain unclear. The present study examined the influence of common health comorbidities on these relationships. Adults with suspected OSA (N=60; 53% male; M age=52 years; SD=14) underwent neuropsychological evaluation before baseline polysomnography (PSG). Apneic syndrome severity, hypoxic strain, and sleep architecture disturbance were assessed through PSG. Methods: Depression (Center for Epidemiological Studies Depression Scale, CESD), pain, and medical comorbidity (Charlson Comorbidity Index) were measured via questionnaires. Processing speed, attention, vigilance, memory, executive functioning, and motor dexterity were evaluated with cognitive testing. A winnowing approach identified 9 potential moderation models comprised of a correlated PSG variable, comorbid health factor, and cognitive performance. Results: Regression analyses identified one significant moderation model: average blood oxygen saturation (AVO2) and depression predicting recall memory, accounting for 31% of the performance variance, p<.001. Depression was a significant predictor of recall memory, p<.001, but AVO2 was not a significant predictor. The interaction between depression and AVO2 was significant, accounting for an additional 10% of the variance, p<.001. The relationship between low AVO2 and low recall memory performance emerged when depression severity ratings approached a previously established clinical cutoff score (CESD=16). Conclusions: This study examined sleep biomarkers with specific neurocognitive functions among individuals with suspected OSA. Findings revealed that depression burden uniquely influence this pathophysiological relationship, which may aid clinical management. (JINS, 2018, 28, 864–875)
Vitamin D deficiency (VDD) and insufficiency (VDI) are increasing at a global level, and they are associated with increased risk of various diseases. However, little information is available on the prevalence and predictors of VDD and VDI in a representative population of US adults. Serum 25-hydroxyvitamin D (25(OH)D) measurements were collected from 26 010 adults aged ≥18 years from the National Health and Nutrition Examination Survey (NHANES) 2001–2010. Using thresholds recommended by the Endocrine Society, VDD was defined as 25(OH)D<50 nmol/l and VDI as 50≤25(OH)D<75 nmol/l. Weighted multinomial log-binomial regression was conducted to estimate prevalence ratios of VDD and VDI. The prevalences of VDD and VDI in 2001–2010 were 28·9 and 41·4 %, respectively. Adults who were black, less educated, poor, obese, current smokers, physically inactive and infrequent milk consumers had a higher prevalence of VDD. After adjustment for other potential predictors, obese adults showed 3·09 times higher prevalence of VDD and 1·80 times higher prevalence of VDI than non-obese adults. Physically inactive adults had 2·00 and 1·36 times higher prevalence of VDD and VDI than active peers. Compared with frequent consumers, rare consumers of milk had 2·44 and 1·25 times higher prevalence of VDD and VDI, respectively. Current alcohol drinkers had 38 % lower prevalence of VDD than non-drinkers. Awareness of the high prevalence of VDD and VDI among US adults and related predictors could inform behavioural and dietary strategies for preventing VDD and monitoring VDI, especially in old, black, obese and inactive individuals who report rare consumption of milk.
We investigated the predictors of neuraminidase inhibitor (NAI) treatment in severe hospitalised influenza cases and the association between antiviral treatment and mortality. An observational epidemiological study was carried out in Catalonia (Spain) during 2010–2016 in patients aged ⩾18 years. Severe hospitalised cases of laboratory-confirmed influenza requiring hospitalisation were included. We collected demographic, virological and clinical characteristics. Mixed-effects logistic regression was used to estimate crude and adjusted odds ratio (aOR). We included 1727 hospitalised patients, of whom 1577 (91.3%) received NAI. Receiving NAI ⩽48 h after onset of clinical symptoms (aOR 0.37, 95% confidence interval (CI) 0.22–0.63), ⩽3 days (aOR 0.49, 95% CI 0.30–0.79) and ⩽5 days (aOR 0.50, 95% CI 0.32–0.79) was associated with a reduction in deaths. In patients admitted to the intensive care unit (ICU) (595; 34.5%), treatment ⩽48 h (aOR 0.32, 95% CI 0.14–0.74), ⩽3 days (aOR 0.44, 95% CI 0.20–0.97) and ⩽5 days (aOR 0.45, 95% CI 0.22–0.96) was associated with a reduction in deaths. Receiving treatment >5 days after onset of clinical symptoms was not associated with the reduction in deaths in hospitalised patients or those admitted to the ICU. NAI treatment of hospitalised patients with severe confirmed influenza was effective in avoiding death, mainly when administered ⩽48 h after symptom onset, but also when no more than 5 days had elapsed.
Universal salt iodisation (USI) has been successfully implemented in China for more than 15 years. Recent evidence suggests that the definition of ‘adequate iodine’ (100–199 µg/l) be revised to ‘sufficient iodine’ (100–299 µg/l) based on the median urinary iodine concentration (MUI) in school-age children. The objective of this study was to determine the prevalence of thyroid dysfunction in populations after long-term salt iodisation and examine whether the definition of adequate iodine can be broadened to sufficient iodine based on the thyroid function in four population groups. A cross-sectional survey was conducted in six provinces in the northern, central and southern regions of China. Four population groups consisting of 657 children, 755 adults, 347 pregnant women and 348 lactating women were recruited. Three spot urinary samples were collected over a 10-d period and blood samples were collected on the 1st day. In the study, among the adults, pregnant women and lactating women, the prevalence rates of elevated thyroglobulin antibody and thyroid microsomal antibody levels were 12·4, 8·5 and 7·8 %, and 12·1, 9·1 and 9·1 %, respectively. Abnormally high thyroid dysfunction prevalence was not observed after more than 15 years of USI in China because the thyroid dysfunction rates were all <5 %. The recommended range should be cautiously broadened from adequate iodine to sufficient iodine according to the MUI of school-age children considering the high levels of hormones and antibodies in the other populations. Adults, particularly pregnant women positive for thyroid antibodies, should be closely monitored.
Earlier research states that if an unaccented pronoun refers to the subject of the preceding sentence, a focally accented pronoun will refer to the object. In the current study, we tested whether Norwegian adults select the intended pronoun referent in this context. Our study is also the first one to use eye-tracking to investigate children's developing sensitivity to intonational cues in pronoun resolution, and consequently the first one where Norwegian is the object language. The participants were monolingual 3-, 5-, and 7-year-old children, and a group of adults. They listened to the Norwegian version of utterances like ‘Sarai hugged Mariaj. Then shei/SHEj hugged her own teddy bear’, while watching two corresponding figures on a screen. This was followed by the question, in Norwegian, ‘Who hugged her own teddybear?’ When answering the question, the adults selected the subject referent (Sara) after unaccented pronouns, and the object referent (Maria) after focally accented pronouns. Eye-tracking data revealed that the 7-year-olds initially looked towards the object referent after hearing the pronoun, and then switched to look at the subject referent, regardless of the pronoun's intonation. The 5-year-olds answered the question by selecting the intended referent more often after a focally accented pronoun than after an unaccented one. Finally, the 3-year-olds showed no clear preferences. These results suggest that Norwegian children under the age of seven are still not adult-like when resolving accented and unaccented pronouns.