To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hercules Dome, Antarctica, has long been identified as a prospective deep ice core site due to the undisturbed internal layering, climatic setting and potential to obtain proxy records from the Last Interglacial (LIG) period when the West Antarctic ice sheet may have collapsed. We performed a geophysical survey using multiple ice-penetrating radar systems to identify potential locations for a deep ice core at Hercules Dome. The surface topography, as revealed with recent satellite observations, is more complex than previously recognized. The most prominent dome, which we term ‘West Dome’, is the most promising region for a deep ice core for the following reasons: (1) bed-conformal radar reflections indicate minimal layer disturbance and extend to within tens of meters of the ice bottom; (2) the bed is likely frozen, as evidenced by both the shape of the measured vertical ice velocity profiles beneath the divide and modeled ice temperature using three remotely sensed estimates of geothermal flux and (3) models of layer thinning have 132 ka old ice at 45–90 m above the bed with an annual layer thickness of ~1 mm, satisfying the resolution and preservation needed for detailed analysis of the LIG period.
Considerable literature has examined the COVID-19 pandemic’s negative mental health sequelae. It is recognised that most people experiencing mental health problems present to primary care and the development of interventions to support GPs in the care of patients with mental health problems is a priority. This review examines interventions to enhance GP care of mental health disorders, with a view to reviewing how mental health needs might be addressed in the post-COVID-19 era.
Five electronic databases (PubMed, PsycINFO, Cochrane Library, Google Scholar and WHO ‘Global Research on COVID-19’) were searched from May – July 2021 for papers published in English following Arksey and O’Malley’s six-stage scoping review process.
The initial search identified 148 articles and a total of 29 were included in the review. These studies adopted a range of methodologies, most commonly randomised control trials, qualitative interviews and surveys. Results from included studies were divided into themes: Interventions to improve identification of mental health disorders, Interventions to support GPs, Therapeutic interventions, Telemedicine Interventions and Barriers and Facilitators to Intervention Implementation. Outcome measures reported included the Seven-item Generalised Anxiety Disorder Scale (GAD-7), the Nine-item Patient Health Questionnaire (PHQ-9) and the ‘The Patient Global Impression of Change Scale’.
With increasing recognition of the mental health sequelae of COVID-19, there is a lack of large scale trials researching the acceptability or effectiveness of general practice interventions. Furthermore there is a lack of research regarding possible biological interventions (psychiatric medications) for mental health problems arising from the pandemic.
This secondary analysis examined the influence of changes in physical activity (PA), sedentary time and energy expenditure (EE) during dietary energy restriction on the rate of weight loss (WL) and 1-year follow-up weight change in women with overweight/obesity.
Measurements of body weight and composition (air-displacement plethysmography), resting metabolic rate (indirect calorimetry), total daily (TDEE) and activity EE (AEE), minutes of PA and sedentary time (PA monitor) were taken at baseline, after 2 weeks, after ≥5% WL or 12 weeks of continuous (25% daily energy deficit) or intermittent (75% daily energy deficit alternated with ad libitum day) energy restriction, and at 1-year post-WL. The rate of WL was calculated as total %WL/number of dieting weeks. Data from both groups were combined for analyses.
Thirty-seven participants (age=35±10y; BMI=29.1±2.3kg/m2) completed the intervention (WL=−5.9±1.6%) and 18 returned at 1-year post-WL (weight change=+4.5±5.2%). Changes in sedentary time at 2 weeks were associated with the rate of WL during energy restriction (r=−0.38; p=0.03). Changes in total (r=0.54; p<0.01), light (r=0.43; p=0.01) and moderate-to-vigorous PA (r=0.55; p<0.01), sedentary time (r=−0.52; p<0.01), steps per day (r=0.39; p=0.02), TDEE (r=0.46; p<0.01) and AEE (r=0.51; p<0.01) during energy restriction were associated with the rate of WL. Changes in total (r=−0.50; p=0.04) and moderate-to-vigorous PA (r=−0.61; p=0.01) between post-WL and follow-up were associated with 1-year weight change (r=−0.51; p=0.04).
These findings highlight that PA and sedentary time could act as modifiable behavioural targets to promote better weight outcomes during dietary energy restriction and/or weight maintenance.
Agitated behaviors are frequently encountered in the prehospital setting and require emergent treatment to prevent harm to patients and prehospital personnel. Chemical sedation with ketamine works faster than traditional pharmacologic agents, though it has a higher incidence of adverse events, including intubation. Outcomes following varying initial doses of prehospital intramuscular (IM) ketamine use have been incompletely described.
To determine whether using a lower dose IM ketamine protocol for agitation is associated with more favorable outcomes.
This study was a pre-/post-intervention retrospective chart review of prehospital care reports (PCRs). Adult patients who received chemical sedation in the form of IM ketamine for agitated behaviors were included. Patients were divided into two cohorts based on the standard IM ketamine dose of 4mg/kg and the lower IM dose of 3mg/kg with the option for an additional 1mg/kg if required. Primary outcomes included intubation and hospital admission. Secondary outcomes included emergency department (ED) length of stay, additional chemical or physical restraints, assaults on prehospital or ED employees, and documented adverse events.
The standard dose cohort consisted of 211 patients. The lower dose cohort consisted of 81 patients, 17 of whom received supplemental ketamine administration. Demographics did not significantly differ between the cohorts (mean age 35.14 versus 35.65 years; P = .484; and 67.8% versus 65.4% male; P = .89). Lower dose subjects were administered a lower ketamine dose (mean 3.24mg/kg) compared to the standard dose cohort (mean 3.51mg/kg). There was no statistically significant difference between the cohorts in intubation rate (14.2% versus 18.5%; P = .455), ED length of stay (14.31 versus 14.88 hours; P = .118), need for additional restraint and sedation (P = .787), or admission rate (26.1% versus 25.9%; P = .677). In the lower dose cohort, 41.2% (7/17) of patients who received supplemental ketamine doses were intubated, a higher rate than the patients in this cohort who did not receive supplemental ketamine (8/64, 12.5%; P <.01).
Access to effective, fast-acting chemical sedation is paramount for prehospital providers. No significant outcomes differences existed when a lower dose IM ketamine protocol was implemented for prehospital chemical sedation. Patients who received a second dose of ketamine had a significant increase in intubation rate. A lower dose protocol may be considered for an agitation protocol to limit the amount of medication administered to a population of high-risk patients.
To examine current dietary fat intakes and compliance in Irish children and to examine changes in intakes from 2005 to 2019.
Analyses were based on data from the Irish National Children’s Food Survey (NCFS) and the NSFS II, two cross-sectional studies that collected detailed food and beverage intake data through 7-day and 4-day weighed food diaries, respectively.
NCFS and NCFS II, Republic of Ireland.
A nationally representative sample of 594 (NCFS) and 600 (NCFS II) children aged 5–12 years. Current intakes from the NCFS II were compared with those previously reported in the NCFS (www.iuna.net).
Current intakes of total fat, SFA, MUFA, PUFA and trans fat as a percentage of total energy are 33·3, 14·0, 13·6, 5·6 and 0·5 %, respectively. Total fat, SFA and trans fat intakes since 2005 remained largely stable over time with all displaying minor decreases of <1 %. Adherence to SFA recommendations remains inadequate, with only 7 % of the population complying. Insufficient compliance with PUFA (71 %) and EPA and DHA (DHA; 16 %) recommendations was also noted.
Children in Ireland continue to meet the total fat and trans fat target goals. Adherence to MUFA and PUFA recommendations has also significantly improved. However, deviations for some fats remain, in particular SFA. These findings are useful for the development of dietary strategies to improve compliance with current recommendations.
Community forestry has long been regarded as a way to achieve the sustainable management of forest and tree resources while maximizing benefits for those responsible for the custodianship of natural resources. Throughout much of the developing world, forests and the lands they occupy have been increasingly ceded to the management and control of Indigenous peoples and local communities. In the post-conflict environment of Liberia, community forestry has been identified as a means of maximizing the engagement of local communities in forest management initiatives. Liberia’s recent comprehensive National Forestry Policy is an important step forward in this process. The new legislative framework makes it clear that a major reorientation of the forestry sector is required if it is to successfully address the economic challenges facing the country. These challenges concern the need to substantially improve forest governance and to ensure that the forest sector contributes more effectively to the alleviation of poverty and livelihood improvement. While, on paper, the legal framework for community forestry is robust, implementation is falling short due to conflicts over land and resources that have pervaded the Liberian forestry sector for decades. Increased investment in oil palm expansion, artisanal agriculture and broader government-supported logging activities all threaten the implementation of community forestry. Concomitantly, a fundamental lack of capacity at the community level and at the level of the Forestry Department has curtailed early attempts to operationalize community forestry in the country. In this chapter we explore the evolution and development of community forestry in Liberia, and assess prospects for its future implementation. We provide a clear framework of recommendations to address potential constraints to its success.
36% the over 50s in Ireland are obese based on body mass index (BMI: reflective of fat store peripherally) while 52% are ‘centrally obese’ based on waist circumference (indicative of fat located viscerally).(1) Visceral fat is thought to be a major site for inflammatory cytokine production and has been linked to other vascular risk factors such as hypertension and diabetes,(2) potentially providing a mechanism for brain atrophy.(3) The aim of the present work was to examine associations between obesity and grey matter (GM)/white matter (WM) perfusion as measured using pseudo-continuous arterial spin labelling (pCASL) MRI.
Materials and Methods
This study was embedded within the Irish Longitudinal Study on Ageing (TILDA), a nationally representative sample of > 8,000 older adults.(4) At wave three, 561 participants underwent brain MRI using a 3T scanner (Achieva, Philips, Netherlands); after exclusions, 484 participants data were included for this analysis. Cerebral blood flow (CBF [ml/100g/min]) values were calculated and their associations with BMI and waist-to-hip ratio (WHR) measures modelled using multiple linear regression. We also examined 6 groups: ‘normal’, ‘overweight’, and ‘obese’ as defined by BMI, with and without central obesity, as defined by WHR.(5) Models were adjusted for age, sex, smoking, alcohol consumption, physical activity, education, heart disease, hypertension, anti-hypertensive use, and depression.
The mean age was 69 years (± 7.2 years); 52% were female. Higher BMI and WHR were both related to lower GM and WM CBF: BMI per 1 SD (GM: β:-1.451, 95%CI:-2.300 to -0.607, P < 0.001; WM: β:-0. 575, 95%CI:-0. 939 to -0.210, P = 0.002) and WHR (GM: β:−1.667, 95%CI:−2.856 to −0.477, P = 0.006; WM: β:−0.688, 95%CI:−1.178 to −0.197, P = 0.006). The combination of overall obesity (BMI ≥ 30 kg/m2) and central obesity (WHR > 0.85[female], > 0.90[male]) was associated with lower CBF (GM: β:-4.303, 95%CI:-7.015 to -1.591, P = 0.002; WM: β:-2.029, 95%CI:-3.185 to -0.873, P < 0.001) compared to subjects without central obesity (GM: β:-0.959, 95%CI:-6.490 to 4.572, P = 0.733; WM:β:-0.051, 95%CI:-2.060 to 1.958, P = 0.960).
Our results show that central adiposity is a risk factor for impaired cerebral perfusion independent of BMI. Recent studies have shown that accumulation of fat in this area is a risk factor for cognitive impairment(6) and thus this study could partly explain the vascular origins.
Milk is widely recognised as a nutrient dense food, supporting the growth and development of children. Nevertheless some milk types such as whole milk can consist of high levels of saturated fat, which is recognised for its association with chronic disease risk in adults when intakes are elevated. In Ireland, current dietary guidelines recommend that children from two years onwards should consume low fat milk. Previous research has shown low levels of compliance with this guideline. Therefore the aim of this study is to review the current consumption of milk and non-dairy milk-based alternatives among Irish children and compare these with previous intakes.
Dietary intakes of ‘whole milk’ decreased over time from 232 ± 186g/d to current intakes of 131 ± 154g/d. In contrast, increases were noted in ‘reduced fat milks’ (26 ± 86g/d to 52 ± 110g/d) and ‘non-dairy alternatives’ (0.2 ± 4g/d to 3 ± 19g/d). A total of 68% of children were classified as consumers of whole milk (193 ± 151g/d) compared to 90% (257 ± 178g/d) previously. ‘Reduced fat milk’ consumers increased from 17% to 31% and ‘non-dairy alternatives’ consumers also increased from < 1% to 3%.
Our preliminary results indicate that the number of Irish children consuming whole milk have decreased over the last number of years. In contrast consumers of ‘reduced fat milks’ have significantly increased, indicating potential improvement to healthy eating guidelines adherence. Further analysis to examine current intakes and sources of saturated fat is warranted to establish additional changes in dietary patterns and compliance with recommendations within this age group.
The rupture of atherosclerotic plaques is the prerequisite for adverse cardiovascular events. Calcification morphology plays a critical role in plaque stability, therefore accurate calcification classification is essential for favourable patient management. Blood biomarkers may be a worthwhile approach to stratify patients based on calcification phenotype. Vitamin K-dependent Matrix γ-carboxyglutamate (Gla) protein (MGP) is a potent inhibitor of vascular calcification. Recent studies have demonstrated the potential utility of circulating non-functional MGP (dp-ucMGP) measurements to determine arterial stiffness and calcification levels. The objective of this study was to examine the relationship between circulating dp-ucMGP and calcification phenotype within symptomatic atherosclerotic lesions. Consenting patients undergoing standard endarterectomy procedures were recruited (n = 29). Fasting venous blood was collected preoperatively. Circulating plasma levels of dp-ucMGP were quantified using the inaKtif MGP (dp-ucMGP) iSYS kit. A bicinchoninic acid assay was used to standardise the total protein content present in each sample. High-resolution micro-CT imaging was conducted on the excised atherosclerotic specimens postoperatively. ImageJ post-processing was used to accurately quantify the calcification volume (≥ 130 Hounsfield Units) and determine the total number of calcified particles (3D objects counter plugin). Thirteen carotid (average age 71 years, 9 male) and fourteen peripheral lower limb (average age 65 years, 12 male) patients were examined. One patient had a carotid and a peripheral lower limb plaque (age 79, male). Peripheral lower limb specimens have larger volumes of calcification and higher numbers of calcified particles than carotid samples (472 ± 310 vs 85 ± 113mm3, p < 0.0005; 13919 ± 16034 vs 3476 ± 6208, p = 0.061.) While a higher dp-ucMGP value was noted in carotid than peripheral lower limb patients (214 ± 52 vs 169 ± 36pmol/L, p = 0.014) there was no correlation between circulating dp-ucMGP and calcification volume or number of calcified particles (rs = -0.329 and rs = 0.046). Previous research also found that peripheral lower limb lesions contain higher volumes of calcification than carotid lesions. There is currently no published data on calcified particle comparisons. Patients with symptomatic carotid disease are assumed to have a degree of peripheral arterial disease, this could explain the higher levels of circulating dp-ucMGP in carotid patients. The current study did not examine the dietary patterns of individuals with regards to Vitamin K intake or analyse other areas of the vasculature for additional calcification. This may interfere with dp-ucMGP measurements. This study serves as a preliminary investigation into the potential of dp-ucMGP as a blood based biomarker to distinguish between symptomatic atherosclerotic calcification phenotypes.
To describe the characteristics of people in Central and Eastern Sydney (CES), NSW, who had a General Practice Management Plan (GPMP) and claimed for at least one private allied health service item; and to examine if allied health service use results in less hospitalisations over a five-year period.
The number of people living with chronic health conditions is increasing in Australia. The Chronic Disease Management programme was introduced to the Medicare Benefits Schedule (MBS) to provide a more structured approach to managing patients with chronic conditions and complex care needs. The programme supports general practitioners claiming up to one GPMP and one Team Care Arrangement every year, and the patient additionally claiming for up to five private allied health services visits.
A prospective longitudinal study was conducted. The sample consisted of 5771 participants in CES who had a GPMP within a two-year health service utilisation baseline period (2007–2009). The analysis used the 45 and Up Study questionnaire data linked to the MBS, hospitalisation, death and emergency department data for the period 2006–2014.
Of the eligible participants, 43% (2460) had at least one allied health service item claim in the subsequent 12 months. Allied health services were reported as physiotherapy, podiatry and other allied health services. The highest rates of allied health service use were among participants aged 85 years and over (49%). After controlling for confounding factors, a significant difference was found between having claimed for five or more physiotherapy services and emergency admissions (HR: 0.83; 95% CI: 0.72–0.95) and potentially preventable hospitalisations (HR: 0.79; 95% CI: 0.64–0.96) in the subsequent five years. Use of allied health service items was well targeted towards those with chronic and complex care needs, and use of physiotherapy services was associated with less avoidable hospitalisations.
Scales are widely used in psychiatric assessments following self-harm. Robust evidence for their diagnostic use is lacking.
To evaluate the performance of risk scales (Manchester Self-Harm Rule, ReACT Self-Harm Rule, SAD PERSONS scale, Modified SAD PERSONS scale, Barratt Impulsiveness Scale); and patient and clinician estimates of risk in identifying patients who repeat self-harm within 6 months.
A multisite prospective cohort study was conducted of adults aged 18 years and over referred to liaison psychiatry services following self-harm. Scale a priori cut-offs were evaluated using diagnostic accuracy statistics. The area under the curve (AUC) was used to determine optimal cut-offs and compare global accuracy.
In total, 483 episodes of self-harm were included in the study. The episode-based 6-month repetition rate was 30% (n = 145). Sensitivity ranged from 1% (95% CI 0–5) for the SAD PERSONS scale, to 97% (95% CI 93–99) for the Manchester Self-Harm Rule. Positive predictive values ranged from 13% (95% CI 2–47) for the Modified SAD PERSONS Scale to 47% (95% CI 41–53) for the clinician assessment of risk. The AUC ranged from 0.55 (95% CI 0.50–0.61) for the SAD PERSONS scale to 0.74 (95% CI 0.69–0.79) for the clinician global scale. The remaining scales performed significantly worse than clinician and patient estimates of risk (P < 0.001).
Risk scales following self-harm have limited clinical utility and may waste valuable resources. Most scales performed no better than clinician or patient ratings of risk. Some performed considerably worse. Positive predictive values were modest. In line with national guidelines, risk scales should not be used to determine patient management or predict self-harm.
To describe the symptoms and functional changes in patients with high levels of somatization who were referred to an outpatient, multidisciplinary, shared mental healthcare (SMHC) service that primarily offered cognitive behavioural therapy. Second, we wished to compare the levels of somatization in this outpatient clinical sample with previously published community norms.
Somatization is common in primary care, and it can lead to significant impairment, disproportionate resource use, and poses a challenge for management.
All the patients (18+ years, n=508) who attended three or more treatment sessions in SMHC primary care over a seven-year period were eligible for inclusion to this pre–post study. Self-report measures included the Patient Health Questionnaire’s somatic symptom severity scale (PHQ-15) and the World Health Organization Disability Assessment Schedule (WHODAS II). Normative comparisons were used to assess the degree of symptoms and functional changes.
Clinically significant levels of somatization before treatment were common (n=138, 27.2%) and were associated with a significant reduction in somatic symptom severity (41.3% reduction; P<0.001) and disability (44% reduction; P<0.001) after treatment. Patients’ levels of somatic symptom severity and disability approached but did not quite reach the community sample norms following treatment. Multidisciplinary short-term SMHC was associated with significant improvement in patient symptoms and disability, and shows promise as an effective treatment for patients with high levels of somatization. Including a control group would allow more confidence regarding the conclusions about the effectiveness of SMHC for patients impaired by somatization.
Since 2000 we have been undertaking a detailed restudy of Norbert Elias’s previously lost ‘Adjustment of Young Workers to Work Situations and Adult Roles’ (1962–4) project. This project was not only important because of its links to Norbert Elias or because it was one of the largest studies of school to work transition at that time (see Goodwin and O’Connor, 2005a), but also because there are very few ‘classic’ studies from the post war period that focused on the English East Midlands and a key centre of engineering, textiles and clothing and footwear manufacture. As part of the restudy we have considered the intersections of work, lifecourse and locality (see Goodwin and O’Connor, 2005a; 2005b; 2006a; 2006b; 2007a; 2007b; O’Connor and Goodwin, 2004; 2010; 2012; 2013a). For example, analysis of the data reveals that the transition from school to work in the 1960s was far more complex than previously thought by academics and policy makers. While the local labour market was initially buoyant, and fairly distinct from other local labour markets in terms of the levels of work available in specific sectors, examination of individual lives suggests that quality jobs were hard to obtain and retain. Moreover, the ‘gold standard’ of apprenticeship was not always experienced as the most rigorous or complete approach to training. The lives of these once young workers also reveal how vulnerable workers are to changes in the global economy. For example, individuals interviewed thought that they were entering ‘jobs for life’ and did not foresee the drastic labour market change and transformation that beset the local economy from the late 1970s onwards. Such change had significant impacts on subsequent careers with very few able to work in the industries for which they had originally trained.
Our research was made possible by a chance rediscovery, in an attic office, of 851 original interview schedules as well as some background documents written by a research team from the 1960s. The 1960s research was funded by the Department of Scientific and Industrial Research, and carried out by the Department of Sociology at the University of Leicester, UK. The original research concentrated on how young people experienced work and adjusted their lives to new work roles in adulthood.
Background: A definitive diagnosis of multiple sclerosis (MS), as distinct from a clinically isolated syndrome, requires one of two conditions: a second clinical attack or particular magnetic resonance imaging (MRI) findings as defined by the McDonald criteria. MRI is also important after a diagnosis is made as a means of monitoring subclinical disease activity. While a standardized protocol for diagnostic and follow-up MRI has been developed by the Consortium of Multiple Sclerosis Centres, acceptance and implementation in Canada have been suboptimal. Methods: To improve diagnosis, monitoring, and management of a clinically isolated syndrome and MS, a Canadian expert panel created consensus recommendations about the appropriate application of the 2010 McDonald criteria in routine practice, strategies to improve adherence to the standardized Consortium of Multiple Sclerosis Centres MRI protocol, and methods for ensuring effective communication among health care practitioners, in particular referring physicians, neurologists, and radiologists. Results: This article presents eight consensus statements developed by the expert panel, along with the rationale underlying the recommendations and commentaries on how to prioritize resource use within the Canadian healthcare system. Conclusions: The expert panel calls on neurologists and radiologists in Canada to incorporate the McDonald criteria, the Consortium of Multiple Sclerosis Centres MRI protocol, and other guidance given in this consensus presentation into their practices. By improving communication and general awareness of best practices for MRI use in MS diagnosis and monitoring, we can improve patient care across Canada by providing timely diagnosis, informed management decisions, and better continuity of care.
Background: Increasingly more attention has been paid to non-pharmacological interventions as treatment of agitated behaviors that accompany dementia. The aim of the current study is to test if personalized one-to-one interaction activities based on Montessori principles will improve agitation, affect, and engagement more than a relevant control condition.
Methods: We conducted a randomized crossover trial in nine residential facilities in metropolitan Melbourne, Australia (n = 44). Personalized one-to-one activities that were delivered using Montessori principles were compared with a non-personalized activity to control for the non-specific benefits of one-to-one interaction. Participants were observed 30 minutes before, during, and after the sessions. The presence or absence of a selected physically non-aggressive behavior was noted in every minute, together with the predominant type of affect and engagement.
Results: Behavior counts fell considerably during both the Montessori and control sessions relative to beforehand. During Montessori activities, the amount of time spend actively engaged was double compared to during the control condition and participants displayed more positive affect and interest as well. Participants with no fluency in English (all from non-English speaking backgrounds) showed a significantly larger reduction in agitation during the Montessori than control sessions.
Conclusion: Our results show that even non-personalized social contact can assist in settling agitated residents. Tailoring activities to residents’ needs and capabilities elicit more positive interactions and are especially suitable for people who have lost fluency in the language spoken predominantly in their residential facility. Future studies could explore implementation by family members and volunteers to avoid demands on facilities’ resources.
Trial Registration: Australian New Zealand Clinical Trials Registry – ACTRN12609000564257.
During the early 1960s, Norbert Elias led a research project on the adjustment of young workers to work situations and adult roles. The data from this project, which consisted of 851 interviews with young people, were recently rediscovered and the participants, now approaching retirement, were re-interviewed as part of a restudy. In this paper we argue, that, in the context of the dramatic changes to the transition to retirement that have taken place in the United Kingdom, it is possible to use Elias's unpublished work on the transition to work as a theoretical framework for understanding of the transition from work and to retirement. In particular, we focus on the themes of fantasy and reality in the perception of retirement; changing interdependencies in the transition to retirement and the extent and impact of retirement preparation on the perception of the change in status from full-time worker to retiree. We conclude by suggesting that the implied advantages of being the ‘baby-boomer’ generation are far from the reality, with the experiences of this group being similar to those who have gone before and face an adjustment to retirement marked by uncertainty and anxiety.