To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The National Institute for Health and Care Excellence (NICE) initiated an ambitious effort to develop the first shared decision making guidelines. The purpose of this commentary is to identify three main concerns pertaining to the new published guidelines for shared decision making research, practice, implementation and cultural differences in mental health.
Aeolianites and cemented foreshore deposits on South Africa's Cape south coast have the capacity to record and preserve events that transpired on them when they were composed of unconsolidated sand. Thirty-five Pleistocene elephant tracksites have been identified along this coastline. This abundance of sites along what was the margin of the vast Palaeo-Agulhas Plain allows for an appreciation of the forms that elephant tracks and traces can take in the context of the global proboscidean track record. They point to a significant regional elephant presence from Marine Isotope Stage (MIS) 11 (~400 ka) through MIS 5 (~130–80 ka) to MIS 3 (~35 ka) and also indicate repeated use of certain dune areas. They buttress Holocene and historical evidence that elephants made use of open areas in the region, and that the remaining “Knysna elephants” retreated into dense afrotemperate forest for protection in recent centuries. Analogies can be drawn between Pleistocene elephant tracks and Mesozoic dinosaur tracks, and some of the Cape south coast elephant tracks are among the largest Cenozoic (and hence, Quaternary) tracks ever to be described. A newly identified tracksite in this area may provide the first reported evidence of elephant trunk-drag impressions.
To test whether point-of-sale (POS) information about the nutrition content of sugar-sweetened beverages (SSB) promotes healthier drink choices among teenagers, and explore whether POS intervention effects vary based on prior exposure to a sugary drink public health campaign (13 Cancers).
Between-subjects online experiment with three POS signage conditions: no signage (control); sugar content (SC) and Health Star Rating (HSR). Participants viewed their assigned POS sign alone, then alongside a drinks product display and chose which drink they would buy. Perceptions of various drink products and campaign recall were assessed.
Adolescents aged 13–17 years (n 925) recruited via an online panel.
POS signs did not promote a significant reduction in preference for SSB (cf. control condition). Cognitive and emotional responses to POS signs were strongest for the SC sign, which was rated higher than the HSR sign on various perceived effectiveness measures. Participants who saw the SC sign rated SSB as less healthy (cf. control condition) and were more likely to accurately estimate the number of teaspoons of sugar in soft drink (cf. HSR sign and control conditions). There was no significant interaction between prior exposure to the 13 Cancers campaign and POS signage condition regarding preferences for and perceptions of SSB.
SSB POS interventions may not have the desired effect on adolescents’ drink preferences. Testing SSB POS signs in real-world retail settings is needed to determine whether positive educational impacts extend to promoting healthier drink purchases and reduced SSB consumption among teenagers.
Using numerical simulations, we probe the fluid flow in an axisymmetric peristaltic vessel fitted with elastic bi-leaflet valves. In this biomimetic system that mimics the flow generated in lymphatic vessels, we investigate the effects of the valve and vessel properties on pumping performance of the valved peristaltic vessel. The results indicate that valves significantly increase pumping by reducing backflow. The presence of valves, however, increases the viscous resistance, therefore requiring greater work compared to valveless vessels. The benefit of the valves is the most significant when the fluid is pumped against an adverse pressure gradient and for low vessel contraction wave speeds. We identify the optimum vessel and valve parameters leading to the maximum pumping efficiency. We show that the optimum valve elasticity maximizes the pumping flow rate by allowing the valve to block the backflow more effectively while maintaining low resistance during the forward flow. We also examine the pumping in vessels where the vessel contraction amplitude is a function of the adverse pressure gradient, as found in lymphatic vessels. We find that, in this case, the flow is limited by the work generated by the contracting vessel, suggesting that the pumping in lymphatic vessels is constrained by the performance of the lymphatic muscle. Given the regional heterogeneity of valve morphology observed throughout the lymphatic vasculature, these results provide insight into how these variations might facilitate efficient lymphatic transport in the vessel's local physiologic context.
Many healthcare workers do not seek help, despite their enormous stress and greater risk for anxiety, depression and post-traumatic stress disorder (PTSD).
This study screened for psychopathology and evaluated the efficacy of a brief, social contact-based video intervention in increasing treatment-seeking intentions among healthcare workers (trial registration: NCT04497415). We anticipated finding high rates of psychopathology and greater treatment-seeking intentions post-intervention.
Healthcare workers (n = 350) were randomised to (a) a brief video-based intervention at day 1, coupled with a booster video at day 14; (b) the video at day 1 only; or (c) a non-intervention control. In the 3 min video, a female nurse described difficulty coping with stress, her anxieties and depression, barriers to care and how therapy helped her. Assessments were conducted pre- and post-intervention and at 14- and 30-day follow-ups.
Of the 350 healthcare workers, 281 (80%) reported probable anxiety, depression and/or PTSD. Participants were principally nurses (n = 237; 68%), physicians (n = 52; 15%) and emergency medical technicians (n = 30; 9%). The brief video-based intervention yielded greater increases in treatment-seeking intentions than the control condition, particularly among participants in the repeat-video group. Exploratory analysis revealed that in both video groups, we found greater effect among nurses than non-nurses.
A brief video-based intervention increased treatment-seeking intention, possibly through identification and emotional engagement with the video protagonist. A booster video magnified that effect. This easily disseminated intervention could increase the likelihood of seeking care and offer employers a proactive approach to encourage employees to search for help if needed.
This chapter offers a concluding reflection on the idea of a towering judge, its value, complexity and potential dangers. Drawing on prior chapters and contributions, it suggests that the idea of a towering judge could be understood in more or less objective/subjective, national/international and relative/absolute terms, as well as across different time frames. It notes the value in asking these questions, as well as in studying the jurisprudence of leading judges cross-nationally. At the same time, it suggests several potential dangers associated with a focus on ‘towering judges’. The idea of a towering judge may tend to privilege chief justices over other leading judges, and male over female justices. And it may not always be a good thing for the courts on which a judge serves. The chapter therefore concludes the volume with a note of caution: even while acknowledging the value of studying leading judges, we might ultimately do better to celebrate more collegial, non-dominant forms of judicial leadership.
To evaluate 3 formulations of copper (Cu)-based self-sanitizing surfaces for antimicrobial efficacy and durability over 1 year in inpatient clinical areas and laboratories.
Randomized control trial.
We assessed 3 copper formulations: (1) solid alloy 80% Cu–20% Ni (integral copper), (2) spray-on 80% Cu–20% Ni (spray-on) and (3) 16% composite copper-impregnated surface (CIS). In total, 480 coupons (1 cm2) of the 3 products and control surgical grade (AISI 316) stainless steel were inserted into gaskets and affixed to clinical carts used in patient care areas (including emergency and maternity units) and on microbiology laboratory bench work spaces (n = 240). The microbial burden and assessment of resistance to wear, corrosion, and material compatibility were determined every 3 months. Participants included 3 tertiary-care Canadian adult hospital and 1 pediatric-maternity hospital.
Copper formulations used on inpatient units statistically significantly reduced bacterial bioburden compared to stainless steel at months 3 and 6. Only the integral copper product had significantly less bacteria than stainless steel at month 12. No statistically significant differences were detected in microbial burden between copper formulations and stainless-steel coupons on microbiology laboratory benches where bacterial counts were low overall. All mass changes and corrosion rates of the formulations were acceptable by engineering standards.
Copper surfaces vary in their antimicrobial efficacy after 1 year of hospital use. Frequency of cleaning and disinfection influence the impact of copper; the greatest reduction in microbial bioburden occurred in clinical areas compared to the microbiology laboratory where cleaning and disinfection were performed multiple times daily.
British coronations from 1761 to 1838 have conventionally been dismissed as tawdry pageants with little religious significance. The study of these ceremonies has also been impeded by the dominance of historiographical frameworks characterizing the later Georgian period as an era of political secularization. Drawing upon many neglected sources, this article challenges such presuppositions by situating the Anglican clergy in the foreground of coronations and exploring the ways in which these events were perceived to retain a religious and political significance. The discussion encompasses theoretical understandings of coronations and the practical tensions between church and state exposed by them.
Young people are among the most severely impacted by conflict and as such many post-conflict initiatives are aimed at assisting them. Yet the impacts of these initiatives on young people's ability to successfully overcome the adversity they faced during conflict are not fully understood. This paper attempts to examine these impacts by conceptualising post-conflict initiatives as enmeshed within young people's social environments. It argues that post-conflict initiatives are intimately connected to broader processes of exclusion from social systems such as the family. While these systems had previously served to protect young people against adversity, conflict and post-conflict initiatives have disrupted their ability to continue this mission. In particular, the structure and function of the family system are examined to demonstrate the types of disruptions that have taken place that have ultimately negatively impacted the landscape in which young people develop.
Dr Anna Dixon, Chief Executive at the Centre for Ageing Better, examines the issues around an ageing population, how we have reached this stage and offers potential solutions to the problems it presents. Her book, The Age of Ageing Better? turns the misleading and depressing narrative of burden and massive extra cost of people living longer on its head and shows how our society could thrive if we started thinking differently. She presents a refreshingly optimistic vision for the future that could change the way we value later life in every sense.
To examine associations between diet and risk of developing gastro-oesophageal reflux disease (GERD).
Prospective cohort with a median follow-up of 15·8 years. Baseline diet was measured using a FFQ. GERD was defined as self-reported current or history of daily heartburn or acid regurgitation beginning at least 2 years after baseline. Sex-specific logistic regressions were performed to estimate OR for GERD associated with diet quality scores and intakes of nutrients, food groups and individual foods and beverages. The effect of substituting saturated fat for monounsaturated or polyunsaturated fat on GERD risk was examined.
A cohort of 20 926 participants (62 % women) aged 40–59 years at recruitment between 1990 and 1994.
For men, total fat intake was associated with increased risk of GERD (OR 1·05 per 5 g/d; 95 % CI 1·01, 1·09; P = 0·016), whereas total carbohydrate (OR 0·89 per 30 g/d; 95 % CI 0·82, 0·98; P = 0·010) and starch intakes (OR 0·84 per 30 g/d; 95 % CI 0·75, 0·94; P = 0·005) were associated with reduced risk. Nutrients were not associated with risk for women. For both sexes, substituting saturated fat for polyunsaturated or monounsaturated fat did not change risk. For both sexes, fish, chicken, cruciferous vegetables and carbonated beverages were associated with increased risk, whereas total fruit and citrus were associated with reduced risk. No association was observed with diet quality scores.
Diet is a possible risk factor for GERD, but food considered as triggers of GERD symptoms might not necessarily contribute to disease development. Potential differential associations for men and women warrant further investigation.
This article outlines a cognitive behavioural therapy (CBT) approach to treating feelings of guilt and aims to be a practical ‘how to’ guide for therapists. The therapeutic techniques were developed in the context of working with clients with a diagnosis of post-traumatic stress disorder (PTSD); however, the ideas can also be used when working with clients who do not meet a diagnosis of PTSD but have experienced trauma or adversity and feel guilty. The techniques in this article are therefore widely applicable: to veterans, refugees, survivors of abuse, the bereaved, and healthcare professionals affected by COVID-19, amongst others. We consider how to assess and formulate feelings of guilt and suggest multiple cognitive and imagery strategies which can be used to reduce feelings of guilt. When working with clients with a diagnosis of PTSD, it is important to establish whether the guilt was first experienced during the traumatic event (peri-traumatically) or after the traumatic event (post-traumatically). If the guilt is peri-traumatic, following cognitive work, this new information may then need to be integrated into the traumatic memory during reliving.
Key learning aims
(1) To understand why feelings of guilt may arise following experiences of trauma or adversity.
(2) To be able to assess and formulate feelings of guilt.
(3) To be able to choose an appropriate cognitive technique, based on the reason for the feeling of guilt/responsibility, and work through this with a client.
(4) To be able to use imagery techniques to support cognitive interventions with feelings of guilt.
There is a large treatment gap for common mental disorders in rural areas of low-income countries. We tested the Friendship Bench as a brief psychological intervention delivered by village health workers (VHWs) in rural Zimbabwe.
Rural women identified with depression in a previous trial received weekly home-based problem-solving therapy from VHWs for 6 weeks, and joined a peer-support group. Depression was assessed using the Edinburgh Postnatal Depression Scale (EPDS) and Shona Symptom Questionnaire (SSQ). Acceptability was explored through in-depth interviews and focus group discussions. The proportion of women with depression pre- and post-intervention was compared using McNemar's test.
Ten VHWs delivered problem-solving therapy to 27 women of mean age 33 years; 25 completed six sessions. Women valued an established and trustful relationship with their VHW, which ensured confidentiality and prevented gossip, and reported finding individual problem-solving therapy beneficial. Peer-support meetings provided space to share problems, solutions and skills. The proportion of women with depression or suicidal ideation on the EPDS declined from 68% to 12% [difference 56% (95% confidence interval (CI) 27.0–85.0); p = 0.001], and the proportion scoring high (>7) on the SSQ declined from 52% to 4% [difference 48% (95% CI 24.4–71.6); p < 0.001] after the 6-week intervention.
VHW-delivered problem-solving therapy and peer-support was acceptable and showed promising results in this pilot evaluation, leading to quantitative and qualitative improvements in mental health among rural Zimbabwean women. Scale-up of the Friendship Bench in rural areas would help close the treatment gap for common mental disorders.
Sub-Saharan Africa (SSA) has the largest care gap for common mental disorders (CMDs) globally, heralding the use of cost-cutting approaches such as task-shifting and digital technologies as viable approaches for expanding the mental health workforce. This study aims to evaluate the effectiveness of a problem-solving therapy (PST) intervention that is delivered by community health volunteers (CHVs) through a mobile application called ‘Inuka coaching’ in Kenya.
A pilot prospective cohort study recruited participants from 18 health centres in Kenya. People who self-screened were eligible if they scored 8 or higher on the Self-Reporting Questionnaire-20 (SRQ-20), were aged 18 years or older, conversant in written and spoken English, and familiar with the use of smart mobile devices. The intervention consisted of four PST mobile application chat-sessions delivered by CHVs. CMD measures were administered at baseline, 4-weeks (post-treatment), and at 3-months follow-up assessment.
In all, 80 participants consented to the study, of which 60 participants (female, n = 38; male, n = 22) completed their 4-week assessments, and 52 participants completed their 3-month follow-up assessment. The results showed a significant improvement over time on the Self-Reporting Questionnaire-20 (SRQ-20). Higher-range income, not reporting suicidal ideation, being aged over 30 years, and being male were associated with higher CMD symptom reduction.
To our knowledge, this report is the first to pilot a PST intervention that is delivered by CHVs through a locally developed mobile application in Kenya, to which clinically meaningful improvements were found. However, a randomised-controlled trial is required to robustly evaluate this intervention.
Understanding risk factors for death from Covid-19 is key to providing good quality clinical care. We assessed the presenting characteristics of the ‘first wave’ of patients with Covid-19 at Royal Oldham Hospital, UK and undertook logistic regression modelling to investigate factors associated with death. Of 470 patients admitted, 169 (36%) died. The median age was 71 years (interquartile range 57–82), and 255 (54.3%) were men. The most common comorbidities were hypertension (n = 218, 46.4%), diabetes (n = 143, 30.4%) and chronic neurological disease (n = 123, 26.1%). The most frequent complications were acute kidney injury (AKI) (n = 157, 33.4%) and myocardial injury (n = 21, 4.5%). Forty-three (9.1%) patients required intubation and ventilation, and 39 (8.3%) received non-invasive ventilation. Independent risk factors for death were increasing age (odds ratio (OR) per 10 year increase above 40 years 1.87, 95% confidence interval (CI) 1.57–2.27), hypertension (OR 1.72, 95% CI 1.10–2.70), cancer (OR 2.20, 95% CI 1.27–3.81), platelets <150 × 103/μl (OR 1.93, 95% CI 1.13–3.30), C-reactive protein ≥100 μg/ml (OR 1.68, 95% CI 1.05–2.68), >50% chest radiograph infiltrates (OR 2.09, 95% CI 1.16–3.77) and AKI (OR 2.60, 95% CI 1.64–4.13). There was no independent association between death and gender, ethnicity, deprivation level, fever, SpO2/FiO2, lymphopoenia or other comorbidities. These findings will inform clinical and shared decision making, including use of respiratory support and therapeutic agents.
Late Pleistocene and Early Holocene aeolian deposits in Tasmania are extensive in the present subhumid climate zone but also occur in areas receiving >1000 mm of rain annually. Thermoluminescence, optically stimulated luminescence, and radiocarbon ages indicate that most of the deposits formed during periods of cold climate. Some dunes are remnants of longitudinal desert dunes sourced from now-inundated continental shelves which were previously semi-arid. Others formed near source, often in the form of lunettes east of seasonally-dry lagoons in the previously semi-arid Midlands and southeast of Tasmania, or as accumulations close to floodplains of major rivers, or as sandsheets in exposed areas. Burning of vegetation by the Aboriginal population after 40 ka is likely to have influenced sediment supply. A key site for determining climate variability in southern Tasmania is Maynes Junction which records three periods of aeolian deposition (at ca. 90, 32 and 20 ka), interspersed with periods of hillslope instability. Whether wind speeds were higher than at present during the last glacial period is uncertain, but shells in the Mary Ann Bay sandsheet near Hobart and particle size analysis of the Ainslie dunes in northeast Tasmania suggest stronger winds during the last glacial period than at present.
The current study aimed to assess the nutritional quality of Australian secondary school canteen menus.
Stratified national samples of schools provided canteen menus in 2012–2013 and 2018, which were systematically assessed against a ‘traffic light’ classification system according to the National Healthy School Canteen Guidelines. Items were classified as green (healthiest and recommended to dominate canteen menus), amber (select carefully) or red (low nutritional quality, should not appear on canteen menus), and pricing and promotional strategies were recorded.
Canteen menus from 244 secondary schools (2012–2013 n 148, 2018 n 96).
A total of 21 501 menu items were classified. Forty-nine percent of canteen menus contained at least 50 % green items; however, nearly all (98·5 %) offered at least one red item and therefore did not comply with national recommendations. Snacks and drinks had the least healthy profile of all product sectors, and a large proportion of schools supplied products typically of poor nutritional quality (meat pies and savoury pastries 91·8 %, sugary drinks 89·5 %, sweet baked goods 71·5 %, ice creams 64·1 % and potato chips 44·0 %). Red items were significantly cheaper than green items on average, and many schools promoted the purchase of red items on canteen menus (52·8 %). There were few differences between survey waves.
There is considerable room for improvement in the nutritional quality of canteen menus in Australian secondary schools, including in the availability, pricing and promotion of healthier options. Additional resources and services to support implementation of national guidelines would be beneficial.