To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To minimise infection during COVID-19, the clozapine haematological monitoring interval was extended from 4-weekly to 12-weekly intervals in South London and Maudsley NHS Foundation Trust.
To investigate the impact of this temporary policy change on clinical and safety outcomes.
All patients who received clozapine treatment with extended (12-weekly) monitoring in a large London National Health Service trust were included in a 1-year mirror-image study. A comparison group was selected with standard monitoring. The proportion of participants with mild to severe neutropenia and the proportion of participants attending the emergency department for clozapine-induced severe neutropenia treatment during the follow-up period were compared. Psychiatric hospital admission rates, clozapine dose and concomitant psychotropic medication in the 1 year before and the 1 year after extended monitoring were compared. All-cause clozapine discontinuation at 1-year follow-up was examined.
Of 569 participants, 459 received clozapine with extended monitoring and 110 controls continued as normal. The total person-years were 458 in the intervention group and 109 in the control group, with a median follow-up time of 1 year in both groups. During follow-up, two participants (0.4%) recorded mild to moderate neutropenia in the intervention group and one (0.9%) in the control group. There was no difference in the incidence of haematological events between the two groups (IRR = 0.48, 95% CI 0.02–28.15, P = 0.29). All neutropenia cases in the intervention group were mild, co-occurring during COVID-19 infection. The median number of admissions per patient during the pre-mirror period remained unchanged (0, IQR = 0) during the post-mirror period. There was one death in the control group, secondary to COVID-19 infection.
There was no evidence that the incidence of severe neutropenia was increased in those receiving extended monitoring.
Clozapine is licensed for treatment-resistant psychosis and remains underutilised. This may berelated to the stringent haematological monitoring requirements that are mandatory in most countries. We aimed to compare guidelines internationally and develop a novel Stringency Index. We hypothesised that the most stringent countries would have increased healthcare costs and reduced prescription rates.
We conducted a literature review and survey of guidelines internationally. Guideline identification involved a literature review and consultation with clinical academics. We focused on the haematological monitoring parameters, frequency and thresholds for discontinuation and rechallenge after suspected clozapine-induced neutropenia. In addition, indicators reflecting monitoring guideline stringency were scored and visualised using a choropleth map. We developed a Stringency Index with an international panel of clozapine experts, through a modified-Delphi-survey. The Stringency Index was compared to health expenditure per-capita and clozapine prescription per 100 000 persons.
One hundred twocountries were included, from Europe (n = 35), Asia (n = 24), Africa (n = 20), South America (n = 11), North America (n = 7) and Oceania and Australia (n = 5). Guidelines differed in frequency of haematological monitoring and discontinuation thresholds. Overall, 5% of included countries had explicit guidelines for clozapine-rechallenge and 40% explicitly prohibited clozapine-rechallenge. Furthermore, 7% of included countries had modified discontinuation thresholds for benign ethnic neutropenia. None of the guidelines specified how long haematological monitoring should continue. The most stringent guidelines were in Europe, and the least stringent were in Africa and South America. There was a positive association (r = 0.43, p < 0.001) between a country's Stringency Index and healthcare expenditure per capita.
Recommendations on how haematological function should be monitored in patients treated with clozapine vary considerably between countries. It would be useful to standardise guidelines on haematological monitoring worldwide.
The proximate cause of democratic breakdown in Argentina has invariably been a military coup. In overthrowing civilian governments, however, the armed forces have not acted in a vacuum. Before the 1966 and 1976 coups, military officers made sure that key landowner, business, and labor leaders would support or at least accept military intervention. The importance that the Argentine military places on civilian opinion raises the question of the conditions under which civilian leaders might become more likely to oppose a coup. One promising development would be for these elites to channel their political demands increasingly through political parties. By investing resources in party activity and becoming more habituated to pressing demands through party channels, Argentine socioeconomic elites would gain a larger stake in the survival of the electoral and legislative institutions that parties require to be effective. This article will analyze the relationship between one such elite, the Peronist union leadership, and one of Argentina's main political parties, the Peronist Partido Justicialista (PJ).
We examine the Holocene loess record in the Heye Catchment on the margins of the Tibetan Plateau (TP) and China Loess Plateau (CLP) to determine: the region to which the Heye Catchment climate is more similar; temporal change in wind strength; and modification of the loess record by mass wasting and human activity. Luminescence and radiocarbon dating demonstrate loess deposited in two periods: >11–8.6 ka and <5.1 ka. The 8.6–5.1 ka depositional hiatus, which coincides with the Mid-Holocene Climatic Optimum, is more similar to the loess deposition cessation in the TP than to the loess deposition deceleration in the CLP. Grain-size analysis suggests the Heye loess is a mixture of at least three different grain-size distributions and that it may derive from multiple sources. A greater proportion of coarse sediments in the older loess may indicate stronger winds compared with the more recent depositional period. Gravel incorporated into younger loess most likely comes from bedrock exposed in slump scarps. Human occupation of the catchment, for which the earliest evidence is 3.4 ka, postdates the onset of slumping; thus the slumps may have created a livable environment for humans.
Strikes have important effects on the workers and employers directly involved as well as significant indirect effects on consumers, economic growth, electoral outcomes, policy making, and political stability. Moreover, strikes form patterns that illuminate long-term social processes, the dynamics of economic and political conflict, and the functioning of industrial relations systems. Because strikes are important as both causes and symptoms of social change, an extensive literature has arisen on patterns of strike activity. To date, however, this literature has focused mainly on advanced industrial countries. To prepare the way for analyses of strike patterns in Latin America, more research is needed on the ways in which Latin American strike statistics have been collected and reported. This research note will report the results of such research for Argentina. The first section will assess the quality of strike statistics covering a century of Argentine history. The second section will compare four sources of data on Argentine strikes since the return to democracy in 1983, focusing on particularly useful data published by the Consejo Técnico de Inversiones (CTI) in Buenos Aires. The third section will make the CTI statistics available for research and will employ them to assess some widely held assumptions about labor militancy in Argentina during the period from 1984 to 1993.
Revolutionary Cuba since 1959 has outpaced most other Latin American countries at raising life expectancy and reducing infant mortality. Pre-revolutionary Cuba from 1900 to 1959 did even better, however, outperforming all other Latin American countries for which data are available. Pre-revolutionary Cuba became Latin America's unlikely champion of mortality decline despite experiencing slow economic growth and high income inequality, a record that is inconsistent with the “wealthier is healthier” interpretation of mortality reduction. It also achieved this distinction despite being ruled by governments that are sometimes portrayed as corrupt, personalistic, patronage-ridden, subordinate to U.S. business interests, and neglectful, at best, of the exploited and downtrodden. We attribute pre-revolutionary Cuba's rapid mortality decline to its health care system's accessibility to a large fraction of the poor and to features of the island's history, geography, labor union movement, and political system that contributed to this accessibility.
Ethical thinking is an indispensable component of sound professional practice across all areas of applied psychology. Within it, practitioners seek to take account of both the principles formulated in codes of conduct and the rights of all the involved participants. In this chapter we first describe the background to and the fundamental concepts of normative ethics before examining the agreed practical ethical principles that determine standards of work and the processes of ethically aware decision making. We identify the major sources of philosophical thought that have influenced the development of professional codes of practice. Examining the implications of this in a variety of contexts in forensic psychology, we focus on the most frequently encountered moral dilemmas and challenges that arise. They are drawn from the areas of working with clients, professional supervision, and research; and involve issues such as the protection of confidentiality, avoidance of role conflicts, resolution of the sometimes incongruent priorities of individual and public domains, and management of boundaries in professional relationships.
This Element explores the association between political democracy and population health. It reviews the rise of scholarly interest in the association, evaluates alternative indicators of democracy and population health, assesses how particular dimensions of democracy have affected population health, and explores how population health has affected democracy. It finds that democracy - optimally defined as free, fair, inclusive, and decisive elections plus basic rights - is usually, but not invariably, beneficial for population health, even after good governance is taken into account. It argues that research on democracy and population health should take measurement challenges seriously; recognize that many aspects of democracy, not just competitive elections, can affect population health; acknowledge that democracy's impact on population health will be large or small, and beneficial or harmful, depending on circumstances; and identify the relevant circumstances by combining the quantitative analysis of many cases with the qualitative study of a few cases.
While recent research points to the potential benefits of clinical intervention before the first episode of psychosis, the logistical feasibility of this is unclear.
To assess the feasibility of providing a clinical service for people with prodromal symptoms in an inner city area where engagement with mental health services is generally poor.
Following a period of liaison with local agencies to promote the service, referrals were assessed and managed in a primary care setting. Activity of the service was audited over 30 months.
People with prodromal symptoms were referred by a range of community agencies and seen at their local primary care physician practice. Over 30 months, 180 clients were referred; 58 (32.2%) met criteria for an at risk mental state, most of whom (67.2%) had attenuated psychotic symptoms. Almost 30% were excluded due to current or previous psychotic illness, of which two-thirds were in the first episode of psychosis. The socio-demographic composition of the 'at risk' group reflected that of the local population, with an over-representation of clients from an ethnic minority. Over 90% of suitable clients remained engaged with the service after 1 year.
It is feasible to provide a clinical service for people with prodromal symptoms in a deprived inner city area with a large ethnic minority population.
The comprehensive biopsychosocial approach to violence requires and enables us to ensure the role of social structures is incorporated in explanations of violence and approaches to addressing the problem. Tackling these social formations requires a public health approach which combines many of the interventions we have considered so far but in addition is designed to do so in an integrated programme which coordinates interventions at different levels. There has been significant growth in these integrated interventions over the past twenty years, especially with vulnerable groups in low- and middle-income countries, and we will consider some examples in this chapter. They are integrated in the sense that they often consist of ‘bundles’ of discrete intervention packages (Yount et al., 2017) operating at distinct levels within and between individuals and across whole societies to tackle violent behaviour both directly and indirectly. These programmes often combine primary prevention through broad structural changes with more direct and focussed interventions with high-risk groups. They may aim, for instance, to equip vulnerable women with the skills necessary to assertively set limits when they are exposed to violent behaviour by their partner. At the same time, they may aim to enhance the status of the same women through providing economic resources which empower them more generally in their relationships with men. The same programmes may also be designed to work with the perpetrators to change both their approach to negotiating with their partner in conflict situations and to change their attitudes toward women more generally. So these integrated programmes operate both on violence itself directly and, for example, on gender relations as an indirect factor associated with premature death from violence and from other avoidable causes such as HIV/AIDS.
Effective risk assessment, when it occurs, ideally leads to decisions about effective interventions. The multifactorial nature of violence, with its origins in biological, psychological, and social processes operating separately or together, means that interventions at all of these different levels are available for those who present an identifiable violence risk. The challenge is to identify which intervention, or combination of interventions, is most suitable in each case and to find the necessary resources for successful implementation. The next three chapters will consider interventions at each of the levels and examine some of the evidence currently available with regard to their effectiveness. In public health terms, the medical and psychosocial interventions to be considered, respectively, here and in the next chapter, mainly represent tertiary but also some secondary prevention efforts deployed with at-risk populations, or with relatively high-risk individuals who have already acted violently. In contrast, the integrated interventions considered in Chapter 10 often incorporate a primary prevention element in addition, as part of a truly comprehensive approach to the problem.
Pharmacology clearly has its limitations and dangers as a tool for reducing violence, especially as part of a public health approach which casts its net so widely across many groups who are viewed as more or less challenging for society. Since it is clear from much of the evidence discussed in Part I of this book that many other factors beyond biology contribute to the tendencies some people have toward violence, it is vital to consider what psychological and social options are available with the potential to counter that tendency. These can be much more expensive to implement because of the need for greater human resources, and, however great the leverage, the intervention can ultimately be refused or ignored. In addition, they are often more difficult to test in terms of efficacy for reasons similar to those discussed in the previous chapter. However, it is psychosocial interventions which the WHO approach endorses most strongly in its ‘best buy’ guidance and which therefore need to be examined here in terms of practicality and effectiveness.
Consider the following question. What would be likely to happen if a group of humans was returned to a ‘state of nature’, divested of the elaborate mechanisms of control that we assume restrain our impulses in everyday civilised life? Many people would probably expect that the likeliest result would be a breakdown of order and a resurgence of primeval self-interest, with increasing aggression, possibly spiralling into violence. This is the theme of William Golding’s dystopian novel Lord of the Flies, published in 1954. It is the imagined story of a group of British schoolboys shipwrecked onto a remote oceanic island. While they initially agree a set of rules for communal living, their behaviour soon deteriorates. Factions emerge, and hostilities erupt, followed by ghastly murderous violence. The book became a worldwide best-seller and was later turned into a stage play and several film versions. It was placed on reading lists in schools and colleges in many countries. The story appears to confirm what many people assume: that once the veneer of socialised conduct is stripped away, human beings naturally resort to ‘the law of the jungle’ based on evolutionary survival strategies.
There is general agreement among scientists that biological factors form an essential ingredient of a comprehensive explanation for human violence. As we have already noted, displays of aggression, fighting, and other forms of violent behaviour are easy to find across many species, and it is mostly the case that aggression pays off when survival is at stake, if there are no other means of ensuring it. It is deployed in acquiring and protecting resources, for self-defence and to promote chances of reproductive success. Yet the public health approach in this area often appears to shy away from the issue of biological factors in human violence, especially from the potential role genes might play in a tendency toward such behaviour. One of the WHO’s key statements, the World Report on Violence and Health (Krug et al., 2002), gives only brief consideration to the possible role of genetic biomarkers as risk factors for self-directed violence, but notes that what individuals may inherit is a risk of mental disorder linked to suicide rather than a propensity toward suicide in itself. The subsequent Global Status Report on Violence Prevention (World Health Organization, 2014, p. 27) in contrast acknowledges that ‘violence is a multifaceted problem with biological, psychological, social and environmental roots’. However, neither of these two landmark publications allots any space to detailed consideration of biological processes, and neither mentions genetic factors in relation to violence at all. Both reports depict violence as a problem which encompasses biological as well as the other identified influences. Yet could it be that simultaneously they render genetic influences on human aggression as something of an ‘elephant in the room’? At issue again is the question of whether we consider this strategy to be integral to our biologically evolved makeup, to an extent whereby as humans we are incapable of refraining from it. Can we leave that history to one side in explaining and addressing violence, and is the WHO on secure ground in paying minimal attention to biological influences?
So far we have been reviewing evidence for and against the idea that human beings are ‘hard-wired’ for violence. We have been suggesting that there is limited evidence for the strong proposal that acting violently is an integral part of ‘human nature’ and that there is plentiful support for the alternative perspective. At the very least, the charge of biological determinism for human violence remains unproven. To be sure, this is not to discount the plentiful evidence that violence has obviously been a recurrent feature of human life at both individual and societal levels for millennia. It is awareness of this long-standing characteristic that may sustain the seemingly widespread but as yet unjustified assumption that we are inherently motivated toward violence, or that we cannot stop ourselves from engaging in it. However, we must remind ourselves again that the simple occurrence of violent events even if frequent does not on its own confirm or deny the supposition that the cause of them lies in our basic nature, or that such a pattern is inescapable and we are perennially bound to repeat it. All that counting violent events can ever do is support the self-evident position that we have a capacity for violence that in some circumstances is expressed. The extent and manner of that expression shows large variations, between individuals, between social groups, and over time, as a function of developmental, cognitive, situational and historical factors. These variations provide the key to understanding the more likely reasons for human violence and the degree to which environment and culture counterbalance any hypothetical innate drive toward aggression and destruction.