Hostname: page-component-848d4c4894-nr4z6 Total loading time: 0 Render date: 2024-04-30T12:58:26.858Z Has data issue: false hasContentIssue false

James L. A. WebbJr. , The Long Struggle Against Malaria in Tropical Africa (New York: Cambridge University Press, 2014), £19.99, paperback, xxi $+$ pp. 219, ISBN: 1107685109.

Published online by Cambridge University Press:  21 December 2016

Patrick Zylberman*
Affiliation:
École des Hautes Études en santé publique and Centre Virchow-Villermé, France
Rights & Permissions [Opens in a new window]

Abstract

Type
Book Review
Copyright
© The Author 2016. Published by Cambridge University Press. 

A masterful endeavour, Webb’s comprehensive study of the fight against malaria in tropical Africa is here to stay at the top of the historical literature upon infectious diseases in Africa. With anthropological and political dimensions blatantly lacking in the global health community’s strategies,Footnote 1 historical enquiry into the facts and the rhetoric of these strategies is desperately needed. As Webb remarks, ‘malaria control initiatives will have to be tailored to African realities’, and African realities means in particular political instability, which plays havoc with public health policies in a large part of the continent.

One of the four forms of the parasite, falciparum had become the dominant influence on the nature of African malaria 40 000–20 000 years ago. Preventing the parasite from infecting the red blood cell, a human genetic variation called Duffy negativity was a dead end for the spread of the former dominant infection throughout Africa, the vivax parasite. Falciparum is the most dangerous form of malarial infection. In the course of an epidemic, it may kill up 20 non-immune individuals, whereas vivax kills 1 or 2.

Europeans played with the idea that Africans were immune to malaria. Only in the course of the early twentieth century would this idea dissipate. From the start, the main strategy was to protect the non-immune Europeans. Quinine and mosquito netting were widely used by Europeans. But Koch’s discovery in the 1890s that many people shared acquired immunity and can be asymptomatic carriers made a new approach necessary. In 1899, Ronald Ross, travelling to Freetown (Sierra Leone), found that the chief breeding places for anophelines were puddles. Experts came to the conclusion that destroying mosquitoes’ breeding grounds (larviciding) would be too expensive, however, and so was administering wholesale doses of quinine to local people. Residential segregation was thus the best option to protect Europeans.

Webb underscores the ‘control-and-lapse instances’ of transmission control strategies, an important issue. Mosquito control, spraying the walls and ceilings of human habitations with insecticides need to be pursued unremittingly. But government monies rarely supported such a necessity, and intermittent malaria control could thus produce vulnerabilities, so it was discovered at the end of the Second World War. Success in lowering transmission was accompanied by an increase of severity in older age cohorts in the African populations. Despite these findings, the risk for local populations of losing their partial immunities was not taken into consideration by policy-makers.

In the first three decades of the twentieth century, malaria therapeutic approach met two principal constraints; antimalarial drugs (quinine) were expensive; treating Africans might compromise their acquired immunity. British, French and Belgian colonies reserved quinine for Europeans, although small-scale experiments with ‘quininisation’ were conducted among African schoolchildren in Madagascar, Sénégal, Gabon and La Réunion. Treatments and control efforts would be even scanter for the African populations during the global economic depression of the 1930s and the wartime years.

After the Second World War winds of optimism blew about the powers of synthetic insecticides. The first large-scale use for indoor residual spraying (IRS) occurred in 1945 in Morovia (Liberia). Success suggested focusing on localised species eradication. Had not Brazil, British Guiana, Egypt or Mauritius fought a winning battle against anophelines some time before? IRS would be a much-discussed issue at the first African Malaria Conference convened in 1950 in Kampala by the WHO. Childhood morbidity and mortality represented ‘costs’ paid to acquire adult immunity, and provided they were low, they might be an acceptable price for fully functional adult immunity. At the Conference, agreement came down in support of intervention in rural zones of endemic malaria, but no agreement was reached about intervention in rural, holoendemic (year-round, heavy), areas of transmission. Yet, in 1955, a program for the global eradication of malaria would be adopted at Geneva. It would be based on two premises. First, if transmission could be interrupted, then eradication was possible. (No consideration was taken of the problem of latent asymptomatic infections.) Second, the costs of the programme, even if substantial, would involve a single, time-limited, disbursement. (No one forecast that the costs of post-eradication surveillance could dwarf initial expenditures.) If 25% of childhood mortality were attributable to malaria, reasons for caution in intervention dissipate. The concern about harm that might be done to adults whose acquired immunity could be compromised was hardly considered. Malariologists might elect utilitarianism as their chosen professional ethics, their brand of utilitarianism was certainly not of a consequentialist variety.

Unfortunately, resistance developed rapidly. Every one of the synthetic insecticides, DDT and others, had a mixed record. Malaria eradication programs were closed down in the early 1960s. One lesson learned was that if malaria eradication in tropical Africa was not feasible, malaria control was feasible, provided funding were sufficient. Yet malariologists never tried to penetrate African cosmological frameworks that determined people’s understanding of the disease. They never succeeded in building a co-operative relationship with communities. As the author says, ‘the cultural gulf remained fundamentally unbridged’. No thought was given to this as one reason for the lack of success.

The 1960s saw the coming an inexpensive and highly effective antimalarial drug, chloroquine, another wonder drug. Rural populations soon embraced it. In the villages, distended spleens declined precipitously, as did sickness and death owing to malaria. Chloroquine is credited to have reduced infant and early childhood mortality from malaria by 25–35%. But resistance to chloroquine was becoming widespread in South-East Asia, and it was only a matter of a few decades before it would set the foot on African soil. With no back-up drug to chloroquine, serious troubles were ahead.

Vector resistance to chloroquine appeared in Africa in 1978. WHO, then, launched various strategies, of which Roll Back Malaria (1998) was the most ambitious. But the global north also became more aggressive with the creation of the Global Fund to Fight AIDS, TB and Malaria (2002), and the President’s Malaria Initiative (G.W. Bush, 2005). In 2007, the Bill and Melinda Gates Foundation announced a new campaign to eradicate malaria. The rationale – the economic rationale – was a repetition of the former campaigns of the 1950s and 1960s. A strange scenario came out. In the 1990s, intermittent preventive therapy (antimalarials) for pregnant women and infant unknowingly rediscovered the empirical findings of Belgian colonial physicians in the 1930s. Pyrethroid resistance developing in the 2000s, lowering the protection given by bed nets with insecticide, stressed the resurgence of susceptibility among those who have lost their partial immunity. These findings were treated as novel, although they were consonant with experiences dating back to the aftermath of the Second Word War. Large-scale plantations of Artemisia annua (for the artemisinin-based new wonder drug) looks like a reprise of the efforts to grow cinchona trees in the 1930s and 1940s. And the whole job of fighting malaria appeared to be, as Webb has it, a ‘Sisyphean endeavour’, as if history was motionless, and without an endgame.

References

1. This has been stressed recently in the Report of the Ebola Interim Assessment Panel (WHO, 7 July 2015), Executive Summary, 6.Google Scholar