Skip to main content Accessibility help
×
Hostname: page-component-76fb5796d-dfsvx Total loading time: 0 Render date: 2024-04-26T13:46:12.841Z Has data issue: false hasContentIssue false

16 - Taking Failure Seriously

Health Research Regulation for Medical Devices, Technological Risk and Preventing Future Harm

from Section IB - Tools, Processes and Actors

Published online by Cambridge University Press:  09 June 2021

Graeme Laurie
Affiliation:
University of Edinburgh
Edward Dove
Affiliation:
University of Edinburgh
Agomoni Ganguli-Mitra
Affiliation:
University of Edinburgh
Catriona McMillan
Affiliation:
University of Edinburgh
Emily Postan
Affiliation:
University of Edinburgh
Nayha Sethi
Affiliation:
University of Edinburgh
Annie Sorbie
Affiliation:
University of Edinburgh

Summary

Failure in health research regulation is nothing new. Indeed, the regulation of clinical trials was developed in response to the Thalidomide scandal, which occurred some 50 years ago. Yet, health research regulation is at the centre of recent failures. In this chapter, I use health research regulation for medical devices to look at the regulatory framing of harm through the language of technological risk, i.e. relating to safety. My overall argument is that reliance on this narrow discourse of technological risk in the regulatory framing of harm may marginalise stakeholder knowledges of harm to produce a limited knowledge base. The latter may underlie harm, and in turn lead to the construction of failure.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2021
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/cclicenses/

16.1 Introduction

Failure in health research regulation is nothing new. Indeed, the regulation of clinical trials was developed in response to the Thalidomide scandal, which occurred some fifty years ago.Footnote 1 Yet, health research regulation is at the centre of recent failures.Footnote 2 Metal-on-metal hip replacements,Footnote 3 and, more recently, mesh implants for urinary incontinence and pelvic organ prolapse in women – often referred to as ‘vaginal mesh’ – have been the subject of intense controversy.Footnote 4 Some have even called the latter controversy ‘the new Thalidomide’.Footnote 5 In these cases, previously licensed medical devices were used to demonstrate the safety of supposedly analogous new medical devices, and obviate the need for health research involving humans.Footnote 6

In this chapter, I use health research regulation for medical devices to look at the regulatory framing of harm through the language of technological risk, i.e. relating to safety. My overall argument is that reliance on this narrow discourse of technological risk in the regulatory framing of harm may marginalise stakeholder knowledges of harm to produce a limited knowledge base. The latter may underlie harm, and in turn lead to the construction of failure.

I understand failure itself in terms of this framing of harm.Footnote 7 Failure is taken to be ontologically and normatively distinct from harm, and as implicating the design and functioning of the system or regime itself. Failure is understood as arising when harm is deemed to thwart expectations of safety built into technological framings of regulation. This usually occurs from stakeholder perspectives. Stakeholders include research participants, patients and other interested parties. However, the new force of failure in public discourse and regulation,Footnote 8 apparent in the way it ‘now saturates public life’,Footnote 9 ensures that the language of failure provides a means to integrate stakeholder knowledges of harm with scientific-technical knowledges.

In the next section, I use health research relating to medical devices to reflect on the role of expectations and harm in constructing failure. This sets the scene for the third section, where I outline the roots of failure in the knowledge base for regulation. Subsequently, I explain how the normative power of failure may be used to impel the integration of expert and stakeholder knowledges, improving the knowledge base and, in turn, providing a better basis on which to anticipate and prevent future failures. The chapter thus appreciates how failure can amount to a ‘failure of foresight’, which may mean it is possible to ‘organise’ failure and the harm it describes out of existence.Footnote 10

16.2 Expectations and Failure in Health Research

Failure has long been understood, principally though not exclusively, in Kurunmäki and Miller’s words, ‘as arising from risk rather than sin’.Footnote 11 Put differently, failure can be understood in principally consequentialist, rather than deontological, terms.Footnote 12 This understanding does not exclude legal conceptualisations of failure in tort law and criminal law, in which the conventional idea of liability is one premised on ‘sin’ or causal contribution.Footnote 13 However, within contemporary society and regulation, such deontological understandings are often overlaid with a consequentialist view of failure.Footnote 14

This is apparent in recent work by Carroll and co-authors. Through their study of material objects and failure, they describe failure as ‘a situation or thing as [sic] not being in accord with expectation’.Footnote 15 According to van Lente and Rip, expectations amount to ‘prospective structures’ that inform ‘statements, brief stories and scenarios’.Footnote 16 It is expectation, rather than anticipation or hope, then, that is central to failure. Unlike expectation, anticipation and hope do not provide a sense of how things ought to be, so much as how they could be or an individual or group would like them to be.Footnote 17 Indeed, as Bryant and Knight explain: ‘We expect because of what the past has taught us to expect … [Expectation] awakens a sense of how things ought to be, given particular conditions.Footnote 18

This normative dimension distinguishes expectation from other future-oriented concepts and furnishes ‘a standard for evaluation’, for whether a situation is ‘good or bad, desirable or undesirable’,Footnote 19 and, relatedly, a failure. Indeed, for Appadurai ‘[t]he most important thing about failure is that it is not a fact but a judgment’.Footnote 20 Expectations rely on the past to inform a normative view of some future situation or thing, such as that it will be safe. When, through the application of calculative techniques that determine compliance with the standard for evaluation, this comes to be seen as thwarted, there is a judgment of failure.Footnote 21 Expectations, and hence a key ground for establishing failure, are built into regulatory framingsFootnote 22 and the targets of regulation.Footnote 23

These insights can be applied and developed through the example of health research regulation for medical devices. In this instance, technological risk, i.e. safety, provides the framing for medical devices within the applicable legislation and engenders an expectation of safety.Footnote 24 However, in respect of metal-on-metal hips and vaginal mesh, harm occurred, and the expectation of safety was thwarted downstream once these medical devices were in use.

Harm was consequent, seemingly in large part, on the classification of metal-on-metal hips and vaginal mesh as Class IIb devices. IIb devices are medium to high-risk devices, which are usually devices installed within the body for thirty days or longer. This meant that it was possible for manufacturers to rely on substantial equivalence to existing products to demonstrate conformity with general safety and performance requirements. These requirements set expectations for manufacturers and regulators to demonstrate safety, both for the device and the person within which it was implanted. Substantial equivalence obviates the need for health research involving humans via a clinical investigation.

It is noted in one BMJ editorial that this route ‘failed to protect patients from substantial harm’.Footnote 25 Heneghan et al. point out that in respect of approvals by the Food and Drug Administration in the USA, which are largely mirrored in the European Union (EU): ‘Transvaginal mesh products for pelvic organ prolapse have been approved on the basis of weak evidence over the last 20 years’.Footnote 26 This study traced the origins of sixty-one surgical mesh implants to just two original devices approved in the USA in 1985 and 1996. The reliance on substantial equivalence meant that safety and performance data came from implants that were already on the market, sometimes for decades, and that were no longer an accurate predicate. In other words, on the basis of past experience – specifically, of ‘substantially equivalent’ medical devices – there was an unrealistic expectation that safety would be ensured through this route, and that further research involving human participants was unnecessary.

Stakeholders reported adverse events including: ‘Pain, impaired mobility, recurrent infections, incontinence/urinary frequency, prolapse, fistula formation, sexual and relationship difficulties, depression, social withdrawal or exclusion/loneliness and lethargy’.Footnote 27 On this basis, stakeholders, including patient groups, demanded regulatory change. Within the EU, new legislation was introduced, largely in response to these events. The specific legislation applicable to the examples considered in this chapter, the Medical Devices Regulation (MDR),Footnote 28 came into force on 26 May 2020 (Article 123(2) MDR).

In respect of metal-on-metal hips and vaginal mesh, the legislation reclassifies them as Class III. Class III devices are high risk and invasive long-term devices. Future manufacturers of these devices will, in general, have to carry out clinical investigations to demonstrate conformity with regulatory requirements (Recital 63 MDR). The EU’s new legislation takes up a whole chapter on clinical investigations and thus safety. The legislation is deemed to provide a ‘fundamental revision’ to ‘establish a robust, transparent, predictable and sustainable regulatory framework for medical devices which ensures a high level of safety and health whilst supporting innovation’ (Recital 1 MDR). One interpretation of the legislation is that it is a direct response to problems in health research for medical devices, and intended to provide ‘a better guarantee for the safety of medical devices, and to restore the loss of confidence that followed high profile scandals around widely used hip, breast, and vaginal mesh devices’.Footnote 29

As regards metal-on-metal hips and vaginal mesh, however, there has been little or no suggestion of failure by those formally responsible, and who might be held accountable if there were – perhaps especially if it could be said there were any plausible causal contribution by them towards harm. Instead, the example of medical devices demonstrates how the construction of failure does not necessarily hinge on official accounts of harm as amounting to ‘failure’. This is apparent in the various quotations from non-regulators noted above. As Hutter and Lloyd-Bostock put it, these are ‘terms in which events are construed or described in the media or in political discourse or by those involved in the event’. As they continue, what matters is an ‘event’s construction, interpretation and categorisation’.Footnote 30

Failure is an interpretation and judgment of harm. Put differently, ‘failure’ arises through an assessment of harm undertaken through calculative techniques and judgments. Harm becomes refracted through these. At a certain point, the expectations of safety built into framing are understood by stakeholders as thwarted, and the harm becomes understood as a failure.Footnote 31 Official discourses are significant, not least because they help to set expectations of safety. But these discourses do not necessarily control stakeholder interpretations and knowledge of harm, or how they thwart expectations of safety, and lead to the construction of failureFootnote 32

In what follows, I shift attention to the lacunae and blind spots in the knowledge base for the regulation of medical devices, which are made apparent by the harm and failure just described. I outline these missing elements before turning to discuss the significance of failure for improving health research regulation.

16.3 Using Failure to Address the Systemic Causes of Harm

Failure, at its root, emerges from the limited knowledge base for health research regulation: for medical devices, and other areas framed by technological risk, it is derived from an archive of past experience and scientific-technical knowledge. The focus on performance (i.e. the device performs as designed and intended, in line with a predicate) marginalised attention to effectiveness (i.e. producing a therapeutic benefit) and patient knowledge on this issue. Moreover, in relation to vaginal mesh implants, female knowledges and lived experiences of the devices implanted within them have tended to be sidelined or even overlooked. The centrality of the male body within research and models of pain, and gender-based presumptions about pain,Footnote 33 help to explain the time taken to recognise a safety problem in respect of medical devices, and the gaping hole in research and knowledge.

Another part of the explanation for the latter problem is that there was a lengthy delay in embodied knowledge and experiences of pain being reported and recognised – effectively sidelining and ignoring those experiences. New guidance on vaginal mesh in the United Kingdom (UK) has faced criticism on gender-based lines. Safety concerns are cited and it is recommended that vaginal mesh should not be used to treat vaginal prolapse. However, as the UK Parliament’s All Party Parliamentary Group on Surgical Mesh Implants said, the guidelines: ‘disregard mesh-injured women’s experiences by stating that there is no long-term evidence of adverse effects’.Footnote 34

The latter may amount to epistemic injustice, what Fricker describes as a ‘wrong done to someone specifically in their capacity as a knower’.Footnote 35 More than a harm in itself, epistemic injustice may limit stakeholder ability to contribute towards regulation, leading to other kinds of harm and failure. This is especially true in the case of health research regulation, where stakeholders may be directly or indirectly harmed by practices and decisions that are grounded on a limited knowledge base. Moreover, even in respect of the EU’s new legislation on medical devices, doubts remain whether these will prevent future harms and thus failures similar to those mentioned above. Indeed, the only medical devices that are required to evidence therapeutic benefit or efficacy in controlled conditions before marketing are those that incorporate medicinal products.Footnote 36

A deeper explanation for the marginalisation of stakeholder knowledges of harm, and a key underpinning for failure, lies in the organisation of knowledge production. Hurlbut describes how: ‘Framed as epistemic matters – that is, as problems of properly assessing the risks of novel technological constructions – problems of governance become questions for experts’.Footnote 37 This framing constructs a hierarchy of knowledge that privileges credentialised knowledge and expertise, while marginalising those deemed inexpert or ‘lay’. Bioethics plays a key role here. As a field, bioethics tends to focus on technological development within biomedicine and principles of individual ethical conduct or so-called ‘quandary ethics’, rather than systemic issues related to epistemic – or social – justice. Consequently, bioethics often privileges and bolsters scientific–technical knowledge, erases social context and renders ‘social’ elements as little more than ‘epiphenomena’.Footnote 38 In this setting, stakeholder knowledges and forms of expertise relating to harm are, as Foucault explained, ‘disqualified … [as] naïve knowledges, hierarchically inferior knowledges, knowledges that are below the required level of erudition or scientificity’.Footnote 39

The specific contemporary cultural resonance of the language of failure means that it can be used as a prompt to overcome this marginalisation and improve the knowledge base for regulation. Specifically, the language of failure can be used to generate a risk to organisational standing and reputation. Adverse public perceptions may cast failure as regulatory failure, effectively framing regulators as ‘part of the cause of disasters and crises’.Footnote 40 A perception of regulatory failure thus has key implications for the accountability and legitimacy of regulation and regulators – and such perception is therefore to be avoided by them. Relatedly, regulators want to avoid the shaming and blaming that often accompany talk of failure. Blaming can even amplifyFootnote 41 or extend the duration of an institutional risk to standing and reputation. This may produce a crisis for regulation, including for its legitimacy, quite apart from any interpretation and judgment of failure or regulatory failure.

The risk posed by failure to standing and reputation may prompt the integration of stakeholder knowledges with the scientific–technical knowledges that currently underpin regulation. The potential to use failure in this way is already apparent in the examples above, and perhaps especially vaginal mesh. Stakeholders have been largely successful in presenting their knowledges of harm, placing a spotlight on health research regulation and demanding change to prevent future failure.

Despite the limitations within much bioethics scholarship, there is a growing plethora of approaches to injustice, most recently and notably vulnerability, within which embodied risk and experiential knowledge are central.Footnote 42 These approaches are buttressed by a developing scientific understanding of the significance of environmental factors to genetic predisposition to vulnerability and embodied risk.Footnote 43 Further, within such approaches, the centrality of the human body and experience is foregrounded precisely to recast the objects of bioethical concern. The goal: to prompt a response from the state to fulfil its responsibilities in respect of rights.Footnote 44 In the context of health research, this research can be leveraged to counter the lack of alertness and communicative failures for which institutions and powerful people must take responsibility,Footnote 45 and expand the knowledges that count in regulation.

There are mechanisms to facilitate the integration of stakeholder with scientific–technical knowledges and improve health research for medical devices. Further attention to effectiveness could yield important additional data (i.e. on producing a therapeutic benefit) on top of performance (i.e. the device performs as designed and intended). Similar to clinical trials for medicines, which produce data to demonstrate safety, quality and efficacy, this would require far more involvement and data from device recipients. Recipient involvement and data could come pre- or post-marketing – or both. Involvement pre-marketing seems both desirable and possible:

The manufacturers’ argument that [randomised controlled trials] are often infeasible and do not represent the gold standard for [medical device] research is clearly refuted. As high-quality evidence is increasingly common for pre-market studies, it is obviously worthwhile to secure these standards through the [Medical Devices Regulation] in Europe and similar regulations in other countries.Footnote 46

One proposed model for long-term implantable devices, such as those discussed in this chapter, involves providing limited access to them through temporary licences that restrict use to within clinical evaluations, with long follow-up at a minimum of five years. Wider access could be provided once safety, performance and efficacy have been adequately demonstrated. In addition, wider public access to medical device patient registries, including the EU’s Eudamed database, could be provided so as to ensure transparency, open up public discourse around safety and tackle epistemic injustice.Footnote 47

16.4 Conclusion

In this chapter, I described how failure is constructed and becomes recognised through processes that determine whether harm has thwarted the expectation of safety built into technological framings of regulation. Laurie is one of the few scholars to illuminate, not only how health research regulation transforms its participants into instruments, but how this may underlie failure:

if we fail to see involvement in health research as an essentially transformative experience, then we blind ourselves to many of the human dimensions of health research. More worryingly, we run the risk of overlooking deeper explanations about why some projects fail and why the entire enterprise continues to operate sub-optimally.Footnote 48

By looking at the organisation of knowledge that supports regulatory framings of medical devices, it becomes clear how the marginalisation of stakeholder knowledge may provide a deeper explanation for harm and failure. Failure can be used to prompt the take-up of stakeholder knowledges of harm in regulation, by recasting regulation or using its mechanisms differently in light of those knowledges, so as to better anticipate and prevent future harm and failure, and enable success. See further on users’ experiences, Harmon, Chapter 39, this volume.

Why, then, has more not been done to ensure epistemic integration as a way to enhance regulatory capacities to anticipate and prevent failure? Epistemic integration would involve bringing stakeholders within regulation via their knowledges. As such, epistemic integration would seem to undermine the dominant position of those deemed expert within extant processes. Knowledge of harm becomes re-problematised: what knowledges from across society are required by regulation in order to ensure its practices are ethical and legitimate? Integration of diverse knowledges might reveal to society at large the limits of current regulation to deal with risk and uncertainty. More deeply, epistemic integration would challenge modernist values on the import of empirically derived knowledge, and the efficacy of society’s technological ‘fixes’ in addressing its problems. However, scientific–technical knowledge and expertise would still be necessary in order to discipline ‘lay’ knowledges and ensure their integration within the epistemic foundations of decision-making. To resist epistemic integration is, therefore, essentially to bolster extant power relations. As the analysis in this chapter suggests, these relations are actually antithetical to addressing failure and maintaining the protections that are central to ethical and legitimate health research and regulation more generally.

Footnotes

* Many thanks to all those with whom I have discussed the ideas set out in this chapter, especially the editors and Ivanka Antova, Richard Ashcroft, Daithi Mac Sithigh, Katharina Paul and Barbara Prainsack. The discussion in this chapter is developed further in: Mark L Flear, ‘Epistemic Injustice as a Basis for Failure? Health Research Regulation, Technological Risk and the Epistemic Foundations of Harm and Its Prevention’, (2019) European Journal of Risk Regulation 10(4), 693721.

1 In the United Kingdom, the scandal resulted in the Medicines Act 1968 and its related licensing authority. See E. Jackson, Law and the Regulation of Medicines (London: Hart Publishing, 2012), pp. 45.

2 Relatedly, see S. Macleod and S. Chakraborty, Pharmaceutical and Medical Device Safety (London: Hart Publishing, 2019).

3 C. Heneghan et al., ‘Ongoing Problems with Metal-On-Metal Hip Implants’, (2012) BMJ, 344(7846), 2324.

4 See the articles comprising ‘The Implant Files’, (The Guardian), www.theguardian.com/society/series/the-implant-files.

5 H. Marsden, ‘Vaginal Mesh to Treat Organ Prolapse Should Be Suspended, Says UK Health Watchdog’, (The Independent, 15 December 2017).

6 The famous Poly Implant Prothése silicone breast implants scandal concerned fraud rather than the kinds of problems with health research regulation discussed in this chapter – see generally C. Greco, ‘The Poly Implant Prothése Breast Prostheses Scandal: Embodied Risk and Social Suffering’, (2015) Social Science and Medicine, 147, 150157; M. Latham, ‘“If It Ain’t Broke Don’t Fix It”: Scandals, Risk and Cosmetic Surgery’, (2014) Medical Law Review, 22(3), 384408.

7 This may extend beyond physical harm to social harm, environmental harm ‘and so on’ – see R. Brownsword, Rights, Regulation and the Technological Revolution (Oxford University Press, 2008), p. 119. Also see pp. 102–105.

8 For definition of ‘regulation’ see the Introduction to this volume.

9 L. Kurunmäki and P. Miller, ‘Calculating Failure: The Making of a Calculative Infrastructure for Forgiving and Forecasting Failure’, (2013) Business History, 55(7), 11001118, 1100. Emphasis added. More broadly, for comment on the ‘stream of failures’ since the 1990s, see M. Power, Organised Uncertainty (Oxford University Press, 2007), p. 5.

10 B. Turner, Man-Made Disasters (Wykeham 1978). For application to organisations, see B. Hutter and M. Power (eds), Organisational Encounters with Risk (Cambridge University Press, 2005), p. 1. Some failures are ‘normal accidents’ and cannot be organised out of existence – see C. Perrow, Normal Accidents: Living with High-Risk Technologies (New York: Basic Books, 1984).

11 Kurunmäki and Miller, ‘Calculating Failure’, 1101. Emphasis added.

12 For discussion, see R. Brownsword and M. Goodwin, Law and the Technologies of the Twenty-First Century: Text and Materials (Cambridge University Press, 2012), p. 208.

13 Indeed, Poly Implant Prothése silicone breast implants and vaginal mesh have been the subject of litigation – for discussion of each see, Macleod and Chakraborty, Pharmaceutical and Medical Device Safety, pp. 232–234 and pp. 259–263, respectively. For a recent case on vaginal mesh involving a class action against members of the Johnson & Johnson group in which the court found in favour of the claimants, see Gill v. Ethicon Sarl (No. 5) [2019] FCA 1905.

14 A. Appadurai, ‘“Introduction” to Special Issue on “Failure”’, (2016) Social Research, 83(3), xxxxvii.

15 T. Carroll et al., ‘Introduction: Towards a General Theory of Failure’ in T. Carroll et al. (eds), The Material Culture of Failure: When Things Go Wrong (Bloomsbury, 2018), pp. 120, p.15. Emphasis added.

16 H. van Lente and A. Rip, ‘Expectations in Technological Developments: An Example of Prospective Structures to be Filled in by Agency’ in C. Disco and B. van der Meulen (eds), Getting New Technologies Together: Studies in Making Sociotechnical Order (Berlin: De Gruyter, 1998), p. 205.

17 R. Bryant and D. Knight, The Anthropology of the Future (Cambridge University Press, 2019), p. 28 for anticipation and p. 134 for hope.

18 Ibid., p. 58. Emphasis added.

19 Ibid., p. 63.

20 Appadurai, ‘Introduction’, p. xxi. Emphasis added. Also see A. Appadurai, Banking on Words: The Failure of Language in the Age of Derivative Finance (University of Chicago Press, 2016).

21 Beckert lists past experience among the social influences on expectations – see J. Beckert, Imagined Futures: Fictional Expectations and Capitalist Dynamics (Cambridge, MA: Harvard University Press, 2016), p. 91.

22 Brownsword, Rights, Regulation and the Technological Revolution; K. Yeung, ‘Towards an Understanding of Regulation by Design’ in R. Brownsword and K. Yeung (eds), Regulating Technologies: Legal Futures, Regulatory Frames and Technological Fixes (London: Hart Publishing, 2008), pp. 79107.

23 T. Dant, Materiality and Society (Open University Press, 2005);D. MacKenzie and J. Wajcman (eds), The Social Shaping of Technology, 2nd Edition (Buckingham: Open University Press, 1999); L. Winner, ‘Do Artefacts Have Politics?’, (1980) Daedalus, 109(1), 121136.

24 Medical devices are defined by their intended function, as determined by the manufacturer, for medical purposes – see Article 2(1) of the Medical Devices Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, amending Directive 2001/83/EC, Regulation (EC) No. 178/2002 and Regulation (EC) No. 1223/2009 and repealing Council Directives 90/385/EEC and 93/42/EEC OJ 2017 L 117/1. On the classification of medical devices, see Point 1.3, Annex VIII.

25 C. Allan et al., ‘Europe’s New Device Regulations Fail to Protect the Public’, (2018) BMJ, 363, k4205, 1.

26 Carl J. Heneghan et al., ‘Trials of Transvaginal Mesh Devices for Pelvic Organ Prolapse: A Systematic Database Review of the US FDA Approval Process’, (2017) BMJ Open, 7(12), e017125, 1. Emphasis added.

27 Macleod and Chakraborty, Pharmaceutical and Medical Device Safety, p. 238.

28 Medicine Devices Regulation (EU) 2017/745. Implementation of this legislation is left to national competent authorities.

29 Allan et al., ‘Europe’s New Device Regulations’, 1. Emphasis added.

30 B. Hutter and S. Lloyd-Bostock, Regulatory Crisis: Negotiating the Consequences of Risk, Disasters and Crises (Cambridge University Press, 2017), p. 3. On understandings of failure, see S. Firestein, Failure. Why Science Is So Successful (Oxford University Press, 2016), pp. 89.

31 Kurunmäki and Miller, ‘Calculating Failure’, 1101. Cf I. Hacking, Historical Ontology (Cambridge, MA: Harvard University Press, 2002) – applied in e.g. B. Allen, ‘Foucault’s Nominalism’ in S. Tremain (ed.), Foucault and the Government of Disability (University of Michigan Press, 2018); D. Haraway, The Haraway Reader (New York: Routledge, 2004); D. Roberts, ‘The Social Immorality of Health in the Gene Age: Race, Disability and Inequality’ in J. Metzl and A. Kirkland (eds), Against Health (New York University Press, 2010), pp. 6171.

32 Kurunmäki and Miller, ‘Calculating Failure’, 1101. Cf Hutter and Lloyd-Bostock, Regulatory Crisis, pp. 9–18 and pp. 19–21 for framing and routines.

33 See, for example, R. Hurley and M. Adams, ‘Sex, Gender and Pain: An Overview of a Complex Field’, (2008) Anesthesia & Analgesia, 107(1), 309317. Also see M. Fox and T. Murphy, ‘The Body, Bodies, Embodiment: Feminist Legal Engagement with Health’ in M. Davies and V. E. Munro (eds), The Ashgate Research Companion to Feminist Legal Theory (London: Ashgate, 2013), pp. 249265.

34 National Institute for Health and Care Excellence (NICE), ‘Urinary Incontinence and Pelvic Organ Prolapse in Women: Management, NICE Guideline [NG123]’, (NICE, 2019). This guidance was issued in response to the NHS England Mesh Working Group – see ‘Mesh Oversight Group Report’, (NHS England, 2017). Also see ‘Mesh Working Group’, (NHS), www.england.nhs.uk/mesh/. For criticism, see H. Pike, ‘NICE Guidance Overlooks Serious Risks of Mesh Surgery’, (2019) BMJ, 365, l1537.

35 M. Fricker, Epistemic Injustice: Power and the Ethics of Knowing (Oxford University Press, 2007), p. 1. Emphasis added. Also see I. J. Kidd and H. Carel, ‘Epistemic Injustice and Illness’, (2017) Journal of Applied Philosophy, 34(2), 172190.

36 For discussion, see C. J. Heneghan et al., ‘Transvaginal Mesh Failure: Lessons for Regulation of Implantable Devices’, (2017) BMJ, 359, j5515.

37 J. B. Hurlbut, ‘Remembering the Future: Science, Law, and the Legacy of Asilomar’ in S. Jasanoff and S. Kim, Dreamscapes of Modernity: Sociotechnical Imaginaries and the Fabrication of Power (University of Chicago Press, 2015), p. 129. Original emphasis.

38 On ‘quandary ethics’, see P. Farmer, Pathologies of Power: Health, Human Rights, and the New War on the Poor (University of California, 2003), pp. 204205. Also see D. Callaghan, ‘The Social Sciences and the Task of Bioethics’, (1999) Daedalus, 128(4), 275294, 276. On bioethics and social context, see J. Garrett, ‘Two Agendas for Bioethics: Critique and Integration’, (2015) Bioethics, 29(6), 440447; A. Hedgecoe, ‘Critical Bioethics: Beyond the Social Science Critique of Applied Ethics’, (2004) Bioethics, 18(2), 120143, 125. Also see B. Hoffmaster (ed.), Bioethics in Social Context (Philadelphia: Temple University Press, 2001).

39 M. Foucault, Society Must Be Defended (London: Penguin Books, 2004), p. 7.

40 Hutter and Lloyd-Bostock, ‘Regulatory Crisis’, p. 8. Emphasis added. For discussion, see M. Lodge, ‘The Wrong Type of Regulation? Regulatory Failure and the Railways in Britain and Germany’, (2002) Journal of Public Policy, 22(3), 271297; R. Schwartz and A. McConnell, ‘Do Crises Help Remedy Regulatory Failure? A Comparative Study of the Walkerton Water and Jerusalem Banquet Hall Disasters’, (2009) Canadian Public Administration, 52(1), 91112.

41 For discussion, see A. Boin et al. (eds), The Politics of Crisis Management: Public Leadership Under Pressure (Cambridge University Press, 2005); C. Hood, The Blame Game: Spin, Bureaucracy, and Self-Preservation in Government (Princeton University Press, 2011); N. Pidgeon et al., The Social Amplification of Risk (Cambridge University Press, 2003).

42 M. Fineman, ‘The Vulnerable Subject and the Responsive State’, (2010) Emory Law Journal, 60(2), 251275. Also see work on: precarity (J. Butler, Precarious Life: The Power of Mourning and Violence (London: Verso, 2005)); the capabilities approach (M. Nussbaum, Creating Capabilities (Cambridge, MA: Harvard University Press, 2011); A. Sen, ‘Equality of What?’ in S. McMurrin (ed.), Tanner Lectures on Human Values, Volume 1 (Cambridge University Press, 1980), pp. 195220); and a feminist approach to flesh (C. Beasley and C. Bacchi, ‘Envisaging a New Politics for an Ethical Future: Beyond Trust, Care and Generosity – Towards an Ethic of Social Flesh’, (2007) Feminist Theory, 8(3), 279298).

43 This includes understanding in epigenetics and neuroscience – see N. Rose and J. Abi-Rached, Neuro: The New Brain Sciences and the Management of the Mind (Princeton University Press, 2013); D. Wastell and S. White, Blinded by Science: The Social Implications of Epigenetics and Neuroscience (Bristol: Policy Press, 2017).

44 Most notably, see Fineman, ‘The Vulnerable Subject’. For application to bioethics, see M. Thomson, ‘Bioethics & Vulnerability: Recasting the Objects of Ethical Concern’, (2018) Emory Law Journal, 67(6), 12071233.

45 For discussion, see A. Boin et al. (eds), The Politics of Crisis Management, especially p. 215 and p. 218. This responsibility is grounded in virtue theory. For discussion see Fricker, Epistemic Injustice.

46 S. Sauerland et al., ‘Premarket Evaluation of Medical Devices: A Cross-Sectional Analysis of Clinical Studies Submitted to a German Ethics Committee’, (2019) BMJ Open, 9(2), 6. Emphasis added. For a review of approaches to the collection of data, see D. B. Kramer et al., ‘Ensuring Medical Device Effectiveness and Safety: A Cross-National Comparison of Approaches to Regulation’, (2014) Food Drug Law Journal, 69(1), 123. The EU’s new legislation on medical devices has sought to improve inter alia post-marketing data collection, such as through take-up of the Unique Device Identification. This is used to mark and identify medical devices within the supply chain. For discussion of this and other aspects of the EU’s new legislation, see A. G. Fraser et al., ‘The Need for Transparency of Clinical Evidence for Medical Devices in Europe’, (2018) Lancet, 392(10146), 521530.

47 On licensing, see Heneghan et al., ‘Transvaginal Mesh Failure’. Also see B. Campbell et al., ‘How Can We Get High Quality Routine Data to Monitor the Safety of Devices and Procedures?’, (2013) BMJ, 346(7907), 2122. On access to data, see M. Eikermann et al., ‘Signatories of Our Open Letter to the European Union. Europe Needs a Central, Transparent, and Evidence Based Regulation Process for Devices’, (2013) BMJ, 346, f2771; Fraser et al., ‘The Need for Transparency’.

48 G. Laurie, ‘Liminality and the Limits of Law in Health Research Regulation: What Are We Missing in the Spaces In-Between?’ (2016) Medical Law Review, 25(1), 4772, 71. Emphasis added.

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×