Hostname: page-component-586b7cd67f-2plfb Total loading time: 0 Render date: 2024-12-05T03:42:44.811Z Has data issue: false hasContentIssue false

Epistemic Exhaustion and the Retention of Power

Published online by Cambridge University Press:  21 March 2024

Mark Satta*
Affiliation:
Department of Philosophy, Wayne State University, Detroit, MI, USA
Rights & Permissions [Opens in a new window]

Abstract

Epistemic exhaustion is cognitive fatigue generated by efforts to determine, retain, or communicate what one believes under conditions that make doing so especially taxing. I argue that the creation and maintenance of epistemic exhaustion is a tool that the socially and politically powerful can and do use in order to retain power. I consider a variety of conversational tactics and three circumstances—partisan polarization, epistemic chaos, and epistemic oppression—that can leave people prone to epistemic exhaustion. I survey several common responses to epistemic exhaustion and offer some suggestions for how we ought to respond to epistemically exhausting circumstances.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © The Author(s), 2024. Published by Cambridge University Press on behalf of Hypatia, a Nonprofit Corporation

The nature of epistemic exhaustion

Epistemic exhaustion is cognitive fatigue generated by efforts to determine, retain, or communicate what one believes under conditions that make doing so especially taxing. In this article, I examine three types of environments where people are prone to develop epistemic exhaustion: (1) sociopolitically polarized environments, (2) epistemically chaotic environments, and (3) epistemically oppressive environments. The epistemic exhaustion created in these environments often helps powerful people and groups retain power by making certain types of progress difficult. This article examines epistemic exhaustion from multiple perspectives.

First, I discuss what it is like to experience epistemic exhaustion. Second, I identify some epistemically vicious conversational behaviors that can leave one’s interlocutors susceptible to epistemic exhaustion. These behaviors include Gish galloping, sealioning, bullshitting, applying double standards, and gaslighting, among others.

Third, I examine epistemic exhaustion generated by sociopolitical polarization and epistemic chaos. In doing so, I explain why polarization and epistemic chaos often accompany each other and how the creation of epistemic exhaustion under such conditions can be used as a tool by those seeking to gain or retain power.

Fourth, I examine epistemic exhaustion as an effect of epistemic oppression. I argue that just as the politically powerful can benefit from widespread epistemic exhaustion in the general population, so too the socially dominant can benefit from widespread epistemic exhaustion among members of marginalized groups.

Fifth, I discuss some common responses to epistemic exhaustion. Sixth, I offer some suggestions for dealing with epistemic exhaustion and epistemically exhausting environments. Finally, I close on a positive note by identifying circumstances for which epistemic exhaustion is part of a process of epistemic improvement. I define epistemic exhaustion as follows:

Epistemic exhaustion: cognitive fatigue generated by efforts to determine, retain, or communicate what one believes under conditions that make doing so especially taxing.

Epistemic exhaustion is epistemic in two senses. First, it is epistemic activities (broadly construed to include many doxastic activities) that generate the exhaustion. Second, the exhaustion often primarily (although not exclusively) impacts epistemic elements of one's life, including one's effectiveness as an epistemic agent.

The general definition of epistemic exhaustion identifies three kinds of tasks that may lead to it. The first task is determining for oneself what one believes. This task includes activities like weighing evidence, seeking out more information, and engaging in conversation with the aim of trying to determine for oneself what one thinks. I call epistemic exhaustion that arises from engaging in these sorts of activities belief-determination exhaustion.

The second task—retaining one's belief—concerns attempts to stick by what one believes in the face of pressure to change. One engages in this task when making second-order epistemic assessments such as whether to suspend belief on a previously settled matter or whether to reopen inquiry on a matter previously treated as closed. One also engages in this task by resisting the impulse to call into question beliefs that one thinks one has excellent reason to consider settled. I call epistemic exhaustion that arises from such activities belief-retention exhaustion.

The third task concerns communicating what one already believes to others. One engages in this task by attempting to communicate one's beliefs or one's reasons for belief with the aim of influencing others. The desired influence may be that others come to share the communicated beliefs. But it could also be something simpler such as desiring that others come to understand one's beliefs better or come to accept one's beliefs as reasonable. I call epistemic exhaustion that arises by engaging in these sorts of activities belief-communication exhaustion.

Here are two things worth noting about epistemic exhaustion. First, belief-determining, belief-retaining, and belief-communicating exhaustion are not mutually exclusive. For example, you may struggle to retain your belief in P in part because you are unable to convince others that P is reasonable, and you may find this combination fatiguing. That said, it need not be the case that belief-determination, belief-retention, and belief-communication exhaustion are all present in order for you to count as epistemically exhausted. For example, you may be very confident in your beliefs but exhausted by continual failures to get others to understand or accept your beliefs.

Second, epistemic exhaustion and the circumstances that give rise to it are not inherently good or bad. If one has despicable beliefs, having difficulty communicating or retaining those beliefs is a good thing. If one has laudable beliefs, having difficulty communicating or retaining those beliefs is generally a bad thing. Thus, normative appraisals of conditions that give rise to epistemic exhaustion should account for the nature of the beliefs in question. Such normative questions will be addressed later. In the rest of this section, I seek to describe how epistemic exhaustion feels.

Different people will experience epistemic exhaustion differently. The effects of epistemic exhaustion can be mental, emotional, or physical. But in all cases these effects will be in response to engaging in or feeling pressure to engage in certain epistemic activities. I use the term epistemic activities broadly to include, among other things, activities we engage in where at least one of our aims is belief formation, belief retention, or belief communication. The following are ways epistemic exhaustion can manifest:

  • Finding it increasingly difficult to concentrate on certain epistemic activities,

  • Finding that epistemic activities that used to feel attainable, enjoyable, or worthwhile now feel unattainable, unenjoyable, or not worthwhile,

  • Feeling overwhelmed by the prospect of undertaking epistemic activities that you once found manageable,

  • Feeling detached from or apathetic about epistemic matters you once cared deeply about,

  • Feeling significantly more stressed, pessimistic, or hopeless about your epistemic condition, the epistemic condition of others, or the prospect of successfully engaging in certain epistemic activities when compared to your previous attitudes on such matters,

  • Experiencing anxiety or depression in response to epistemic conditions,

  • Becoming less effective at certain epistemic activities because you rapidly fatigue, easily become overwhelmed, or are pessimistic about your prospects of success,

  • Becoming more easily frustrated with or disappointed in your epistemic interlocutors,

  • Becoming less tolerant of those who hold opposing views,

  • Finding it difficult to make decisions about which epistemic activities to engage in,

  • Feeling guilty about not engaging in more epistemic activities or for not being more successful in the epistemic activities you do engage in,

  • Experiencing burnout in connection with certain epistemic activities.

This list does not provide criteria for a diagnosis. Epistemic exhaustion itself is not some kind of medical condition. Rather, this list describes ways the phenomenon of epistemic exhaustion can feel. The unifying factor across variations is that epistemic exhaustion is marked by a decreased interest in or energy to manage one's epistemic life well.

Indications of epistemic exhaustion can show up in mundane ways. You dread receiving a phone call from a relative with opposing political views because you don't have the energy to have another fruitless and unsatisfying discussion about politics. You scroll past what is clearly fake news that someone has shared on Facebook. In a previous era, you may have done some independent research in order to help inform your Facebook friend of the facts. But now, after a history of failed attempts at successfully communicating, you conclude that the odds of success are too low to justify the investment needed to craft a response that might plausibly change the original poster's views. In the past, you've been willing to listen to others’ counterarguments about your beliefs in a particular domain, but you now find that you don't have the energy or patience to do so anymore. To feel fatigued by the prospect of engaging in such epistemic tasks that you once found doable or enjoyable is to be epistemically exhausted.

The fatigue generated by epistemic exhaustion often is not generated solely by epistemic exhaustion. Rather, our responses to situations like those described above can result from interlocking conditions that give rise to various types of fatigue. For example, we may experience social exhaustion because we live in a polarized society. Partisan polarization can create painful or frustrating divides among family members, colleagues, or friends. It can turn those we love and thought we understood into unrecognizable figures. It can leave us feeling disenchanted with our communities or nation. To experience such things, especially over a long period of time, is tiring. Thus, we may feel fatigued by the idea of taking a call from a partisan relative not only because of the epistemic effort the conversation will require but also because of a social fatigue caused by the burden of living in a polarized society. Alternatively, we may find ourselves less inclined to listen to an interlocutor's counterarguments because we believe that the ideas they are promoting are morally repugnant and should not be treated as open to debate. We may see the continued promotion of such ideas as an injustice. And we may feel worn down by such reminders of continued injustice. This article examines epistemic exhaustion as a distinct phenomenon, while acknowledging that it is often entangled with other forms of fatigue, such as those just described.

Conversational behavior and epistemic exhaustion

Our likelihood of becoming epistemically exhausted by an epistemic activity increases as the cognitive or emotional cost of undertaking that epistemic activity increases. Paradigmatic instances of epistemic exhaustion result from an epistemic activity being made much more difficult or unpleasant than it needs to be or ordinarily is. This helps us distinguish cognitive fatigue generated by epistemic activities that due to the challenging subject matter are difficult activities for almost everybody under most circumstances from epistemic activities that are made more difficult by the circumstances under which one aims to accomplish them. This article deals with the latter kind of circumstance. Thus, cognitive fatigue generated by studying for a challenging exam is not a paradigmatic case of epistemic exhaustion, while cognitive fatigue generated by continually testifying to audiences exhibiting willful ignorance is. This is because consistently testifying to audiences exhibiting willful ignorance makes testifying especially taxing compared to testifying under many other circumstances.

In this section, I argue that certain conversational tactics tend to push people toward epistemic exhaustion. All these tactics raise the cognitive cost of epistemic activity for those whom the tactics are employed against. Such increases in cognitive costs, often coupled with decreases in expected epistemic returns, leave people prone to epistemic exhaustion.

The first set of tactics are burden-shifting measures, which if employed successfully put an asymmetrical cognitive burden on one's interlocutors. Repeated capitulation to these burden-shifting measures can lead to epistemic exhaustion because even the simplest epistemic tasks come to require an exorbitant amount of work. Examples of burden-shifting tactics include Gish galloping, sealioning, and bullshitting. Let's examine each in turn.

A Gish gallop is a rhetorical technique where the user seeks to overwhelm their interlocutor by putting forward more claims or arguments than it is feasible to respond to. The Gish galloper's burden-shifting tool is volume. If the Gish galloper rattles off a dozen claims in close succession, each of which would be time consuming to respond to, they create a substantial burden for the interlocutor who attempts to respond point by point. Rattling off multiple claims is easy. Responding point by point, especially if offering useful responses requires research, is difficult.Footnote 1

A related tactic is sealioning, which consists of repeated, unreasonable requests for one's interlocutor to provide evidence for their claims (cf. Jhaver et al. Reference Jhaver, Ghoshal, Bruckman and Gilbert2018). By default, conversations assume a fair amount of shared background information and a certain degree of faith in the honesty and competence of one's interlocutors. A sealioner can increase the cognitive burden of the conversation for others by removing these assumptions and pedantically requiring extensive documentation or evidence for even the most banal claims. This makes conversational progress tedious and difficult. This difficulty can be increased if the sealioner adopts a veneer of politeness, which can put social pressure on others to comply with the sealioner's expectations.

The costs of responding to a Gish galloper or sealioner are increased if the Gish galloper or sealioner simultaneously engages in bullshitting (that is, communicating with indifference to the truth). It takes little effort to make unverified claims. It often takes much more effort to find the evidence needed to respond. This is captured in the “bullshit asymmetry problem” (also known as “Brandolini's law”), which states that “the amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it” (Williamson Reference Williamson2016).

Gish galloping, sealioning, and bullshitting all pressure one's interlocutors to take up an asymmetrical conversational workload, which increases the cognitive burden of holding the conversation. There is another set of tactics that capitalize on another form of asymmetry, namely, asymmetrical trust. For example, someone who vacillates between gullibility and incredulity, depending on the source, can create an increased burden for their conversational partners by subjecting them to ungrounded and disproportionate incredulity. Quassim Cassam has identified this pattern of behavior among some conspiracy theorists, who will buy outlandish conspiracy narratives with little evidence but who exhibit extreme suspicion of the evidence adduced by those working to debunk the conspiratorial claims (Cassam Reference Cassam2015). Such patterns of selective skepticism and selective gullibility can make epistemic progress near impossible. As a result, these patterns are often very taxing for interlocutors who seek to meet the inconsistent standards of the selective skeptic.

Similarly, someone may employ double standards in assessing things like reliability, objectivity, and honesty. Someone may reject wholesale certain sources of information due to the sources’ apologized-for peccadillos, while overlooking another source's pervasive failures to get the facts correct when the latter source has ideological priorities more to one's liking.Footnote 2 This may make it challenging for an interlocutor to provide disconfirming evidence that will be taken up by the holder of the double standard. The interlocutor's cognitive burden may be further increased if they try to make sense of their conversational partner's double standards or try to offer reasons why their conversational partner should give up their double standards.

There are also a wide variety of tactics one can use to increase an interlocutor's emotional conversational burden. These tactics include hurling insults such as that they are stupid or immoral for their beliefs, seeking to manipulate an interlocutor's emotions by suggesting that their attempts to fact-check are signs of disloyalty or unkindness, and offering insulting or degrading psychologizing explanations that seek to explain away an interlocutor's motives rather than engaging in the substance of what they've said (cf. Flowerree Reference Flowerree2023; Flowerree and Satta Reference Flowerree and Sattaforthcoming). Epistemic activities can become especially taxing when someone is faced with an interlocutor who employs techniques that raise both the cognitive and emotional cost of the conversation.

As we've seen, many conversational tactics push one's interlocutors toward epistemic exhaustion by increasing the cognitive workload or emotional burden required for epistemically profitable communication. But these are not the only ways in which conversational tactics can contribute to epistemic exhaustion. Some of these tactics can also increase an interlocutor's sense of epistemic exhaustion by undermining their sense of epistemic agency. For example, calling someone's views stupid or crazy may cause them to think their beliefs are in fact stupid or crazy. Alternatively, persistently sealioning someone may leave them feeling like they are incapable of generating enough evidence to support their own beliefs.

Gaslighting is a paradigmatic example of a tactic that can push someone toward epistemic exhaustion by undermining their sense of epistemic agency. As Kate Abramson describes it, gaslighting is “a form of emotional manipulation in which the gaslighter tries (consciously or not) to induce in someone the sense that her reactions, perceptions, memories and/or beliefs are not just mistaken, but utterly without grounds—paradigmatically, so unfounded as to qualify as crazy” (Abramson Reference Abramson2014, 2). When we are gaslit, we are vulnerable to epistemic exhaustion via at least two routes. First, we may become epistemically exhausted by expending cognitive and emotional effort to resist the gaslighter's claims. If we are routinely undermined when we engage in basic epistemic tasks like making an argument or recounting a memory, it will take more from us to continue engaging in these tasks. Second, we may become epistemically exhausted by giving into the gaslighter's picture of our epistemic situation and by doubting our own epistemic competency.Footnote 3

Of significance are Abramson's points that gaslighting “almost always involves multiple incidents that take place over long stretches of time” and that “it frequently involves multiple parties playing the role of gaslighter” (Abramson Reference Abramson2014, 2). This is true of behaviors that lead to epistemic exhaustion generally. We all encounter frustrating situations where our epistemic goals are thwarted. We tirelessly search for evidence but never discover the truth. We lay out our arguments as best as we can but fail to convince anyone of our position. A closely held belief is attacked, and we find ourselves unable to suppress nagging doubts about the belief thereafter. But in isolation such incidents are unlikely to lead to epistemic exhaustion. Rather, it is repeated exposure to such challenges over long stretches of time by multiple people that can slowly wear us down to a state of epistemic exhaustion.

Understood this way, we can see that epistemic exhaustion is not a sign of weakness. We are finite beings with limited time, energy, and resources. If we are put in a position where the costs of successfully engaging in certain epistemic activities are high, we either must expend a lot of energy to successfully engage in those activities or choose to forego those activities. We face the same sort of dilemma with an overly demanding job. And just as unsustainable working conditions can lead to burnout, so too unsustainable epistemic conditions can lead to epistemic exhaustion.

Partisan polarization and epistemic chaos

Some social and political conditions are more likely than others to lead us into epistemic exhaustion. In this section, I identify two types of structural, sociopolitical conditions that give rise to epistemic exhaustion: partisan polarization and epistemic chaos. I explain what these two conditions are, how they can give rise to epistemic exhaustion, and how they help the powerful retain power.

A society is politically polarized when it is divided into two political “teams,” with many team members disliking and distrusting those on the other team. A society is sociopolitically polarized when divisions along political lines strongly correlate with other aspects of identity such as race, religious affiliation, national origin, socioeconomic status, or education level. I will refer to sociopolitical polarization as partisan polarization. Partisan polarization can lead to what Regina Rini calls partisan epistemology, which occurs when one grants more credibility to testifiers who share one's partisan identity than is warranted (Rini Reference Rini2017, E43). It can also lead to what David Roberts calls tribal epistemology, which occurs when the primary way of assessing information is in terms of how well it conforms to a team's narratives and values (Roberts Reference Roberts2017).

C. Thi Nguyen has identified two distinct sorts of epistemic structures—epistemic bubbles and echo chambers—both of which often arise alongside partisan polarization (Nguyen Reference Nguyen2018; see also Nguyen Reference Nguyen2020). An epistemic bubble is “an informational network from which relevant voices have been excluded by omission” (Nguyen Reference Nguyen2018). An echo chamber is “a social structure from which other relevant voices have been actively discredited” (Nguyen Reference Nguyen2018). Thus, as Nguyen puts it, “[i]n epistemic bubbles, other voices are not heard; in echo chambers, other voices are actively undermined” (Nguyen Reference Nguyen2020, 141). Partisan polarization tends to increase the cognitive and emotional burden of epistemic activity for those who regularly take the views of others seriously. It does so in at least two ways: first, by fueling distrust; second, by destroying common ground. Let's consider each way in turn.

The “ingroup” versus “outgroup” team mentality of a partisan society causes people to negatively stereotype those who they view as part of their outgroup (Mason Reference Mason2018, 3; Iyengar et al. Reference Iyengar, Sood and Lelkes2012). Of particular concern for our purposes is when such negative stereotyping leads partisans to conclude that people or institutions associated with their outgroup are not credible sources of information. As epistemologists have noted, we gain much of our knowledge via testimony. Thus, when opposing partisan teams develop radically different conceptions of who is a credible testifier and who is not, the cognitive and emotional burden of forming, retaining, and sharing beliefs can go up because there are fewer sources of information that are widely viewed as credible. This disagreement over who is credible is further enhanced when one or both partisan teams form echo chambers that actively discredit outsiders. This can make the process of forming or retaining belief more difficult for those outside of or caught in between partisan camps. And this can make the process of sharing beliefs across partisan lines difficult because one is more likely to have one's sources challenged by someone in the opposite partisan faction.

The ability of partisanship to fuel distrust is bound up with the tendency of partisan teams to form echo chambers.Footnote 4 But the way partisanship destroys common ground may rely just as much on the tendency of partisan teams to form epistemic bubbles. Successful communication often tacitly relies on shared background assumptions. This includes assumptions about matters of both fact and value. Partisanship also reduces shared narratives, schemas, ideologies, and norms underlying how we process information.

As common ground lessens, the cognitive burden of successfully communicating what we believe to others or of understanding what others are trying to communicate to us goes up. The increased cost of successful communication can lead to fewer instances of successful communication, which can in turn create emotional costs generated by failures to successfully communicate.

Partisan epistemology can be accompanied by a second kind of epistemically harmful social phenomenon: epistemic chaos. Epistemic chaos occurs when a society experiences a glut of conflicting information while lacking widely agreed upon epistemic authorities to resolve the conflicts.Footnote 5 As such, epistemic chaos has two key features: (1) a large volume of conflicting claims, and (2) an absence of widely acknowledged epistemic authorities to help sort out which of the conflicting claims are true and false (or which are justified/unjustified, reasonable/unreasonable, and so on).

Epistemic chaos can be created accidentally. For example, a society slowly and organically forms two partisan echo chambers, which over time erode common ground and agreement about whom to trust. Epistemic chaos can also be created purposefully. For example, individuals or groups successfully orchestrate propaganda and misinformation campaigns that function to drive people into two partisan echo chambers. I suspect most epistemic chaos is the result of both intentional acts and unorchestrated social development. But if one is in an epistemically chaotic social environment, the cognitive and emotional burden of many epistemic actions will go up, regardless of the cause. This is because in epistemically chaotic environments, the amount of information one must sort through is high and the number of widely agreed upon epistemic authorities is low.

So far, I've defined partisan polarization and epistemic chaos and provided reasons why they increase our chances of becoming epistemically exhausted. In closing this section, I offer three reasons why partisan polarization and epistemic chaos—in part through the epistemic exhaustion they cause—help the politically powerful retain their power.

First, both partisan polarization and epistemic chaos make it harder for the kind of broad coalition building required to make meaningful social change. In a democracy, increasing social justice often requires that a majority of the democracy's members agree and insist upon change. Such changes often disperse power, resulting in the most powerful losing power. When a society is polarized between two groups of comparable size or is ridden with epistemic chaos, it is hard to create the kind of coalition required for changes that will weaken the power of the most powerful. It is even harder to create this kind of coalition when people are epistemically exhausted, leaving them without the energy needed to create the shared foundation of beliefs and knowledge needed to form and maintain such coalitions.

Second, when a society is operating well, social, political, and legal changes will sometimes be the result of new information gained and disseminated by experts. For example, social and legal changes in connection with smoking in recent decades have been driven in large part by the consensus among experts and the general public that smoking is dangerous. But when a social change is detrimental to the interests of powerful people or groups, the powerful may use partisan polarization or epistemic chaos to prevent the general public's uptake of relevant information.Footnote 6 This can be accomplished by driving members of the public into a state of epistemic exhaustion where they lack the energy and drive needed to sort through the relevant information, to form firm convictions on the relevant matter, or to communicate what they have come to believe or understand with others.

Third, partisan polarization and epistemic chaos increase the cognitive effort required to make meaningful changes. They also decrease the chance that such efforts will be rewarded no matter how much effort is put in. As a result, it becomes more likely that such actions will not be attempted or will result in burnout if attempted. This can occur either because, despite their best efforts, those working for a change do not succeed in communicating with others and eventually burn out or because those who want change make a rational cost-benefit assessment and determine that change is not worth trying for due to low odds of success. Circumstances that create epistemic exhaustion typically preserve the status quo. When the status quo is a power-differentiated society, widespread epistemic exhaustion makes it more difficult to disrupt such patterns of differentiated power.

Epistemic oppression

Just as political polarization and epistemic chaos make it difficult to disrupt the political status quo, so too epistemic oppression makes it difficult to disrupt the social status quo. This is, in part, because just as political polarization and epistemic chaos create circumstances that leave those with less political power susceptible to epistemic exhaustion, so too epistemic oppression creates circumstances that leave those with less social power susceptible to epistemic exhaustion. My argument for this conclusion, in brief, is that epistemic oppression increases the epistemic effort expected or required of persons occupying socially marginalized positions while simultaneously decreasing the epistemic payoff for such effort. Epistemic oppression increases epistemic costs while decreasing epistemic benefits for those seeking to gain or share knowledge from a socially marginalized perspective. Operating under such conditions easily creates epistemic exhaustion.

In keeping with the definition provided by Kristie Dotson, by epistemic oppression I mean “persistent epistemic exclusion that hinders one's contribution to knowledge production,” where epistemic exclusion is “an unwarranted infringement on the epistemic agency of knowers” (Dotson Reference Dotson2014, 115; cf. Toole Reference Toole2019, 608). Following Briana Toole, I use epistemic oppression as an umbrella term that includes a variety of phenomena including various forms of epistemic injustice, willful ignorance, and epistemic exploitation (Toole Reference Toole2019, 22).

In this section, I examine several types of epistemic oppression. For each type, I argue that it creates circumstances that leave people occupying socially marginalized positions vulnerable to epistemic exhaustion by, on balance, increasing epistemic burdens while decreasing epistemic payoffs. The forms of epistemic oppression for which I claim this pattern holds include testimonial injustice, hermeneutical injustice, willful hermeneutical ignorance, contributory injustice, and epistemic exploitation, among others.

Testimonial injustice (in Miranda Fricker's paradigmatic form as identity-prejudicial credibility deficit testimonial injustice) occurs when prejudice on a receiver's part causes them to give a testifier less credibility than they otherwise would have given (Fricker Reference Fricker2007, 4 and 17). For example, if a woman's testimony is given less credibility because she is a woman, she is experiencing testimonial injustice. Because of the credibility deficit, it requires more effort for a testifier whose testimony is downgraded to effectively convey what they know to others. In addition, such testifiers also face an increased risk that their testimony will not be accepted, no matter how hard they try. As Fricker and others have shown, such credibility deficits fall disproportionately on those occupying marginalized social positions. As a result, those occupying marginalized positions are more likely to experience the epistemically taxing circumstance of repeatedly having others downgrade their testimony due to a credibility deficit.

Unlike testimonial injustice, in cases of hermeneutical injustice, people face increased cognitive costs in forming their own beliefs in a way that accurately describes their experiences.Footnote 7 Fricker explains that hermeneutical injustice happens “when a gap in collective interpretive resources puts someone at an unfair disadvantage when it comes to making sense of their social experiences” (Fricker Reference Fricker2007, 1). As Gaile Pohlhaus Jr. notes, “when one is marginally positioned, the epistemic resources used by most knowers in one's society for knowing the world will be less suited to those situations in which marginally situated knowers find themselves on account of being marginal” (Pohlhaus Reference Pohlhaus2012, 717). As an example, in the first decades after the discovery of HIV, people living with HIV lacked a lot of the language and other interpretive resources now available to make sense of their experiences because such interpretive resources did not yet exist or were not accessible to many people living with HIV. In turn, a significant part of why these resources did not exist or were hard for people living with HIV to obtain for so long was stigma around HIV and the marginalization of various populations disproportionately affected by HIV. The marginalized position of people living with HIV stymied the public health response and hindered the swift development of many important interpretive resources as a result (cf. Davidson and Satta Reference Davidson, Satta, Grasswick and McHugh2021a). This is but one example. Generalizing, those occupying privileged positions exercise disproportionate influence over what collective interpretive resources a society has, and those occupying more marginalized positions are more likely to face repeated hermeneutical injustice.

Hermeneutical injustice increases one's epistemic burden by increasing the cognitive and emotional effort required to make sense of one's own experiences. However, unlike testimonial injustice, in cases of hermeneutical injustice the subject of the injustice exercises more control over the payoffs they receive for investing in the development of interpretive resources. Because rectifying hermeneutical injustice requires developing interpretive resources to aid in the formation of one's own beliefs, it is more likely to cause belief-determination exhaustion than is testimonial injustice. Testimonial injustice, on the other hand, is more apt to cause belief-communication exhaustion because it hampers one's ability to convey testimony to others.

Hermeneutical injustice is related to another form of epistemic oppression that can contribute to belief-communication exhaustion: willful hermeneutical ignorance. Pohlhaus states that willful hermeneutical ignorance “describes instances where marginally situated knowers actively resist epistemic domination through interaction with other resistant knowers, while dominantly situated knowers nonetheless continue to misunderstand and misinterpret the world” (Pohlhaus Reference Pohlhaus2012, 716). With hermeneutical injustice, at least as Fricker first described it, the injustice lies with the lack of resources. With willful hermeneutical ignorance, the injustice lies with the failure of dominantly situated persons to use the interpretive resources developed by the marginalized. The result of this is that such dominantly situated persons “continue to misunderstand and misinterpret the world” at the expense of the more marginally situated.

Dotson employs the concept of willful hermeneutical ignorance in her description of contributory injustice (Dotson Reference Dotson2012, 31). Dotson, like Pohlhaus, rejects the idea that there is a single set of interpretive resources. Dotson explains that contributory injustice “is caused by an epistemic agent's situated ignorance, in the form of willful hermeneutical ignorance, in maintaining and utilizing structurally prejudiced hermeneutical resources that result in epistemic harm to the epistemic agency of a knower” (Dotson Reference Dotson2012, 31).

Repeatedly encountering willful hermeneutical ignorance and contributory injustice in response to one's attempt to share one's beliefs will naturally lead to epistemic exhaustion. This is because a precondition for successfully sharing beliefs with others is that others accept and understand the interpretive resources used to express those beliefs. In cases of willful hermeneutical ignorance and contributory injustice, marginally situated knowers are put in a situation where communicating their beliefs will be difficult and taxing (because their audience is willfully refusing to accept the resources needed to understand their testimony) and where their chances of success are low (for the same reason).Footnote 8

Perhaps the clearest link between epistemic exhaustion and epistemic oppression is with what Nora Berenstain calls epistemic exploitation. Berenstain states that epistemic exploitation “occurs when privileged persons compel marginalized persons to produce an education or explanation about the nature of the oppression they face” (Berenstain Reference Berenstain2016, 570). Berenstain situates epistemic exploitation as a type of epistemic oppression “marked by unrecognized, uncompensated, emotionally taxing, coerced epistemic labor” (Berenstain Reference Berenstain2016, 570).

Berenstain appeals to the words of Manissa McCleave Maharawal in showing how epistemic exploitative labor is often expected of those in socially marginalized positions. Maharawal writes:

Let me tell you what it feels like to stand in front of a white man and explain privilege to him. It hurts. It makes you tired. Sometimes it makes you want to cry. Sometimes it is exhilarating. Every single time it is hard. Every single time I get angry that I have to do this, that this is my job, that this shouldn't be my job. (Berenstain Reference Berenstain2016, 575; quoting from Maharawal Reference Maharawal2011)

Berenstain goes on to point out that marginally situated people are often put in a situation where they cannot “disengage from an epistemically exploitative situation without being subjected to harm as a result of their perceived affront” (Berenstain Reference Berenstain2016, 576). Berenstain identifies this as a double bind. However, even after being coerced to educate the privileged about their oppression, those forced into epistemically exploitative labor are often met with skepticism, dismissal, or anger. Thus, like other forms of epistemic oppression, in cases of epistemic exploitation marginally situated persons are pressured into doing more cognitively and emotionally taxing epistemic work with little by way of epistemic or social reward. Doing this kind of work, in Maharawal's words, makes you tired.

Epistemic oppression ensures that the cognitive and emotional burdens of generating and sharing knowledge are unfairly distributed. As a result, on average, the more marginalized one's social position is, the greater the risk of becoming epistemically exhausted. This is both because marginally situated persons are being asked to do more cognitive work than others, and also because they are being asked to do cognitive work that is less likely to be worth the effort. As a result, the ways in which epistemic oppression naturally leaves marginally situated knowers vulnerable to epistemic exhaustion serves to reinforce and retain the status quo and its differentiated power structures.

Responses to epistemic exhaustion

Having covered what epistemic exhaustion is and a variety of circumstances that naturally lead to its development, I now outline four common responses to epistemic exhaustion: (1) reactive partisanship, (2) skepticism (3) disengagement, and (4) pressing on. I do not think any of these responses are inherently good or bad. All of them can at times usefully preserve an individual's well-being, but all of them can at times create harmful consequences either for the person adopting the response or for others. In this section, I seek merely to describe these four responses. In the next section, I offer prescriptive thoughts about how we ought to respond to epistemic exhaustion and the circumstances that leave us prone to it.

In describing these responses to epistemic exhaustion, I have used quotes that I view as reflecting epistemic exhaustion. As I pointed out at the start of this paper, epistemic exhaustion is often one component of more complex forms of fatigue. Thus, in using the words of others, I do not mean to suggest that their claims pertain only to epistemic exhaustion. Yet I think those quoted help illuminate what epistemic exhaustion is like and how people tend to respond to it.

As Jason Baehr has noted, one response to the cacophony of conflicting messaging we experience during epistemic chaos is to double down on one's tribal epistemology (Baehr Reference Baehr2020). I call this reactive partisanship. Reactive partisans seek to reduce their epistemic difficulties by further increasing trust in their own partisan ingroup or by decreasing their exposure to messaging outside their ingroup (or both). It takes a lot less cognitive effort simply to accept the dominant narratives of one's ingroup.

A second response to epistemic exhaustion is increased skepticism. Baehr also recognizes this possible response, appealing to a New York Times article on “voters worn out by a fog of political news” (Tavernise and Gardiner Reference Tavernise and Gardiner2019). One respondent captures the skeptical position well, stating that “There's so much information that's biased, that no one believes anything. There is so much out there and you don't know what to believe, so it's like there is nothing” (Tavernise and Gardiner Reference Tavernise and Gardiner2019, quoted in Baehr Reference Baehr2020). Skepticism of this type can alleviate epistemic exhaustion if the skeptic ceases to try to determine or share beliefs, based on an assumption that it is impossible to discover the truth or to have justified beliefs. This kind of skeptic can implement simplifying principles, such as that the truth is undiscoverable because of bias on all sides. That said, such simplifying principles often come at a high epistemic cost.

A third response is disengagement. Disengagement can be accompanied by skepticism, but it need not. One can think that the answers are out there but still choose to simply stop looking for or discussing those answers because the costs of doing so are viewed as too high. Masha Gessen offers a description of how this line of reasoning can look in a discussion about polarization, conflicting messaging, and dishonesty in the United States under the Trump presidency:

The tension is draining. The need to pay constant attention to the lies is exhausting, and it is compounded by the feeling of helplessness in the face of the ridiculous and repeated lies. Most Americans in the age of Trump are not, like the subject of a totalitarian regime, subject to state terror. But even before the coronavirus, they were subjected to constant, sometimes debilitating anxiety. One way out of that anxiety is to relieve the mind of stress by accepting Trumpian reality. Another—and this too is an option often exercised by people living under totalitarianism—is to stop paying attention, disengage, and retreat to one's private sphere. Both approaches are victories for Trump in an attack on politics. (Gessen Reference Gessen2020, 111)

This retreat into one's private sphere is one key kind of disengagement I have in mind. To disengage in this way is to stop being a participant in certain epistemic domains.

Significantly, Gessen situates this form of disengagement as a direct response to exhaustion (e.g., the tension is “draining,” the lies “exhausting,” and the anxiety “debilitating”). And, like Baehr, Gessen recognizes that there is more than one response someone can have to such exhausting conditions. The option of “accepting Trumpian reality” can perhaps be understood as a form of reactive partisanship.

Disengagement can take many other forms, including more selective forms. Reni Eddo-Lodge announced a more limited form of disengagement in her blog post (and later book of the same name) “Why I'm no longer talking to white people about race”:

I'm no longer engaging with white people on the topic of race. Not all white people, just the vast majority who refuse to accept the legitimacy of structural racism and its symptoms. I can no longer engage with the gulf of an emotional disconnect that white people display when a person of colour articulates their experience. You can see their eyes shut down and harden. It's like treacle is poured into their ears, blocking up their ear canals. It's like they can no longer hear us … I don't have a huge amount of power to change the way the world works, but I can set boundaries. I can halt the entitlement they feel towards me and I'll start that by stopping the conversation. The balance is too far swung in their favour. Their intent is often not to listen or learn, but to exert their power, to prove me wrong, to emotionally drain me, and to rebalance the status quo. (Eddo-Lodge Reference Eddo-Lodge2017, ix–xii; quoting her 2014 blog post).

Eddo-Lodge expresses a circumscribed form of disengagement that is a direct response to the way in which some people—most white people—make the tasks of communicating about race especially taxing for Eddo-Lodge as a Black person.

Because it is white people specifically who often respond to Eddo-Lodge in conversations about race with the intent to do things like exert power over her, prove her wrong, and emotionally drain her, among other things, Eddo-Lodge decided to engage in a strategic act of selective disengagement with white people on the topic of race. Importantly, this kind of selective disengagement can create space for engagement in other domains, with other people, or at other times. A Black writer who decides to stop talking about race with white people may have more energy to productively discuss race with people of color, for example.

In Eddo-Lodge's own case, her selective disengagement was temporary. Several years after publishing her initial blog post, she writes that “I now spend most of my time talking to white people about race. The publishing industry is very white, so there's no way I could have got this book published without talking to at least some white people about race” (Eddo-Lodge Reference Eddo-Lodge2017, xv). Eddo-Lodge seems to have decided that reengagement was worth it, given some of her goals. But if circumstances or those goals changed, she could once again stop talking to white people about race because, as she notes, “I don't think giving up is a sign of weakness. Sometimes it's about self-preservation” (Eddo-Lodge Reference Eddo-Lodge2017, xv).

A fourth response is pressing on. The person who presses on stays their course and continues to invest cognitive effort into their epistemic tasks, typically in a similar manner to how they have before. I view the following quote from Ijeoma Oluo as an expression of the pressing on attitude:

But I'm tired. I'm tired because this is the conversation I've been having since the 2016 election ended … And although I'm tired, because I have just had this conversation with multiple people for multiple hours the evening before, here I am having it again, hearing what I have always heard: the problem in American society is not race, it's class. (Oluo Reference Oluo2019, 8–9)

Despite being tired of having the same conversation again and again, Oluo chooses to have the same conversation again anyway. She continues to put labor into this epistemic activity even though she is tired.

Oluo's quote connects to other facets of epistemic exhaustion too. Here Oluo is put in the position of responding to an argument often given by the racially privileged (that “the problem” in American society is class, not race). Given that Oluo is a queer woman of color, it is likely that she often occupies a more socially marginalized position in such conversations. She may find herself subject to Berenstain's double bind and may be met with skepticism, dismissal, or anger if she testifies that race is a significant source of problems in the United States. Like disengagement, pressing on is a response that can be engaged in selectively. Someone can decide that pressing on is worth it in some cases but not in others.

While these four responses have been distinguished here in the abstract, in practice people may often engage in multiple responses to epistemic exhaustion. For example, author and columnist Michiko Kakutani offers the following multi-faceted description of how we often respond to a “firehose” of disinformation or lies.

[It] tends to overwhelm and numb people while simultaneously defining deviancy down and normalizing the unacceptable. Outrage gives way to outrage fatigue, which gives way to the sort of cynicism and weariness that empowers those disseminating lies. As the former world chess champion and Russian pro-democracy leader Garry Kasparov tweeted in December 2016, ‘The point of modern propaganda isn't only to misinform or push an agenda. It is to exhaust your critical thinking, to annihilate truth.’ (Kakutani Reference Kakutani2018, 142–43)

Kakutani shows a keen awareness not only of the different ways epistemic exhaustion feels, but also the political role that epistemic exhaustion can play in creating social and political circumstances beneficial to the powerful.

How should we respond to epistemic exhaustion?

As I've tried to show, epistemic exhaustion is often generated by social and political features of our social systems and structures. Thus, eliminating undesirable epistemically exhausting circumstances will require making changes to our social systems and structures. But those are long-term collective changes. In the short term, it is useful to have personal strategies in place for preventing or dealing with epistemically exhausting circumstances in a politically polarized, epistemically chaotic, and epistemically oppressive world like ours. Some of those quoted in the previous section, like Eddo-Lodge, provide some examples of such strategies. Here I offer six additional suggestions for how one should respond to epistemically exhausting circumstances. This list is not meant to be complete, but it is meant to provide a starting place grounded in my own experiences, conversations with others, and engagement with relevant scholarship.

But first, a general comment: There is no single best response to epistemically exhausting circumstances. How one ought to respond to epistemically exhausting circumstances is context specific. For example, if one is dealing with a gaslighter, disengagement—if feasible—is advisable. But in other circumstances it may be worth pressing on and strategically engaging even an exhausting interlocutor. Thus, successful application of my suggestions requires discernment about which option is best in a given circumstance.

My suggestions, in brief, are: (1) conceptualize epistemic exhaustion as a tool of the powerful, (2) be selective in how you expend your energy, (3) be intentional about your epistemic goals, (4) identify and name relevant epistemic dynamics, (5) be considerate and cultivate awareness of the epistemic demands you put on others, and (6) account for your epistemic position. None of these suggestions is groundbreaking. They might all seem like common sense. Yet, as with many things that seem like common sense, I think they are worth stating and thinking through anyway. Let's consider each.

Conceptualize epistemic exhaustion

It is often easier to resist something once you realize that it ought to be resisted. I think this applies to epistemic exhaustion. Those whose social and political power comes from maintaining the status quo have an incentive to create widespread epistemic exhaustion among those who constitute a threat to their power. Recognizing this can help us resist. Thus, it can be beneficial to conceptualize the creation of epistemically exhausting circumstances as a tool that the powerful use to retain their power. This does not mean that everyone contributing to epistemically exhausting circumstances stands to gain or does so intentionally. Often the orchestrators are far up the chain. This gives us no less reason to resist; although it does give us a reason to be aware that sometimes those directly contributing to our epistemic exhaustion are not themselves benefitting and may have been unwittingly enlisted by the powerful to do their dirty work.

Be selective

We only have a finite amount of time and energy. If we overburden ourselves with mentally and emotionally taxing epistemic labor, we will become epistemically exhausted. Such exhaustion limits the ability of our future selves to self-advocate, to learn, or to demand change. Thus, we have reason to be selective in how we choose to expend our epistemic energy. Pick your battles.

In choosing where to expend your epistemic energy, it may be useful to consider matters in terms of how much time and energy you expect the task will require of you and how likely and how great the expected benefit will be. Choose to engage in situations where the benefits are likely worth the energy. This may seem obvious, but I say it anyway because it's easy to fail to follow this advice in practice.

Here are three rules of thumb to help put this advice into practice. First, don't argue with strangers, those with track records of recalcitrance, or those committed to winning at all costs. These people are unlikely to be swayed by even the most intelligent and carefully worded arguments. Moreover, such people are more likely to use conversational tactics that make the conversation epistemically burdensome for you.

Second, opt to have conversations in person, on video chat, or on the phone when you can. These modes of conversation make it easier and more natural to humanize and empathize with those we're talking to.Footnote 9 This in turn increases the likelihood that the conversation will be productive, worthwhile, and satisfying. In addition, it helps us filter out which conversations are worth having and which aren't. In general, we're more likely to have worthwhile conversations with those whom we care about enough to meet in person or talk with via phone or video chat.

Third, ask yourself if the epistemic activity matters. More than once, I've found myself pulled into researching a question that simply didn't matter. Often this is because I've found a claim just so flabbergasting that I felt compelled to investigate. But such feelings of compulsion can be misleading. The typical result is wasted cognitive and emotional energy on learning about something that will never influence my behavior or the behavior of others going forward. Avoid such activity.

Be intentional

When seeking knowledge or engaging with an interlocutor, be intentional about how you're engaging and what you're hoping to accomplish. Without such intentionality, it is easy to fall into the social media trap of having conversations with the aim of winning (or at least of scoring points or having a good clapback). Set your own agenda. Don't let trolls set your agenda for you by pushing your buttons. Consider which forms of conversations are worth having and with whom. Sometimes debate is worth it. Other times, it is not worth it, but some other form of conversation would be.

Identify and name relevant epistemic dynamics

Epistemically exhausting conversational tactics often work by tacitly changing the frame or tone of a conversation. Sometimes the best way to disrupt the maneuvers of epistemically exhausting interlocutors is to offer second-order responses that name the relevant epistemic dynamics. If someone is sealioning you, point out that they've repeatedly asked you to defend your claims in a way that they haven't been defending theirs. Make clear that you don't have a one-sided responsibility to provide evidence. If you perceive that someone in the conversation is worn out, provide a way for people to exit the conversation (for example, “I know we've been discussing this heavy topic for quite a while now. We can pick up the conversation another time if you'd like.”). If someone asks for your opinion and is then dismissive of what you have to say, remind them that you shared your opinion because that is what they asked for, and point out that it is not in your interest to continue answering their questions if they're going to be dismissive.

There are limits to when this kind of response will work. It will not work when your interlocutors lack the testimonial competence to understand the epistemic dynamics or statements made about them. Willful hermeneutical ignorance and contributory injustice can be reified with testimony about epistemic dynamics. And there can be times when the risks of commenting on the epistemic dynamics outweigh the potential benefits, such as in cases of the kind of double bind Berenstain identifies. But if the risks are low, you may be able to provide a benefit to your interlocutor's future conversational partners by increasing others’ understanding of the epistemic dynamics they create through second-order comments about the epistemic dynamic they have created.

Be considerate of others

This paper has emphasized the perspective of the epistemically exhausted. But preventing a culture of epistemic exhaustion requires more than working to avoid our own epistemic exhaustion. It also requires that we avoid creating epistemically exhausting circumstances for others. Reflect on your epistemic practices and work to cultivate virtuous conversational behaviors. Avoid the unfair conversational tactics discussed earlier. Develop fair expectations about epistemic labor from others. Recognize that everyone has a limited amount of time and energy. Think about your own social position and how your privilege may make it difficult for you to identify ways in which you are creating epistemic exhaustion for others. Cultivate the habit of listening to what others tell you about how your behaviors might be creating epistemic exhaustion for them. Be prepared to change in response to what you are told. Unless you occupy a particularly powerful or privileged position, your agency is still diminished in a world where you are not epistemically exhausted, but everyone around you is.

Account for your epistemic positions

None of the common responses to epistemic exhaustion I considered (reactive partisanship, skepticism, disengagement, and pressing on) is always appropriate or inappropriate. This depends on context. If you are in a partisan society where one group consistently has better epistemic practices and access to truth when compared to the other, reactive partisanship is likely harmful if you're in the epistemically inferior group, but this may not be true if you are in the epistemically superior group. Disengagement from a bullshitter is helpful, but disengagement from experts is not. Pressing on will pay off with some interlocutors, but not with others. In order to respond wisely to epistemically exhausting circumstances, you need to assess what kind of epistemic position you're in.

Here are some suggestions for assessing your epistemic position. First, notice where you are socially situated and take into account the epistemic advantages and disadvantages that come from your social position. Second, acknowledge where you have obtained expertise and its accompanying epistemic authority, while also acknowledging your epistemic limits and situations where you ought to suspend judgment or to defer to the judgment of others.

Third, if you find yourself part of a sociopolitical ingroup that exerts significant control over what information you receive and who you trust, reflect on the reasons why you are part of that particular group. Some reasons provide a better epistemic foundation than others. Many people end up in partisan groups accidentally. They are conservative or progressive because their friends and family are part of that group. These kinds of accidental grounds for partisan identity do not confer strong reasons for thinking that one's ingroup has a better epistemic foundation than one's outgroup. However, other people select partisan groups more intentionally as the result of epistemic reasoning. Someone who spent time neutrally assessing the credibility and trustworthiness of the members of more than one partisan team before claiming one as their ingroup likely has better grounds for trusting their ingroup over their outgroup than someone who has not.

It is unrealistic to expect one to constantly redo the work of assessing different partisan teams from scratch, but it seems like a reasonable investment to make at least once—especially if one remains on a partisan team simply because that is the team they have always been on. Nguyen calls this sort of square one reassessment of the trustworthiness and credibility of different sources a social epistemic reboot (Nguyen Reference Nguyen2020, 157). Part of assessing your epistemic position involves addressing how you've structured your epistemic life and determining whether an epistemic reboot is called for.

I have focused here on individual-centered responses to epistemically exhausting circumstances. I think these suggestions are useful, but they are limited. As I argued earlier, epistemically exhausting circumstances are often generated by particular political and social structures. Thus, a holistic response to epistemically exhausting circumstances will have both individual and structural components.

When is epistemic exhaustion worth it?

Given the unpleasantness of epistemic exhaustion and the way in which it can be leveraged by the powerful to retain the status quo, epistemic exhaustion is normally undesirable. However, I want to close on a positive note by identifying at least two circumstances where the sting of epistemic exhaustion can be removed, at least in part, because the exhaustion is a sign of epistemic progress.

The first such circumstance is undertaking a social epistemic reboot. An epistemic reboot is hard work. It can be tedious reassessing the trustworthiness and credibility of all of one's sources and disconcerting to operate without psychological certainty. Still, as Nguyen argues, this may be the best, or perhaps only, way to break free of an echo chamber or from an epistemically problematic partisan team. Just as an arduous hike can be worth it for the view from the top of the mountain, so too the arduous task of undergoing an epistemic reboot can be worth it for the clarity one gains in the process.

A second circumstance where the cost of epistemic exhaustion is worth the epistemic gains occurs when epistemic exhaustion is generated by confronting one's own privilege and the ignorance that privilege has created and preserved. Coming to grips with how one has misunderstood and misinterpreted the world due to privilege can be unmooring. It also is a never-ending process. Recognizing and responding to ignorance creates conditions that allow for the revealing of yet more ignorance. Working through this—especially if one is doing the work oneself rather than epistemically exploiting others—can be exhausting. But this confrontation of ignorance is worth the exhaustion.

In this article, I've argued that epistemic exhaustion tends to help prop up the powerful, but these concluding examples show that epistemic exhaustion can be severed from that role. By conceptualizing epistemic exhaustion as a tool used by the powerful in the service of retaining their power, perhaps we can sever epistemic exhaustion's role in propping up the status quo more often.

Acknowledgments

Thanks to Marilie Coetsee, Lacey J. Davidson, Amy Flowerree, Dan Kelly, Lauren Kuykendall, and Josh Wilburn for helpful conversations about ideas discussed in this article, to three anonymous referees from this journal for useful suggestions, and to those in attendance at a Social (Distance) Epistemology virtual event, hosted by the Social Epistemology Network, for beneficial engagement with many of this article's ideas.

Mark Satta is an assistant professor of philosophy at Wayne State University in Detroit, Michigan. His research interests include epistemology (especially social, feminist, legal, and applied epistemology), philosophy of language, social and political philosophy, and philosophy of law.

Footnotes

1 The term “Gish gallop” was first coined by physical anthropologist and science advocate, Eugenie Scott, who named the term after Duane Gish, who was known to use the technique in debates against proponents of evolution (Scott Reference Scott2004). This tactic is also referred to as firehosing or the firehose of falsehoods (see, e.g., Kakutani Reference Kakutani2018, 142).

2 For a discussion of an example of this in contemporary American politics, see Snyder Reference Snyder2018, 269–70.

3 My focus here is on the gaslighting of individuals, not cultural or structural gaslighting. That said, it seems that cultural and structural gaslighting can also lead to epistemic exhaustion. See Ruiz Reference Ruíz2020 and Berenstain Reference Berenstain2020.

4 While widespread distrust can increase the chances that the distrustful or distrusted (or both) will become epistemically exhausted, this does not mean that such distrust is never justified. Under many circumstances distrust, including general distrust, can be both justified and beneficial. See, e.g., Krishnamurthy Reference Krishnamurthy2015 and Davidson and Satta Reference Davidson, Satta, Vallier and Weber2021b.

5 My use of the term epistemic chaos has some resemblance to, but differs in significant ways from, the use in Brady Reference Brady2015.

6 As documented by Naomi Oreskes and Erik Conway (Reference Oreskes and Conway2010), among others, powerful people and corporations who stood to lose via broad social acceptance of the dangers of smoking did indeed seek to prevent broad social uptake of these facts. The artificial manufacturing of doubt on the part of tobacco companies and their allies can be seen as an attempt to create a limited form of epistemic chaos around information about smoking. Oreskes and Conway outline how similar tactics have been employed by the powerful concerning issues such as acid rain, secondhand smoke, and climate change.

7 This represents but one way of conceptualizing hermeneutical injustice. See, e.g., Mason Reference Mason2011, Medina Reference Medina, Kidd, Medina and Pohlhaus2017, and Goetze Reference Goetze2018 for critiques and alternative conceptions.

8 This issue generalizes to all cases of what Dotson calls testimonial incompetence (Dotson Reference Dotson2011). Dotson discusses how testimonial incompetence can lead to testimonial smothering—i.e., the truncating of one's own testimony in order to ensure that the testimony contains only content for which one's audience demonstrates testimonial competence (Dotson Reference Dotson2011, 244). On this reading, one can interpret some instances of testimonial smothering as attempts to avoid epistemic exhaustion. One can also interpret some instances of testimonial smothering as a response, in part, to epistemic exhaustion that one is already experiencing.

9 For an account of the value of empathetic understanding in deliberation, see Hannon Reference Hannon2020.

References

Abramson, Kate. 2014. Turning up the lights on gaslighting. Philosophical Perspectives 28: 130.CrossRefGoogle Scholar
Baehr, Jason. 2020. Skepticism, tribalism, and humble persistence. Cardiff University Blogs: Open for Debate. https://blogs.cardiff.ac.uk/openfordebate/2020/04/06/skepticism-tribalism-and-humble-persistence/Google Scholar
Berenstain, Nora. 2016. Epistemic exploitation. Ergo 3 (22): 569–90.Google Scholar
Berenstain, Nora. 2020. White feminist gaslighting. Hypatia 35: 733–58.CrossRefGoogle Scholar
Brady, Norman. 2015. “Epistemic chaos”: The recontextualisation of undergraduate curriculum design and pedagogic practice in a new university business school. British Journal of Sociology of Education 36 (8): 1236–57.CrossRefGoogle Scholar
Davidson, Lacey J., and Satta, Mark. 2021a. Epistemology and HIV transmission: Privilege and marginalization in the dissemination of knowledge. In Making the case: Feminist and critical race philosophers engage case studies, ed. Grasswick, Heidi and McHugh, Nancy Arden. Albany: SUNY Press.Google Scholar
Davidson, Lacey J. and Satta, Mark. 2021b. Justified social distrust. In Social trust: Foundational and philosophical issues, ed. Vallier, Kevin and Weber, Michael. New York: Routledge.Google Scholar
Dotson, Kristie. 2011. Tracking epistemic violence, tracking practices of silencing. Hypatia 26 (2): 236–57.CrossRefGoogle Scholar
Dotson, Kristie. 2012. A cautionary tale: On limiting epistemic oppression. Frontiers 31 (1): 2447.CrossRefGoogle Scholar
Dotson, Kristie. 2014. Conceptualizing epistemic oppression. Social Epistemology 28 (2): 115–38.CrossRefGoogle Scholar
Eddo-Lodge, Reni. 2017. Why I'm no longer talking to white people about race. London: Bloomsbury.Google Scholar
Flowerree, A. K. 2023. When to psychologize? Australasian Journal of Philosophy https://doi.org/10.1080/00048402.2022.2157032CrossRefGoogle Scholar
Flowerree, A. K., and Satta, Mark. forthcoming. Moral grandstanding and the norms of moral discourse. Journal of the American Philosophical Association.Google Scholar
Fricker, Miranda. 2007. Epistemic injustice: Power and the ethics of knowing. New York: Oxford University Press.CrossRefGoogle Scholar
Gessen, Masha. 2020. Surviving autocracy. New York: Riverhead Books.Google Scholar
Goetze, Trystan. 2018. Hermeneutical dissent and the species of hermeneutical injustice. Hypatia 33 (1): 7390.CrossRefGoogle Scholar
Hannon, Michael. 2020. Empathetic understanding and deliberative democracy. Philosophy and Phenomenological Research 101 (3): 591611.CrossRefGoogle Scholar
Iyengar, Shanto, Sood, Gaurav, and Lelkes, Yphtach. 2012. Affect, not ideology: A social identity perspective on polarization. Public Opinion Quarterly 76 (3): 405–31.CrossRefGoogle Scholar
Jhaver, Shagun, Ghoshal, Sucheta, Bruckman, Amy, and Gilbert, Eric. 2018. Online harassment and content moderation: The case of blocklists. ACM Transactions on Computer-Human Interaction 25 (2). doi/10.1145/3185593.CrossRefGoogle Scholar
Kakutani, Michiko. 2018. The death of truth. New York: Crown Publishing Group.Google Scholar
Krishnamurthy, Meena. 2015. (White) tyranny and the democratic value of distrust. The Monist 98 (4): 391406.CrossRefGoogle Scholar
Maharawal, Manissa McCleave. 2011. So real it hurts. The Occupied Wall Street Journal. http://leftturn.org/so-real-it-hurts-notes-occupy-wall-street/Google Scholar
Mason, Rebecca. 2011. Two kinds of unknowing. Hypatia 26 (2): 294307.CrossRefGoogle Scholar
Mason, Lilliana. 2018. Uncivil disagreement: How politics became our identity. Chicago: Chicago University Press.CrossRefGoogle Scholar
Medina, José. 2017. Varieties of hermeneutical injustice. In The Routledge handbook of epistemic injustice, ed. Kidd, Ian James, Medina, José, and Pohlhaus, Gaile Jr. Abingdon: Routledge.Google Scholar
Nguyen, C. Thi. 2020. Echo chambers and epistemic bubbles. Episteme 17 (2): 141–61.CrossRefGoogle Scholar
Oluo, Ijeoma. 2019. So you want to talk about race. New York: Seal Press.Google Scholar
Oreskes, Naomi, and Conway, Erik. 2010. Merchants of doubt. New York: Bloomsbury Press.Google ScholarPubMed
Pohlhaus, Gaile Jr. 2012. Relational knowing and epistemic injustice: Toward a theory of willful hermeneutical ignorance. Hypatia 27 (4): 715–35.CrossRefGoogle Scholar
Rini, Regina. 2017. Fake news and partisan epistemology. Kennedy Institute of Ethics Journal 27 (2): E43E64CrossRefGoogle Scholar
Roberts, David. 2017. Donald Trump and the rise of tribal epistemology. Vox. https://www.vox.com/policy-and-politics/2017/3/22/14762030/donald-trump-tribal-epistemologyGoogle Scholar
Ruíz, Elena Flores. 2020. Cultural gaslighting. Hypatia 35 (4): 687713.CrossRefGoogle Scholar
Scott, Eugenie. 2004. Confronting creationism: When and how. Reports of National Center for Science Education 24 (6): 23.Google Scholar
Snyder, Timothy. 2018. The road to unfreedom. New York: Crown Publishing Group.Google Scholar
Tavernise, Sabrina, and Gardiner, Aidan. 2019. “No one believes anything”: Voters worn out by a fog of political wews. New York Times, November 18. https://www.nytimes.com/2019/11/18/us/polls-media-fake-news.htmlGoogle Scholar
Toole, Briana. 2019. From standpoint epistemology to epistemic oppression. Hypatia 34 (4): 598618.CrossRefGoogle Scholar
Williamson, Phil. 2016. Take the time and effort to correct misinformation. Nature 540: 171. doi:10.1038/540171aCrossRefGoogle Scholar