Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-12-03T04:03:31.819Z Has data issue: false hasContentIssue false

Embracing the Crisis of Research Design: How the Collapse of Case Selection in the Field Can Uncover New Discoveries

Published online by Cambridge University Press:  09 April 2024

Rights & Permissions [Opens in a new window]

Abstract

Political science has seen a welcome increase in guidance on conducting field research, which recognizes the need for adaptability. But while disciplinary conversations on “iterating” in the field have advanced, strategies for adapting to the breakdown of one’s case selection—an all-too-frequent problem faced by field researchers—remain underspecified. I synthesize the sources of case selection collapse and puts forward four strategies to help scholars iterate when things fall apart: 1) rethinking what constitutes a “case” when fieldwork upends one’s understanding of the population to which the original case(s) belong; 2) reorienting the object of analysis from outcomes to processes when new insights question the values of the outcome variable within one’s original case(s); 3) returning to dominant theoretical models as a source of comparison when unanticipated changes cut off data or field site access; and 4) dropping case(s) that become extraneous amid fieldwork-induced changes in the project’s comparative logic. By embracing these moments of seeming crisis, we can more productively train field researchers to make the most of the inductive discoveries and new theoretical insights that often emerge when one’s original plans fall apart.

Type
Reflection
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of American Political Science Association

In recent years, political science has seen a welcome increase in guidance on conducting field research. Several edited volumes and symposia serve as touchstones for those embarking on fieldwork (Kapiszewski, MacLean, and Read Reference Kapiszewski, MacLean and Read2015; Krause and Szekly Reference Krause and Szekely2020; Ortbals and Rincker Reference Ortbals and Rincker2009; Hsueh, Jensenius, and Newsome Reference Hsueh, Jensenius and Newsome2014; Lieberman Reference Lieberman2004), while other articles and research initiatives have illuminated more focused topics, like the ethical, psychological, and physical dimensions of field research in conflict-affected areas (Cronin-Furman and Lake Reference Cronin-Furman and Lake2018; Loyle and Simoni Reference Loyle and Simoni2017) and the use of digital and remote fieldwork methods (Konken and Howlett Reference Konken and Howlett2022). Importantly, these conversations have extended into the realm of graduate training, where a dedicated module at the Institute for Qualitative and Multi-Method Research (IQMR) and qualitative methods and research design courses train a new generation of scholars (Emmons and Moravscik Reference Emmons and Moravcsik2020).

Within efforts to elaborate the ins and outs of fieldwork, seasoned veterans recognize the need for adaptability given the messiness and uncertainty of the field research enterprise. Contributions to the literature on fieldwork frequently discuss the importance of “flexibility” and the prospect of having to “retool” or “iterate” in the field (see, for example, Kapiszewski, MacLean, and Read Reference Kapiszewski, MacLean and Read2018, Reference Kapiszewski, MacLean and Read2022; Posner Reference Posner, Krause and Szekely2020; LaPorte Reference LaPorte2014). Adjusting and innovating on the ground is both a product of necessity and the logic of social science inquiry itself. The need to adapt and iterate may arise in response to any number of complications that researchers in the pre-, mid-, and post-fieldwork stages face—complications that can range from the inability to access the field following initial planning (due to personal, practical, or political issues), the recognition that one’s project has already been done or is not worth doing (Schrank Reference Schrank, Perecman and Curran2006, 222), the realization that one’s data are incomplete or biased upon returning from fieldwork, or concerns that political and ethical conditions have changed, such that data cannot be used in the way originally envisioned (Knott Reference Knott2019). Inductive discoveries that come through immersion can also invite new questions that may be more contextually relevant and meaningful, triggering substantial shifts in a field-based study.

This article, however, focuses on one acute, yet common challenge: the breakdown of the case selection mechanism during fieldwork. The collapse of one’s pre-determined case selection strategy amid field research is a core facet of what Jody LaPorte (Reference LaPorte2014, 163) calls a “crisis of research design”—“when fieldwork questions the appropriateness of the research question, dependent variable, or case selection mechanism.” Certain aspects of a project may make it more prone to an eventual case selection crisis. For example, with lesser-known topics or in “understudied” settings, we may have incomplete information about key features of our pre-selected cases. Upon arriving at a fuller picture once immersed in the field, we may learn that the original rationale by which we selected cases no longer holds. But even when the initial assumptions that guided the selection of cases remain intact, the complexity and capriciousness of contemporary politics also threatens to upend case selection mid-stream (Kapiszewski, MacLean, and Read Reference Kapiszewski, MacLean and Read2022, 11). A sudden eruption in violence or conflict may make it unsafe to conduct research in pre-selected field sites, political turnover may foreclose access to critical interviewees and archival documentation, or a worldwide pandemic may restrict travel entirely.

The collapse of one’s original case selection rationale in the field looms large and can result from a variety of developments, whether the kinds of logistical and informational challenges that often make fieldwork so daunting or the novel insights and inductive discoveries that often make fieldwork so rewarding. But despite the growing recognition of the need for adaptability, scholarly interventions on how to iterate provide little specific guidance on how to recover and retool amid the fieldwork-induced breakdown of one’s case selection.Footnote 1 What strategies can field researchers deploy to recover when the assumptions grounding their original case selection fall apart? What should scholars do when certain research settings become inaccessible, upending their case selection rationale? How can field researchers cope with partially implemented data collection plans to still generate meaningful theoretical and empirical insights? And how might a more explicit recognition and anticipation of these challenges change fieldwork training and preparation?

I build on ongoing conversations on field-based iteration and propose a new framework for rethinking case selection when one’s original plans fall apart. In response to the diverse challenges and discoveries that may upend one’s original case selection mechanism, I posit four strategies for remaking a research project, which range from more maximalist to minimalist fixes: 1) “re-casing” when fieldwork upends understandings of the population to which one’s original case(s) belong; 2) reorienting the object of analysis from outcomes to processes when new insights question the values of the outcome variable within one’s original case(s); 3) returning to dominant theories as ideal types for comparison and explanation when unanticipated changes cut off data or field site access; and 4) dropping case(s) that become extraneous amid fieldwork-induced changes in the project’s comparative logic. Rather than constraining a study’s scholarly import, I posit that these strategies can, by contrast, help researchers make the most of new discoveries and generate innovative theoretical and empirical contributions to the discipline. I also examine how taking these strategies seriously might push us to transform the ways we train scholars to design and prepare for fieldwork.

This article makes three key contributions to the literature on field research within political science. First, it contributes practical guidance for how scholars might adapt their research designs when case selection is upended by fieldwork, offering concrete examples from the author’s own dissertation project (Schwartz Reference Schwartz2023) as well as the work of other scholars in the field. It thus advances fieldwork guidance by helping scholars puzzle through the sources of case selection breakdown and identify appropriate fixes, which range from minor tweaks to more significant redesigns. By demonstrating how case selection can be rethought amid diverse challenges, this article supplies current and future field researchers with actionable strategies for overcoming this aspect of the crisis of research design.

Second, the paper advances existing conversations about “flexibility” and “iteration” on the ground to meet the current realities of field-based research, especially for graduate students and early-career scholars. In so doing, it makes explicit something rarely stated aloud due to orthodox methodological norms in our discipline: that given the strictures on time, funding, and travel, the need to “retool” one’s research strategy is more often a need to make the most of what one has. This article not only provides alternative logics to ground and frame retooled research designs, but also illustrates how iteration can uncover and amplify rich new discoveries that lead to more contextually embedded and theoretically innovative projects.

Finally, the paper articulates ways to remake guidance for fieldwork design and preparation, especially for graduate students embarking on dissertation research. By normalizing the fieldwork-induced collapse of a project’s case selection rationale and providing strategies to pivot and adapt (often with limited time and resources), we can not only facilitate greater reflexivity, but also train scholars to anticipate the disruption of their research designs and preemptively formulate ways forward. By treating fieldwork preparation as a process of casting a wide net to survey multiple versions of the research question, unit of analysis, and sources of data, graduate students and early-career researchers will be better positioned to make the most of their empirical insights if and when their original plans fall apart.

Case Selection for Field Research and the Challenges of Iteration

Broadly, field research refers to “acquiring information, using any set of appropriate data collection techniques, for qualitative, quantitative, or experimental analysis through embedded research” (Irgil et al. Reference Irgil, Kreft, Lee, Willis and Zvobgo2021, 1500). Inherent in this definition is a recognition of methodological pluralism. However, I focus here specifically on fieldwork undertaken to collect qualitative data and research grounded in positivist traditions, while recognizing that the insights and solutions put forward may, in some respects, serve scholars operating outside of these methodological and epistemological confines. This scope was adopted for several reasons. First, a recent survey by Kapsizewski, MacLean, and Read (2022, 28) found that those conducting qualitative analysis were more likely to engage in iteration when it came to their research question, concepts, and case selection, than those conducting quantitative analysis. Therefore, addressing case selection collapse is more salient for qualitative research.

Second, studies grounded in interpretive as opposed to positivist epistemologies are more prone to considering fieldwork-based iteration to be part and parcel of the research process itself. As Yanow and Schwartz-Shea (Reference Yanow and Schwartz-Shea2012, 18) note in their essential guide to interpretive research design, scholars operating in this tradition allow cases to “emerge from the field” as part of recognizing their socially constructed and embedded nature. While this article adopts insights from interpretive research design, it is not directed at such studies because iteration is already a more fundamental part of interpretive epistemologies. By contrast, while methodological guidance from a positivist perspective may embrace iteration due to inductive discovery, such approaches also tend to be more concerned that mid-course changes may introduce bias or compromise the rigor and integrity of a study (Peters Reference Peters2013, 62-3).

Before discussing the challenges of retooling, it is important to make explicit the principles and assumptions underlying case selection for field-based research projects before they commence. The most detailed “how-to” manuals and advice from experienced field researchers emphasize the need to develop deep contextual knowledge of potential field sites to engage in sound case selection. As Kapiszewski, MacLean, and Read’s (Reference Kapiszewski, MacLean and Read2015, 85) influential guide to conducting fieldwork notes,

building broad and deep knowledge of the context in which fieldwork will be conducted—coming to understand the relevant history, culture, and political situation of one’s field sites—is a necessary prerequisite for effective research design. Knowledge of the field helps scholars to identify a relevant and appropriate research question, to learn how to think about key concepts and relationships among them, and to consider what cases might be used to investigate the question.

This approach advocates for poring over previous scholarship and accessible primary sources on the research setting. When possible, it also encourages undertaking preliminary fieldwork to gauge whether it will be possible to collect data in prospective field sites and to determine the appropriate cases considering the research question(s). This perspective recognizes what Koivu and Hinze (Reference Koivu and Hinze2017, 1026) call “the human element of research”—the myriad logistical considerations like “language skills, familiarity with the region, and in-country networks”—that shape one’s options for fieldwork.

While acknowledging these practical constraints, however, few guidebooks advocate that researchers select field sites purely based on convenience or desire. Instead, the selection of cases for field research often adheres, implicitly or explicitly, to Mill’s dictum to “maximize experimental variance, minimize error variance, and control extraneous variance” (in Peters Reference Peters2013, 31). Though there are plenty of reasons why scholars may engage in the single-case study (Gerring Reference Gerring2004), most utilize the so-called “method of difference” whereby two or more cases that appear similar on relevant independent variables yet vary on the outcome of interest are chosen; the goal, in turn, is to identify the key explanatory factor(s) that accounts for divergent outcomes (Koivu and Hinze Reference Koivu and Hinze2017, 1024). Alternatively, scholars may select field sites and thus cases to maximize variation on the explanatory variable of interest, a strategy more common in mixed-methods research (ibid.). But in either case, the assumption is that even though field researchers face practical and logistical constraints on case selection, there must be a purposive, ex ante case selection strategy to counteract potential bias. As Kalyvas (Reference Kalyvas, Krause and Szekely2020, 55) notes, “the absence of a watertight separation between theory and research design, on one hand, and data collection, on the other, is increasingly considered inappropriate at best, potentially dishonest at worst.” Even if researchers cannot be certain that their initial case selection strategy will survive the uncertainties of the field, they are encouraged to devise a “Plan B” that abides by similar logics (Kapiszewski, MacLean, and Read Reference Kapiszewski, MacLean and Read2015, 89).

Yet conventional advice, even if it does anticipate the need for adaptation, may be unable to overcome the breakdown of a study’s original case selection mechanism in the field, which can emerge from numerous sources. First, despite preliminary fieldwork, sometimes it is only much deeper in that a researcher realizes they have miscoded the values of the independent or dependent variables represented by a particular case, upending the divergences and convergences driving their original case selection rationale. In fact, “researchers may not know what is representative of a population when selecting cases,” as Collier and co-authors note (Collier, Mahoney, and Seawright Reference Collier, Mahoney, Seawright, Brady and Collier2004, 88; see also Saylor Reference Saylor2020, 992; Kapiszewski, MacLean, and Read Reference Kapiszewski, MacLean and Read2022, 10-11). In low-information settings, this problem may be even more salient and may not reveal itself until the data collection process is well underway.

At a more fundamental level, researchers may have misjudged what constitutes a unit fit for analysis in the first place. For example, prior to commencing fieldwork, a researcher may seek to explain differences in public service provision between two economically and demographically similar cities, only to learn that neighborhood is the more salient object of study. Deeper fieldwork may also reveal changes in neighborhood-level service provision over the span of several decades, indicating that comparisons of the same neighborhood over different time periods is an even more relevant unit of analysis. While researchers designing studies through an interpretivist lens are much more amenable to letting their cases “emerge from the field” in this way, the prior selection of these units (and possible realization that they are not the most appropriate) is the more common strategy and scenario for positivist scholars (Yanow and Schwartz-Shea Reference Yanow and Schwartz-Shea2012, 18).

Beyond these conceptual and methodological issues, the breakdown of one’s case selection mechanism may also emerge in response to the practical and political realities of fieldwork. Even when the cases initially selected for field research do reflect the attributes that the researcher anticipated, conflict, instability, and other sudden events can cut off access to people, places, and information at a moment’s notice. Even places other scholars previously studied in depth or sites the researcher visited during preliminary fieldwork may abruptly become off limits. These difficulties can also present new ethical dilemmas that change the calculus of risks and benefits for one’s interlocutors in the field (Knott Reference Knott2019).

While the voluminous literature on conducting field research within political science acknowledges these challenges, practical guidance on how to iterate in response to the breakdown of one’s case selection mechanism remains vague. Kapiszewski, MacLean, and Read’s (Reference Kapiszewski, MacLean and Read2022, 16-7) valuable contribution on dynamic research design, for example, proposes “[rethinking the] logic of case selection” as a possible response to “a case not working out” or the “dependent variable (DV) or outcome of interest [seeming] inapt.” But what does it mean to rethink the logic of case selection? What does it look like and how might a researcher implement this solution? When might a minor tweak, like dropping an extraneous case, suffice, and when might a more substantial fix, like re-casing the project entirely, become necessary? I turn to answering these questions in the following section.

Confronting Case Selection Collapse: Problems and Strategies

To begin puzzling through concrete strategies for adapting amid the breakdown of one’s case selection mechanism, let’s start with a hypothetical scenario, which may sound all too familiar for new and seasoned field researchers alike: having gone to the field with a pre-defined selection of cases (whether based on preliminary fieldwork or existing literature), the researcher has made progress in gaining access to and collecting data. In other words, fieldwork appears to be unfolding according to plan. Yet after several weeks or months, the tide suddenly turns, upending the original case selection mechanism. Perhaps the researcher realizes that they had misjudged the value of the dependent variable for one of their cases, meaning the variation around which they had designed the study no longer exists. Perhaps unanticipated discoveries related to the pre-selected unit of analysis—whether country, region, neighborhood, time period, or bureaucracy—question the appropriateness of how the study’s cases were originally conceived. Perhaps worsening political conditions have made interlocutors reluctant to participate in interviews, cut off access to critical state archives, or, worse, forced a premature exit from a particular field site before data collection was complete.

Recognizing that the time and resources to start anew are often limited, how can researchers cope with these mid-course disasters, which often feel like they might put an end to the project altogether? Here, I elaborate four concrete strategies, each of which correspond to a particular problem that disrupts case selection (see table 1). These mishaps are by no means mutually exclusive, and the solutions to them range from more significant alterations to minor fixes. In the remainder of this section, I draw on my own dissertation research as well as the experiences of other scholars to discuss what they look like and how they might be implemented in practice.

Table 1 Practical strategies to adapt to case-selection breakdown

Rethinking What Constitutes a Case

First, as noted earlier, a major challenge facing positivist qualitative research is that determining what something is a case of—the broader universe of units that it represents—requires prior knowledge that scholars may not have until they engage in fieldwork. As a result, time and data accumulated in the field may reveal that the pre-identified set of cases are not appropriate for examining the question(s) at hand. For example, perhaps a researcher studying economic development strategies treated “country” as their unit of analysis, only to find significant differences at the subnational level. Conversely, what if a researcher studying criminal violence at the subnational level treated local gang cliques as their units of analysis, only to find that centralized leadership rendered the national-level gang organizations more salient in understanding the outcome of interest?

One key strategy to confronting the inappropriateness of the pre-identified unit of analysis is reconsidering what constitutes a “case” in the first place by building on the data already collected. Guidance on “re-casing” is instructive. Political scientists are often counselled to purposefully select their cases before embarking on fieldwork; however, this need not be the order of operations to engage in rigorous and insightful research. Against a “realist” view, which holds that cases are out there waiting to be found, we might see the process of casing—or “[adopting] a schema of understanding … that organizes and guides our analysis”—as an ongoing activity within the course of field research, as Soss contends (Reference Soss, Simmons and Smith2021, 90). With this approach, the breakdown of one’s initial case selection mechanism is not a crisis only resolved through costly or impractical changes in fieldwork locales; it is an opportunity to step back and think creatively about the case(s) that have emerged through field-based learning and how the data already collected can be repurposed and analyzed through new frames.

A common re-casing strategy put forward in fieldwork guidance is disaggregation to allow for the comparison of subnational units. Though cross-country comparison was once the modal approach to research in comparative politics, scholars have increasingly embraced the comparison of subnational units within a single country context as providing leverage on some of the most important questions in the discipline (see Giraudy, Moncada, and Snyder Reference Giraudy, Moncada and Snyder2019). As Snyder (Reference Snyder2001) notes, adopting a subnational approach may allow researchers to increase the number of observations within their study, more accurately code the attributes of cases to enhance causal inference, and understand the dynamics and connections between the different levels of a political system. But this strategy can also serve another end for field researchers confronting a crisis of research design: it can provide a productive way of pivoting when one’s original case selection falls apart due to the researcher realizing the original unit of analysis is not appropriate for the question at hand.

There are also other techniques for rethinking what a case is beyond leveraging subnational variation in this way. For example, Riofrancos (Reference Riofrancos, Simmons and Smith2021, 120-21) argues for an approach that centers field “sites” as constitutive of broader phenomena rather than “cases” of a discrete outcome. In this sense, even a single “case” (i.e., a geographically bounded entity) contains multiple sites in which political and social processes can be contested and compared. This approach also offers a strategy for navigating the breakdown of one’s case selection rationale: looking both beyond the case(s) as initially conceived to uncover the broader global phenomena in question and within the case(s) to locate the multiple sites that are “politically salient” (120). Importantly, such a strategy does not shirk methodological rigor. Instead, the tacking back and forth between sites “[strengthens] both empirical acumen and analytical leverage by honing our concepts and subjecting them to constant tests provided by events, interviews, and archives” (122).

Re-casing can be a fruitful strategy for navigating and adapting to the breakdown of one’s original selection rationale. Here, I offer three examples—one from my own dissertation research, one from Benjamin Read’s Reference Read, Simmons and Smith2021 study of neighborhood organizations in China and Taiwan, and one from Sarah Parkinson’s work on militant groups in Lebanon.

My dissertation project was driven by a deep interest in the legacies of armed conflict in Central America, specifically how Cold War-era counterinsurgent campaigns reshaped state institutions in ways that distorted political and economic development. During initial fieldwork in Guatemala in 2015, I witnessed an unprecedented anti-corruption movement, which uncovered criminal structures embedded in the state—some of which, including a high-profile customs fraud network, were rooted in the militarization of government at the height of the civil war in the late 1970s. Beyond the detective story-like intrigue sparked by the case, I thought it also had something important to say about classic theories of conflict and state formation, which have stood at the center of research in comparative politics, international relations, and political sociology. In line with conventional approaches, civil war did contribute to the construction of state administrative institutions; however, such institutions do not always enhance the state’s capacity to carry out core functions. Instead, they may distort and undermine these aims. This is what Guatemala was a case of, in comparative terms.

But as with all sound controlled comparisons, crafting a viable research design required finding another case—ideally a comparable national context in which civil war built new institutions that strengthened state capacity to craft a most similar systems design. My fieldwork would then seek to uncover the explanatory variable(s) that differed between the two cases and thus plausibly explained the divergent outcomes. I settled on Nicaragua, a country that also experienced a Cold War-era conflict and, in the post-conflict period, was considered Central America’s outlier, experiencing less violence and evincing stronger state presence and capacity.

But the inappropriateness of thinking about these two country contexts as my study’s cases became very clear, particularly in Nicaragua, where wartime institutional changes had a state-undermining effect in some sectors like land administration and a state-reinforcing effect in others like public security. In other words, approaching my study through a national-level lens obscured critical sectoral dynamics, which prompted me to shift my unit of analysis from country to institution. Not only was this institutional turn in line with growing scholarly calls to disaggregate the state (Brenner et al. Reference Brenner, Jessop, Jones and Macleod2008; Ferguson and Gupta Reference Ferguson and Gupta2002), but it also allowed me to unpack the dynamics of institutional change during wartime in a more fine-grained and empirically richer way.

Likewise, Benjamin Read (Reference Read, Simmons and Smith2021) lays out a similar approach in reflecting on his own work comparing across regime types. Read’s study of local Residents’ Committees (RC) in China and Taiwan originally cased these entities as “mass organizations, common in all communist systems” (223); however, insights gleaned through comparison across China’s and Taiwan’s distinctive regime types allowed him to recognize the broader conceptual leverage of the phenomenon under study and thus re-case the project to focus on “state-backed neighborhood organizations” that engaged in “administrative grassroots engagement” (223).

Finally, Sarah Parkinson’s rich fieldwork on militant groups in Palestinian refugee camps in Lebanon reflected a similar process. Parkinson initially took for granted the idea that “camps” were suitable units to compare, only to find that that the different “factions’ organizational structures themselves varied geographically,” thus calling into question the utility of treating camps as cases (Parkinson Reference Parkinson, Simmons and Smith2021, 159). Instead, focusing on different network configurations as the object of analysis allowed her to more meaningfully examine how militant organizations evolve. In all three of these studies, not only did the fieldwork-induced re-casing process allow the researchers to shift to more salient, contextually grounded units of analysis, but it also contributed to conceptual developments potentially useful for scholars studying similar phenomena across the world.

Shifting from the Analysis of Outcomes to Processes

Among the most common causes of case selection collapse is the realization that one has miscoded the attributes of a case (or multiple cases), undermining the logic that drove the choice of fieldwork sites. For example, based on previous studies, a researcher may enter the field having chosen cases because of their values on the dependent variable—whether that dependent variable is healthcare infrastructure, violent crime, human rights treaty compliance, or bureaucratic capacity. But perhaps significant political, social, or economic developments since the publication of those earlier studies mask related changes in the outcome of interest, upending previous characterizations of the case. In low-information settings, characterizations of a case may be based on biased or incomplete data. In either scenario, the realization that one’s case(s) look different than anticipated undermines the initial case selection rationale by invalidating the convergent or divergent outcomes around which the study was designed.

But outcomes are far from the only entities fit for analysis. Processes—the conjunctions of actions and events that produce an outcome—are often crucial objects of analysis, as the focus of a growing literature on process tracing has acknowledged (see Bennett and Checkel Reference Bennett and Checkel2015; Beach and Pedersen Reference Beach and Pedersen2019; Fairfield and Charman Reference Fairfield and Charman2017). Without uncovering the mechanisms linking a cause to an effect—mechanisms that often combine in a processual fashion—we are unable to fully illuminate the phenomenon of interest.

The realization that one has misjudged the value of a pre-selected case’s outcome, in fact, provides new opportunities to examine why cases previously seen as convergent or divergent are not actually so and thus uncover new insights about the causal processes underlying these previously unanticipated juxtapositions. For example, in my own dissertation research introduced earlier, my initial fieldwork stint in Guatemala yielded exciting archival and interview data that largely corroborated my characterization of the case. However, a crisis of research design upended the divergences that anchored my case selection once I arrived in Nicaragua, a country in which I had comparatively less experience. Upon immersion, the Nicaraguan context did not look like one in which civil war had generated new, more capable state institutions, as I originally anticipated. Instead, I kept stumbling upon instances in which the counterinsurgent imperative had bred perverse institutional arrangements, which, for example, facilitated wartime drug trafficking and other illicit activities or subverted the state’s ability to regulate land tenure. In other words, within select institutional domains, Nicaragua looked surprisingly like Guatemala.

By turning to the institutional level, I refocused my research on trying to understand why these developments unfolded, thus leading me to center the process of wartime institutional change. In so doing, I discovered that, despite vast differences in the institutional domains under examination and in Guatemala’s and Nicaragua’s wartime contexts, the processes of institutional change looked remarkably similar. Specifically, perceptions that insurgent forces posed an increasingly serious, if not existential, threat led to the insulation of a narrow counterinsurgent elite coalition, which operated with broad discretion and faced few countervailing social or political forces to challenge its authority. To maintain their grip on power or accrue private benefits, this elite coalition crafted new rules and procedures that distorted state functioning. Shifting to an intensive analysis of institutional processes thus revealed unanticipated similarities and new theoretical insights.

My project is not unique in its orientation toward illuminating and comparing processes. Taylor’s (Reference Taylor2023) study of how social rights become constitutionally embedded, for example, unpacks this process following the passage of Colombia’s 1991 Constitution while also comparing it to that which unfolded in response to South Africa’s 1996 Constitution. Falleti’s (Reference Falleti2010) book examines decentralization through a processual lens, focusing on how the nature of decentralizing policies depend on the sequencing of reforms. And explaining the construction of participatory institutions in Brazil and Colombia, Mayka (Reference Mayka2019) examines how sweeping sectoral reforms open windows of opportunity for policy entrepreneurs to promote new ideas, create civil society networks, and build pro-reform coalitions that contribute to these institutional innovations. Moreover, in fields like conflict studies, which has been dominated by correlational research, scholars have urged greater attention to unpacking causal processes (Lyall Reference Lyall, Bennett and Checkel2015). Given the growing calls for, and incidence of, process-oriented research in political science, this strategy may not only help field researchers rethink troubled research designs but provide key scholarly contributions as well.

Utilizing Dominant Theoretical Models as Comparisons to Develop New Insights

Under the previous two scenarios, the breakdown of one’s case selection mechanism is more a product of new field-based insights that force design adjustments. But there is another challenge that is arguably more dire and is also becoming more common in our increasingly volatile world: abrupt changes in on-the-ground conditions, which may deny researchers key information or may force a premature exit from the field altogether. Of course, in the most extreme cases, the researcher may not yet have collected much data, thus rendering the guidance in this essay moot. But when such changes in access occur after a core of empirical materials have been accumulated, how can scholars cope with partially implemented data collection plans? If the loss of data access means that the researcher does not have sufficient information to include one or more of their pre-selected cases, how can case selection be rethought to allow them to make the most of data already accumulated?

A strategy for addressing this challenge is returning to extant theoretical models, which may allow researchers to elaborate more robust comparisons that generate new theoretical insights. The abrupt cessation of data access or concerns that require one to depart their field site(s) are likely to leave the researcher without the variation around which they designed their project—whether treated as outcomes, explanatory variables, or processes. Absent variation, qualitative research that aims to inform theory falls victim to accusations of selectively cherry-picking data points to tell a broader story. Even those who encourage a more flexible approach to casing encourage “[selecting] units … that offer interesting variation in whatever you wish to understand more about” (Htun and Jensenius Reference Htun, Jensenius, Simmons and Smith2021, 194, emphasis original).

One solution is to reframe variation as between an empirical process being observed and the “general claims of an ideal type,” as reflected in Saylor’s (Reference Saylor2020, 982) technique for crafting causal explanations. Rather than find the variation necessary to draw out causal inferences, Saylor argues that researchers elaborating political processes to explain the production of an effect can do so “by considering to what extent an analytical ideal type renders a case intelligible and how case-specific factors affected the outcome as well” (Saylor Reference Saylor2020, 1002). Ideal types, which can be drawn from broader theories, offer “specialized conceptual filters that focus our scholarly attention on particular aspects of actually existing things” (Jackson Reference Jackson2010, 145; in Saylor Reference Saylor2020, 1002-3). The “extent to which the ideal type can account for the permutation” within the chosen case, as well as contextual divergences that shape the presence of relevant causal mechanisms, can help scholars achieve causal explanation.

Many researchers already engage analytical ideal types in this way. But it is also important to note that this strategy goes beyond theory testing. Rather than simply confirm or disconfirm hypotheses developed a priori, this approach encourages scholars to delve deeper into the potential similarities or dissonances observed between one’s empirical case(s) and extant theory and elaborate how contextual factors might shape resulting causal explanations. Saylor (Reference Saylor2020, 1006) illustrates what this looks like by drawing on Spruyt’s (Reference Spruyt1994) work on the rise of the state system. Spruyt theorizes that increased trade and, in turn, growing merchant power forged new political coalitions that produced distinct institutional formations. Saylor interprets Spruyt’s use of the French case as an ideal type: as trade grows, merchants, who “traded in low value-added goods … wanted to reduce transaction costs by establishing centralized rule,” thus leading to alliances with the state and subsequent centralization (ibid.). By deploying the seemingly similar German context as a comparison, Spruyt, according to Saylor, develops a “more robust” explanation for institutional variation by demonstrating how and why the German king allied with landed elites rather than merchants, impeding centralized rule (Saylor Reference Saylor2020,1007). In other words, the model developed out of the French case is not merely disconfirmed in the German case. Analyzing and elaborating the salient divergences helps uncover new theoretical insights.

What might this approach mean for those experiencing the breakdown of their initial case selection due to a lack of data access? Importantly, it suggests that convergences and divergences can be built into research projects in new ways that “embolden unconventional comparisons” (Saylor Reference Saylor2020, 1008). Scholars engaged in theoretically grounded research have likely designed a study that references some ideal type drawn from previous literature. Whether or not their initial case selection rationale survives the uncertainties of field research, that ideal type can serve as a point of departure. Researchers immersed in the field need not stretch to find new cases, but instead can take stock of the broader theoretical picture and anchor their study in the variations (or lack thereof) from an ideal type to construct causal explanations.

My own dissertation research experience deployed this approach, returning to the classic bellicist model of state building to highlight critical divergences and craft a new explanation for wartime institutional change. In sifting through secondary literature and archival information on wartime institutional development in Nicaragua, I quickly realized that the data access there was far more restricted than in Guatemala and that I would not be able to leverage an institution-level case of state-bolstering wartime changes. In stepping back, however, I realized that this new research design did, in fact, allow me to take advantage of sources of variation that I would not have recognized otherwise: Tilly’s “warmaking as statemaking” framework, the theoretical grounding of the project.

According to Tilly’s account, which was derived from the study of European polities from the tenth to fifteenth centuries,Footnote 2 as rulers sought to expand their territorial control, they came into conflict with the population from which they needed to extract resources—“men, materials, and money” (Finer Reference Finer and Tilly1975, 96)—to successfully wage war. As a result, they needed to build new institutions to subdue internal challengers, conscript soldiers, and levy taxes. In addition, mobilizing the population and its resources induced bargaining between rulers and their societies, which contributed to administrative institutions (Tilly Reference Tilly1990, 25). In short, war requires the accumulation of resources, which leads to the construction of state institutions and the bolstering of state capacity.

Curiously, Nicaragua’s wartime land administration, a domain on which I did have substantial archival documentation, was emblematic of the resource accumulation strategy—the mobilization of, bargaining with, and extraction from mass actors—that Tilly posits bolsters state institutions. How did the Nicaraguan case, then, vary from this classic bellicist theory, triggering a divergent institutional trajectory? Despite marshalling wartime resources from popular sectors, Nicaragua’s Sandinista government became increasingly insulated, particularly as the economic strains of conflict deepened. The narrow FSLN ruling coalition, rather than incorporate countervailing social and political forces, undertook policies to strengthen peasant dependence on the regime, bolstering its rural control. By placing this empirical institutional process into conversation with the Tillyan model, I was able to uncover distinct wartime state-society dynamics and thus refine the causal explanation for why armed conflict generated different kinds of institutional logics.

Though this approach has become quite common in studies of war and state formation, the Tillyan example is far from the only theoretical model that has provided leverage as a source of variation in contemporary political science research. In her work on China’s economic strategy amid increasing globalization, Roselyn Hsueh (Reference Hsueh2011, 14) utilizes the developmental state model, in which the state serves “as a coordinator of economic growth, [insulates] private industry from penetration by foreign capital by decoupling technology and investment, [acts] as a market gatekeeper, [filters] external entry into the market, and, at the same time, [uses] market-conforming mechanisms to spur industrial development.”Footnote 3 By drawing on this ideal type developed by Johnson (Reference Johnson1982) in the context of Japanese state-led development (and subsequently applied to other East Asian newly industrialized countries), Hsueh (Reference Hsueh2011) elaborates divergent dynamics in China, where macro-level liberalization of foreign direct investment has been combined with strategic reregulation at the sectoral level. In so doing, Hsueh not only makes sense of the puzzling Chinese case but articulates a new bifurcated model of state-led development that may provide theoretical insights beyond China.

Beyond these examples, scholars have leveraged divergences from canonical theories to provide ground-breaking theoretical insights into why individuals join rebellion or engage in social mobilization. For example, Elisabeth Wood’s (Reference Wood2003) study of peasant mobilization in El Salvador’s civil war—now a staple on political violence syllabi—does not necessarily center the difference between joiners and non-joiners, but the moral and emotional motives that spurred individual mobilization in contrast to dominant theories focused on material incentives. Likewise, in a contrast from classic contentious politics approaches, Erica Simmons (Reference Simmons2016) underscores the critical importance of the ideational content, rather than just the material function, of grievances in spurring social mobilization against state policies, drawing on two convergent cases of protest: one against water privatization in Bolivia and another against the lifting of corn subsidies in Mexico. Whether the framing of these studies was driven by fieldwork-induced crises of research design is less significant than what they ultimately show: leveraging a dominant theoretical model can serve as a rich source of variation that sharpens a study’s contributions.

Dropping Extraneous Cases

The three previously mentioned strategies for adapting to the breakdown of a study’s case selection mechanism in the field range from more substantial to minor fixes. Rethinking what constitutes a case and reorienting the object of analysis from outcome to process are more extensive changes that may entail altered research questions and sources of data. By contrast, leveraging an extant theoretical model as an ideal type to refine causal explanations may require a tweak of the framing, rather than major changes to the nuts and bolts of the project itself. But with all three of these solutions, the selection of cases and the underlying rationale may change, rendering previously central cases no longer viable and, in some cases, even extraneous. In other scenarios, field researchers may misjudge their time and resources, as well as their own bandwidth and capacity, to undertake fieldwork (Newsome Reference Newsome2014, 157). When other strategies have been adopted or the study’s objectives—original or rethought—can be accomplished without investigating the full array of cases previously selected, a prudent approach may be dropping cases that are no longer salient.

Within my own dissertation fieldwork, reducing the number of cases stemmed from the revised comparative logic of the project itself. Within the original research design, Guatemala and Nicaragua were my two country cases; in turning from countries to state institutional sectors, I may have then envisioned a project with six cases grouped into three pairs—Guatemala’s and Nicaragua’s tax institutions, policing institutions, and property administrations. Yet by shifting the project’s focus to elaborating processes of institutional change, collecting data and undertaking detailed process-tracing for six cases would have been well beyond the time and resource limitations I faced. In addition, accessing comparable data on some of these cases, like Nicaragua’s tax apparatus, was exceedingly difficult.

But beyond these constraints, the revised comparative logic of the project made it no longer necessary to analyze all six cases. The study’s objective was to understand why undermining institutional arrangements developed within vastly different wartime contexts and why some endured into peace while others did not—goals I could accomplish by relying on the divergent conditions of the Guatemalan and Nicaraguan civil-war settings as well as variation in institutional persistence across three of the cases (Guatemala’s tax administration, Guatemala’s policing institutions, and Nicaragua’s land administration). In short, by rethinking the objectives of the study, I was able to drop cases that became extraneous while developing deeper analysis of those that reflected variation on important dimensions.

Dropping cases or field sites when the (revised) case selection rationale warrants it is not only a useful fix after misjudging one’s time and resources. It can also be a solution to another common problem: the dread of undertaking fieldwork, which may limit one’s mental, emotional, and even physical capacity to complete it. This challenge is articulated by self-described “fieldwork-hater” Amelia Hoover Green (Reference Hoover Green, Krause and Szekely2020), whose study of how political education can restrain violence by armed actors in wartime originally planned for in-depth fieldwork in El Salvador and Sierra Leone. After acknowledging her aversion to fieldwork, however, Hoover Green (Reference Hoover Green, Krause and Szekely2020, 120) writes that “I finally decided not to travel to Sierra Leone, both because I judged that I could write a good dissertation on the basis of subnational variation in El Salvador alone and because I worried that I’d be too miserable to function in Sierra Leone.” On the one hand, dropping Sierra Leone as a case resulted from the researcher recognizing her own fieldwork limits; on the other hand, it also emerged following inductive discoveries and the reorientation of case selection, which rendered the additional country and fieldwork no longer essential. In short, eliminating cases can constitute another useful strategy for confronting the practical and the personal constraints that arise amid fieldwork.

Reimagining Fieldwork Training and Preparation

My purpose is to urge greater acknowledgement that field researchers frequently face problems arising from case selection collapse but are seldom trained to confront them in a concrete and realistic way. Longstanding conversations on field research within political science recognize the need for adaptation in response to unexpected challenges, but few provide specific, actionable strategies for confronting these on-the-ground realities and using these seeming crises as opportunities for discovery. When it comes to addressing the breakdown of one’s original case selection strategy, such efforts often feel like a scramble to make something out of nothing. But as the examples discussed above indicate, these moments can often yield the most exciting new empirical insights and prompt fruitful theoretical innovations, which not only advance the state of knowledge in our field but authorize other scholars to embrace and think creatively about their own field-induced disruptions.

Thinking about research design in these terms remains taboo within a discipline that prizes its “scientific” identity. Openly and reflexively elaborating one’s iterative research process and fieldwork experience within a study thus comes with its own costs and tradeoffs. Just as processes of inductive iteration entail context-driven adjustments that might limit the generalizability of findings, deploying the recovery strategies may lead to research designs that enhance local relevance and that amplify empirical discoveries in ways that cannot account for whether findings travel to distinct contexts. Relatedly, so long as methodological standards in our discipline fixate on the a priori development of research questions, concepts, sampling techniques, and analytical strategies, convincing scholars to take the leap of faith and air their methodological “dirty laundry,” so to speak, will not be terribly palatable, particularly for graduate students and early-career researchers who have the most at stake.

Of course, we should not simply ignore the reasons why scholars feel the need to hide the missteps and messiness of what happens in the field. The professional incentives of preserving the image of an unproblematically executed, pre-planned case selection strategy are no doubt powerful, especially for junior scholars. These incentives surface in how we frame and structure our research in journal articles, books, and presentations and in how we teach our students to undertake social-scientific inquiry, and thus perpetuate the myth of how “good research” unfolds. But recent upheaval engendered by challenges like the COVID-19 pandemic presents an opportunity to reconsider whether the myth is worth perpetuating, as well as the kinds of techniques and logics that might replace it. We owe future field researchers the advantages and insights of this opportunity.

If we take seriously these strategies for pivoting amid the collapse of one’s case selection in the field, how might this shape the way we prepare for fieldwork and train graduate students entering the field? First, fieldwork preparation that anticipates a crisis of research design would much more intentionally encourage researchers to reflect on what they don’t know. Which settings, archives, and interviewees are you relatively confident you will be able to access, and which will entail much more uncertainty? How might a lack of access to certain locales or data sources affect your overall research design? Are there on-the-ground dynamics in your field sites that could affect data access and your ability to live and work there altogether? While these conversations about unknowns often happen informally between graduate students and their advisors, they should also be part of formal disciplinary exercises, like required research design courses and dissertation proposals. It is no secret that dissertation proposals are often ripped up once one enters the field and things fall apart. Instead of pretending that the neat research design contained within them unfolds perfectly, graduate programs can encourage or even require students embarking on dissertation fieldwork to include an “unknowns” section in which they purposively lay out the doubts, concerns, and uncertainties that could affect case selection. This could provoke more open and reflexive conversations that allows graduate students to better anticipate problems and the ability to pivot when they arise.

Relatedly, graduate fieldwork preparation and training could encourage students entering the field to cast a wide net when it comes to cases and data. Rather than encouraging burgeoning researchers to put their projects in boxes and settle on their precise cases and, subsequently, on the kinds of data they need to collect, training ahead of fieldwork would position graduate students to anticipate multiple ways of “casing” their projects (Soss Reference Soss, Simmons and Smith2021), as well as to anticipate multiple objects of analysis. In other words, fieldwork preparation would entail a process of broadening rather than narrowing—a process of anticipating and developing multiple units of analysis, divergences and convergences, and sources of data that can be explored in the field. While this may also happen within informal discussions with mentors and advisors, it can also be formalized within pre-fieldwork and dissertation proposal requirements. For example, rather than justify one’s intended case selection, programs or advisors can urge their students to put forward multiple possible casings—ways of understanding what something is a case of—to set them up for potential changes in the field.

Finally, training that helps students anticipate the crisis of research design and prepares them to shift course amid case-selection collapse should also encourage scholars to keep the big theoretical questions at the center. While often the most rewarding aspect of the research enterprise, fieldwork is frequently the most overwhelming. It is easy to get stuck in the empirical weeds while in the field and to lose sight of the overarching scholarly motivation that likely propelled the project initially—at least in part. The tacking back and forth between on-the-ground knowledge and broader theories, which scholars increasingly recognize as quite normal (Yom Reference Yom2015), can also be critical to finding your way out of a rut when the crisis of research design strikes. Building in time while in the field to intentionally engage in this abductive exercise can help recenter projects that have gone off the rails by allowing field researchers to reflect on what they have learned on the ground and how it speaks to the most important questions within our discipline.

Footnotes

1 For an important exception, see the recent contribution by Kapiszewski, MacLean, and Read Reference Kapiszewski, MacLean and Read2022.

2 It is important to recognize that the Tillyan framework sought to explain state building amid foreign rather than domestic conflict, which some may argue renders my application of this causal model an inappropriate shift in scale. However, in my dissertation, I was more interested in the process of institutional development, or the mechanisms linking war and institutional development. Given that state actors fighting internal armed conflicts also require resources to wage war, we can envision similar causal processes emerging in these contexts and, indeed, others have illustrated that they do; see, for example, Slater Reference Slater2010; Flores-Macías Reference Flores-Macías2014; and Slater and Smith Reference Slater and Smith2016.

3 I am grateful to Reviewer 4 for pointing out this example.

References

Beach, Derek, and Pedersen, Rasmus Brun. 2019. Process-Tracing Methods: Foundations and Guidelines. 2nd edition. Ann Arbor: University of Michigan Press.10.3998/mpub.10072208CrossRefGoogle Scholar
Bennett, Andrew, and Checkel, Jeffrey. 2015. Process Tracing: From Metaphor to Analytic Tool. New York: Cambridge University Press.Google Scholar
Brenner, Neil, Jessop, Bob, Jones, Martin, and Macleod, Gordon. 2008. State / Space: A Reader. Hoboken, NJ: John Wiley & Sons.Google Scholar
Collier, David, Mahoney, James, and Seawright, Jason. 2004. “Claiming Too Much: Warnings about Selection Bias.” In Rethinking Social Inquiry, ed. Brady, Henry E and Collier, David, 85102. Lanham: Rowman and Littlefield.Google Scholar
Cronin-Furman, Kate, and Lake, Milli. 2018. “Ethics Abroad: Fieldwork in Fragile and Violent Contexts.” PS: Political Science & Politics 51(3): 607–14.Google Scholar
Emmons, Cassandra V., and Moravcsik, Andrew M.. 2020. “Graduate Qualitative Methods Training in Political Science: A Disciplinary Crisis.” PS: Political Science & Politics 53(2): 258–64.Google Scholar
Fairfield, Tasha, and Charman, Andrew. 2017. “Explicit Bayesian Analysis for Process Tracing: Guidelines, Opportunities, and Caveats.” Political Analysis 25(3): 363–80.10.1017/pan.2017.14CrossRefGoogle Scholar
Falleti, Tulia G. 2010. Decentralization and Subnational Politics in Latin America. New York: Cambridge University Press.10.1017/CBO9780511777813CrossRefGoogle Scholar
Ferguson, James, and Gupta, Akhil. 2002. “Spatializing States: Toward an Ethnography of Neoliberal Governmentality.” American Ethnologist 29(4): 9811002.10.1525/ae.2002.29.4.981CrossRefGoogle Scholar
Finer, Samuel E. 1975. “State- and Nation-Building in Europe: The Role of the Military.” In The Formation of National States in Western Europe, ed. Tilly, Charles, 84163. Princeton, NJ: Princeton University Press.Google Scholar
Flores-Macías, Gustavo A. 2014. “Financing Security through Elite Taxation: The Case of Colombia’s ‘Democratic Security Taxes.’Studies in Comparative International Development 49:477500.10.1007/s12116-013-9146-7CrossRefGoogle Scholar
Gerring, John. 2004. “What Is a Case Study and What Is It Good For?American Political Science Review 98(2): 341–54.10.1017/S0003055404001182CrossRefGoogle Scholar
Giraudy, Agustina, Moncada, Eduardo, and Snyder, Richard, eds. 2019. Inside Countries: Subnational Research in Comparative Politics. Cambridge: Cambridge University Press.10.1017/9781108678384CrossRefGoogle Scholar
Hoover Green, Amelia. 2020. “Successful Fieldwork for the Fieldwork-Hater.” In Stories from the Field: A Guide to Navigating Fieldwork in Political Science, ed. Krause, Peter and Szekely, Ora, 115–23. New York: Columbia University Press.10.7312/krau19300-016CrossRefGoogle Scholar
Hsueh, Roselyn. 2011. China’s Regulatory State: A New Strategy for Globalization. Ithaca, NY: Cornell University Press.10.7591/cornell/9780801449956.001.0001CrossRefGoogle Scholar
Hsueh, Roselyn, Jensenius, Francesca Refsum, and Newsome, Akasemi. 2014. “Fieldwork in Political Science: Encountering Challenges and Crafting Solutions: Introduction.” PS: Political Science & Politics 47(2): 391–93.Google Scholar
Htun, Mala, and Jensenius, Francesca R.. 2021. “Comparative Analysis for Theory Development.” In Rethinking Comparison: Innovative Methods for Qualitative Political Inquiry, ed. Simmons, Erica S. and Smith, Nicholas Rush, 190207. New York: Cambridge University Press.10.1017/9781108966009.010CrossRefGoogle Scholar
Irgil, Ezgi, Kreft, Anne-Kathrin, Lee, Myunghee, Willis, Charmaine N, and Zvobgo, Kelebogile. 2021. “Field Research: A Graduate Student’s Guide.” International Studies Review 23(4): 1495–517.10.1093/isr/viab023CrossRefGoogle Scholar
Jackson, Patrick Thaddeus. 2010. The Conduct of Inquiry in International Relations: Philosophy of Science and Its Implications for the Study of World Politics. 1st edition. London; New York: Routledge.10.4324/9780203843321CrossRefGoogle Scholar
Johnson, Chalmers. 1982. MITI and the Japanese Miracle: The Growth of Industrial Policy, 1925-1975. Palo Alto, CA: Stanford University Press.10.1515/9780804765602CrossRefGoogle Scholar
Kalyvas, Stathis N. 2020. “Fieldwork by Decree, Not Design.” In Stories from the Field: A Guide to Navigating Fieldwork in Political Science, ed. Krause, Peter and Szekely, Ora, 4957. New York: Columbia University Press.10.7312/krau19300-006CrossRefGoogle Scholar
Kapiszewski, Diana, MacLean, Lauren M., and Read, Benjamin L.. 2015. Field Research in Political Science: Practices and Principles. Cambridge, UK: Cambridge University Press.10.1017/CBO9780511794551CrossRefGoogle Scholar
Kapiszewski, Diana, MacLean, Lauren M., and Read, Benjamin L.. 2018. “Reconceptualizing Field Research in Political Science.” Oxford Encyclopedia of Politics. DOI: 10.1093/acrefore/9780190228637.013.722.10.1093/acrefore/9780190228637.013.722CrossRefGoogle Scholar
Kapiszewski, Diana, MacLean, Lauren M., and Read, Benjamin L.. 2022. “Dynamic Research Design: Iteration in Field-Based Inquiry.” Comparative Politics 54(4): 645–70.10.5129/001041522X16352603126875CrossRefGoogle Scholar
Knott, Eleanor. 2019. “Beyond the Field: Ethics after Fieldwork in Politically Dynamic Contexts.” Perspectives on Politics 17(1): 140–53.10.1017/S1537592718002116CrossRefGoogle Scholar
Koivu, Kendra and Hinze, Annika Marlen. 2017. “Cases of Convenience? The Divergence of Theory from Practice in Case Selection in Qualitative and Mixed-Methods Research.” PS: Political Science & Politics 50(4): 1023–27.Google Scholar
Konken, Lauren C., and Howlett, Marnie. 2022. “When ‘Home’ Becomes the ‘Field’: Ethical Considerations in Digital and Remote Fieldwork.” Perspectives on Politics 21(3): 849–62.10.1017/S1537592722002572CrossRefGoogle Scholar
Krause, Peter, and Szekely, Ora, eds. 2020. Stories from the Field: A Guide to Navigating Fieldwork in Political Science. New York: Columbia University Press.10.7312/krau19300CrossRefGoogle Scholar
LaPorte, Jody. 2014. “Confronting a Crisis of Research Design.” PS: Political Science & Politics 47(2): 414–17.Google Scholar
Lieberman, Evan. 2004. “Symposium on Field Research.” Qualitative Methods 2(1). https://evanlieberman.org/wp-content/uploads/2014/01/lieberman-howard-lynch-field-research.pdf (September 2, 2022).Google Scholar
Loyle, Cyanne E., and Simoni, Alicia. 2017. “Researching Under Fire: Political Science and Researcher Trauma.” PS: Political Science & Politics 50(1): 141–45.Google Scholar
Lyall, Jason. 2015. “Process Tracing, Causal Inference, and Civil War.” In Process Tracing: From Metaphor to Analytical Tool, ed. Bennett, Andrew and Checkel, Jeffrey T., 186208. New York: Cambridge University Press.Google Scholar
Mayka, Lindsay. 2019. Building Participatory Institutions in Latin America: Reform Coalitions and Institutional Change. New York: Cambridge University Press.10.1017/9781108598927CrossRefGoogle Scholar
Newsome, Akasemi. 2014. “Knowing When to Scale Back: Addressing Questions of Research Scope in the Field.” PS: Political Science & Politics 47(2): 410–13.Google Scholar
Ortbals, Candice D., and Rincker, Meg E.. 2009. “Fieldwork, Identities, and Intersectionality: Negotiating Gender, Race, Class, Religion, Nationality, and Age in the Research Field Abroad: Editors’ Introduction.” PS: Political Science & Politics 42(2): 287–90.Google Scholar
Parkinson, Sarah E. 2021. “Composing Comparisons: Studying Configurations of Relations in Social Network Research.” In Rethinking Comparison: Innovative Methods for Qualitative Political Inquiry, ed. Simmons, Erica S. and Smith, Nicholas Rush, 152–71. New York: Cambridge University Press.10.1017/9781108966009.008CrossRefGoogle Scholar
Peters, B. Guy. 2013. Strategies for Comparative Research in Political Science. New York: Bloomsbury.10.1007/978-1-137-36722-8CrossRefGoogle Scholar
Posner, Daniel N. 2020. “Be Prepared (To Go Off-Script).” In Stories from the Field: A Guide to Navigating Fieldwork in Political Science, ed. Krause, Peter and Szekely, Ora, 8892. New York: Columbia University Press.10.7312/krau19300-012CrossRefGoogle Scholar
Read, Benjamin L. 2021. “Problems and Possibilities of Comparison across Regime Types: Examples Involving China.” In Rethinking Comparison: Innovative Methods for Qualitative Political Inquiry, ed. Simmons, Erica S. and Smith, Nicholas Rush, 208–30. New York: Cambridge University Press.10.1017/9781108966009.011CrossRefGoogle Scholar
Riofrancos, Thea. 2021. “From Cases to Sites: Studying Global Processes in Comparative Politics.” In Rethinking Comparison: Innovative Methods for Qualitative Political Inquiry, ed. Simmons, Erica S. and Smith, Nicholas Rush, 107–26. New York: Cambridge University Press.10.1017/9781108966009.006CrossRefGoogle Scholar
Saylor, Ryan. 2020. “Why Causal Mechanisms and Process Tracing Should Alter Case Selection Guidance.” Sociological Methods & Research 49(4): 9821017.10.1177/0049124118769109CrossRefGoogle Scholar
Schrank, Andrew. 2006. “Bringing It All Back Home: Personal Reflections on Friends, Findings, and Fieldwork.” In A Handbook for Social Science Research: Essays and Bibliographic Sources on Research Design and Methods, ed. Perecman, Ellen and Curran, Sara R., 217225. Thousand Oaks, CA: Sage Publications.Google Scholar
Schwartz, Rachel A. 2023. Undermining the State from Within: The Institutional Legacies of Civil War in Central America. New York: Cambridge University Press.10.1017/9781009219907CrossRefGoogle Scholar
Simmons, Erica S. 2016. Meaningful Resistance: Market Reforms and the Roots of Social Protest in Latin America. New York: Cambridge University Press.10.1017/9781316417645CrossRefGoogle Scholar
Simmons, Erica S., Smith, Nicholas Rush, and Schwartz, Rachel A. 2018. “Symposium: Rethinking Comparisons.” Qualitative and Multi-Methods Research 16(1): 17.Google Scholar
Slater, Dan. 2010. Ordering Power: Contentious Politics and Authoritarian Levithans in Southeast Asia. New York: Cambridge University Press.10.1017/CBO9780511760891CrossRefGoogle Scholar
Slater, Dan, and Smith, Nicholas Rush. 2016. “The Power of Counterrevolution: Elitist Origins of Political Order in Postcolonial Asia and Africa.” American Journal of Sociology 121(5): 1472–516.10.1086/684199CrossRefGoogle Scholar
Snyder, Richard. 2001. “Scaling Down: The Subnational Comparative Method.” Studies in Comparative International Development 36(1): 93110.10.1007/BF02687586CrossRefGoogle Scholar
Soss, Joe. 2021. “On Casing a Study versus Studying a Case.” In Rethinking Comparison: Innovative Methods for Qualitative Political Inquiry, ed. Simmons, Erica S. and Smith, Nicholas Rush, 84106. New York: Cambridge University Press.10.1017/9781108966009.005CrossRefGoogle Scholar
Spruyt, Henrik. 1994. The Sovereign State and Its Competitors. Princeton, NJ: Princeton University Press.10.1515/9780691213057CrossRefGoogle Scholar
Taylor, Whitney K. 2023. The Social Constitution: Embedding Social Rights through Legal Mobilization. New York: Cambridge University Press.10.1017/9781009367738CrossRefGoogle Scholar
Tilly, Charles. 1990. Coercion, Capital, and European States, A.D. 990–1990. Rev. ed. Cambridge, MA: Wiley-Blackwell.Google Scholar
Wood, Elisabeth Jean. 2003. Insurgent Collective Action and Civil War in El Salvador. New York: Cambridge University Press.10.1017/CBO9780511808685CrossRefGoogle Scholar
Yanow, Dvora, and Schwartz-Shea, Peregrine. 2012. Interpretive Research Design: Concepts and Processes. New York: Routledge.Google Scholar
Yom, Sean. 2015. “From Methodology to Practice: Inductive Iteration in Comparative Research.” Comparative Political Studies 48(5): 616–44.10.1177/0010414014554685CrossRefGoogle Scholar
Figure 0

Table 1 Practical strategies to adapt to case-selection breakdown