Hostname: page-component-7c8c6479df-hgkh8 Total loading time: 0 Render date: 2024-03-28T19:44:16.255Z Has data issue: false hasContentIssue false

Selecting in or Selecting Out? Gender Gaps and Political Methodology in Europe

Published online by Cambridge University Press:  29 October 2019

Malu A. C. Gatto
Affiliation:
University College London
Anita R. Gohdes
Affiliation:
Hertie School of Governance, Berlin
Denise Traber
Affiliation:
University of Lucerne
Mariken A. C. G. van der Velden
Affiliation:
Free University of Amsterdam
Rights & Permissions [Opens in a new window]

Abstract

Studies investigating gender gaps in the doctoral training of political science students have focused so far overwhelmingly on the US context. Although important research within this context has made strides in identifying the persistent challenges to women’s incorporation in political methodology, much remains unknown about whether women and men have different experiences in methods training during their PhD programs. We contribute to this debate by analyzing data from an original survey on the methods-training experiences of political science PhD students at different European universities. We assess whether gender gaps exist with respect to PhD students’ methods training and confidence in employing methods skills. Our findings show that women cover significantly fewer methods courses in their doctoral training. When women do participate in methods training, they show levels of method employment similar to their male colleagues. We discuss the implications of these findings in the context of European doctoral training.

Type
Article
Copyright
Copyright © American Political Science Association 2019 

After building extensive bodies of literature to explain persistent gender gaps in politics and society, academics have turned to their own institutions. Recent studies have shed light on how gender biases continue to persist in academia, disadvantaging women from graduate school to tenured positions (Dion, Sumner, and Mitchell Reference Dion, Sumner and Mitchell2018; Knobloch-Westerwick, Glynn, and Huge Reference Knobloch-Westerwick, Glynn and Huge2013; Maliniak, Powers, and Walter Reference Maliniak, Powers and Walter2013). Disparities accumulate, with implications for wages, promotions, and tenure. This means that the multiple dimensions of gender gaps also have consequences for women’s overall presence in academia. Although women represent roughly 40% of political science PhDs in the United States, they are still underrepresented in political science meetings and senior positions (Teele and Thelen Reference Teele and Thelen2017).

This underrepresentation is particularly pronounced in the subfield of political methodology. Women constitute less than 20% of participants at the Society of Political Methodology (POLMETH) annual meeting (Barnes Reference Barnes2018), and they are less likely to publish studies using quantitative and computational methods (Teele and Thelen Reference Teele and Thelen2017) and to be lecturers in quantitative-methods modules (Barnes Reference Barnes2018). In other words, if “political methodologists” can be understood as those who participate in the community (Esarey Reference Esarey2018) and have methods-centered teaching and research interests (Leeper Reference Leeper2018), then women are substantially underrepresented in this category. In fact, even when women use the same methods as men, they are still less likely to characterize themselves as methodologists (Esarey Reference Esarey2018; Shannon Reference Shannon2014).

In examining the roots of the gender gap in political methodology, some scholars have pointed to a problem of self-selection: women tend to select out of courses that cover quantitative and computational methods due to reasons that emerge long before postgraduate school. Although girls and boys tend to perform equally in math-related activities in elementary school, observable differences in test scores emerge in high school and college. Girls’ and women’s self-evaluations of their math-related qualifications also tend to be lower, even when they outperform men (Morrow-Jones and Box-Steffensmeier Reference Morrow-Jones and Box-Steffensmeier2014; Shannon Reference Shannon2014). Women’s self-evaluations are reinforced by others, who tend to underestimate their work or deem it of lesser quality—even when identical to men’s work (Knobloch-Westerwick, Glynn, and Huge Reference Knobloch-Westerwick, Glynn and Huge2013; Maliniak, Powers, and Walter Reference Maliniak, Powers and Walter2013).

Instead of remediating gender gaps in knowledge, however, doctoral programs seem to reinforce and perpetuate them. “Impostor syndrome” (Barnes Reference Barnes2018; Shannon Reference Shannon2014), discouragement (Morrow-Jones and Box-Steffensmeier Reference Morrow-Jones and Box-Steffensmeier2014), and the perception of such environments as male dominated and competitive (Shannon Reference Shannon2014) are frequently cited as reasons for pushing women away from quantitative and computational methods. Gender identification thereby intersects with other factors, such as race and sexuality (Gutierrez y Muhs et al. Reference Gutierrez y Muhs, Niemann, Gonzalez and Harris2012; Puwar Reference Puwar2004).Footnote 1 However, as evidenced by established mentoring programs—such as the US-based Visions in Methodology Conference, the CeMENT program of the American Economic Association, and the Journeys in World Politics Workshop at the University of Iowa—women-only environments that provide training, networking, mentorship, and access to role models have proven to be promising initiatives in tackling the gender gap in the discipline (Barnes Reference Barnes2018; Barnes, Beaulieu, and Krupnikov Reference Barnes, Beaulieu and Krupnikov2014; Blau et al. Reference Blau, Currie, Croson and Ginther2010; Barnes and Beaulieu Reference Barnes and Beaulieu2017; Dion Reference Dion2014). Our experience hosting the Zurich Summer School for Women in Political Methodology since 2017 provides further support for this.Footnote 2

Addressing women’s underrepresentation in political methodology is crucial because being excluded from the field may have broader implications for their careers—and for the discipline, more generally. Whereas important strides have been made to identify the persistent challenges to women’s incorporation in political methodology, much remains unknown about whether women and men have different experiences in methods training during their PhD programs. Furthermore, studies on the topic so far have focused overwhelmingly on the US context; therefore, we do not know whether the patterns identified also are observed elsewhere. The question of generalizability is especially important because, unlike in the United States—where most departments require that students enroll and qualify in specific mandatory methods courses—many European programs do not require their PhD students to complete formal training (although courses may be available to them).

We contribute to this debate by analyzing data from an original survey on the methods-training experiences of political science PhD students at European universities. We use these data to assess whether gender gaps exist regarding PhD students’ methods training and knowledge confidence.

SURVEY DESIGN, POPULATION, AND SAMPLE

The study was designed to capture patterns and variations in the methods-training experience of political science PhD students in Europe (appendix A lists the questions asked). One challenge in studying the profession is that the population of political scientists fluctuates and is largely unknown. In the United States, scholars often use conference presentation (Barnes, Beaulieu, and Krupnikov Reference Barnes, Beaulieu and Krupnikov2014) or membership in the American Political Science Association (APSA) as a proxy for the population (Teele and Thelen Reference Teele and Thelen2017). Membership in APSA, however, incurs financial costs—something that might lead to the disproportionate underestimation of junior scholars and those with more limited conference allowances (i.e., our target group). Instead of a single country, our focus on institutions in Europe posed yet an additional obstacle to estimating the population of political science PhD students.

To be consistent with previous scholarship, as well as to overcome these challenges, we first identified a list of institutions that sent participants to the general conference of the European Political Science Association (EPSA) between 2011 and 2018. We focused on EPSA because it tends to attract scholars doing quantitative methodologically oriented political science research. We therefore assume that our sample represents universities in which political scientists display at least some interest in the training and application of quantitative and computational methods. Indeed, 55.6% of all respondents in our sample stated that their respective programs offered mandatory methods training—a proportion that is likely to be significantly higher than across all PhD programs in Europe.

The list included 232 institutions in 20 European countries.Footnote 3 Not all institutions identified by EPSA attendance were universities or PhD-granting departments. To estimate our population of interest, we therefore manually accessed the websites of all 232 institutions to search for the names and email addresses of PhD students. This information was publicly available for 113 departments (48.7%), yielding a total contact list of 2,973 names.

Although incomplete,Footnote 4 these data provide valuable insight into the cross-country variation in the gender balance of the political science PhD population.Footnote 5 Figure 1 shows the distribution of students stemming from the 113 departments included in our list, located across 20 European countries. As shown in the left y-axis, Germany and the United Kingdom have the largest number of institutions attending EPSA. The right y-axis shows gender distribution in the identified population (i.e., dark-gray dots) and our response sample (i.e., light-gray triangles). Overall, the identified population consists of 1,477 (51%) men and 1,344 (49%) women scholars. In most countries, the distribution of male and female PhD students is gender balanced; however, as figure 1 shows, there is some cross-country variation. For example, the Austrian and Hungarian institutions with EPSA attendance have more female than male PhD students, whereas the Swiss and Norwegian institutions have more male PhD students.

Figure 1 Country Distribution and Gender Balance of Identified Population (113 Departments)

Note: The black boxes represent the number of departments by country that had EPSA attendees and for which we could find public information.

A total of 557 people completed the survey (20% response rate); however, 75 respondents were eliminated for belonging to a discipline other than political science or for not being a PhD student. This rendered a sample of 482 respondents, of which 233 (48%) were women.Footnote 6 T-tests of differences in means confirm that women and men are not statistically different regarding the size of the department they attend, their age, or their career plans to stay in academia—all aspects that could potentially impact one’s methods-training experience. However, women in the sample, on average, were less likely than men to be in a department with a mandatory methods-training program. That is, 60% of women in our sample were in a department with a mandatory methods-training program vis-á-vis 64% of men, which is a significant difference at the 5% level. Moreover, women in the sample, on average, were further along in their PhD studies than men. That is, 60% of female respondents were in their third year or higher of their PhD study versus 46% of the male respondents, a difference that also is statistically significant.

FINDINGS

Using our original survey data, we now turn to examining whether women PhD students are different from their male colleagues in their training and employment of methods.

Methods Training

Scholars argue that one reason for women’s underrepresentation in political methodology is that they select out of math-heavy courses. We expect this pattern to be particularly strong in Europe: unlike in the United States—where most departments require students to enroll and qualify in specific mandatory methods courses—many European programs do not require their PhD students to complete formal training (although courses may be available to them). In our sample, 38% of respondents were enrolled in a department that does not have a mandatory methods-training program.

To assess whether women’s methods training is different from that of men’s, we presented respondents with a list of 21 topics and asked them to identify which ones they had covered during their postgraduate studies (figure 2).Footnote 7 The topics were descriptive statistics, linear regressions, multivariate regressions, bivariate regressions, time series, panel data, experimental research, survey design, text/content analysis, formal theory, Bayesian statistics, data visualization, multilevel modeling, archival research, causal inference, elite interviews, agent-based modeling, machine learning, process tracing, maximum likelihood estimation, and survival analysis.

Figure 2 Topics Covered in Class

Notes: Left panel: topics covered by respondents’ gender. * indicates statistically significant differences between male and female PhD students (at p<0.05). Right panel: results from negative-binomial regression models using country fixed effects, reporting incidence-rate ratios.

In a preliminary assessment of gender-based differences, we found that women are more likely than men to have completed qualitative-methods courses. Yet, our survey included only three types of methods that could be considered qualitative; therefore, gender differences could be driven by the specific methods included. Overall, we found that women respondents covered a smaller share of topics. Whereas men, on average, covered 40% of the topics on the list, women covered 31%—a difference of 9 percentage points, which is statistically significant at the 1% level.

Given our main interest in quantitative and computational methods, we restricted further analyses to only quantitative and computational methods (i.e., 18).Footnote 8 For visualization purposes, we also grouped linear, multivariate, and bivariate regressions into a single category—“traditional regression”—and time series, panel data, multilevel modeling, and maximum likelihood estimation into the category “MLE.” A total of 11 topics remained.

The left-hand panel in figure 2 displays gender differences for each topic analyzed. As indicated by asterisks (i.e., statistically significant differences), seven of 11 methods were covered by a lower proportion of women. To assess whether gender indeed accounts for differences in methods training when controlling for other potential contributing factors, we estimated the proportion of the 18 quantitative topics covered using a negative-binomial regression model with country fixed effects. In addition to gender, we accounted for respondents’ age, whether they self-identified as white, their career plans,Footnote 9 whether their department offers mandatory methods training during the PhD program, whether they received departmental funding (i.e., material or in-kind), and their subdisciplinary focus (see appendix B for the operationalization of the variables used and the regression tables).

The coefficients presented in the right-hand panel of figure 2 are incidence-rate ratios. These can be substantively interpreted as the ratio of odds of A happening in the presence of B. As the figure shows, even when controlling for individual- and department-specific factors, women’s share of methods coverage is 15 percentage points lower (i.e., coefficient of 0.85) than men’s. Our results also show that self-identifying as white and receiving departmental funding are positively and statistically significantly associated with the proportion of methods covered. As demonstrated, subfield also matters: compared to those focusing on comparative politics, respondents from the international relations and political theory subfields cover a lower proportion of quantitative methods.

... even when controlling for individual- and department-specific factors, women’s share of methods coverage is 15 percentage points lower (i.e., coefficient of 0.85) than men’s.

Employment of Methods

“Impostor syndrome” often is cited as leading women to feel less confident in their skills, preventing them from employing the knowledge they may have. We investigated confidence by asking respondents to identify which topics they previously used in their research.

The left-hand panel in figure 3 demonstrates that women are less likely than men to have used traditional regression, descriptive statistics, data visualization, survey design, and MLE—despite having learned them in class. These differences, however, are not statistically significant. The lower part of the left-hand panel shows that women are more likely than men to use six of the other quantitative techniques if the methods were covered in class—namely experiments, machine learning, formal and agent-based modeling, text/content analysis, causal inference, and Bayesian statistics. For causal inference and Bayesian statistics, these differences were statistically significant at the 10% and 5% levels, respectively. These results suggest that, overall, there seem to be no systematic gender differences in the employment of methods once they have been learned in class. If anything, the data suggest that for some methods, women may be more likely than men to use them once they learned about them in class. In total, men stated that they previously used 63% of the methods they learned in class in their own research, whereas women used roughly 62% of the methods they learned—a difference that is not statistically significant.Footnote 10

These results suggest that, overall, there seem to be no systematic gender differences in the employment of methods once they have been learned in class. If anything, the data suggest that for some methods, women may be more likely than men to use them once they learned about them in class.

Figure 3 Topics Employed in Research

Notes: Left-hand panel: differences between methods covered and employed in percentages by respondents’ gender. * indicates statistically significant differences at p<0.05; ** indicates statistically significant differences at p<0.10. Right-hand panel: results from negative-binomial regression models using country fixed effects, reporting incidence-rate ratios.

We also used a negative-binomial regression model with country fixed effects to examine whether gender is associated with respondents’ levels of methods employment, when controlling for other individual- and department-level characteristics. The right-hand panel in figure 3 summarizes these results. As shown, no discernible gender differences emerge. Across the different institutional- and individual-level determinants, we also found no discernable factors that shape variation in level of method employment. For instance, doctoral students in non-funded PhD programs, in general, seem to use fewer quantitative methods—but not statistically significantly so.

Overall, our findings suggest that gendered differences in terms of method employment may not be as substantive as previously thought: both descriptive and multivariate results indicate that respondents’ gender cannot explain variation regarding the employment of quantitative methods in research. In other words, our findings suggest that although women may not seek out methods training to the same extent as men, once they have learned the methods, they seem to use them at similar rates.

CONCLUSION

This article is a brief analysis of recently collected data from a survey on the methods-training experience of political science PhD students from 20 European countries. Using these data, our aim was to understand whether and how the methods-training experiences of men and women differ. According to the existing literature, gender gaps in methods training can have broader consequences for academic careers. Our respondents concurred: both men and women perceived having knowledge of quantitative methods as more important for their career progression and opportunities than expertise in qualitative methods.

However, when analyzing PhD students’ training and employment of quantitative and computational methods, we found that—on average—women covered fewer topics than men. This reinforces previous research that finds gender imbalances in political methodology. Perhaps more optimistically than existing work, however, we also found that women and men use quantitative methods at similar rates—once they are covered in class. This suggests that an important factor of the previously identified gender gap in methods may be in women’s and men’s different levels of exposure to a larger variety of topics. Moreover, it is possible—as other studies suggest—that the types of methods covered by men and women are still different. In other words, although aggregate levels of confidence reported may not be different, a more detailed analysis may show a gender gap in respondents’ levels of confidence in using specific topics.

Overall, our results are encouraging in that they indicate that when women PhD students are exposed to methods in class, they are as likely as their male colleagues to employ them. However, given the structure (or lack thereof) of European doctoral programs, women currently cover significantly fewer methods throughout their training. Our limited data on qualitative methods further suggest that women also are more likely to select into training of qualitative-methods topics, which tends to feature not as prominently in students’ curricula.

... when women PhD students are exposed to methods in class, they are as likely as their male colleagues to employ them. However, given the structure (or lack thereof) of European doctoral programs, women currently cover significantly fewer methods throughout their training.

We contend that there are concrete implications to be drawn from these results. First, departments that want to prepare their students for both academic and nonacademic job markets should create course offerings that reflect their doctoral students’ methods needs and interests in a gender-balanced way. This includes both quantitative and qualitative course offerings—or providing support for students to participate in these courses through external training.

Second, if we aim to achieve gender balance in political methodology, methods training should be offered in a way that encourages women doctoral students to participate early and often. This is especially important given that quantitative skills currently are highly valued in the discipline (Teele and Thelen Reference Teele and Thelen2017)—a perception that our survey respondents also share. Ensuring that women do not select out of quantitative and computational methods early in their careers could ensure that they continue to comfortably use these methods later—helping to eradicate (or, at least, begin to close)Footnote 11 gender gaps in the profession in the future.

SUPPLEMENTARY MATERIAL

To view supplementary material for this article, please visit https://doi.org/10.1017/S1049096519001288

ACKNOWLEDGMENTS

An earlier version of this article was presented at the 2018 APSA Annual Meeting. The authors are grateful to APSA panel attendees, Fabrizio Gilardi, Lucas Leemann, and an anonymous reviewer for helpful comments. The usual disclaimers apply.

Footnotes

1. The report on “Diversity and Inclusion in the Society for Political Methodology” showed that 80% of all POLMETH members are non-Hispanic white; 76% of all American Political Science Association members fall in that category as well.

2. For more information about the program, see www.zurichsummerschool.com.

3. To examine whether our list encompassed all major political science departments in Europe, we compared it to membership in the European Consortium for Political Research (ECPR). The ECPR has exactly the same number of European member institutions as EPSA (i.e., 232), 136 of which (approximately 60%) are also EPSA members. In other words, 60% of our identified population are members of the two major European political science associations. Additionally, we accounted for 96 institutions that sent representatives to EPSA but are not members of ECPR—and excluded 96 institutions that are members of ECPR but have not taken part in EPSA conferences. Information on ECPR membership was retrieved from https://ecpr.eu/Membership/CurrentMembers.aspx on February 21, 2019.

4. We were able to retrieve information about PhD students from 48% of the institutions in our initial list. Although many of the institutions not included do not grant PhDs, we also are likely not accounting for students in departments that do not list students on their websites. The number of PhD students also fluctuates; therefore, it is possible that we failed to contact students who had not yet been added to the websites (or contacted individuals who have already left the program). We also know that our list of contacts mis-specified some individuals who were PhD students in social science departments but not political scientists.

5. We used the R package gender to encode the predicted gender of each individual in the population based on their first name (Mullen Reference Mullen2018).

6. Using the online survey platform Qualtrics, we recruited participants via email between June 30 and August 25, 2018. The questionnaire had 26 questions and took approximately 10 minutes to be completed. As an inducement for participation, we raffled a 50-euro Amazon voucher.

7. Although the list included some qualitative methods, it purposefully displayed a majority of topics that would be covered in quantitative- and computational-methods courses. Including an exhaustive list of topics would not have been feasible; therefore, we also allowed respondents to provide information about additional topics they studied. We did not analyze this information in detail in this article.

8. We excluded archival research, elite interviews, and process tracing.

9. That is, operationalized as “staying in academia” versus “leaving academia” or “not knowing yet.”

10. Similarly, when asked about their level of confidence in their knowledge of the topics, women and men in our sample stated that they would be “somewhat comfortable” or “very comfortable” in employing 57% and 59%, respectively, of the methods they learned.

11. Explicit encouragement does not directly address larger challenges related to implicit biases that affect in which fields women ultimately work and which departments will hire them. However, it may offer a first step toward building a network of women working on questions related to political methodology in Europe, which may ultimately address more systemic issues in the discipline.

References

REFERENCES

Barnes, Tiffany D. 2018. “Strategies for Improving Gender Diversity in the Methods Community: Insights from Political Methodologists and Social Science Research.” PS: Political Science & Politics 51 (3): 580–87. Available at https://doi.org/10.1017/S1049096518000513.Google Scholar
Barnes, Tiffany D., and Beaulieu, Emily. 2017. “Engaging Women: Addressing the Gender Gap in Women’s Networking and Productivity.” PS: Political Science & Politics 50 (2): 461–66. Available at https://doi.org/10.1017/S1049096516003000.Google Scholar
Barnes, Tiffany D., Beaulieu, Emily, and Krupnikov, Yanna. 2014. “An Assessment of the Visions in Methodology Initiative: Directions for Increasing Women’s Participation.” Political Methodologist 21 (2): 1016.Google Scholar
Blau, Francine D., Currie, Janet M., Croson, Rachel T., and Ginther, Donna K.. 2010. “Can Mentoring Help Female Assistant Professors? Interim Results from a Randomized Trial.” American Economic Review 100 (2): 348–52.CrossRefGoogle Scholar
Dion, Michelle L. 2014. “An Effort to Increase Women’s Participation: The Visions in Methodology Initiative.” Political Methodologist 21 (2): 68.Google Scholar
Dion, Michelle L., Sumner, Jane Lawrence, and Mitchell, Sara McLaughlin. 2018. “Gendered Citation Patterns across Political Science and Social Science Methodology Fields.” Political Analysis 26 (3): 312–27. Available at https://doi.org/10.7910/DVN/R7AQT1.CrossRefGoogle Scholar
Esarey, Justin. 2018. “What Makes Someone a Political Methodologist?PS: Political Science & Politics 51 (3): 588–96. Available at https://doi.org/10.1017/S1049096518000525.Google Scholar
Gutierrez y Muhs, Gabriella, Niemann, Yolanda Flores, Gonzalez, Carmen G., and Harris, Angela P.. 2012. Presumed Incompetent: The Intersections of Race and Class for Women in Academia Introduction. Logan: Utah State University Press. Available at https://doi.org/10.1353/rhe.2014.0006.Google Scholar
Knobloch-Westerwick, Silvia, Glynn, Carroll J., and Huge, Michael. 2013. “The Matilda Effect in Science Communication: An Experiment on Gender Bias in Publication Quality Perceptions and Collaboration Interest.” Science Communication 35 (5): 603–25. Available at https://doi.org/10.1177/1075547012472684.CrossRefGoogle Scholar
Leeper, Thomas J. 2018. “Am I a Methodologist? (Asking for a Friend).” PS: Political Science & Politics 51 (3): 602–6. Available at https://doi.org/10.1017/S1049096518000549.Google Scholar
Maliniak, Daniel, Powers, Ryan, and Walter, Barbara F.. 2013. “The Gender Citation Gap in International Relations.” International Organization 67 (4): 889922. Available at https://doi.org/10.1017/S0020818313000209.CrossRefGoogle Scholar
Morrow-Jones, Hazel, and Box-Steffensmeier, Janet M.. 2014. “Implicit Bias and Why It Matters to the Field of Political Methodology.” Political Methodologist 21 (2): 1620.Google Scholar
Mullen, Lincoln. 2018. “Predict Gender from Names Using Historical Data.” Available at https://cran.r-project.org/web/packages/gender/vignettes/predicting-gender.html.Google Scholar
Puwar, Nirmal. 2004. Space Invaders: Race, Gender and Bodies out of Place. New York: Berg Publishers. Available at http://research.gold.ac.uk/2017.Google Scholar
Shannon, Megan. 2014. “Barriers to Women’s Participation in Political Methodology: Graduate School and Beyond.” Political Methodologist 21 (2): 26.Google Scholar
Teele, Dawn Langan, and Thelen, Kathleen. 2017. “Gender in the Journals: Publication Patterns in Political Science.” PS: Political Science & Politics 50 (2): 433–47. Available at https://doi.org/10.1017/S1049096516002985.Google Scholar
Figure 0

Figure 1 Country Distribution and Gender Balance of Identified Population (113 Departments)Note: The black boxes represent the number of departments by country that had EPSA attendees and for which we could find public information.

Figure 1

Figure 2 Topics Covered in ClassNotes: Left panel: topics covered by respondents’ gender. * indicates statistically significant differences between male and female PhD students (at p<0.05). Right panel: results from negative-binomial regression models using country fixed effects, reporting incidence-rate ratios.

Figure 2

Figure 3 Topics Employed in ResearchNotes: Left-hand panel: differences between methods covered and employed in percentages by respondents’ gender. * indicates statistically significant differences at p<0.05; ** indicates statistically significant differences at p<0.10. Right-hand panel: results from negative-binomial regression models using country fixed effects, reporting incidence-rate ratios.

Supplementary material: PDF

Gatto et al. supplementary material

Appendix

Download Gatto et al. supplementary material(PDF)
PDF 406.7 KB