Hostname: page-component-76fb5796d-x4r87 Total loading time: 0 Render date: 2024-04-27T10:40:47.004Z Has data issue: false hasContentIssue false

Logging in to Learn: The Effects of Online Civic Education Pedagogy on a Latinx and AAPI Civic Engagement Youth Conference

Published online by Cambridge University Press:  09 November 2023

Matt Lamb*
Affiliation:
Texas Tech University, USA
Rights & Permissions [Opens in a new window]

Abstract

Civic education is essential to the health of any democracy. When COVID-19 emerged in the spring of 2020, almost all civic education efforts went online. This increased interest in the effectiveness of online civic education. Does it lead to similar outcomes as in-person education? I used student evaluations from a youth civic engagement conference co-run by Latinx and Asian American and Pacific Islander (AAPI) non-profit organizations to compare learning outcomes on multiple dimensions of civic education, from an in-person conference in 2019 and an online conference in 2020. I find that although students improved over the course of both conferences, the 2019 in-person conference yielded slightly greater improvement in civic knowledge confidence than the online conference. Other dimensions—verifiable knowledge, self-efficacy, and community consciousness—increased after participation in the conference in both years; however, the increases were similar between the online and in-person formats.

Type
Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of American Political Science Association

Political engagement among youth, especially minority youth, is of paramount importance for the continuation of a multicultural democracy in the United States. With youth engagement still relatively low compared to the power of its potential, social scientists must continually examine the best strategies for civic education, not only in terms of increasing citizen civic knowledge but also with regard to community consciousness and a sense of self-efficacy (Center for Information & Research on Civic Learning and Engagement 2022). The COVID-19 pandemic brought new significance to these questions—and raised new ones—as traditional in-person formats for classrooms, workshops, and civic engagement conferences were shifted to online formats. Educators of all subjects became concerned about whether online education would be as effective as an in-person classroom environment. Subsequent evaluations of reading comprehension and math scores suggest significant learning loss (Engzell Reference Engzell, Frey and Verhagen2021; Toness and Lurye Reference Toness and Lurye2022).

With youth engagement still relatively low compared to the power of its potential, social scientists must continually examine the best strategies for civic education, not only in terms of increasing citizen civic knowledge but also with regard to community consciousness and a sense of self-efficacy.

But what of civic engagement education? Does online civic education yield the same outcomes as in-person civic education? These questions are of utmost importance for those interested in civic education, as well as the continuation of a thriving civic spirit in the United States. Civic education programs targeted to youth can result in a higher likelihood of direct participation in the future by program participants (Holbein and Hillygus Reference Holbein and Hillygus2017). The increased popularity of online learning as a convenient option for students has prompted a discussion among political scientists about whether traditionally desired learning outcomes can be achieved in an online mode of instruction (Bolsen, Evans, and Fleming Reference Bolsen, Evans and Fleming2016).

This article uses a difference-in-difference research design to analyze learning outcomes from two youth civic education conferences conducted jointly by Mi Familia Vota (a nonpartisan Latinx civic organization) and OCA–Greater Houston (a nonpartisan AAPI civic organization). The first conference was held in person in the summer of 2019. The following year, during the summer of 2020, the conference was held entirely online due to COVID-19 precautions. This provided a useful natural experiment to analyze the effect of the online format on desired learning outcomes. I found that students increased their levels of verifiable civic knowledge, knowledge confidence, self-efficacy, and community consciousness after both conferences. However, compared to the other measures, their knowledge confidence apparently increased more in the in-person format. The effects of online pedagogy on knowledge confidence, however, did not reach the level of statistical significance. The increases in self-efficacy and community consciousness scores were similar between the in-person and online formats. The effects of online pedagogy, however, also did not reach the level of statistical significance for these scores. Although none of the differences was statistically significant, these results suggest evidence that online education could have disparate effects between types of learning outcomes, and they demonstrate that further research is warranted.

This article uses a difference-in-difference research design to analyze learning outcomes from two youth civic education conferences conducted jointly by Mi Familia Vota (a nonpartisan Latinx civic organization) and OCA–Greater Houston (a nonpartisan AAPI civic organization).

DIFFERENT APPROACHES TO CIVIC EDUCATION AND THEIR EFFECTS ON STUDENTS

In an era when political scientists bemoan the lack of enthusiasm for civic engagement, educators have become more interested in analyzing the most effective pedagogical tools. Most research focuses on determining which classroom tools are most effective for achieving the desired outcomes, including assignments that involve consuming political media, engaging in political self-expression, and attending civic meetings (Claassen and Monson Reference Claassen and Monson2015; Forestiere Reference Forestiere2015; Hepburn, Niemi, and Chapman Reference Hepburn, Niemi and Chapman2000; Huerta and Jozwiak Reference Huerta and Jozwiak2008; Van Assendelft Reference Van Assendelft2008). Other research examined student outcomes with institutionalized civic engagement centers at colleges and universities (Hoffman Reference Hoffman2015; Lamb, Perry, and Steinberg Reference Lamb, Perry and Steinberg2022).

Other studies research whether civic education has heterogeneous effects on students themselves. For example, Nelson (Reference Nelson2019) used a combination of public survey data and original survey data to determine whether civic education has heterogeneous effects on students based on race and ethnicity. Examining several dimensions of civic awareness, Nelson (Reference Nelson2019) determined that civic education increases rates of external efficacy among white students but not Black and Latinx students. The curriculum increased acts of public voice among Black and Latinx students but not for their white peers. Additionally, a study of Chicago-area Black and Latinx youth showed that civic education curricula that incorporated critical pedagogy led to Latinx and Black youth having a greater propensity to engage in multiple forms of political participation compared to white students (Nelson Reference Nelson2021).

ONLINE EDUCATION IN TEACHING GOVERNMENT AND CIVICS

It is apparent that during the past five years, online education has become far more popular and available at a wide range of colleges and universities. The COVID-19 pandemic further increased curiosity about whether online education could be a viable long-term alternative to in-person education. Empirical evidence demonstrates that online education in political science coursework can be effective. Bolsen, Evans, and Fleming (Reference Bolsen, Evans and Fleming2016) compared students who took online American government courses to those who took an in-person course and found that those who participated in online classes were more likely to discuss or participate in politics by the end of the term. Online students also demonstrated higher objective knowledge. Moreover, online collaboration between students and peers as well as between students and faculty members through online discussion boards and responses to written prompts is shown to be as likely to provide depth to topics of discussion as in-person dialog (Chadha Reference Chadha2017a, Reference Chadha2017b, Reference Chadha2018, Reference Chadha2019).

However, there has not been extensive research on the effects of online compared to in-person civic education. The extant research is limited in insight that can be gleaned to inform future curricular development. For example, when examining the effects of online education, it is important to pre- and post-test all learning outcomes of interest to establish baseline knowledge and determine whether the participation in online or in-person education had a significant effect. However, in their examination of student learning outcomes in an American government course, Bolsen, Evans, and Fleming (Reference Bolsen, Evans and Fleming2016) conducted only pre- and post-tests when measuring civic knowledge. They did not pre- and post-test the likelihood of discussing or participating in politics for students who took the course online. Additionally, there are selection problems: the authors conceded that students in the online class were far more likely to drop out than those in in-person classes.

Scholars and instructors of online pedagogy have developed best practices in online curricula to maximize engagement by incorporating assignments that require community interaction and engagement with civic issues. For example, in a new democracy—Tunisia—online civic education interventions were found to have a major effect on democratic engagement (Finkel, Neundorf, and Ramírez Reference Finkel, Neundorf and Ramírez2023). However, civic engagement is unique from other disciplines in that it is a field that requires a certain baseline level of community mindedness within individual participants. Whereas skills such as reading and mathematics may not necessitate social interaction, civic engagement requires it. Online instruction can mitigate a sense of community by limiting physical and in-person interaction, forcing participants to interact solely through online means. In a learning environment in which people can communicate solely through text-only messaging or in which they simply can turn off their camera and communicate solely through audio, the exchange of ideas and the importance of community can be lost. Although he was speaking before the proliferation of online pedagogy, Putnam (Reference Putnam2000) noted, “If we think of politics as an industry, we might delight in its new ‘labor-saving efficiency,’ but if we think of politics as democratic deliberation, to leave people out is to miss the whole point of the exercise.” Consequently, determination of the effect of online learning on civic education—a subject that requires a unique sense of community mindedness—is crucial and necessitates thorough examination.

THEORY AND HYPOTHESES

When evaluating the success of civic education pedagogy, the most robust assessments measure multiple dimensions of student outcomes. Whereas much of pedagogical research on American government courses examine retention of civic knowledge, Zukin et al. (Reference Zukin, Keeter, Adolina and Jenkins2006) also used measures of public voice, cognitive engagement, political engagement, and civic engagement. These were meant to measure multiple dimensions of civic education that looked not only at behavior but also the affect toward civic engagement and self-orientation toward public life. These measures were used in Nelson’s (Reference Nelson2019, Reference Nelson2021) studies. For the purposes of this study, I draw on these measures by using the terms “self-efficacy” to refer to the feeling that one can make change in a community; “community consciousness” to refer to the awareness of public concerns and the ability to place oneself in a larger body politic; and “civic knowledge” to refer to knowledge of facts regarding government abilities and functions.

Using these measures, I present my hypotheses regarding how students will fare when they participate in an online and an in-person conference about civic education. The literature has shown that participation in civic-engagement exercises that require students to engage in self-expressive assignments, encourages them to consume political media, and to attend events with speakers who are public officials or community activists is likely to cultivate civic-mindedness in students (Claassen and Monson Reference Claassen and Monson2015; Forestiere Reference Forestiere2015; Huerta and Jozwiak Reference Huerta and Jozwiak2008; Van Assendelft Reference Van Assendelft2008). Therefore, I expected that students who participated in a series of conference workshops that incorporated these exercises would yield positive learning outcomes, which informs my first hypothesis:

Hypothesis 1: Students will show increased levels of civic knowledge, self-efficacy, and community consciousness after participation in a civic-engagement conference.

Research analyzing the effects of online pedagogy in political science suggests that it may be just as or even more effective in achieving desired learning outcomes when compared to in-person pedagogy. However, I suggest that these findings may be the result of several types of methodological biases, including endogeneity, self-selection, and variance in how online courses approach curriculum. For example, students who complete online coursework—an undertaking that involves more self-pacing and self-motivation—may be inherently more likely to be more successful with regard to learning outcomes. In addition, civic-engagement literature suggests that incorporating pedagogical practices that take students out of the classroom and engage them in more interactive activities yield better outcomes (Huerta and Jozwiak Reference Huerta and Jozwiak2008; Van Assendelft Reference Van Assendelft2008). These findings inform my second hypothesis:

Hypothesis 2: Levels of civic knowledge, self-efficacy, and community consciousness will increase more in the in-person format than in the online format.

To test these hypotheses, it was necessary to implement a pre- and post-test design. As a scholarly community responsible for ensuring that students are adequately prepared to participate as members of a democratic society, political science should employ the same rigorous empirical approach to research on questions of political science and civic-engagement instruction as we do to questions regarding political institutions, behavior, and other major questions of the discipline. Unfortunately, much of the published research regarding political science and civic education primarily examines post hoc student evaluations and instructor-created assessments of learning outcomes. Rarely have researchers of pedagogical tools in civic education used methods that incorporated control groups and pre- and post-testing. Additionally, many studies have used only one dimension of learning outcomes (e.g., civic knowledge, self-efficacy, or community consciousness) rather than examining multiple dimensions. What would we find if we created a survey tool that measured multiple dimensions of civic education and used a pre- and post-test to determine whether mode of instruction has a role in learning outcomes?

RESEARCH DESIGN

I took advantage of data from pre- and post-surveys administered to students who engaged in an in-person civic-engagement education conference in the summer of 2019 and completely online in the summer of 2020. The conferences were administered by Mi Familia Vota (MFV) and OCA–Greater Houston (OCA), nonprofit civic-engagement organizations that seek to increase voter mobilization by underrepresented groups, particularly those in the Latinx and AAPI communities. Full descriptions of these organizations are in the online appendix. Every summer, MFV and OCA sponsor a one-week, joint youth-leadership workshop called the Youth Advocacy Summit (YAS) on the campus of Rice University. The summit is a series of workshops, speakers, and reflective exercises that increase civic knowledge, self-efficacy, and community consciousness in high-school and early-college-aged students. The workshop is advertised via government courses, after-school programming, and extracurricular clubs. Students volunteer to register and attend. YAS also runs approximately the same programing each year. Although the event traditionally is held in person, the COVID-19 pandemic forced YAS organizers in 2020 to hold it completely online.

The circumstances surrounding this workshop going online in the summer of 2020 and the data collected from these successive workshops appropriately resolved the research questions posted in this article because they provided a unique dataset that compares students of similar demographic characteristics at two different points in time who participated in the same civic-education workshop curriculum. The only difference was the modality of instruction. In both instances, students were administered a pre- and post-test to examine their baseline levels of knowledge, community consciousness, and self-efficacy, as well as any impact that may have been made by the workshop A difference-in-difference test was used to compare the impact of the effects of in-person and online modalities.

In addition to its appropriateness, the design resolves some of the methodological issues from previous research. One issue in previous studies of online education is the concern that some students may self-select into online or in-person education, consequently mitigating the effects that modality has on outcomes. Selection bias was a concern for this study because some students in 2020 who otherwise would have attended the conference if it had been in person may have opted out due to the online modality. However, a benefit of the circumstances surrounding the 2020 conference was that it reduced some of the potential selection-bias issues. Because there was no in-person option, students who wanted to attend the conference regardless of modality were not able to self-select into or out of online or in-person options. Because of this feature of this 2020 workshop, it is a methodological improvement over previous studies that examined online education in a social environment in which students could self-select into and out of online education based solely on personal preferences.

LEARNING OUTCOME MEASURES

Before the 2019 conference began, participating students were sent an online survey to measure their baseline levels of civic knowledge, community consciousness, self-efficacy, and civic-participation habits. The same survey was sent to participating students after the summit to determine any empirically measurable differences. The surveys were used to determine whether the summit was effective in increasing levels of civic knowledge, as well as whether it increased sentiments of self-efficacy and community-mindedness. Although the COVID-19 restrictions forced the conference online in 2020, the same pre- and post-surveys were used (except for a few wording changes to reflect updated current events).Footnote 1

The survey asked questions that were written to measure three desired learning outcomes: civic knowledge, community consciousness, and self-efficacy. Twelve questions measured civic knowledge. Participants were asked whether they could answer various questions on current events. A “yes” response was coded as 1 and a “no” response was coded as 0. Students also were asked if they knew the implications of several pieces of legislation pending at the time of the summit. These questions required respondents to choose from multiple answers that were coded as 1 if they answered correctly and 0 if incorrectly. Because these knowledge questions measured different dimensions of knowledge, they were developed with three different scores. The questions that asked whether participants were confident in their ability to correctly answer civic knowledge questions were used to create an additive score of “knowledge confidence.” Questions that required a specific, verifiably correct answer were used to create an additive “verifiable knowledge” score for each year. An additive score that combined the verifiable knowledge and knowledge confidence scores also was created.Footnote 2 Nine additional questions measured self-efficacy and seven questions measured community consciousness. Respondents were asked to place themselves on a five-point Likert scale ranging from “strongly disagree” (1) to “strongly agree” (5). Additive scores were created for these variables as well. The questions used to measure these learning outcomes are listed in table 1. The difference in means between pre-YAS and post-YAS survey responses was measured using a standard t-test to determine statistical significance (Lamb Reference Lamb2023).

Table 1 Pre- and Post-Survey Response Questions and Statements

PRE- AND POST-SURVEY DESIGN

This study used the format change as an opportunity for a natural experiment to examine the effectiveness of online pedagogy using a difference-in-difference design. In 2019, 63 students responded to the pre-survey and 73 responded to the post-survey. In 2020, 115 students responded to the pre-survey and 55 responded to the post-survey. After excluding students who responded to only one of the surveys, I analyzed 29 respondent matched pairs for the 2019 data and 37 matched pairs for the 2020 data. Although the relatively small sample size might be of some concern, it is important to note that this design matches pre- and post-tests to individual participants. Consequently, any observable effects can be viewed as real change, and methodological reviews of matching have shown that relatively small samples can yield measurements that are as accurate as those of samples as large as 1,000 (Pirracchio et al. Reference Pirracchio, Resch-Rigon and Chevret2012). Moreover, comparing samples of this size is within the standard practices of studies on political science and civic education (Huerta and Jozwiak Reference Huerta and Jozwiak2008; Lamb, Perry, and Steinberg Reference Lamb, Perry and Steinberg2022; Van Assendelft Reference Van Assendelft2008).

This design is an improvement over previous research on the effectiveness of online education. First, this study conducted pre- and post-tests on multiple dimensions. Bolsen, Evans, and Fleming (Reference Bolsen, Evans and Fleming2016) conducted pre- and post-tests only on civic knowledge and based their conclusions on other dimensions solely by analyzing post-course evaluations. Second, although there were slightly different samples with regard to geographic composition in 2019 and 2020, the analyses of both years examined the responses of only those students who completed the program in both years and matched pre- and post-surveys to individual participants. This eliminated any selection biases that may result from participants dropping out before taking the post-test.

DATA AND RESULTS

The demographic characteristics of those students whose responses were analyzed in this study are included in online appendix table A1. The sample is predominantly female and most of them identified as Latina. Although such an ethnically biased sample may concern some scholars with regard to generalizability, the experimental research design ensured that ethnicity should not be acting as a confounding variable on the results. Consequently, I argue that the ethnic composition of this sample increases generalizability. Additionally, most participants were 17 years of age and a few were 14, 15, 16, and 18 years old.

The greatest increases in both conferences were in the overall civic knowledge and knowledge confidence scores. In 2019, overall knowledge scores increased by 21% after the YAS and in 2020 by 14.6%. The knowledge confidence score increased by 28.2% in 2019 and by 17% in 2020. The difference in means pre- and post-survey was statistically significant in both years; however, the increases in the verifiable knowledge scores were much smaller. In 2019, the verifiable knowledge score increased by 6.9%; in 2020, by 2.7%. However, it is important to note that due to the small number of respondents, this minor increase represents a change in only one response. The mean scores are graphically displayed in figure 1, on which means are plotted in red and confidence intervals are shown in blue.

Figure 1 Improvement in Civic Knowledge

In 2019, community consciousness scores increased by 7% after the YAS; in 2020, by 6.2%. The difference in means pre- and post-survey was statistically significant in both years. The increases are presented graphically in figure 2. Although there were marked improvements in community consciousness scores, the increases were not much different between 2019 and 2020—certainly not as different as for civic knowledge. There may be two reasons for this. First, it may be that the type of students who participates in a program like this are already more likely to possess high levels of community consciousness. Although conference programming increased scores on this dimension, the mode of instruction proved less important to those who participated. Second, it may be that those students who actually completed the voluntary pre- and post-surveys were more likely to possess a characteristic that made the mode of education irrelevant to their improvement than those who completed only the pre-survey or the post-survey.

Figure 2 Improvement in Community Consciousness

In 2019, self-efficacy scores increased by 6.6% after the YAS; in 2020, by 6.4%. The difference in means pre- and post-survey was statistically significant. However, the increases in 2019 and 2020 again were not distinguishable from one another. These means are presented graphically in figure 3.

Figure 3 Improvement in Self-Efficacy

For reasons similar to those for community consciousness, it may be that these results come from an unseen characteristics among those participants in the sample that render the mode of instruction irrelevant to increases in self-efficacy.

DIFFERENCE-IN-DIFFERENCE RESULTS

To properly test Hypothesis 2, I compared the differences in the mean pre- and post-test differences in both years to each other to determine whether there were statistically significant differences. Table 2 lists the results of a normal linear-regression model that measures the effects of being post-tested, of being online, and of post-testing in the online format. The table shows the difference-in-difference estimator for the effects of taking the conference online.

Table 2 Coefficients Table

Notes: Standard errors are in parentheses. ***p<0.01, **p<0.05, *p<0.1.

The coefficients suggest that improvements in learning outcomes were smaller in the online condition, as noted by the negative difference-in-difference estimators, but none is statistically significant. What is suggestive, however, is that the substantive effect of the difference-in-difference estimator for civic knowledge is more than 10 times larger than for self-efficacy and approximately 1.5 times larger than for community consciousness.

DISCUSSION

The results of this study yield mixed possible conclusions. Although participation in a civic engagement education conference yielded increases for civic knowledge, self-efficacy, and community consciousness in both the in-person and online formats, only the increases in the civic knowledge scores were notably different between the online and in-person formats. In particular, knowledge confidence increased to a greater degree than the other measures, including verifiable knowledge. The larger increase in the in-person conference for civic knowledge confidence suggests that students may believe that they learned and retained facts about civic engagement better in the in-person format. It may be that the in-person format and face-to-face discussion of civic engagement information is a better method of retention, or that the in-person format makes students believe that they have retained more knowledge than in the online format. The fact that the increases in the learning outcome scores were not statistically significant between 2019 and 2020 contradicts my second hypothesis. It is possible that online formats, in fact, are as adequate as in-person education on multiple dimensions of civic education. Another possibility is that student predispositions to these characteristics made them unsusceptible to the effects that a difference in mode of instruction may have had on a more general pool of students.

The larger increase in the in-person conference for civic knowledge confidence suggests that students may believe that they learned and retained facts about civic engagement better in the in-person format.

Although this article presents an empirical improvement on previous studies of civic engagement instruction, more research must be undertaken. As political scientists, we must examine our classroom tools and pedagogical style with the same empirical rigor as the other important questions of the field. Further analysis of pedagogical techniques and modes of instruction must incorporate more pre- and post-testing, control and treatment groups, times-series designs, and larger-scale designs to determine the most effective learning tools at our disposal. As online modes of education become more popular, more must be done especially to examine how social science and civics instructors can use online tools to maximum effect—particularly if the results of this study hold true that online instruction is the inferior method for knowledge retention.

ACKNOWLEDGMENT

The author thanks the leadership of the organizations Mi Familia Vota and OCA–Greater Houston for their openness in sharing their data for this article.

DATA AVAILABILITY STATEMENT

Research documentation and data that support the findings of this study are openly available at the PS: Political Science & Politics Harvard Dataverse at https://doi.org/10.7910/DVN/JS6AVN.

SUPPLEMENTARY MATERIAL

To view supplementary material for this article, please visit http://doi.org/10.1017/S1049096523000938.

CONFLICTS OF INTEREST

The author declares that there are no ethical issues or conflicts of interest in this research.

Footnotes

1. The organizers of the conference surveyed the students for internal evaluation purposes and anonymized the data for the purpose of academic analysis. Consequently, Institutional Review Board approval was not obtained because the collection of these data was not originally intended for academic publication. The data are being analyzed, with permission of the organizations involved, as secondhand data.

2. Note that measurements were slightly different in 2019 and 2020 to account for the fact that in 2019, only Houston students were surveyed, whereas in 2020, students from multiple states were surveyed. Additionally, questions that were more salient to civic knowledge in each year were asked. However, questions in each sample were designed to capture similar elements of civic knowledge.

References

REFERENCES

Bolsen, Toby, Evans, Michael, and Fleming, Anna McCaghren. 2016. “A Comparison of Online and Face-to-Face Approaches to Teaching Introduction to American Government.” Journal of Political Science Education 12 (3): 302–17.CrossRefGoogle Scholar
Center for Information & Research on Civic Learning and Engagement. 2022. “Youth Voter Turnout and Impact in the 2022 Midterm Elections.” https://circle.tufts.edu/sites/default/files/2022-12/early_data_youth_vote_report.pdf.Google Scholar
Chadha, Anita. 2017a. “Comparing Student Reflectiveness in Online Discussion Forums Across Modes of Instruction and Levels of Courses.” Journal of Educators Online 14 (2): 119.Google Scholar
Chadha, Anita. 2017b. “Learning to Learn: Lessons from a Collaboration.” Journal of the Scholarship of Teaching and Learning 17 (3): 3447.CrossRefGoogle Scholar
Chadha, Anita. 2018. “Virtual Classrooms: Analyzing Student and Instructor Collaborative Experiences.” Journal of the Scholarship of Teaching and Learning 18 (3): 5571.CrossRefGoogle Scholar
Chadha, Anita. 2019. “Personalizing and Extending Deliberation in the Online Classroom: Future Horizons.” Journal of Educators Online 16 (2): 120.CrossRefGoogle Scholar
Claassen, Ryan L., and Monson, J. Quin. 2015. “Does Civic Education Matter? The Power for Long-Term Observation and Experimental Method.” Journal of Political Science Education 11 (4): 404–21.CrossRefGoogle Scholar
Engzell, Per, Frey, Arun, and Verhagen, Mark D.. 2021. “Learning Loss Due to School Closures During the COVID-19 Pandemic.” Proceedings of the National Academy of Sciences 118 (17): e2022376118.CrossRefGoogle ScholarPubMed
Finkel, Steven E., Neundorf, Anja, and Ramírez, Ericka Rascón. 2023. “Can Online Civic Education Induce Democratic Citizenship? Experimental Evidence from a New Democracy.” American Journal of Political Science. https://doi.org/10.1111/ajps.12765.Google Scholar
Forestiere, Carolyn. 2015. “Promoting Civic Agency through Civic Engagement Activities: A Guide for Instructors New to Civic Engagement Pedagogy.” Journal of Political Science 11 (4): 455–71.Google Scholar
Hepburn, Mary A., Niemi, Richard G., and Chapman, Chris. 2000. “Service Learning in College Political Science: Queries and Commentary.” PS: Political Science & Politics 33 (3): 617–22.Google Scholar
Hoffman, Adam H. 2015. “Institutionalizing Political and Civic Engagement on Campus.” Journal of Political Science Education 11 (3): 264–78.CrossRefGoogle Scholar
Holbein, John B., and Hillygus, D. Sunshine. 2017. “Making Young Voters: The Impact of Preregistration on Youth Turnout.” American Journal of Political Science 60 (2): 364–82.CrossRefGoogle Scholar
Huerta, Juan Carlos, and Jozwiak, Joseph. 2008. “Developing Civic Engagement in General Education Political Science.” Journal of Political Science Education 4 (1): 4260.CrossRefGoogle Scholar
Lamb, Matt. 2020. “Mi Familia Vota.” In Voting and Political Representation in America: Issues and Trends, ed. Jones, Mark, 376–78. New York: Bloomsbury Publishing, Inc.Google Scholar
Lamb, Matt. 2023. “Replication Data for ‘Logging in to Learn: The Effects of Online Civic Education Pedagogy on a Latinx and AAPI Civic Engagement Youth Conference.’” PS: Political Science & Politics. DOI:10.7910/DVN/JS6AVN.CrossRefGoogle Scholar
Lamb, Matt, Perry, Steven, and Steinberg, Alan. 2022. “From Classroom to Community: An Assessment and Potential Implications of an Undergraduate Civic Engagement Research and Learning Program.” Journal of Political Science Education 19 (2): 250–69. DOI:10.1080/15512169.2022.2128811.CrossRefGoogle Scholar
Nelson, Matthew D. 2019. “Teaching Citizenship: Race and the Behavioral Effects of American Civic Education.” Journal of Race, Ethnicity, and Politics 6 (1): 130.Google Scholar
Nelson, Matthew D. 2021. “Cultivating Youth Engagement: Race and the Behavioral Effects of Critical Pedagogy.” Political Behavior 43:751–84.CrossRefGoogle Scholar
Pirracchio, Romain, Resch-Rigon, Matthieu, and Chevret, Sylvie. 2012. “Evaluation of the Propensity Score Methods for Estimating Marginal Odds Ratios in Case of Small Sample Size.” Medical Research Methodology 70 (12): 110.Google Scholar
Putnam, Robert D. 2000. Bowling Alone: The Collapse and Revival of American Community. New York: Simon & Schuster.Google Scholar
Toness, Bianca, and Lurye, Sharon. 2022. “Massive Learning Setbacks Show COVID’s Sweeping Toll on Kids.” U.S. News & World Report, October 28. www.usnews.com/news/us/articles/2022-10-28/massive-learning-setbacks-show-covids-sweeping-toll-on-kids.Google Scholar
Van Assendelft, Laura. 2008. “‘City Council Meetings Are Cool’: Increasing Student Engagement through Service Learning.” Journal of Political Science Education 4 (1): 8697.CrossRefGoogle Scholar
Zukin, Cliff, Keeter, Scott, Adolina, Molly, and Jenkins, Krista. 2006. A New Engagement? Political Participation, Civic Life, and the Changing American Citizen. New York: Oxford University Press.CrossRefGoogle Scholar
Figure 0

Table 1 Pre- and Post-Survey Response Questions and Statements

Figure 1

Figure 1 Improvement in Civic Knowledge

Figure 2

Figure 2 Improvement in Community Consciousness

Figure 3

Figure 3 Improvement in Self-Efficacy

Figure 4

Table 2 Coefficients Table

Supplementary material: Link
Link
Supplementary material: PDF

Lamb supplementary material

Appendix
Download Lamb supplementary material(PDF)
PDF 131.2 KB