Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-v9fdk Total loading time: 0 Render date: 2024-11-19T11:35:37.658Z Has data issue: false hasContentIssue false

4 - Privacy in Practice

A Socio-technical Integration Research (STIR) Study of Rules-in-Use within Institutional Research

from Part I - Personal Information as a Knowledge Commons Resource

Published online by Cambridge University Press:  29 March 2021

Madelyn Rose Sanfilippo
Affiliation:
University of Illinois, Urbana-Champaign
Brett M. Frischmann
Affiliation:
Villanova University School of Law
Katherine J. Strandburg
Affiliation:
New York University School of Law

Summary

Understanding the rules and norms that shape the practices of institutional researchers and other data practitioners in regards to student data privacy within higher education could be researched using descriptive methods, which attempt to illustrate what is actually being done in this space. But, we argue that it is also important for practitioners to become reflexive about their practice while they are in the midst of using sensitive data in order to make responsive practical and ethical modulations. To achieve this, we conducted a STIR, or socio-technical integration research. We see in the data, the STIR of a single institutional researcher, some evidence of changes in information flow, reactions to it, and ways of thinking and doing to reestablish privacy-protecting rules-in-use.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2021
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

4.1 Introduction

The ubiquity of information systems on university campuses for supporting university work has led to an undeniable increase in the quantity of institutional data. Higher education institutions have taken note of the trove of data to which they now have access, arguing that they have a responsibility to use data in service to their administrative and educational missions and to act upon accountability pressures from external constituents to use data to identify actionable insights directed toward institutional improvement (Reference Prinsloo, Slade, Menon, Terkla and GibbsPrinsloo and Slade, 2014). In response to this institutional data influx and to address mounting external pressures, learning analytics (Reference Johnson, Smith, Willis, Levine and HaywoodJohnson, Smith, Willis, Levine, and Haywood, 2011) and other data-based research and practitioner communities have emerged, while existing communities, such as institutional research, are transforming their practices to account for the evolving data environment (Reference Zilvinskis, Willis and BordenZilvinskis, Willis, and Borden, 2017).

Within this landscape, significant new privacy issues are emerging as a result of changing data use practices and the sociopolitical pressures on higher education institutions to surface, analyze, and act on data. One of the questions associated with these issues concerns how higher education actors are handling private data, especially student data, in praxis given the increasing sensitivity of the data (Reference Slade and PrinslooSlade and Prinsloo, 2013). However, the existing rules and norms that govern the privacy practices of institutional researchers and other data practitioners are often unable to account for the nuances of data privacy in praxis (Reference FullerFuller, 2017b; Reference ZeideZeide, 2016), which has led to informal and implicit institutional policies about student data privacy (Reference FullerFuller, 2017b).

Understanding the rules and norms that shape the practices of institutional researchers and other data practitioners in regard to student data privacy within higher education could be researched using descriptive methods, which attempt to illustrate what is actually being done in this space. But, we argue that it is also important for practitioners to become reflexive about their practice while they are in the midst of using sensitive data in order to make responsive practical and ethical modulations.

To achieve this, we conducted a socio-technical integration research (STIR) (Reference FisherFisher, 2012). STIR provides structured opportunities for research participants to integrate perspectives and methods from the social sciences and humanities. The STIR method targets small teams or groups of participants, often scientific laboratory researchers. We adopted this method to STIR a single institutional researcher over an extended period of time. The participant’s responsibilities entailed, among other things, conducting statistical analyses on important administrative metrics, such as retention, recruitment, and enrollment for their university’s administration. Drawing on Reference Crawford and OstromCrawford and Ostrom’s (1995) institutional grammar, we assessed the rules, norms, and strategies that governed the participant’s practices as they related to data privacy.Footnote 3 This theoretical join of STIR and institutional grammar helped us to answer the general research question: What rules-in-use govern the participant’s privacy practices, and how might STIR lead to modulations in those practices? In summary, the findings reveal that the participant was encouraged to reflect on the conditions of her context and her agency to make modulations of her own work and consider whether existing rules, norms, and strategies are justifiable. These reflections, in turn, led to active modulations where her practices were modified to more explicitly consider privacy or, at the least, brought about ideas for future privacy-focused initiatives (e.g., data management strategies and documentation processes).

4.2 Data Analytics in Higher Education and Challenges to Contextual Integrity

4.2.1 The Value of Analytics

The advent of new technologies and analytical techniques are enabling the proliferation of data and information within higher education institutions. Reference Goldstein and KatzGoldstein and Katz (2005, 11) explain that “the challenge [to colleges and universities] is no longer the lack of access to timely information”; it is the ability to make actionable decisions based on available information. In the early aughts, universities began to develop capacity for what was then called “academic analytics.” Like business intelligence, academic analytics is the use of various technological systems and applications to analyze accessible institutional data in support of decision-making.

Much of the capacity-building done in support of academic analytics has led to additional analytic practices to serve various ends, in part due to function creep. Most prominent among these practices is the learning analytics movement. Since 2010, institutions have methodically worked to make data about students once “unseen, unnoticed, and therefore unactionable” to be visible and analyzable (Reference Bienkowski, Feng and MeansBienkowski, Feng, and Means, 2012, ix). Defined, learning analytics is “the interpretation of a wide range of data produced by and gathered on behalf of students in order to assess academic progress, predict future performance, and spot potential issues” (Reference Johnson, Smith, Willis, Levine and HaywoodJohnson et al., 2011, 28). A driving goal of learning analytics is to “tailor educational opportunities to each student’s level of need and ability” (Reference Johnson, Smith, Willis, Levine and HaywoodJohnson et al., 2011, 28), but learning analytics is not just about learners: it is also about the learning context and can be used to “assess curricula, programs, and institutions” (Reference Johnson, Smith, Willis, Levine and HaywoodJohnson et al., 2011, p. 28). To a lesser extent than learning analytics, institutions have also begun using their information infrastructures to mine and analyze data about faculty performance and productivity (see Reference FlahertyFlaherty, 2016; Reference PatelPatel, 2016).

Why are higher education institutions pursuing analytics (academic, learning, faculty, or otherwise)? Reference Campbell, DeBlois and OblingerCampbell, DeBlois, and Oblinger (2007, 42) present analytics as a sort of salve for higher education, writing that “academic analytics is emerging as a new tool that can address what seem like intractable challenges.” As in other contexts, institutional actors and higher education pundits have applied powerful metaphors to express – and influence – the role of data mining at the university level (Reference Stark and HoffmannStark and Hoffman, 2019). Some argue that the data and information institutions can aggregate and analyze is akin to valuable natural resources, like oil and gold, that they have social, political, and economic value (see Reference Mayer-Schönberger and CukierMayer-Schönberger and Cukier, 2014; Reference WattersWatters, 2013). Proponents of analytics argue that the insightful information they create can help institutions defend themselves against mounting accountability pressures and provide useful insights regarding resource usage in languishing economic times (Reference Prinsloo, Slade, Menon, Terkla and GibbsPrinsloo and Slade, 2014).

4.2.2 Competing Interests

The turn toward data analytics in higher education raises particular questions about the effects caused by an increasingly data-driven, technocratic institution (Reference Slade and PrinslooSlade and Prinsloo, 2013). Maturing institutional data infrastructures enable the administrative surveillance of researcher productivity, instructional methods, and the day-to-day life of students, which in turn allow for granular reforms of programs, practices, and people – all in the name of institutional effectiveness (Reference SelwynSelwyn, 2014; Reference WilliamsonWilliamson, 2018). Reflecting on this point, Reference JohnsonJohnson (2016, 27) argues:

Data systems … are too often assumed to be value-neutral representations of fact that produce justice and social welfare as an inevitable by-product of efficiency and openness. Rarely are questions raised about how they affect the position of individuals and groups in society. But data systems both arbitrate among competing claims to material and moral goods and shape how much control one has over one’s life.

It could be that data analytics privilege bureaucratic and politically expedient outcomes in ways that suppress what is otherwise “educationally desirable” (Reference Slade and PrinslooSlade and Prinsloo, 2013), including developing just educational systems that support student autonomy and well-being (Reference Rubel and JonesRubel and Jones, 2016). Important questions emerge: Who has the power to wield institutional data, to what ends are analytics directed, and whose interests are served (or ignored)? Reference KitchinKitchin (2014, 165) reminds us that “there is a fine balance between using data in emancipatory and empowering ways, and using data for one’s own ends and to the detriment of others, or in ways contrary to the wishes of those the data represent.”

Government actors, institutional administrators, parents and guardians, and, among others, companies who develop and participate in educational analytics all have varying interests in maximizing value from analyzable data (Reference FergusonFerguson, 2012; Reference Rubel and JonesRubel and Jones, 2016). Such stated benefits include increases in academic success for students, but analytics also enable others to gain financial, social, and reputational advantages.

Consider the following examples of plausible conflicts of interests. Administrators want to decrease time-to-degree measures and increase graduation rates. One method may be to use analytics to direct students to enroll in academic programs or courses for which they meet the threshold for predicted success, say 75 percent. Students share the same goals, but forcing them down an academic path not of their choosing will not benefit them if they find their future careers to be dull and uninteresting. Where faculty are concerned, analytics may enable tenure and promotion committees to do peer-institution comparisons of research output and impact, which help them to make quicker recommendations, in addition to strategically build a core faculty according to standardized metrics. However, these analytics are decontextualized and limited; tenure and promotion candidates may not be provided the opportunity to tell a complete story about their body of work. These competing interests highlight the fact that data and information are not neutral artifacts, but instead they are “cooked” with the motivations of those who wield data and analytic tools (Reference Bowker and GitelmanBowker, 2013).

4.2.3 The Appropriate Flow of Information

When the appropriate flow of information changes, and those changes run counter to normative expectations, privacy is put at risk within a given context (Reference NissenbaumNissenbaum, 2010). With higher education analytics, the creation of new information flows – many of which contain identifiable data – and the alteration of existing flows to support analytic practices have raised privacy concerns, primarily but not exclusively regarding students (Reference Pardo and SiemensPardo and Siemens, 2014; Reference Rubel and JonesRubel and Jones, 2016). The problem with higher education analytics is that all of these parts are affected in some way by emerging analytic infrastructures, related practices, and changes in who is able to access and use data – which are indubitably affected by shifting politics and administrators’ neoliberal interests (Reference HeathHeath, 2014). Some existing informational norms are, therefore, uncapable of providing clear direction in this era of analytics. As a result, institutional actors may find themselves making sensitive, and often critical, data privacy decisions based on their own personal values and ethical judgement. For the purposes of this chapter, we focus on institutional researchers whose very role dictates that they access, manage, and analyze an array of data to inform institutional practices.

4.3 A Socio-Technical Integration Research Study of an Institutional Researcher

4.3.1 Downstream Effects, Impacting Midstream Practice

There is a need to better understand how higher education’s information workers, like institutional researchers, make sense of their moral practices as they implement data analytics into important decision-making strategies. Instead of looking at downstream effects and then shining the proverbial light after the fact, there is a need to look at – and influence – the design of ethically sensitive data technologies and practices closer upstream. These efforts are crucial for identifying problems before they are baked into socio-technical data analytics systems and individuals are made into and considered as data (Reference Jones and McCoyJones and McCoy, 2018). We argue that the socio-technical integration research (STIR) method can lead to positive upstream engagement and useful modulations at the midstream level.

STIR enables research practitioners – laboratory scientists, engineers, technologists, and information professionals – to consider perspectives from the humanities. STIR projects pair practitioners with embedded social scientists who together work to “unpack the social and ethical dimensions of research and innovation in real time and to document and analyse the results” (Fisher, 2010, 76). These partnerships enable researchers to study the practices of their research practitioner partners, while engaging them in conversations that explore the societal and ethical dimensions of their work. Surfacing these issues provides the conditions necessary for research practitioners to reconsider their efforts and make midstream modulations that reduce downstream harms.

4.3.2 Socio-Technical Awareness

During their time together, the STIR researcher takes the opportunity to move the practitioner toward “reflexive awareness,” or an attentiveness to “the nested processes, structures, interactions, and interdependencies, both immediate and more removed, within which they operate” (Reference Fisher, Mahajan and MitchamFisher, Mahajan, and Mitcham, 2006, 492). Such awareness provides the conditions necessary for the practitioner to consider one’s socio-political position, usage of resources, ethical reasoning, among other things, which can give rise to “goal-directed” (Reference Fisher, Mahajan and MitchamFisher, Mahajan, and Mitcham, 2006, 492) modulations that directly impact current practices. To build toward this opportunity for change, the researcher structures discussion protocols around these basic questions, which are asked in relation to a specific practice:

  1. 1. What are you doing?

  2. 2. Why are you doing it?

  3. 3. How could you do it differently?

  4. 4. Who might care?

The first question establishes the particulars of a practice (e.g., cleaning a laboratory table with disinfectant or developing an algorithm), while the second prompts the practitioner to take up the underlying justification(s) for the action. Question three begins to nudge the practitioner toward reflexivity by providing the intellectual time and space to consider alternative ways of doing and other justificatory reasons. The fourth and final question stimulates the practitioner to reflect on the present and proposed altered practice by considering relevant stakeholders and downstream effects thereon.

4.3.3 Modulations in Practice

There are three stages of identifiable modulations: de facto, reflexive, and deliberate. With de facto modulations, research data indicates that socio-technical integration occurs, but the research participant does not actively reflect on the integration because there is no incentive to do so. Reflexive modulations by participants arise because of heightened awareness of socio-technical considerations brought about by working with the researcher. In these cases, participants explicitly notice how social influences (e.g., actors, politics, values, resources, etc.) interact with a given practice. At the deliberate modulation stage, participants begin to act on their reflexive modulations. They take stock of their heightened awareness of the socio-technical milieu to plan strategies, curate resources, and make changes in their practices. Such changes may simply make their current practice more efficient and effective, and this would be a first-order deliberate modulation. But if the participant makes changes to alter the goals, objectives, and assumptions of the project due to enhanced social sensitivity, then these changes would be second-order deliberate modulations. In the remainder of this chapter, we discuss our work using STIR to study an institutional research practitioner and the participant’s privacy practices in praxis.

4.3.4 Joining STIR with Institutional Grammar

We integrated the STIR method (Reference Fisher, Mahajan and MitchamFisher, Mahajan, and Mitcham, 2006; Reference Fisher, Schuurbiers, Doorn, Schuurbiers, van de Poel and GormanFisher and Shuurbiers, 2013) and Reference Crawford and OstromCrawford and Ostrom’s (1995) institutional grammar for identifying rules-in-use as expanded to address information privacy concerns (Reference Sanfilippo, Frischmann and StrandburgSanfilippo, Frischmann, and Strandburg, 2018). The STIR approach was used to probe the research participant into considering and acknowledging the implicit socio-technical characteristics that guided the participant’s practice and those within the participant’s office, and with the intention that these have been made explicit that they would lead to identifiable modulations in the participant’s privacy practices. Institutional grammar’s rules-in-use were used to assess rules such as policies and laws, norms that govern privacy practices in institutional research, and strategies that shape the privacy practices of institutional researchers.

4.3.5 Study Design

The study’s participant was a single institutional researcher at a mid-sized public university. The participant’s institutional research responsibilities entail, among other things, conducting statistical analyses on important administrative metrics, such as retention, recruitment, and enrollment, and providing this information to their institution’s administration. Over four months, we conducted twelve semi-structured in-person and virtual interviews with the participant. Furthermore, during the interviews, the participant often shared data artifacts, such as an ongoing project on enrollment projections and trends, while discussing the practices associated with their everyday work. While studying one participant is a unique sample size, the STIR method has traditionally been used with small teams of scientific laboratory workers. Studying just one institutional researcher is adequate given the often solo nature of this type of professional’s work. Moreover, working with one individual allowed us to develop an intimate rapport and gain access to sensitive information shared by the participant, which may have been held back if we had also been working with her peers.

We developed an interview protocol to guide the participant to reflect on her privacy practices and those of her staff within the office of institutional research. The interviews sought to elicit from the participant reflections upon four decision components: the institutional research activities that they engage in (opportunities); the reasons for and against their practices (considerations); possible alternative approaches to their activities and reasons that might lead to acting on those alternatives (alternatives); and the possible outcomes if such outcome were acted upon (outcomes) (Reference Flipse, van der Sanden and OsseweijerFlipse, van der Sanden, and Osseweijer, 2013) to identify the rules-in-use, values, goals, and other socio-technical variables that shaped the practitioner’s privacy practice.

4.3.6 Data Analysis Procedures

We digitally recorded all interviews, using the audio to create transcriptions for coding purposes. We imported transcripts into MAXQDA, a qualitative data analysis application, and then coded interviews based on a two-stage approach. First, Reference Crawford and OstromCrawford and Ostrom’s (1995) institutional grammar approach was used to identify the rules-in-use that governed the practitioner’s institutional research privacy practices. These codes assessed the rules, norms, and strategies, and each of these rules-in-use’s associated attributes, aims, conditions, deontics, and consequences associated with the Governing Knowledge Commons (GKC) framework devised by Reference Sanfilippo, Frischmann and StrandburgSanfilippo, Frischmann, and Strandburg (2018). At the same time as these items were coded, we coded for the level at which a particular rule-in-use existed: individual, office, institution, external to the institution. Second, the interviews were then coded based on the STIR approach to identify the four socio-technical decision components, followed by codes for identifying the various socio-technical modulations that emerged throughout the interview process. What follows is relevant background information on institutional researchers and the findings we uncovered from our GKC-informed STIR.

4.4 The Role of Institutional Researchers in Higher Education’s Analytic Practices

4.4.1 Higher Education Intelligences

Institutional research is a branch of educational research that concentrates on improving “understanding, planning, and operating of institutions of postsecondary education” (Reference PetersonPeterson, 1999, 84). The role of institutional researchers is then to provide information to institutional administrators to aid in the improvement of planning, policy generation, and effective decision-making. Reference Volkwein, Liu, Woodell, Richard, MacLaughlin and KnightVolkwien, Liu, and Woodell (2012, 23) suggest that the institutional researcher is engaged in three areas of study, which they call the “golden triangle of institutional research”:

  1. 1. institutional reporting and administrative policy analysis;

  2. 2. strategic planning, enrollment, and financial management;

  3. 3. outcomes assessment, program review, accountability, accreditation, and institutional effectiveness.

Furthermore, institutional researchers are called upon to not only provide information to facilitate improvement in these areas, but to actively engage in information sharing practices that contribute to organizational learning and, in turn, improve institutional effectiveness (Reference Borden, Kezar, Howard, McLaughlin and KnightBorden and Kezar, 2012).

Effective institutional research practices require that the institutional researcher engage three types of intelligences as they relate to their institution and higher education in general: technical and analytical; issues; and contextual and cultural (Reference TerenziniTerenzini, 1999). Given the diversity of these intelligences, institutional researchers have to balance various, and often competing, demands from administrators internal to their institutions, and from their external constituents, including state and federal policy makers, and their local communities (Reference VolkweinVolkwein, 1999). One such balancing concerns what Reference VolkweinVolkwien (1999, 13) calls, “enrollment pressures.” Institutions are “asked to simultaneously admit more students (for financial health and access) and become more selective (to bolster academic standards and performance measures).” For this reason, he likens the institutional researcher to Janus, the two-faced Roman God of “doors and gateways” in that they have to look inward toward internal improvement, while contemporaneously facing outward to ensure that they are attuned to external accountability demands.

4.4.2 Information and Knowledge Managers

In addition to appraising the demands of variegated internal and external actors, Reference SerbanSerban (2002) emphasizes the institutional researcher’s role regarding managing the flow of institutional data and information throughout their institutions. It is for this reason that the institutional researcher should also be understood as their institution’s knowledge manager, whereby they are responsible for the “processes that underlie the knowledge management framework – creation, capturing, and sharing of knowledge – that serve both internal and external purposes and audiences” (Reference SerbanSerban, 2002, 105). Understanding and addressing the complexity of the flow of institutional information and data throughout institutions of higher education is necessary given growing interest in advanced analytic practices.

Where once there was a time when institutional researchers served as their institution’s “one source of truth,” this new environment is leading to situations where “decision makers at all levels are establishing their own data collection processes and analytics” (Reference Swing and RossSwing and Ross, 2016, 5). Reference Zilvinskis, Willis and BordenZilvinskis, Willis, and Borden (2017, 12) argue that broad interest in analytics across campus units and offices creates a new situation where institutional researchers are playing a different role, writing: “[w]orking on learning analytics projects requires IR staff to engage with colleagues who tend to use information in operational and individualized contexts rather than the more strategic and aggregate uses to which they are accustomed.” This is so because institutional research offices are no longer the primary source of data, information, and analytic insights; each academic unit and office increasingly uses highly contextual data to serve their information needs. For instance, advisors are adopting analytic systems to analyze student movement through curricula, and information technology offices are developing their own metrics and data dashboards to evaluate system usage and services. Reference Swing and RossSwing and Ross (2016) contend that because data flows are becoming more complex and analytics more widespread, institutional researchers should become more actively engaged in managing and shaping of policies regarding the flow and use of institutional information and data.

4.4.3 Governing Sensitive Institutional Data and Information

Among professional institutional research associations and in the research literature, there have been ongoing conversations about principles, rules, and national and institutional policies that do or should govern uses of institutional information (Reference ShiltzShiltz, 1992). Much of this work concerns privacy as it relates to security, confidentiality, and appropriate use. And since student data and information is of chief importance in institutional research, policy conversations tend to revolve around students and less about faculty and staff.

On a national level, institutional research practices are bound by the Family Educational Rights and Privacy Act (FERPA). The law dictates that educational institutions who receive federal funding must protect and hold in confidence student data and information considered a part of a student’s identifiable educational record. Institutional actors have the right to gain access to such records when they have a legitimate educational interest in doing so. However, Reference FullerFuller (2017a) argues that institutional researchers are often unaware or undertrained regarding FERPA. According to a survey of 232 institutional researchers, 53 percent self-taught themselves about FERPA, while 22.5 percent had received no training. Lacking in knowledge of FERPA and its relation to data privacy matters should be a matter of concern, given that data breaches and other FERPA violations have led to numerous institutions being litigated in recent years (Reference FullerFuller, 2017b). Knowledge of and training with regard to FERPA is especially important given that the law’s definitions and requirements are imprecise and/or able to be bent based on an institution’s interpretations (Reference ZeideZeide, 2016).

Information sharing and information flow practices are also guided by professional ethics, institutional policies, and personal values (Reference FullerFuller, 2017b). Regarding professional ethics, the Association for Institutional Research (2013) outlines how institutional researchers should handle privacy issues in their Code of Ethics. However, the code is scant on this issue and merely states that institutional researchers should balance privacy risks and confidentiality against the potential benefits that the information can provide to the institution. Additionally, institutional researchers’ practices are supposed to be informed by internal institutional policies on data privacy. However, Reference FullerFuller (2017b) claims that many institutions do not have formal written policies.

4.4.4 Ethical Murkiness

The ambiguity in federal law, the “squishiness” of codes of ethics, and possibly the lack of guiding institutional information policy leads institutional researchers into a murky, ethical gray area. Researchers acknowledge that educational data analytics – a social and technological practice – raises significant ethical concerns (see Reference Slade and PrinslooSlade and Prinsloo, 2013). If data analytics produced by institutional researchers and others are to be considered trustworthy and legitimate, then they must attend to the ethical issues, the so-called “critical barriers” that will determine success and failure of data-based analytic initiatives (Reference Gašević, Dawson and JelenaGašević, Dawson, and Jovanović, 2016, 2). Higher education analytics are “moral practice[s]” (Reference Slade and PrinslooSlade and Prinsloo, 2013, 1519) that must account for actual and potential harms brought about data and information access, analysis, and use (Reference Pardo and SiemensPardo and Siemens, 2014).

4.5 Governance in Practice: STIR Findings

4.5.1 Attributes

The findings below highlight how various information resources, policies, institutional actors, and various – and sometimes divergent – goals and objectives influence and frame the work done by the institutional researcher who participated in our socio-technical integration research (STIR). But briefly, it is important to highlight how these attributes make up the contextual background of the participant.

The participant’s resources are data-based and informational. She relied on datasets in various forms to complete her responsibilities, and she primarily used a centralized data warehouse to access and export data to her local computer for statistical analysis and data visualization purposes. Some but not all data sets were shared on a local network in the office of institutional research with specific user permissions set to limit access and protect sensitive data. Notably, the office’s information infrastructure was described differently in terms of its data security protections when compared to other offices on campus. The datasets were comprised of identifiable and de-identified student data, in addition to “raw” and aggregate data provided by other institutional offices, including among others human resources and admissions.

The participant’s office was, as expected, comprised of a staff of roughly twenty individuals. The makeup of the staff included administrators, data analysts, institutional researchers, and part-time graduate assistants. It would not be accurate to think that this office worked in a silo; they often collaborated with institutional administration to provide them actionable information and worked with other offices on campus when specific projects needed access to and analysis of institutional data. The work the institutional research office did was shaped and limited at times by the political interests of those to whom they reported data findings, as well as policy set by the institution’s office of information technology.

The goals and values of the institutional research and her office were not made explicit during interviews. However, such things were made clear upon examining the office’s documentation. The office strives to provide actionable information to support decision-making throughout the campus, as well as support the institution’s wider goals around student success and the effective operation of the campus. It is notable that the office explicitly aims to provide access to a data infrastructure and related tools, signaling that its staff wishes to be enablers of data – not gatekeepers – and help institutional colleagues leverage data in innovative ways.

4.5.2 Existing Rules-in-Use

The analysis of the conversations with the participant uncovered the rules-in-use (norms, rules, and strategies) that govern her work practices with institutional data. Furthermore, the analysis found that addressing the levels at which rules-in-use occur is important for understanding how rules-in-use emerge and differ in practice. For this analysis, we found rules-in-use at the following levels: individual, office, institution, and external-to-the-institution. The level of the rules-in-use impacted their scope and determined what and how they were governed. Understanding rules-in-use and the levels at which they occur will be important for our to-be-discussed STIR findings, where we explore the participant’s reflections upon and her modulations of her privacy practices and, in turn, the rules-in-use that govern her practice.

The participant’s privacy practices are governed by a variety of rules, most of which occur at the external-to-the-institution, institution, and office levels. Regarding the rules external to the institution, her and her colleagues are required to follow the appropriate FERPA guidelines. At the university level, university rules require that the participant and her colleagues are compliant with rules related to data sharing and use, such as ensuring that data consumers have the proper data use training and have signed the institution’s data use agreement in order to receive and share data. At the office level, the participant described rules primarily related to working with the university’s institutional review board (IRB) prior to conducting research and ensuring that she and her colleagues are up to date with institutional and federal privacy policies.

Regarding the norms, these predominantly occur at the institution and office levels. At the institution level, the norms revolve around data use for institutional improvement. According to the participant, the institutional norm is that data should be shared and made available to those seeking to improve the educational mission of the institution to develop useful insights. As the participant stated, “I think it’s been a policy [at the university] that we share information and we don’t try to silo things.” Given this, it is expected that her office collaborates with and supports other offices across the campus. The office level norms that guide the participant and her office’s practices relate primarily to protocols for how to share data with those external to their office and how student data should be de-identified in their institutional research products, such as in reports, dashboards, and data sets.

The strategies that govern the participant’s privacy practices primarily occur at the office and individual levels. These relate to spatial privacy practices and the appropriate use of student data. Spatial privacy practices refer to how the participant and her colleagues consider and modify their work and office spaces to ensure that student data are kept secure. Regarding office level strategies for appropriate data use, the participant stated that they assist data consumers to develop their business use cases when requesting access to institutional data. In addition, her office collaborates with other campus offices when they have questions about data access and use rules and norms.

Based on the analysis of the rules-in-use occurring at the office and individual levels an interesting trend emerged. The office and individual rules-in-use that the participant and her colleagues developed were in response to instances when the institution failed to adequately govern the participant’s privacy practices. In some instances, the office and individual’s rules-in-use explicitly contradicted that of the institution, but in other situations individual rules-in-use at the office and individual levels were developed in response to gaps at the institutional and federal levels. These contradictions and gaps will be further explained in the following section, where we discuss the findings of the STIR analysis.

4.5.3 Rethinking Rules-in-Use with Socio-Technical Integration Research

Conversations with the participant probed to examine the socio-technical conditions of her work regarding data practices and privacy. These probing questions enabled the participant to reflect on her workaday routines, but also to nudge the participant to examine the criteria (e.g., values, principles, procedures) that inform decisions a part of her routines. As we describe below, the participant was acutely knowledgeable of privacy issues, related processes and procedures, and had even developed unique privacy-protecting strategies.

The participant was keenly aware of the fact that data to which she had access, especially student data, were sensitive and needed to be kept secure. She expressed a personal ethos of responsibility, suggesting several times that data handling actions needed care and attention to potential downstream privacy effects. When asked why she felt this normative responsibility, she replied, “Why is it so important to protect student data in the way that we are? Because we’re here for the students because we want to make sure that we’re not creating any kind of violation, that we’re not violating this trust that they have.” Notably, she suggests that her ethos is one to which her institutional research peers subscribe as well. Additionally, the motivation for protecting the privacy of those she analyzes in data is due to a sense of obligation to uphold the trust data subjects have in her, but also the institution, to use data appropriately. If trust was something that could be violated, we asked, then what would be the consequences? To this probing question, she suggested that 1) students would not be willing participants in research projects and 2) her office would “lose access to some of the data that we need to be able to do our jobs” due to non-compliance with existing policies.

Before beginning her analytical work, the participant claimed that she strategically worked with her institution’s institutional review board (IRB), using it as a means to discuss and protect data privacy. “We tend to err on the side of caution,” she said, “and at least talk to IRB about every single project that we do … . We want to at the very least make IRB aware of [the project] and get some sort of approval.” Pursuing an “ethical consultation” with IRB, to the participant, would help her understand if her work was “consistent with good ethical research practices,” in compliance with federal rules, and in alignment with what the institution expects regarding access to student data. The IRB could help her limit downstream harms, such as the following she expressed in a conversation:

It is easy to see how that sort of access to data could be abused, um, should it get into the wrong hands. People could theoretically be linking, you know, survey responses with income data from the [Free Application for Federal Student Aid (FAFSA)] or things like that if it’s not used properly.

After working with the IRB, the participant described the process for gaining access to data. Institutional policy limited who within the institution could gain access to different sources of data. Depending on the source of the data, the participant would have to consult with a dedicated data steward (e.g., the registrar for enrollment data, the bursar for financial data, or a library administrator for library data). Conversations with the stewards required her to “build a business use case to justify” data access and use. The participant emphasized that the creation of the “business use case” was a collaborative process, and she stated that finding the necessary justification would not be as easy at other institutions where data sharing was more restrictive and there was less value placed on analytics based on combinable data from across various offices.

Before conversations gained significant traction, the participant would have to prove that she had successfully passed the institution’s FERPA training and consented to its data use agreement; both processes informed the participant of her legal and institutional responsibilities. Since her office also serves as a source of institutional data and information, the participant also asked for proof of the same compliance credentials from those with whom she worked within the institution. Notably, she questioned if others thoughtfully considered the compliance measures like she did, saying, “I would hope that everybody reads that information and takes it seriously; I don’t have any kind of assurance.” When she provides a data set, she makes the data requester “promise” that data will not be shared unless carefully outlined and approved ahead of time, detail how the data will be used, and share their data deletion strategies, all in order to make sure that institutional policy compliance is assured. When data requests proved difficult to determine access and use privileges, she consulted with other data stewards to seek their interpretations. The participant did not detail requirements that guided these types of conversations and blindly trusted that all data stewards would be just as rigorous in their analysis of data use requests.

About her office’s data privacy practices, she revealed a significant detail concerning the physical layout of her own office and that of her colleagues. The conversation unfolded in this way:

Researcher: You were saying that each analyst I think has a door, right? It’s not in a cubicle. And you were saying your monitors are faced away from the wall.

Participant: We’re all kind of positioned in a way that nobody just walking by can just take a look and peek at your machine.

Researcher: So how did that come about? It’s an interesting decision to make.

Participant: We were just very sensitive about the fact that we had student-level data in our records on our computer at any given time, and we just wanted to be cognizant of the fact that somebody could just come in just happen to accidentally peep over and take a look at something that they weren’t supposed to be looking at. We are responsible for data security for this information … . It is something that I think we do need to be conscious about.

When her office hires student workers who do not have the benefit of a secure office, students are made aware of the fact that their work may involve sensitive data and that they should situate their computer screens to limit others from looking over their shoulders. The participant noted that her office’s privacy-protecting strategies were not as stringent as those in other offices, such as financial aid, whose employees “keep the windows drawn” and do not allow unaccompanied visitors.

Regarding digital data privacy practices, the participant expressed two strategies. First, another employee in the office was in charge of maintaining scoped data pulls from administrative systems (e.g., the student information system) and subsequently checking the veracity of the data. Having a point person for this data practice reduced inaccurate data and limited access to data unnecessary for informing analytic projects (i.e., they followed data minimization principles). Second, any analytics created by the office abided by their own rule that for data including a sample size of five or less, the reportable number changes to an asterisk. For instance, if a data dashboard includes aggregate data demonstrating that three Hawaiian/Pacific Islander students reflected a certain behavior or outcome, then the number would be masked to reduce reidentification risks. The participant emphasized that this rule went beyond less stringent requirements set by FERPA and guidance by the registrar’s office.

When the participant described her privacy practices – or a lack thereof in some cases – we prompted her to discuss alternative ways of thinking, doing, and valuing. The purpose of this strategy was to provide intellectual space and time to consider how outcomes of her privacy practices could be different and to reconsider the stimuli motivating standard practices. Responses to this strategy ranged from affirmative alternative designs (e.g., “I could do this … ”) to negative responses (e.g., “There is no other possibility … ”) due to existing conditions. The following highlights two instances where the participant outlined possible alternatives and outcomes.

An ongoing frustrating experience for the participant concerned her relationship with the institution’s office of information technology (OIT). Part of her position’s responsibility covered creating internal use and publicly available data dashboards, which required permission from an internal review panel and OIT. She expressed that even in cases where the internal review panel gives permission, OIT removes the dashboard – but they often fail to tell the participant that they had done so. Consequently, she “has to play ball” with OIT even if their decision would align with the internal review panel in the first place; if she does not, she loses her completed work.

The purpose for OIT rules is to protect data and the privacy of data subjects. But, the participant argued that these rules were too restrictive and unconstructive:

[OIT] kind of feels like everybody outside of [OIT] is the crazy grandmother who’s going to be signing up for Nigerian banking schemes, and they’re going to click on every link and wantonly do all kinds of stuff to make the databases vulnerable … . I do have a degree of empathy. I just kind of wish they would go about it in a way that they don’t treat anybody outside of [OIT] like they’re an idiot.

We confirmed with the participant that the issue boils down to a lack of trust between OIT and the office of institutional research, among others, and asked what she could do to get a different result. Even though she expressed skepticism that OIT would change its behavior and views, she noted that communications between her office could be more strategic. For instance, any issues with OIT decisions should be communicated from administrators from the office of institutional research, and not staffers. Additionally, attempts to “make nice” with OIT are preferable and probably more efficacious in the long run than battling OIT’s decisions. These alternatives were not optimal, but the participant perceived they could prove to be better than existing practices.

Another instance of alternative practices and outcomes concerned the development of new policy. The participant’s status as an administrator, not just an analyst, meant that she had policy-making privileges. If she desired and felt it would be useful, she could develop standardized data use practices with related compliance measures to guide her work and that of her peers within the office of institutional research. When the conversation shifted to this possibility, the response was negative. Her argument against forcing new policy was as follows (note: names changed to protect those referenced):

Well, because we have, like I said, Jane has got a very vested interest in, you know, FERPA and a lot of experience with that. Um, Danica was one of the data stewards for institutional research-level data. You know, we have some expertise in this office, you know, Jared and Kristin manage all of the survey information. Jonathan deals a lot with [human resource] data. We have a lot of expertise in a lot of different data sources and we want to consult everybody and also make sure that we’re on the same page. That’s just kinda the culture of the office. I think also that Susan [a peer administrator] has established that we’re a collaborative group and we want to make sure that we have buy-in from everyone before moving forward with that kind of thing.

Considering the alternative enabled the participant to take stock of a potential outcome, which even though it was denied still proved useful. Thinking through the possibilities enabled her to consider the expertise of her peers (e.g., Jane and FERPA), knowledge of institutional policy and procedure (e.g., Danica as a data steward), location of various data and who knew of such data (e.g., Jonathan and HR data), and the norms and expectations around collaboration (e.g., as developed with Susan’s leadership).

4.5.4 Modulating Practice

With the participant made more aware of the socio-technical dimensions of her work due to the STIR conversations, she began to think through different strategies for navigating rules-in-use regarding her data practices and data privacy. Analysis of the findings suggest that the participant engaged in a greater number of reflexive deliberations than deliberate modulations. In what follows, we report on the participant’s most clearly articulated reflexive deliberation regarding foregrounding privacy and a series of deliberate, second-order modulations.

Towards the end of the interview sessions, the participant began to reflect on the topics covered, issues discussed, and useful takeaways. Not all of these contemplations led to weighty considerations, but one significant reflexive deliberation concerned a new approach to her thinking on privacy. When asked to consider what, if anything, had been influential about her time with the researchers, she answered in detail:

Just to be constantly cognizant about who we’re sharing data with, what relevant policies exists, what are the complications with being able to share information and things like that? Just being cognizant. A lot of times I do try to be cognizant about what are the FERPA implications, what are the [institutional] data sharing policy implications. Um, admittedly I always need refreshers. I feel like I, I’ve done several trainings on them, but they’re, they’re so detailed that I’ve always constantly needing refreshers and I usually err on the side of caution. Um, but to just be constantly cognizant about that when I’m sharing information, I think, is a good step … . These conversations have again, kind of pushed it more to the forefront.

The simple statement of “being cognizant” reveals a heightened awareness in the participant’s mind about the important of privacy in her daily practices. It also demonstrates that she recognizes that privacy entails a variety of different rules-in-use, not just following institutional policy (though noted). This reflexive deliberation suggests that “being cognizant” takes focus and awareness, yet the common tasks and pressures – not to forget institutional politics – of the job intervene. To be cognizant involves pursuing strategies, such as vetting individuals requesting sensitive data and more carefully navigating information sharing expectations. Moreover, having privacy in the forefront of her mind, she believes, will enable her to think more carefully about the downstream consequences of sharing sensitive data. Finally, this deliberation also gave rise to a recognition that committing to ongoing professional education about her privacy responsibilities and the policies that govern her practices would better assist her in her work.

Based on the conversations with the participant, three types of deliberate modulations emerged: documentation of office data sharing and security practices; collaboration with data consumers on campus to help them use institutional data properly; and creation of a campus group to determine appropriate data sharing practices. Notably, these modulations reflect an “action arena” in the Governing Knowledge Commons (GKC) framework (Reference Sanfilippo, Frischmann and StrandburgSanfilippo, Frischmann, and Strandburg, 2018). These modulations will be addressed shortly, but it should be noted first that all of the participant’s modulations are considered second-order deliberate modulations, as opposed to first-order.

First-order modulations focus on actual changes in the STIR participant’s practices, whereas second-order are modulations where the participant alters her project’s goals, objectives, and assumptions to such a degree that they “come to challenge their own established routines of thought and practice, and also crucially, the various external forces which shape these” (Reference WynneWynne, 2011, p. 794). In the case of this study’s deliberate modulations, by the end of the study the participant had yet to make actual changes in their practices, rather they reflected upon the need for substantial changes in their future practices, and in some cases, set the stage for first-order modulations to occur.

The first second-order deliberate modulation that the participant reflected upon focuses on creating opportunities for her and her staff to document the implicit strategies and norms that guide their practices, such as spatial privacy practices and the norms that guide how they share data with campus data consumers. As the participant stated, documentation had not been an integral part of her office’s culture: “I think being a little bit more intentional about documenting policies for how we share data and things like that would probably be a good idea for our office and something that we haven’t really thought too much about.”

The participant reflected upon how the conversations led her to start a conversation with her supervisor about creating documentation opportunities, which could include setting aside dedicated office time during the week, such as “documentation Fridays,” focusing on documentation strategies as a team during a staff retreat or during office meetings, or by encouraging staff to document their practices concurrently as they work. The participant reflected that documenting office practices would help to reify and make explicit her office’s rules and norms, and to help justify their practices to others within the institution:

We should be able to justify what we’re doing. We’ve always done it that way is not a good excuse for doing anything. So, we should be able to justify what we’re doing and we should be able to document it for our own purposes as well as to better explain to people how we’re doing something.

For the second of the second-order deliberate modulations, the participant reflected upon the need for her and her staff to actively work with data consumers on campus to inform them about proper institutional data use in ways that align with campus and federal-level rules and norms. As the findings suggest, the campus requires data consumers to adhere to rules in regards to having appropriate FERPA training and data use agreements signed. However, the participant stated the need for ensuring that the campus’ existing rules and norms are followed by working with data consumers to ensure that they understand proper data storage practices and protect institutional data they receive:

[Our conversations have] gotten me to think a little bit more intentionally about making sure that people have use agreements about making sure that people understand data appropriately. Making sure that people are understanding data security and how they keep their information a little bit better and just being a little bit more thoughtful in those kinds of conversations.

The final second-order deliberate modulation addresses the need for creating a campus-level initiative focused on developing standardized data sharing best practices. As addressed in the previous section, throughout the conversations the participant reflected upon the lack of campus-level rules and norms governing data sharing practices. Given this, the various offices on campus with data sharing responsibilities, including her own, have created their own rules and norms governing how data should be shared.

Historically, campus offices had limited interactions with one another regarding campus data sharing practices; data analysis had not been central to their respective work. But with it becoming so and the pressures increasing to make data-informed decisions, the participant recognized the need for developing consistent practices and policies across campus. “We’ve never really gotten together a group of people,” she said, “and just kind of discussed it out here, discussed things with the exception of like a handful of large-scale projects, um, that are about to be released.” There was a need, the participant expressed, for creating a campus group to discuss standardized data sharing practices on the campus, and that it was “not something that I had really thought to do before.” She and her supervisor were planning to meet to discuss how they might go about creating this campus group.

4.6 A Concluding Discussion

The socio-technical milieu within higher education is drastically influencing data and information practices, according to the literature. With various analytics initiatives emerging and institutional actors trying to determine the right sources and types of data as inputs, it should be expected that the rules-in-use, especially policies, governing cutting-edge practices are not clear-cut and standardized. Moreover, as these actors take on new data-driven roles and responsibilities, especially within institutional research, it will take time for useful strategies to form and norms to settle.

We see in the data, the STIR of a single institutional researcher, some evidence of changes in information flow, reactions to it, and ways of thinking and doing to reestablish privacy-protecting rules-in-use. A single participant does not make for generalizable results about changes in higher education writ large. However, using STIR to address rules-in-use about privacy has led to notable insights and a potentially valuable research agenda.

The norms, strategies, and rules that govern interactions with sensitive data and information are often taken for granted. They may drive workaday practices, but they rarely give rise to reflexive or deliberative moments about alternative ways of doing. Additionally, rules as they exist as policies are to many individuals simply things one follows – not things one seeks to create or change. But with STIR, and as made evident in the findings, we see that there is an opportunity to make rules-in-use worthy of deliberation, as something that when given the space and time to consider can become something to rethink and reconsider. As the findings suggest, the act of naming and describing what structures privacy practices creates the circumstances necessary to then evaluate rules-in-use, solidify and support those that are successful, fill gaps where they exist, and plan for improvements.

Within the context of higher education and in other contexts where data analytics are gaining interest and momentum, it is an opportune time – if not a necessary responsibility – to investigate data practices. The consequences of predictive analytics, algorithms, black-boxed technological systems, and the data on which they all rely are getting serious scholarly consideration. But, looking downstream is only one way of approaching these issues. Instead, looking upstream at seemingly boring and benign practices, and prodding those actors to reflect on their practices, can produce significant insights for the actors-cum-research participants that lead to altered or new practices more attuned to the socio-technical mélange and its implications. Applying the STIR method to address informational privacy rules-in-use, ethics, or otherwise can advance research in this important area.

Footnotes

We would like to thank Erik Fisher, Ira Bennett, Jamey Wetmore, Rider Foley, and all the participants at the 5th Annual Winter School on Responsible Innovation and Social Studies of Emerging Technologies at Arizona State University. Many of the ideas that motivated this project were seeded from constructive conversations at the Winter School. We would also like to thank the participants at the Fairness and Equity in Learning Analytics Systems (FairLAK) workshop hosted at the 2019 Learning Analytics and Knowledge conference, as attendee feedback informed the evolution of this chapter. Please note that parts of this chapter were originally presented in the following paper: Jones, K. M. and McCoy, C. (2019, March). “Ethics in Praxis: Socio-Technical Integration Research in Learning Analytics.” In Companion Proceedings of the 9th International Learning Analytics & Knowledge Conference.

1 Assistant Professor, School of Informatics and Computing, Department of Library and Information Science at Indiana University-Indianapolis (IUPUI). Ph.D., University of Wisconsin-Madison iSchool; M.L.I.S., Dominican University; B.A., Elmhurst College.

2 Ph.D. Candidate, Luddy School of Informatics, Computing, and Engineering, Department of Library and Information Science at Indiana University-Bloomington. M.L.S., Indiana University-Bloomington; B.A., University of Illinois Urbana-Champaign.

3 To be clear, “institutional grammar” has no explicit conceptual or theoretical ties to colleges and universities as institutions. Unless situated in discussions around the grammar of institutions, our reference to “institutions” concerns higher education.

References

Association for Institutional Research. “Code of Ethics and Professional Practice.” Last modified May 2, 2013. www.airweb.org/ir-data-professional-overview/code-of-ethics-and-professional-practice.Google Scholar
Bienkowski, Marie, Feng, Mingyu, and Means, Barbara. Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An Issue Brief. Washington: U.S. Department of Education, 2012. http://tech.ed.gov/wp-content/uploads/2014/03/edm-la-brief.pdf.Google Scholar
Borden, Victor M. H. and Kezar, Adrianna. “Institutional Research and Collaborative Organizational Learning.” In The Handbook of Institutional Research, Howard, Richard D., McLaughlin, Gerald W., and Knight, William E., eds. San Francisco: John Wiley & Sons, 2012, 86106.Google Scholar
Bowker, Geoffrey. C.Data Flakes: An Afterword to ‘Raw Data’ is an Oxymoron.” In “Raw data” is an Oxymoron, Gitelman, Lisa, ed. Cambridge: MIT Press, 2013, 167171.Google Scholar
Campbell, John, DeBlois, Peter, and Oblinger, Diana. “Academic Analytics: A New Tool for a New Era.” EDUCAUSE Review 42, no. 4 (2007): 4057. www.educause.edu/ero/article/academic-analytics-new-tool-new-era.Google Scholar
Crawford, Sue E. S. and Ostrom, Elinor. “A Grammar of Institutions.” American Political Science 89, no. 3 (1995): 582600. https://doi.org/10.2307/2082975Google Scholar
Ferguson, Rebecca. “Learning Analytics: Drivers, Developments and Challenges.” International Journal of Technology Enhanced Learning 4, no. 5/6 (2012): 304317. https://doi.org/10.1504/IJTEL.2012.051816.CrossRefGoogle Scholar
Fisher, Erik. “Causing a STIR.” International Innovation (2012): 7679. https://sciencepolicy.colorado.edu/news/fisher.pdf.Google Scholar
Fisher, Erik, Mahajan, Roop L., and Mitcham, Carl. “Midstream Modulation of Technology: Governance from Within.” Bulletin of Science, Technology & Society 26, no. 6 (2006): 485496. https://doi.org/10.1177/0270467606295402.Google Scholar
Fisher, Erik and Schuurbiers, Daan. “Socio-Technical Integration Research: Collaborative Inquiry at the Midstream of Research and Development.” In Early engagement and New Technologies: Opening up the Laboratory, Doorn, Neelke, Schuurbiers, Daan, van de Poel, Ibo, Gorman, Michael E., eds. Dordrecht: Springer, 2013, 97110.Google Scholar
Flaherty, Colleen. “Academic ‘Moneyball.’” Inside Higher Ed, December 20, 2016. www.insidehighered.com/news/2016/12/20/mit-professors-push-data-based-model-they-say-more-predictive-academics-future.Google Scholar
Flipse, Steven M., van der Sanden, Maarten C. A., and Osseweijer, Patricia. “Midstream Modulation in Biotechnology Industry: Redefining what is ‘Part Of The Job’ of Researchers in Industry.” Science and Engineering Ethics 19, no. 3 (2013): 11411164. https://doi.org/10.1007/s11948-012–9411-6.Google Scholar
Fuller, Matthew. “An Update on the Family Educational Rights and Privacy Act.” New Directions for Institutional Research 2016, no. 172 (2017a): 2536. https://doi.org/10.1002/ir.20201.Google Scholar
Fuller, Matthew. “The Practices, Policies, and Legal Boundaries Framework in Assessment and Institutional Research.” New Directions for Institutional Research 2016, no. 172 (2017b): 923. https://doi.org/10.1002/ir.20200.Google Scholar
Gašević, Dragan, Dawson, Shane, and Jelena, Jovanović. “Ethics and Privacy as Enablers of Learning Analytics.” Journal of Learning Analytics 3, no. 1 (2016): 14. https://doi.org/10.18608/jla.2016.31.1.Google Scholar
Goldstein, Phil and Katz, Richard. Academic Analytics: The Uses of Management Information and Technology in Higher Education. Louisville: EDUCAUSE, 2005. https://net.educause.edu/ir/library/pdf/ers0508/rs/ers0508w.pdf.Google Scholar
Heath, Jennifer. “Contemporary Privacy Theory Contributions to Learning Analytics.” Journal of Learning Analytics 1, no. 1 (2014): 140149. https://doi.org/10.18608/jla.2014.11.8.Google Scholar
Johnson, Jeffrey. “The Question of Information Justice.” Communications of the ACM 59, no. 3 (2016): 2729. https://doi.org/10.1145/2879878.Google Scholar
Johnson, Larry, Smith, Rachel S., Willis, H., Levine, Alan, and Haywood, Keene. The 2011 Horizon Report. Austin: The New Media Consortium, 2011. https://library.educause.edu/-/media/files/library/2011/2/hr2011-pdf.pdf.Google Scholar
Jones, Kyle M. L. and McCoy, Chase. “Reconsidering Data in Learning Analytics: Opportunities for Critical Research.” Learning, Media and Technology 44, no. 1 (2018): 5263. https://doi.org/10.1080/17439884.2018.1556216.Google Scholar
Kitchin, Rob. The Data Revolution: Big Data, Open Data, Data Infrastructures, and their Consequences. Los Angeles: SAGE Publications, 2014.Google Scholar
Mayer-Schönberger, Viktor and Cukier, Kenneth. Learning with Big Data: The Future of Education. New York: Houghton Mifflin Harcourt, 2014.Google Scholar
Nissenbaum, Helen. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford: Stanford University Press, 2010.Google Scholar
Oudshoorn, Nelly and Pinch, Trevor. How Users Matter: The Co-Construction of Users and Technology. Cambridge: MIT Press, 2003.Google Scholar
Pardo, Abelardo and Siemens, George. “Ethical and Privacy Principles for Learning Analytics.” British Journal of Educational Technology 45, no. 3 (2014): 438450. https://doi.org/10.1111/bjet.12152.Google Scholar
Patel, Vimal. “Productivity Metrics: What is the Best Way to Assess Faculty Activity?” The Chronicle of Higher Education, February 29, 2016. www.chronicle.com/article/Productivity-Metrics/235436.Google Scholar
Peterson, Marvin W.The Role of Institutional Research: From Improvement to Redesign.” New Directions for Institutional Research 1999, no. 104 (1999): 83103. https://doi.org/10.1002/ir.10408.Google Scholar
Prinsloo, Paul and Slade, Sharon. “Student Data Privacy and Institutional Accountability in an Age of Surveillance.” In Using Data to Improve Higher Education: Research, Policy and Practice, Menon, Maria Eliophotou, Terkla, Dawn Geronimo, and Gibbs, Paul, eds. Rotterdam: Sense Publishers, 2014, 197214.Google Scholar
Rubel, Alan and Jones, Kyle M. L.. “Student Privacy in Learning Analytics: An Information Ethics Perspective.” The Information Society 32, no. 2 (2016): 143159. https://doi.org/10.1080/01972243.2016.1130502.Google Scholar
Sanfilippo, Madelyn, Frischmann, Brett, and Strandburg, Katherine. “Privacy as Commons: Case Evaluation Through the Governing Knowledge Commons Framework.” Journal of Information Policy 8, no. 2018 (2018): 116166. https://doi.org/0.5325/jinfopoli.8.2018.0116.Google Scholar
Selwyn, Neil. “Data Entry: Towards the Critical Study of Digital Data and Education.” Learning, Media and Technology 40, no. 1 (2014): 6482. https://doi.org/10.1080/17439884.2014.921628.Google Scholar
Serban, Andreea M.Knowledge Management: The ‘Fifth Face’ of Institutional Research.” New Directions for Institutional Research 2002, no. 113 (2002): 105112. https://doi.org/10.1002/ir.40.Google Scholar
Shiltz, M.Ethics and Standards and Institutional Research.” New Directions for Institutional Research 73 (1992): 39.Google Scholar
Slade, Sharon and Prinsloo, Paul. “Learning Analytics: Ethical Issues and Dilemmas.” American Behavioral Scientist 57, no. 10 (2013): 15101529. https://doi.org/10.1177/0002764213479366.Google Scholar
Stark, Luke and Hoffmann, Anna Lauren. “Data is the New What? Popular Metaphors & Professional Ethics in Emerging Data Culture.” Journal of Cultural Analytics (May 2, 2019). https://doi.org/10.22148/16.036.Google Scholar
Swing, Randy L. and Ross, Leah Ewing. Statement of Aspirational Practice for Institutional Research. Tallahassee: Association for Institutional Research, 2016. www.airweb.org/aspirationalstatement.Google Scholar
Terenzini, Patrick T.On the Nature of Institutional Research and the Knowledge and Skills it Requires.” New Directions for Institutional Research 1999, no. 104 (1999): 2129. https://doi.org/10.1002/ir.10402.Google Scholar
van Dijk, José and Poell, Thomas. “Understanding Social Media Logic.” Media and Communication 1, no. 1 (2013): 214. https://doi.org/10.17645/mac.v1i1.70.CrossRefGoogle Scholar
Volkwein, J. Fredericks. “The Four Faces of Institutional Research.” New Directions for Institutional Research 1999, no. 104 (1999): 919. https://doi.org/10.1002/ir.10401.Google Scholar
Volkwein, J. Fredericks, Liu, Ying, and Woodell, James. “The Structure and Function of Institutional Research Offices.” In The Handbook of Institutional Research, Richard, D. Howard, MacLaughlin, Gerald W., and Knight, William E., eds. San Francisco: John Wiley & Sons, 2012, 2239.Google Scholar
Watters, Audrey. “Student Data is the New Oil: Moocs, Metaphor, and Money.” Last modified October 17, 2013. http://hackeducation.com/2013/10/17/student-data-is-the-new-oil.Google Scholar
Williamson, Ben. “The Hidden Architecture of Higher Education: Building a Big Data Infrastructure for the ‘Smarter University.’” International Journal of Educational Technology in Higher Education 15, no. 12 (2018): 1–26. https://doi.org/10.1186/s41239-018–0094-1.Google Scholar
Wynne, Brian. “Lab Work Goes Social, and Vice Versa: Strategising Public Engagement Processes.” Science and Engineering Ethics 17, no. 4 (2011): 791800. https://doi.org/10.1007/s11948-011–9316-9.Google Scholar
Zeide, Elana. “Student Privacy Principles for the Age Of Big Data: Moving Beyond FERPA and FIPPS.” Drexel Law Review 8, no. 2 (2016): 339394. http://drexel.edu/law/lawreview/issues/Archives/v8-2/zeide/.Google Scholar
Zilvinskis, John, Willis, James III, and Borden, Victor M. H.. “An Overview of Learning Analytics.” New Directions for Higher Education 2017, no. 179 (2017): 917. https://doi.org/10.1002/he.20239.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×