Return to homepage

From the 2021 update report

Emerging Challenges: Rising Privacy and Surveillance Concerns

The rise in commercial publishers usage of tracking software in services sold to academic libraries which allows them to collect and sell data to third parties, as well as the risks and inequities of online exam proctoring tools, require attention of the academic community.

8 mins read

3. Rising Privacy and Surveillance Concerns in Technology Used by Academic Institutions

Since the start of the pandemic, academic institutions have been confronted with new issues related to deployment of technology. In particular, three issues require attention from the academic community. Each of these raises different concerns:

  • Insertion of tracking software in services sold to academic libraries

  • Collection and sale of data by some commercial vendors with ties to the academic community to governments and law enforcement

  • Risks and inequities of online exam proctoring tools

Tracking and Monitoring Software

In an October 2020 SNSI (Scholarly Networks Security Initiative) webinar, a group of scholarly journal publishers unveiled a plan to insert monitoring software on its platforms to protect copyright from cyber-attacks.1 According to some reports, the SNSI initiative was not targeted to blocking or monitoring Sci-Hub,2 but the agenda of the SNSI webinar explicitly included a presentation about Sci-Hub.

Critics of the initiative have pointed out the absence of evidence for some of the claims made by SNSI (from connections between Sci-Hub and Russian intelligence agencies to the use of Sci-Hub to steal passwords to access personal records or access to other databases).3 Claims and counterclaims are difficult to adjudicate. On the other hand, the idea of academic libraries acquiescing to the deployment of software that monitors the behavior of their patrons and collects data with no conditions on what, why, and for what use, flies in the face of both long-held privacy expectations of library users and of academic freedom.

The SNSI presented an update at the STM (International Association of Science, Technical, and Medical Publishers) Conference in April 2021. The speakers reported seeing skepticism from the librarian community and indicated the need to reframe their message from “protecting the publishers” to “protecting library patrons.” Setting aside the communication strategy which appeared inadequate even to the speakers, the real issue seems to be the healthy skepticism of librarians. Publishers appeared surprised and out of touch, particularly considering the data resale activities of some of the companies in their ranks.

Collection and Sale of Data

RELX and Thomson Reuters operate businesses that collect vast amounts of data on more than 1 billion people, with a particular focus on the United States. While individual researchers have been highlighting this issue for years (Sarah Lamdan at CUNY, in particular4), concern about these surveillance businesses began to register with the academic community in 2020 and early 2021.The Daily Bruin (the UCLA student news-paper) called for a boycott of RELX and Thomson Reuters for selling data to the US Immigration and Customs Enforcement Agency (ICE).5

The intersection of data sales and academic institutions poses both ethical and practical challenges. Many academic institutions believe their fundamental values clash with some government policies. This issue is not confined to the US alone: in many countries, in past years academic institutions have clashed with governments or have been subdued and forced to acquiesce to government policies. Academic institutions should determine whether they want to be customers of and do business with companies that operate in activities that may be perfectly legal, but clash with their values.

There is precedent for this: in 2007, Reed Elsevier (as RELX was called then) divested its arms shows business after repeated calls to do so from activists as well as from The Lancet, one of the Elsevier journals.6 At the time, Reed Elsevier pointed out that all these shows were legal activities, underscoring that it is legitimate to ask companies to renounce, on ethical grounds, business with legally sanctioned or governmental programs. It is also important to underscore that a sale is an unsatisfactory remedy in most cases. First, the sale price of a business is a function of expected future earnings, so selling a business is a way to collect a significant part of future profits from the business. In addition, simply empowering someone else to continue undertaking an objectionable activity does not solve the issue. If activists want Thomson Reuters and RELX to stop selling data to ICE, they should ask for those businesses to be closed altogether, particularly because it may be difficult for other companies to replace the data made available by the two companies.

In addition to the ethical issues posed by the ICE business of Thomson Reuters and RELX, there is a practical one. SNSI openly lobbies libraries to install tracking software; according to sources, RELX already does so. ScienceDirect links directly to ThreatMetrix’s (a RELX company) processing notice to describe how collected data are used: “As explained in our privacy policy, and collect information through the use of cookies or similar technologies. Security cookies and related technologies, such as those provided by ThreatMetrix, are used to maintain online security and protect our website against fraud and abuse.” There is no indication whether any of the information is made available to third parties.

The use of ThreatMetrix in ScienceDirect raises questions of whether the online activities of legal clinics in law schools through LexisNexis tools is monitored, and whether data are sold to government agencies, jeopardizing the rights of individuals and associations being assisted by legal clinics. In the past, RELX has denied doing so—but the only way to ensure that RELX does not combine information from LexisNexis with other databases is to abandon contracts with controversial agencies and governments.

Risks and Inequities of Online Exam Proctoring Tools

Exam proctoring raises significant issues of privacy and ethics. The sudden mass transition to online learning during the Spring 2020 term raised the issue of how to conduct exams and other assessments for a vast number of students who had limited or no experience with distance learning. There was a rush to adopt online proctoring solutions that would allow an orderly conclusion of the semester for as many students as possible. These solutions were made available by both courseware publishers and independent companies, with little time to adopt sound rules to guard against violations of privacy and the risk of introducing biases that disfavor underrepresented and underprivileged communities.

Unsurprisingly, reports of issues quickly started to emerge,7 particularly regarding software singling out some categories of people, like minorities, students with certain medical conditions, and parents with young children who cannot be left alone. In addition, proctoring software is deeply intrusive into students’ personal computers and personal lives—intrusions that an increasing number of students are rightfully pushing back against.8

The issues posed by online proctoring fall into two broad categories: privacy and equity. The privacy issues are significant and easy to grasp: Private companies are collecting sensitive information on students’ names, locations, and even physical appearance, exposing them to a number of risks. Since there have been several instances of use of surveillance technology for illegal purposes, adding a large group of students to the possible pool of victims appears ill-considered at best.

The issues related to inequity are even more complex. A major feature of online proctoring software is the use of algorithms to detect “suspicious” behavior that may indicate cheating or other academic integrity violations. This type of technology is problematic because it is bound to make mistakes, and those mistakes disproportionately harm students subject to the algorithm’s biases. While some online proctoring tools incorporate human review of flagged behavior, little is published about the protocols used, so there is no way for students or their advocates to know what behavior will trigger a human review and how the review is formulated. A vast literature in psychology demonstrates that how a question is formulated affects how people respond and that confirmation bias can play a role when determining guilt or innocence in forensic activities. It is possible that humans will review the evidence with a predetermined bias and find themselves more likely to decide against the student. Ultimately, the use of proctoring software poses serious ethical issues and, ideally, it should be phased out altogether. However, as long as it is deployed and in unique circumstances where institutions may believe its use is justified, the protocols used should be transparent and agreed upon with advocates of students and their families.

About the authors

Portrait of Claudio Aspesi

Claudio Aspesi

A respected market analyst with over a decade of experience covering the academic publishing market, and leadership roles at Sanford C. Bernstein, and McKinsey.

Scholarly Publishing and Academic Resources Coalition

SPARC is a coalition of academic and research libraries that work to enable the open sharing of research outputs and educational materials in order to democratize access to knowledge, accelerate discovery, and increase the return on our investment in research and education.