Algorithmic pollution: Making the invisible visible

Olivera Marjanovic, Dubravka Cecez-Kecmanovic, Richard Vidgen

Research output: Contribution to journalArticlepeer-review

43 Citations (Scopus)

Abstract

In this article, we focus on the growing evidence of unintended harmful societal effects of automated algorithmic decision-making in transformative services (e.g. social welfare, healthcare, education, policing and criminal justice), for individuals, communities and society at large. Drawing from the long-established research on social pollution, in particular its contemporary ‘pollution-as-harm’ notion, we put forward a claim – and provide evidence – that these harmful effects constitute a new type of digital social pollution, which we name ‘algorithmic pollution’. Words do matter, and by using the term ‘pollution’, not as a metaphor or an analogy, but as a transformative redefinition of the digital harm performed by automated algorithmic decision-making, we seek to make it visible and recognized. By adopting a critical performative perspective, we explain how the execution of automated algorithmic decision-making produces harm and thus performs algorithmic pollution. Recognition of the potential for unintended harmful effects of algorithmic pollution, and their examination as such, leads us to articulate the need for transformative actions to prevent, detect, redress, mitigate and educate about algorithmic harm. These actions, in turn, open up new research challenges for the information systems community.
Original languageEnglish
Pages (from-to)391-408
Number of pages18
JournalJournal of Information Technology
Volume36
Issue number4
DOIs
Publication statusPublished - Dec 2021
Externally publishedYes

Keywords

  • Societal effects of AI and algorithms
  • transformative services
  • harmful effects
  • algorithmic pollution
  • critical performative perspective

Fingerprint

Dive into the research topics of 'Algorithmic pollution: Making the invisible visible'. Together they form a unique fingerprint.

Cite this