Big Data & Society (Jul 2021)

“The revolution will not be supervised”: Consent and open secrets in data science

  • Coleen Carrigan,
  • Madison W Green,
  • Abibat Rahman-Davies

DOI
https://doi.org/10.1177/20539517211035673
Journal volume & issue
Vol. 8

Abstract

Read online

The social impacts of computer technology are often glorified in public discourse, but there is growing concern about its actual effects on society. In this article, we ask: how does “consent” as an analytical framework make visible the social dynamics and power relations in the capture, extraction, and labor of data science knowledge production? We hypothesize that a form of boundary violation in data science workplaces—gender harassment—may correlate with the ways humans’ lived experiences are extracted to produce Big Data. The concept of consent offers a useful way to draw comparisons between gender relations in data science and the means by which machines are trained to learn and reason. Inspired by how Big Tech leaders describe unsupervised machine learning, and the co-optation of “revolutionary” rhetoric they use to do so, we introduce a concept we call “techniques of invisibility.” Techniques of invisibility are the ways in which an extreme imbalance between exposure and opacity, demarcated along fault lines of power, are fabricated and maintained, closing down the possibility for bidirectional transparency in the production and applications of algorithms. Further, techniques of invisibility, which we group into two categories—epistemic injustice and the Brotherhood—include acts of subjection by powerful actors in data science designed to quell resistance to exploitative relations. These techniques may be useful in making further connections between epistemic violence, sexism, and surveillance, sussing out persistent boundary violations in data science to render the social in data science visible, and open to scrutiny and debate.