a review of Natasha Lushetich, ed. Big Data—A New Medium? (Routledge, 2021)
by Zachary Loeb
When discussing the digital, conversations can quickly shift towards talk of quantity. Just how many images are being uploaded every hour, how many meticulously monitored purchases are being made on a particular e-commerce platform every day, how many vehicles are being booked through a ride-sharing app at 3 p.m. on Tuesday afternoon, how many people are streaming how many shows/movies/albums at any given time? The specific answer to the “how much?” and “how many?” will obviously vary depending upon the rest of the question, yet if one wanted to give a general response across these questions it would likely be fair to answer with some version of “a heck of a lot.” Yet from this flows another, perhaps more complicated and significant question, namely: given the massive amount of information being generated by seemingly every online activity, where does all of that information actually go, and how is that information rendered usable and useful? To this the simple answer may be “big data,” but this in turn just serves to raise the question of what we mean by “big data.”
“Big data” denotes the point at which data begins to be talked about in terms of scale, not merely gigabytes but zettabytes. And, to be clear, a zettabyte represents a trillion gigabytes—and big data is dealing with zettabytes, plural. Beyond the sheer scale of the quantity in question, considering big data “as process and product” involves a consideration of “the seven Vs: volume” (the amount of data previously generated and newly generated), “variety” (the various sorts of data being generated), “velocity” (the highly accelerated rate at which data is being generated), “variability” (the range of types of information that make up big data), “visualization” (how this data can be visually represented to a user), “value” (how much all of that data is worth, especially once it can be processed in a useful way), and “veracity” (3) (the reliability, trustworthiness, and authenticity of the data being generated). In addition to these “seven Vs” there are also the “three Hs: high dimension, high complexity, and high uncertainty” (3). Granted, “many of these terms remain debatable” (3). Big data is both “process and product” (3), its applications vary from undergirding the sorts of real-time analysis that makes it possible to detect viral outbreaks as they are happening to the directions app that is able to suggest an alternative route before you hit traffic to the recommendation software (be it banal or nefarious) that forecast future behavior based on past actions.
To the extent that discussions around the digital generally focus on the end(s) results of big data, the means remain fairly occluded both from public view and from many of the discussants. And while big data has largely been accepted as an essential aspect of our digital lives by some, for many others it remains highly fraught.
As Natasha Lushetich notes, “in the arts and (digital) humanities…the use of big data remains a contentious issue not only because data architectures are increasingly determining classificatory systems in the educational, social, and medical realms, but because they reduce political and ethical questions to technical management” (4). And it is this contentiousness that is at the heart of Lushetich’s edited volume Big Data—A New Medium? (Routledge, 2021). Drawing together scholars from a variety of different disciplines ranging across “the arts and (digital) humanities,” this book moves beyond an analysis of what big data is to a complex considerations of what big data could be (and may be in the process of currently becoming). In engaging with the perils and potentialities of big data, the book (as its title suggests) wrestles with the question as to whether or not big data can be seen as constituting “a new medium.” Through engaging with big data as a medium, the contributors to the volume grapple not only with how big data “conjugates human existence” but also how it “(re)articulates time, space, the material and immaterial world, the knowable and the unknowable; how it navigates or alters, hierarchies of importance” and how it “enhances, obsolesces, retrieves and pushes to the limits of potentiality” (8). Across four sections, the contributors grapple with big data in terms of knowledge and time, use and extraction, cultural heritage and memory, as well as people.
“Patterning Knowledge and Time” begins with a chapter by Ingrid M. Hoofd that places big data in the broader trajectory of the university’s attempt to make the whole of the world knowable. Considering how “big data renders its object of analysis simultaneously more unknowable (or superficial) and more knowable (or deep)” (18), Hoofd’s chapter examines how big data replicates and reinforces the ways in which that which becomes legitimated as knowable are the very things that can be known through the university’s (and big data’s) techniques. Following Hoofd, Franco “Bifo” Berardi provocatively engages with the power embedded in big data, treating it as an attempt to assert computerized control over a chaotic future by forcing it into a predictable model. Here big data is treated as a potential constraint wherein “the future is no longer a possibility, but the implementation of a logical necessity inscribed in the present” (43), as participation in society becomes bound up with making one’s self and one’s actions legible and analyzable to the very systems that enclose one’s future horizons. Shifting towards the visual and the environmental, Abelardo Gil-Fournier and Jussi Parikka consider the interweaving of images and environments and how data impacts this. As Gil-Fournier and Parikka explore, as a result of developments in machine learning and computer vision “meteorological changes” are increasingly “not only observable but also predictable as images” (56).
The second part of the book, “Patterning Use and Existence” starts with Btihaj Ajana reflecting on the ways in which “surveillance technologies are now embedded in our everyday products and services” (64). By juxtaposing the biometric control of refugees with the quantified-self movement, Ajana explores the datafication of society and the differences (as well as similarities) between willing participation and forced participation in regimes of surveillance of the self. Highlighting a range of well-known gig-economy platforms (such as Uber, Deliveroo, and Amazon Mechanical Turk), Tim Christaens examines the ways that “the speed of the platform’s algorithms exceeds the capacities of human bodies” (81). While offering a thorough critique of the inhuman speed imposed by gig economy platforms/algorithms, Christaens also offers a hopeful argument for the possibility that by making their software open source some of these gig platforms could “become a vehicle for social emancipation instead of machinic subjugation” (90). While aesthetic and artistic considerations appear in earlier chapters, Lonce Wyse’s chapter pushes fully into this area through looking at the ways that deep learning systems create the sorts of works of art “that, when recognized in humans, are thought of as creative” (95). Wyse provides a rich, and yet succinct, examination of how these systems function while highlighting the sorts of patterns that emerge (sometimes accidentally) in the process of training these systems.
At the outset of the book’s third section, “Patterning cultural heritage and memory,” Craig J. Saper approaches the magazine The Smart Set as an object of analysis and proceeds to zoom in and zoom out to reveal what is revealed and what is obfuscated at different scales. Highlighting that “one cannot arbitrarily discount or dismiss particular types of data, big or intimate, or approaches to reading, distant or close” Saper’s chapter demonstrates how “all scales carry intellectual weight” (124). Moving away from the academic and the artist, Nicola Horsley’s chapter reckons with the work of archivists and the ways in which their intellectual labor and the tasks of their profession have been challenged by digital shifts. While archival training teaches archivists that “the historical record, on which collective memory is based, is a process not a product” (140) and in interacting with researchers archivists seek to convey that lesson, Horsley’s considers the ways in which the shift away from the physical archive and towards the digital archive (wherein a researcher may never directly interact with an archivist or librarian) means this “process” risks going unseen. From the archive to the work of art, Natasha Lushetich and Masaki Fujihata’s chapter explores Fujihata’s project BeHere: The Past in the Present and how augmented reality opens up the space for new artistic experience and challenges how individual memory is constructed. Through its engagement with “images obtained through data processing and digital frottage” the BeHere project reveals “new configurations of machinically (rather than humanly) perceived existents” and thus can “shed light on that which eludes the (naked) human eye” (151).
The fourth and final section of the volume, begins with Dominic Smith’s exploration of the aesthetics of big data. While referring back to the “Seven Vs” of big data, Smith argues that to imagine big data as a “new medium” requires considering “how we make sense of data” in regards to both “how we produce it” and “how we perceive it” (164). A matter which Smith explores through an analysis of “surfaces and depths” of oceanic images. Though big data is closely connected with sheer scale (hence the “big”), Mitra Azar observes that “it is never enough as it is always possible to generate new data and make more comprehensive data sets” (180). Tangling with this in a visual registry, Azar contrasts the cinematic point of view with that of the big data enabled “data double” of the individual (which is meant to stand in for that user). Considering several of his own artistic installations—Babel, Dark Matter, and Heteropticon—Simon Biggs examines the ways in which big data reveals “the everyday and trivial and how it offers insights into the dense ambient noise that is our daily lives” (192). In contrast to treating big data as a revelator of the sublime, Biggs discusses big data’s capacity to show “the infra-ordinary” and to show the value of seemingly banal daily details. The book concludes with Warren Neidich’s speculative gaze to what the future of big data might portend, couched in a belief that “we are at the beginning of a transition from knowledge-based economics to a neural or brain-based economy” (207). Surveying current big data technologies and the trajectories they may suggest, Neidich forecasts “a gradual accumulation of telepathic technocueticals” such that “at some moment a critical point might be reached when telepathy could become a necessary skill for successful adaptation…similar to being able to read in today’s society” (218).
In the introduction to the book, Natasha Lushetich grounds the discussion in a recognition that “it is also important to ask how big data (re)articulates time, space, the material and immaterial world, the knowable and the unknowable; how it navigates or alters, hierarchies of importance” (8), and over the course of this fascinating and challenging volume, the many contributors do just that.
The term big data captures the way in which massive troves of digitally sourced information are made legible and understandable. Yet one of the challenges of discussing big data is trying to figure out a way to make big data itself legible and understandable. In discussions around the digital, big data is often gestured at rather obliquely as the way to explain a lot of mysterious technological activity in the background. We may not find ourselves capable, for a variety of reasons, of prying open the various black boxes of a host of different digital systems but stamped in large letters on the outside of that box are the words “big data.” When shopping online or using a particular app, a user may be aware that the information being gathered from their activities is feeding into big data and that the recommendations being promoted to them come courtesy of the same. Or they may be obliquely aware that there is some sort of connection between the mystery shrouded algorithms and big data. Or the very evocation of “big” when twinned with a recognition of surveillance technologies may serve as a discomforting reminder of “big brother.” Or “big data” might simply sound like a non-existent episode of Star Trek: The Next Generation in which Lieutenant Commander Data is somehow turned into a giant. All of which is to say, that though big data is not a new matter, the question of how to think about it (which is not the same as how to use and be used by it) remains a challenging issue.
With Big Data—A New Medium?, Natasha Lushetich has assembled an impressive group of thinkers to engage with big data in a novel way. By raising the question of big data as “a new medium,” the contributors shift the discussion away from considerations focused on surveillance and algorithms to wrestle with the ways that big data might be similar and distinct from other mediums. While this shift does not represent a rejection, or move to ignore, the important matters related to issues like surveillance, the focus on big data as a medium raises a different set of questions. What are the aesthetics of big data? As a medium what are the affordances of big data? And what does it mean for other mediums that in the digital era so many of those mediums are themselves being subsumed by big data? After all, so many of the older mediums that theorists have grown so accustomed to discussing have undergone some not insignificant changes as a result of big data. And yet to engage with big data as a medium also opens up a potential space for engaging with big data that does not treat it as being wholly captured and controlled by large tech firms.
The contributors to the volume do not seem to be fully in agreement with one another about whether big data represents poison or panacea, but the chapters are clearly speaking to one another instead of shouting over each other. There are certainly some contributions to the book, notably Berardi’s, with its evocation of a “new century suspended between two opposite polarities: chaos and automaton” (44), that seem a bit more pessimistic. While other contributors, such as Christaens, engage with the unsavory realities of contemporary data gathering regimes but envision the ways that these can be repurposed to serve users instead of large companies. And such optimistic and pessimistic assessments come up against multiple contributions that eschew such positive/negative framings in favor of an artistically minded aesthetic engagement with what it means to treat big data as a medium for the creation of works of art. Taken together, the chapters in the book provide a wide-ranging assessment of big data, one which is grounded in larger discussions around matters such as surveillance and algorithmic bias, but which pushes readers to think of big data beyond those established frameworks.
As an edited volume, one of the major strengths of Big Data—A New Medium? is the way it brings together perspectives from such a variety of fields and specialties. As part of Routledge’s “studies in science, technology, and society” series, the volume demonstrates the sort of interdisciplinary mixing that makes STS such a vital space for discussions of the digital. Granted, this very interdisciplinary richness can serve to be as much benefit as burden, as some readers will wish there had been slightly more representation of their particular subfield, or wish that the particular scholarly techniques of a particular discipline had seen greater use. Case in point: Horsley’s contribution will be of great interest to those approaching this book from the world of libraries and archives (and information schools more generally), and some of those same readers will wish that other chapters in the book had been equally attentive to the work done by archive professionals. Similarly those who approach the book from fields more grounded in historical techniques may wish that more of the authors had spent more time engaging with “how we got here” instead of focusing so heavily on the exploration of the present and the possible future. Of course, these are always the challenges with edited interdisciplinary volumes, and it is a major credit to Lushetich as an editor that this volume provides readers from so many different backgrounds with so much to mull over. Beyond presenting numerous perspectives on the titular question, the book is also an invitation to artists and academics to join in discussion about that titular question.
Those who are broadly interested in discussions around big data will find much in this volume of significance, and will likely find their own thinking pushed in novel directions. That being said, this book will likely be most productively read by those who are already somewhat conversant in debates around big data/the digital humanities/the arts/and STS more generally. While contributors are consistently careful in clearly defining their terms and referencing the theorists from whom they are drawing, from Benjamin to Foucault to Baudrillard to Marx to Deleuze and Guattari (to name but a few), the contributors to this book couch much of their commentary in theory, and a reader of this volume will be best able to engage with these chapters if they have at least some passing familiarity with those theorists themselves. Many of the contributors to this volume are also clearly engaging with arguments made by Shoshana Zuboff in Surveillance Capitalism and this book can be very productively read as critique and complement to Zuboff’s tome. Academics in and around STS, and artists who incorporate the digital into their practice, will find that this book makes a worthwhile intervention into current discourse around big data. And though the book seems to assume a fairly academically engaged readership, this book will certainly work well in graduate seminars (or advanced undergraduate classrooms)—many of the chapter will stand quite well on their own, though much of the book’s strength is in the way the chapters work in tandem.
One of the claims that is frequently made about big data is that—for better or worse—it will allow us to see the world from a fresh perspective. And what Big Data—A New Medium? does is allow us to see big data itself from a fresh perspective.
Zachary Loeb earned his MSIS from the University of Texas at Austin, an MA from the Media, Culture, and Communications department at NYU, and is currently a PhD candidate in the History and Sociology of Science department at the University of Pennsylvania. Loeb works at the intersection of the history of technology and disaster studies, and his research focusses on the ways that complex technological systems amplify risk, as well as the history of technological doom-saying. He is working on a dissertation on Y2K. Loeb writes at the blog Librarianshipwreck, and is a frequent contributor to The b2o Review Digital Studies section.