Zachary Loeb — Who Moderates the Moderators? On the Facebook Files

0
7066
10 years of facebook
Illustration by Zachary Loeb

by Zachary Loeb

~

Speculative fiction is littered with fantastical tales warning of the dangers that arise when things get, to put it amusingly, too big. A researcher loses control of their experiment! A giant lizard menaces a city! Massive computer networks decide to wipe out humanity! A horrifying blob metastasizes as it incorporates all that it touches into its gelatinous self!

Such stories generally contain at least a faint hint of the absurd. Nevertheless, silly stories can still contain important lessons, and some of the morals that one can pull from such tales are: that big things keep getting bigger, that big things can be very dangerous, and that sometimes things that get very big very fast wind up doing a fair amount of damage as what appears to be controlled growth is revealed to actually be far from managed. It may not necessarily always be a case of too big, as in size, but speculative fiction features no shortage of tragic characters who accidentally unleash some form of horror upon an unsuspecting populace because things were too big for that sorry individual to control. The mad scientist has a sad corollary in the figure of the earnest scientist who wails “I meant well” while watching their creation slip free from their grasp.

Granted, if you want to see such a tale of the dangers of things getting too big and the desperate attempts to maintain some sort of control you don’t need to go looking for speculative fiction.

You can just look at Facebook.

With its publication of The Facebook Files, The Guardian has pried back the smiling façade of Zuckerberg’s monster to reveal a creature that an overwhelmed staff is desperately trying to contain with less than clear insight into how best to keep things under control. Parsing through a host of presentations and guidelines that are given to Facebook’s mysterious legion of content moderators, The Facebook Files provides insight into how the company determines what is and is not permitted on the website. It’s a tale that is littered with details about the desperate attempt to screen things that are being uploaded at a furious rate, with moderators often only having a matter of seconds in which they can make a decision as to whether or not something is permitted. It is a set of leaks that are definitely worth considering, as they provide an exposé of the guidelines Facebook moderators use when considering whether things truly qualify as revenge porn, child abuse, animal abuse, self-harm, unacceptable violence, and more. At the very least, the Facebook Files are yet another reminder of the continuing validity of Erich Fromm’s wise observation:

What we use is not ours simply because we use it. (Fromm 2001, 225)

In considering the Facebook Files it is worthwhile to recognize that the moderators are special figures in this story – they are not really the villains. The people working as actual Facebook moderators are likely not the same people who truly developed these guidelines. In truth, they likely weren’t even consulted. Furthermore, the moderators are almost certainly not the high-profile Facebook executives espousing techno-utopian ideologies in front of packed auditoriums. To put it plainly, Mark Zuckerberg is not checking to see if the thousands of photos being uploaded every second fit within the guidelines. In other words, having a measure of sympathy for the Facebook moderators who spend their days judging a mountain of (often disturbing) content is not the same thing as having any sympathy for Facebook (the company) or for its figureheads. Furthermore, Facebook has already automated a fair amount of the moderating process, and it is more than likely that Facebook would love to be able to ditch all of its human moderators in favor of an algorithm. Given the rate at which it expects them to work it seems that Facebook already thinks of its moderators as being little more than cogs in its vast apparatus.

That last part helps point to one of the reasons why the Facebook Files are so interesting – because they provide a very revealing glimpse of the type of morality that a machine might be programmed to follow. The Facebook Files – indeed the very idea of Facebook moderators – is a massive hammer that smashes to bits the idea that technological systems are somehow neutral, for it puts into clear relief the ways in which people are involved in shaping the moral guidelines to which the technological system adheres. The case of what is and is not allowed on Facebook is a story playing out in real time of a company (staffed by real live humans) trying to structure the morality of a technological space. Even once all of this moderating is turned over to an algorithm, these Files will serve as a reminder that the system is acting in accordance with a set of values and views that were programmed into it by people. And this whole tale of Facebook’s attempts to moderate sensitive/disturbing content points to the fact that morality can often be quite tricky. And the truth of the matter, as many a trained ethicist will attest, is that moral matters are often rather complex – which is a challenge for Facebook as algorithms tend to do better with “yes” and “no” than they do with matters that devolve into a lot of complex philosophical argumentation.

Thus, while a blanket “zero nudity” policy might be crude, prudish, and simplistic – it still represents a fairly easy way to separate allowed content from forbidden content. Similarly, a “zero violence” policy runs the risk of hiding the consequences of violence, masking the gruesome realities of war, and covering up a lot of important history – but it makes it easy to say “no videos of killings or self-harm are allowed at all.” Likewise, a strong “absolutely no threats of any sort policy” would make it so that “someone shoot [specific political figure” and “let’s beat up people with fedoras” would both be banned. By trying to parse these things Facebook has placed its moderators in tricky territory – and the guidelines it provides them with are not necessarily the clearest. Had Facebook maintained a strict “black and white” version of what’s permitted and not permitted it could have avoided the swamp through which it is now trudging with mixed results. Again, it is fair to have some measure of sympathy for the moderators here – they did not set the rules, but they will certainly be blamed, shamed, and likely fired for any failures to adhere to the letter of Facebook’s confusing law.

Part of the problem that Facebook has to contend with is clearly the matter of free speech. There are certainly some who will cry foul at any attempt by Facebook to moderate content – crying out that such things are censorship. While still others will scoff at the idea of free speech as applied to Facebook seeing as it is a corporate platform and therefore all speech that takes place on the site already exists in a controlled space. A person may live in a country where they have a government protected right to free speech – but Facebook has no such obligation to its users. There is nothing preventing Facebook from radically changing its policies about what is permissible. If Facebook decided tomorrow that no content related to, for example, cookies was to be permitted, it could make and enforce that decision. And the company could make that decision regarding things much less absurd than cookies – if Facebook wanted to ban any content related to a given protest movement it would be within its rights to do so (which is not to say that would be good, but to say that it would be possible). In short, if you use Facebook you use it in accordance with its rules, the company does not particularly care what you think. And if you run afoul of one of its moderators you may well find your account suspended – you can cry “free speech” but Facebook will retort with “you agreed to our terms of use, Facebook is a private online space.” Here, though, a person may try to fire back at Facebook that in the 21st century, to a large extent, social media platforms like Facebook have become a sort of new public square.

And, yet again, that is part of the reason why this is all so tricky.

Facebook clearly wants to be the new “public square” – it wants to be the space where people debate politics, where candidates have forums, and where activists organize. Yet it wants all of these “public” affairs to take place within its own enclosed “private” space. There is no real democratic control of Facebook, the company may try to train its moderators to respect various local norms but the people from those localities don’t get to have a voice in determining what is and isn’t acceptable. Facebook is trying desperately to have it all ways – it wants to be the key space of the public sphere while simultaneously pushing back against any attempts to regulate it or subject it to increased public oversight. As lackluster and problematic as the guidelines revealed by the Facebook Files are, they still demonstrate that Facebook is trying (with mixed results) to regulate itself so that it can avoid being subject to further regulation. Thus, free speech is both a sword and a shield for Facebook – it allows the company to hide from the accusations that the site is filled with misogyny and xenophobia behind the shield of “free speech” even as the site can pull out its massive terms of service agreement (updated frequently) to slash users with the blade that on the social network there is no free speech only Facebook speech. The speech that Facebook is most concerned with is its own, and it will say and do what it needs to say and do, to protect itself from constraints.

Yet, to bring it back to the points with which this piece began, many of the issues that the Facebook Files reveal have a lot to do with scale. Sorting out the nuance of an image or a video can take longer than the paltry few seconds most moderators are able to allot to each image/video. And it further seems that some of the judgments that Facebook is asking its moderators to make have less to do with morality or policies than they have to do with huge questions regarding how the moderator can possibly know if something is in accordance with the policies or not. How does a moderator not based in a community really know if something is up to a community’s standard? Facebook is hardly some niche site with a small user base and devoted cadre of moderators committed to keeping the peace – its moderators are overworked members of the cybertariat (a term borrowed from Ursula Huws), the community they serve is Facebook not those from whence the users hail. Furthermore, some of the more permissive policies – such as allowing images of animal abuse – couched under the premise that they may help to alert the authorities seems like more of an excuse than an admission of responsibility. Facebook has grown quite large, and it continues to grow. What it is experiencing is not so much a case of “growing pains” as it is a case of the pains that are inflicted on a society when something is allowed to grow out of control. Every week it seems that Facebook becomes more and more of a monopoly – but there seems to be little chance that it will be broken up (and it is unclear what that would mean or look like).

Facebook is the science project of the researcher which is always about to get too big and slip out of control, and the Facebook Files reveal the company’s frantic attempt to keep the beast from throwing off its shackles altogether. And the danger there, from Facebook’s stance, is that – as in all works where something gets too big and gets out of control – the point when it loses control is the point where governments step in to try to restore order. What that would look like in this case is quite unclear, and while the point is not to romanticize regulation the Facebook Files help raise the question of who is currently doing the regulating and how are they doing it? That Facebook is having such a hard time moderating content on the site is actually a pretty convincing argument that when a site gets too big, the task of carefully moderating things becomes nearly impossible.

To deny that Facebook has significant power and influence is to deny reality. While it’s true that Facebook can only set the policy for the fiefdoms it controls, it is worth recognizing that many people spend a heck of a lot of time ensconced within those fiefdoms. The Facebook Files are not exactly a shocking revelation showing a company that desperately needs some serious societal oversight – rather what is shocking about them is that they reveal that Facebook has been allowed to become so big and so powerful without any serious societal oversight. The Guardian’s article leading into the Facebook Files quotes Monika Bickert, ‎Facebook’s head of global policy management, as saying that Facebook is:

“not a traditional technology company. It’s not a traditional media company. We build technology, and we feel responsible for how it’s used.”

But a question lingers as to whether or not these policies are really reflective of responsibility in any meaningful sense. Facebook may not be a “traditional” company in many respects, but one area in which it remains quite hitched to tradition is in holding to a value system where what matters most is the preservation of the corporate brand. To put it slightly differently, there are few things more “traditional” than the monopolistic vision of total technological control reified in Facebook’s every move. In his classic work on the politics of technology, The Whale and the Reactor, Langdon Winner emphasized the need to seriously consider the type of world that technological systems were helping to construct. As he put it:

We should try to imagine and seek to build technical regimes compatible with freedom, social justice, and other key political ends…If it is clear that the social contract implicitly created by implementing a particular generic variety of technology is incompatible with the kind of society we deliberately choose—that is, if we are confronted with an inherently political technology of an unfriendly sort—then that kind of device or system ought to be excluded from society altogether. (Winner 1989, 55)

The Facebook Files reveal the type of world that Facebook is working tirelessly to build. It is a world where Facebook is even larger and even more powerful – a world in which Facebook sets the rules and regulations. In which Facebook says “trust us” and people are expected to obediently go along.

Yes, Facebook needs content moderators, but it also seems that it is long-past due for there to be people who moderate Facebook. And those people should not be cogs in the Facebook machine.

_____

Zachary Loeb is a writer, activist, librarian, and terrible accordion player. He earned his MSIS from the University of Texas at Austin, an MA from the Media, Culture, and Communications department at NYU, and is currently working towards a PhD in the History and Sociology of Science department at the University of Pennsylvania. His research areas include media refusal and resistance to technology, ideologies that develop in response to technological change, and the ways in which technology factors into ethical philosophy – particularly in regards of the way in which Jewish philosophers have written about ethics and technology. Using the moniker “The Luddbrarian,” Loeb writes at the blog Librarian Shipwreck, where an earlier version of this post first appeared, and is a frequent contributor to The b2 Review Digital Studies section.

Back to the essay
_____

Works Cited

  • Fromm, Erich. 2001. The Fear of Freedom. London: Routledge Classics.
  • Winner, Langdon. 1989. The Whale and the Reactor. Chicago: The University of Chicago Press.

LEAVE A REPLY

Please enter your comment!
Please enter your name here