Seeing Evil Differently
From the Series: Evil Infrastructures
Throughout 2015, the German public witnessed hatred in its diverse forms and expressions taking over comment sections of German online newspapers, as well as public and personal social media websites. Online news platforms closed their comment sections because they could not cope with the spread of what German media soon began labeling as online hate. Observers linked the massive surge of openly xenophobic and racist content mainly to an influx of refugees into the country in the summer of 2015. In particular, the social media platform Facebook became—especially with its German users—the online infrastructure for spreading hatred, fake news, and lies about refugees and migrants, as well as far-right nationalist propaganda.
Many Facebook users felt frustrated by the platform’s lack of response to the problem. They criticized the company for not fulfilling its legal obligation to delete unlawful posts that users had reported; they argued that national laws seemed basically ineffectual in supporting users through the complaint process. In this way, exploring the evolution of online hate, and how it played out in Germany in 2015, also entails following stories about ineffective community guidelines, questionable moderation policies, overwhelmed teams of moderators, helpless users, and powerless German authorities.
Many countries have laws for protecting people against discrimination, racism, and exclusion. But what happens when these national laws fail in the online world? What happens when they generate different expectations and aspirations than those supported by the policies and regulatory systems Internet corporations have created to counter such hatred? Scholarship on online harassment and defamation has examined the wide range of regulatory regimes, policies, and tools (provided by governments and corporations) that may come into play when attempting to govern these problems both online and offline. Here, however, I am concerned with the relationship between the different legal or regulatory regimes involved in the fight against online hate. I want to draw ethnographic attention to the places where tensions between offline and online hate speech policies arise, where gaps between the promise and the performance of their regulatory regimes become visible—or even evil, as I will demonstrate with examples from my field site in Germany.
The Facebook situation reached its peak in September 2015, when the German government stepped in and called on the company’s European head office in Dublin (as well as on its German subsidiary) to take action against the spread of hatred and racism on its site. By that time, the hateful commentaries perpetrated online had arguably gone so far as to create a fertile environment for the effusion of hate speech and violence beyond the online world, resulting in physical attacks on refugee camps and shelters.
Against this backdrop, the German justice minister Heiko Maas demanded from Facebook an urgent review of its policies regarding hate messages. He wanted the company to better enforce its community standards in reaction to abusive users. The example illustrates what the media scholars Malte Ziewitz and Christian Petzold (2014, 314) have called a “clash of governing regimes,” pointing to infrastructural and cultural differences between, in their case example, modes of self-governance at Twitter and English law. In my own fieldwork, the controversies between Facebook and the German government over the company’s community standards and moderation policies displayed similar “clashes” or incongruities. They addressed in particular the gaps between the promise and the performance of policies, but also cultural and national differences in understanding and defining what hate speech means.
In my research, I have found these gaps between the different legal infrastructures, and how they became a matter of critical political and public concern during the course of 2015, a productive object of ethnographic inquiry. A focus on the different legal projections and aspirations enables not just an analysis of the legal and political challenges in countering supposed evil online. It can also provide culturally and historically specific accounts of how legal and political responsibilities and governance regimes are reconfigured or shifted in the process. These accounts might further tell us a great deal about changing aspirations, anticipations, and imaginations of rights and order in the online world. Interestingly, to come back once more to the German case, Facebook users had been sending complaints about the failure to ban abusive content on the platform to the Justice Ministry: they used one legal infrastructure (of government) to complain about the dysfunction of another (a corporate one).
Since the earliest days of the Internet, the legal infrastructures of the online world (measures now increasingly provided by Internet companies) have been a highly contested (techno)political terrain among governments, corporations, civil society organizations, advocacy groups, and Internet activists. They represent a powerful resource: they can, in Antina von Schnitzler’s words, serve as “ethical objects, conduits of power, or [can] be wielded as political tools.”
Then, too, policing the so-called online evil has its own (evil) twists in several ways: the European Union’s new code of conduct against hate speech signed by the four major social media plattforms (Microsoft, Twitter, Facebook, and Google) has raised severe concerns among free-speech advocates about political censorship. Yet just discussing the problem of online harassment can provoke major controversies and attacks against participants, as demonstrated by the prehistories of the Online Harassment Summit organized by the South by Southwest Festival in Austin, Texas, in 2016. Ultimately, many scholars and activists who engage with these issues have themselves become targets of online hate.
Ziewitz, Malte, and Christian Pentzold. 2014. “In Search of Internet Governance: Performing Order in Digitally Networked Environments.” New Media and Society 16, no. 2: 306–22.