By Katrin Bennhold
A country taps its past as it leads the way on one of the most pressing issues facing modern democracies: how to regulate the world’s biggest social network.
BERLIN — Security is tight at this brick building on the western edge of Berlin. Inside, a sign warns: “Everybody without a badge is a potential spy!”
Spread over five floors, hundreds of men and women sit in rows of six scanning their computer screens. All have signed nondisclosure agreements. Four trauma specialists are at their disposal seven days a week.
They are the agents of Facebook. And they have the power to decide what is free speech and what is hate speech.
This is a deletion center, one of Facebook’s largest, with more than 1,200 content moderators. They are cleaning up content — from terrorist propaganda to Nazi symbols to child abuse — that violates the law or the company’s community standards.
Germany, home to a tough new online hate speech law, has become a laboratory for one of the most pressing issues for governments today: how and whether to regulate the world’s biggest social network.
Around the world, Facebook and other social networking platforms are facing a backlash over their failures to safeguard privacy, disinformation campaigns and the digital reach of hate groups.
In India, seven people were beaten to death after a false viral message on the Facebook subsidiary WhatsApp. In Myanmar, violence against the Rohingya minority was fueled, in part, by misinformation spread on Facebook. In the United States, Congress called Mark Zuckerberg, Facebook’s chief executive, to testify about the company’s inability to protect its users’ privacy.
As the world confronts these rising forces, Europe, and Germany in particular, have emerged as the de facto regulators of the industry, exerting influence beyond their own borders. Berlin’s digital crackdown on hate speech, which took effect on Jan. 1, is being closely watched by other countries. And German officials are playing a major role behind one of Europe’s most aggressive moves to rein in technology companies, strict data privacy rules that take effect across the European Union on May 25 and are prompting global changes.
Gerd Billen, the secretary of state for Germany’s Ministry of Justice and Consumer Protection, says that data protection is a fundamental right.
“For them, data is the raw material that makes them money,” said Gerd Billen, secretary of state in Germany’s Ministry of Justice and Consumer Protection. “For us, data protection is a fundamental right that underpins our democratic institutions.”
Germany’s troubled history has placed it on the front line of a modern tug-of-war between democracies and digital platforms.
In the country of the Holocaust, the commitment against hate speech is as fierce as the commitment to free speech. Hitler’s “Mein Kampf” is only available in an annotated version. Swastikas are illegal. Inciting hatred is punishable by up to five years in jail.
But banned posts, pictures and videos have routinely lingered on Facebook and other social media platforms. Now companies that systematically fail to remove “obviously illegal” content within 24 hours face fines of up to 50 million euros.
The deletion center predates the legislation, but its efforts have taken on new urgency. Every day content moderators in Berlin, hired by a third-party firm and working exclusively on Facebook, pore over thousands of posts flagged by users as upsetting or potentially illegal and make a judgment: Ignore, delete or, in particularly tricky cases, “escalate” to a global team of Facebook lawyers with expertise in German regulation.
Some decisions to delete are easy. Posts about Holocaust denial and genocidal rants against particular groups like refugees are obvious ones for taking down.
Others are less so. On Dec. 31, the day before the new law took effect, a far-right lawmaker reacted to an Arabic New Year’s tweet from the Cologne police, accusing them of appeasing “barbaric, Muslim, gang-raping groups of men.”
The request to block a screenshot of the lawmaker’s post wound up in the queue of Nils, a 35-year-old agent in the Berlin deletion center. His judgment was to let it stand. A colleague thought it should come down. Ultimately, the post was sent to lawyers in Dublin, London, Silicon Valley and Hamburg. By the afternoon it had been deleted, prompting a storm of criticism about the new legislation, known here as the “Facebook Law.”
“A lot of stuff is clear-cut,” Nils said. Facebook, citing his safety, did not allow him to give his surname. “But then there is the borderline stuff.”
Complicated cases have raised concerns that the threat of the new rules’ steep fines and 24-hour window for making decisions encourage “over-blocking” by companies, a sort of defensive censorship of content that is not actually illegal.
The far-right Alternative of Germany, a noisy and prolific user of social media, has been quick to proclaim “the end of free speech.” Human rights organizations have warned that the legislation was inspiring authoritarian governments to copy it.
Other people argue that the law simply gives a private company too much authority to decide what constitutes illegal hate speech in a democracy, an argument that Facebook, which favored voluntary guidelines, made against the law.
“It is perfectly appropriate for the German government to set standards,” said Elliot Schrage, Facebook’s vice president of communications and public policy. “But we think it’s a bad idea for the German government to outsource the decision of what is lawful and what is not.”
Richard Allen, Facebook’s vice president for public policy in Europe, leaving a meeting at Germany’s justice ministry in March.CreditSean Gallup/Getty Images
Richard Allan, Facebook’s vice president for public policy in Europe and the leader of the company’s lobbying effort against the German legislation, put it more simply: “We don’t want to be the arbiters of free speech.”
German officials counter that social media platforms are the arbiters anyway.
It all boils down to one question, said Mr. Billen, who helped draw up the new legislation: “Who is sovereign? Parliament or Facebook?”
Learning From (German) History
When Nils applied for a job at the deletion center, the first question the recruiter asked him was: “Do you know what you will see here?”
Nils has seen it all. Child torture. Mutilations. Suicides. Even murder: He once saw a video of a man cutting a heart out of a living human being.
And then there is hate.
“You see all the ugliness of the world here,” Nils said. “Everyone is against everyone else. Everyone is complaining about that other group. And everyone is saying the same horrible things.”
The issue is deeply personal for Nils. He has a 4-year-old daughter. “I’m also doing this for her,” he said.
The center here is run by Arvato, a German service provider owned by the conglomerate Bertelsmann. The agents have a broad purview, reviewing content from a half-dozen countries. Those with a focus on Germany must know Facebook’s community standards and, as of January, the basics of German hate speech and defamation law.
“Two agents looking at the same post should come up with the same decision,” says Karsten König, who manages Arvato’s partnership with Facebook.
The Berlin center opened with 200 employees in 2015, as Germany was opening its doors to hundreds of thousands of migrants.
Anas Modamani, a Syrian refugee, with Chancellor Angela Merkel of Germany in Berlin in September 2015.
Anas Modamani, a Syrian refugee, posed with Chancellor Angela Merkel and posted the image on Facebook. It instantly became a symbol of her decision to allowing in hundreds of thousands of migrants.
Soon it also became a symbol of the backlash.
The image showed up in false reports linking Mr. Modamani to terrorist attacks in Brussels and on a Christmas market in Berlin. He sought an injunction against Facebook to stop such posts from being shared but eventually lost.
The arrival of nearly 1.4 million migrants in Germany has tested the country’s resolve to keep a tight lid on hate speech. The law on illegal speech was long-established but enforcement in the digital realm was scattershot before the new legislation.
Posts calling refugees rapists, Neanderthals and scum survived for weeks, according to jugendschutz.net, a publicly funded internet safety organization. Many were never taken down. Researchers at jugendschutz.net reported a tripling in observed hate speech in the second half of 2015.
Mr. Billen, the secretary of state in charge of the new law, was alarmed. In September 2015, he convened executives from Facebook and other social media sites at the justice ministry, a building that was once the epicenter of state propaganda for the Communist East. A task force for fighting hate speech was created. A couple of months later, Facebook and other companies signed a joint declaration, promising to “examine flagged content and block or delete the majority of illegal posts within 24 hours.”
But the problem did not go away. Over the 15 months that followed, independent researchers, hired by the government, twice posed as ordinary users and flagged illegal hate speech. During the tests, they found that Facebook had deleted 46 percent and 39 percent.
“They knew that they were a platform for criminal behavior and for calls to commit criminal acts, but they presented themselves to us as a wolf in sheep skin,” said Mr. Billen, a poker-faced civil servant with stern black frames on his glasses.
By March 2017, the German government had lost patience and started drafting legislation. The Network Enforcement Law was born, setting out 21 types of content that are “manifestly illegal” and requiring social media platforms to act quickly.
Officials say early indications suggest the rules have served their purpose. Facebook’s performance on removing illegal hate speech in Germany rose to 100 percent over the past year, according to the latest spot check of the European Union.
Platforms must publish biannual reports on their efforts. The first is expected in July.
At Facebook’s Berlin offices, Mr. Allan acknowledged that under the earlier voluntary agreement, the company had not acted decisively enough at first.
“It was too little and it was too slow,” he said. But, he added, “that has changed.”
He cited another independent report for the European Commission from last summer that showed Facebook was by then removing 80 percent of hate speech posts in Germany.
The reason for the improvement was not German legislation, he said, but a voluntary code of conduct with the European Union. Facebook’s results have improved in all European countries, not just in Germany, Mr. Allan said.
“There was no need for legislation,” he said.
Mr. Billen disagrees.
“They could have prevented the law,” he said. YouTube scored 90 percent in last year’s monitoring exercise. If other platforms had done the same, there would be no law today, he said.
A Regulatory Dilemma
Germany’s hard-line approach to hate speech and data privacy once made it an outlier in Europe. The country’s stance is now more mainstream, an evolution seen in the justice commissioner in Brussels.
Vera Jourova, the European Union’s justice commissioner, deleted her Facebook account in 2015 because she could no longer stand the hate.
Vera Jourova, the justice commissioner, deleted her Facebook account in 2015 because she could not stand the hate anymore.
“It felt good,” she said about pressing the button. She added: “It felt like taking back control.”
But Ms. Jourova, who grew up behind the Iron Curtain in what is now the Czech Republic, had long been skeptical about governments legislating any aspect of free speech, including hate speech. Her father lost his job after making a disparaging comment about the Soviet invasion in 1968, barring her from going to university until she married and took her husband’s name.
“I lived half my life in the atmosphere driven by Soviet propaganda,” she said. “The golden principle was: If you repeat a lie a hundred times it becomes the truth.”
When Germany started considering a law, she instead preferred a voluntary code of conduct. In 2016, platforms like Facebook promised European users easy reporting tools and committed to removing most illegal posts brought to their attention within 24 hours.
The approach worked well enough, Ms. Jourova said. It was also the quickest way to act because the 28 member states in the European Union differed so much about whether and how to legislate.
But the stance of many governments toward Facebook has hardened since it emerged that the consulting firm Cambridge Analytica had harvested the personal data of up to 87 million users. Representatives of the European Parliament have asked Mr. Zuckerberg to come to Brussels to “clarify issues related to the use of personal data” and he has agreed to come as soon as next week.
Ms. Jourova, whose job is to protect the data of over 500 million Europeans, has hardened her stance as well.