World News

Germans, Seeking News, Find YouTube’s Far-Right Tirades

CHEMNITZ, Germany — The day after far-right demonstrators took over the streets here, Sören Uhle, a city official who oversees municipal marketing and development, began to get strange phone calls from reporters.

Posted Updated

By
Max Fisher
and
Katrin Bennhold, New York Times

CHEMNITZ, Germany — The day after far-right demonstrators took over the streets here, Sören Uhle, a city official who oversees municipal marketing and development, began to get strange phone calls from reporters.

The man whose killing had set off the riots, they said, had died while trying to stop asylum-seekers from molesting a local woman. And it wasn’t just one local man who had been killed, but two. Could he comment?

These sorts of accusations suddenly seemed to be everywhere. But none were true. They had come, Uhle and others suspected, from social media — particularly YouTube.

Ray Serrato, a Berlin-based digital researcher, noticed the tide of misinformation when his wife’s uncle showed him a YouTube video that claimed the rioters had been Muslim refugees.

The video, posted by an obscure fringe group, was rambling, and it appeared to be cheaply produced. Yet it had nearly half a million views — far more than any news video on the riots. How was that possible?

Serrato scraped YouTube databases for information on every Chemnitz-related video published this year. He found that the platform’s recommendation system consistently directed people toward extremist videos on the riots — then on to far-right videos on other subjects.

Users searching for news on Chemnitz would be sent down a rabbit hole of misinformation and hate. And as interest in Chemnitz grew, it appears, YouTube funneled many Germans to extremist pages, whose view counts skyrocketed.

Activists say this may have contributed to a flood of misinformation, helping extremists shape public perceptions even after they had been run off Chemnitz’s streets.

“This was new,” Uhle said. “It’s never happened to me before that mainstream media, big German newspapers and television channels, ask me about false news and propaganda that had clearly become so pervasive that people just bought it.”

Researchers who study YouTube say the episode, far from being isolated, reflects the platform’s tendency to push everyday users toward politically extreme content — and, often, to keep them there.

A YouTube spokeswoman declined to comment on the accusations, saying the recommendation system intended to “give people video suggestions that leave them satisfied.” She said the company planned to work with news publishers to help “build a better news experience on YouTube.”

Though YouTube has typically drawn less scrutiny than other social networks, that may be changing. Its parent company, Google, faced criticism from U.S. lawmakers this week for declining to send its chief executive to congressional hearings attended by chief executives from Twitter and Facebook.

— A closed system

YouTube’s recommendation system is the core of its business strategy: Getting people to click on one more video means serving them more ads. The algorithm is sophisticated, constantly learning what keep users engaged. And it is powerful. A high ranking from the algorithm can mean huge audiences for a video.

Serrato wondered if that explained how his family member had discovered the conspiracy video. He had read studies about users who blindly followed the recommendation system; inevitably, they seemed to end up watching long series of far-left or far-right videos.

Zeynep Tufekci, a prominent social media researcher at the University of North Carolina at Chapel Hill, has written that these findings suggest that YouTube could become “one of the most powerful radicalizing instruments of the 21st century.”

But, as Tufekci and other researchers stress, such experiments are anecdotal.

Serrato wanted to get a fuller picture of how YouTube shapes perceptions of events. So he conducted something known as a network analysis, applying techniques he had used in his day job, as an analyst with Democracy Reporting International, a respected global governance monitor, to track hate speech in Myanmar.

Using YouTube’s public developer interface, Serrato plugged in a dozen recent videos related to Chemnitz. For each, he scraped YouTube’s recommendations for what to watch next. Then he did the same for those videos, and so on. Eventually, he identified a network of about 650 videos, nearly all from this year.

The results, he said, were disturbing. The network showed a tight cluster of videos that Serrato identified as predominantly conspiracy theorist or far right.

This was the first sign that YouTube’s algorithm systemically directs users toward extremist content. A more neutral algorithm would most likely produce a few distinct clusters of videos — one of mainstream news coverage, another of conspiracy theories, another of extremist groups. Those who began in one cluster would tend to stay there.

Instead, the YouTube recommendations bunched them all together, sending users through a vast, closed system composed heavily of misinformation and hate.

Viewers who come to YouTube for down-the-middle news may quickly find themselves in a world of extremists, Serrato said.

“That’s what I found bizarre,” he said. “Why are they so close together, unless the aim is to trigger a reaction?” Content that engages viewers’ emotions or curiosity, he suspected, would hook them in. And it wasn’t just that the platform directed people to unreliable videos about the subject they had sought out — in this case, Chemnitz.

Many of the videos in Serrato’s analysis were unrelated to Chemnitz. Some offered positive portrayals of white nationalism in general or of Alternative for Germany, a far-right political party. Others went further astray, detailing fringe conspiracies; one argues that President Donald Trump is a pawn of the Rothschild banking family.

Why would YouTube surface videos like these in a search for news stories?

How many steps are there on YouTube’s algorithm from news story to fever swamp? “Only two,” Serrato said. “By the second, you’re quite knee-deep in the alt-right.”

Perhaps most striking is what was absent. The algorithm rarely led back to mainstream news coverage, or to liberal or centrist videos on Chemnitz or any other topic. Once on the fringes, the algorithm tended to stay there, as if that had been the destination all along.

— From fringe to mainstream

Activists and residents in Chemnitz say far-right conspiracy theories seemed unusually common in the days before and after the demonstration.

Oliver Flesch, a far-right figure on YouTube, posted a series of videos misrepresenting the killing that set off the riots, with titles like “German Stabbed to Death Just Because He Wanted to Help Our Women.” Another claimed the asylum-seekers had killed two Germans.

Flesch, who has 20,000 subscribers, operates in a political bubble. Yet his claims had filtered into the mainstream enough that journalists asked Uhle, the Chemnitz official, about them. How?

The algorithm may have helped. Serrato’s network analysis led to 16 of Flesch’s videos, and to five by obscure right-wing rapper Chris Ares, with whom Flesch sometimes does guest spots.

And misinformation can travel in other ways. Some German officials said this week that a widely circulated video, appearing to show a far-right activist chasing a dark-skinned person during Chemnitz’s riots, may have been faked. Thomas Hoffmann, who helps run a local refugee organization, was on a train from Hamburg when the riots broke out. So he searched YouTube for “Chemnitz” and the date, hoping to follow the events.

Instead, the platform returned obvious forgeries. One video of dark-skinned residents being attacked was edited to make them look like the aggressors. Others were interspersed with footage from previous rallies, to make this one look more peaceful than it was.

“It was incredible how much blatantly doctored material there was,” Hoffmann said. “When you click on one video, whether you like it or not, another one is proposed that features content from far-right conspiracy theories.”

— Is YouTube worse than others?

YouTube has been more cooperative with German authorities about removing hate speech than other social media companies, said Flemming Ipsen, who tracks political extremism at Jugendschutz.net, a government-linked internet monitor.

But some researchers consider YouTube to be unusually permissive about content that it does not consider overt hate speech, and its algorithm unusually aggressive in pushing users toward political fringes. YouTube also designates some content as borderline, neither blocking nor promoting it.

YouTube says it does not code videos by political content, but rather by viewer interest. Critics say that leads the platform to surface fringe material that reliably wins more clicks.

Serrato said that even while researching videos he found abhorrent, he was unable to resist.

“As soon as I was on one of the videos, I thought, OK, now I’m going to watch the next one,” he said. “That’s YouTube’s goal. I stay engaged, ads play. And it works.”

Guillaume Chaslot, a former engineer at YouTube’s owner, Google, said that during and after high-profile political events like the demonstration in Chemnitz, extremism and misinformation often spike on the platform. But this reflects deeper tendencies in YouTube’s algorithm, he said.

“The example I like to cite is the flat-earth theory, because it’s apolitical,” Chaslot said. Videos claiming the earth is flat, he said, are “still going viral, still getting highly recommended by the YouTube algorithm, because it gets watch-time.”

Chaslot worked on YouTube’s algorithm until 2013, when he was fired. Google has said he was fired for poor performance, Chaslot has cited disagreements over the company’s direction.

Now, he studies the algorithm from outside, most recently analyzing its recommendations during the 2016 presidential campaign. As in Chemnitz, he found that YouTube’s suggestions consistently nudged users into extremist content.

“YouTube doesn’t give you a straight representation of the world,” Chaslot said. “It looks like reality, but it deforms reality, because it is biased toward watch-time.” Even in Germany, which has some of the toughest social media restrictions of any democracy, officials say they have little power to regulate the vast majority of social media content.

“Lies, propaganda and manipulation are harmful for society, but on their own are not illegal — and so our hands are often tied,” said Ipsen, of the government-linked internet monitor.

German officials have urged social media companies to make their algorithms more transparent. U.S. lawmakers have done the same, citing research linking social media to polarization, foreign meddling and hate speech.

But the companies are refusing.

“The algorithm is central to their business model,” Ipsen said. “All we can do is remind them of their social responsibility.”

Copyright 2024 New York Times News Service. All rights reserved.