Business

Facebook and YouTube Give Alex Jones a Wrist Slap

The digital walls are closing in on Alex Jones, the social media shock jock whose penchant for right-wing conspiracy theories and viral misinformation set off a heated debate about the limits of free speech on internet platforms.

Posted Updated
Facebook and YouTube Give Alex Jones a Wrist Slap
By
Kevin Roose
, New York Times

The digital walls are closing in on Alex Jones, the social media shock jock whose penchant for right-wing conspiracy theories and viral misinformation set off a heated debate about the limits of free speech on internet platforms.

Facebook said Friday that it had suspended Jones from posting on the site for 30 days because he had repeatedly violated its policies. The social network also took down four videos posted by Jones and Infowars, the website he oversees.

“We received reports related to four different videos on the pages that Infowars and Alex Jones maintain on Facebook,” a Facebook spokeswoman said in an emailed statement. “We reviewed the content against our Community Standards and determined that it violates. All four videos have been removed from Facebook.”

The 30-day ban applies only to Jones personally, not to Infowars or to any of the other administrators of his Facebook page, which has nearly 1.7 million followers. Those people will still be able to post to Jones’ page as long as their posts don’t violate the site’s policies — meaning that Jones could still appear in videos and stories posted to the page as long as he does not post them personally.

In fact, Jones appeared on a livestreamed Facebook video on his page Friday, shortly after the suspension went into effect, in which he claimed that he was the victim of a media conspiracy to “de-platform” conservative voices.

“This is war,” Jones said in the video.

Facebook may remove Jones’ page altogether if he continues to violate its policies, the spokeswoman said. This week, Facebook determined that one of Jones’ recent videos — an inflammatory rant in which he accused Robert Mueller, the special counsel, of supporting pedophilia and pantomimed shooting him — did not violate its policies.

On Tuesday, YouTube took down four videos uploaded to Jones’ channel, which has 2.4 million subscribers, for violating its policies on hate speech and child endangerment. The violation placed a first strike against Jones’ account on YouTube, preventing the channel from streaming live video for 90 days. If Jones receives two more strikes during that period, YouTube will terminate his account.

“We apply our policies consistently according to the content in the videos, regardless of the speaker or the channel,” YouTube said in a statement.

A request to Jones for comment was not immediately returned.

This is not the first time that Jones’ videos have received a strike from YouTube. In February, YouTube levied a strike for a video claiming that David Hogg, one of the outspoken student survivors of the school shooting in Parkland, Florida, was a “crisis actor.” YouTube said the video had violated its policies around harassment and bullying. But since there were no additional violations during the next 90 days, the strike was removed from the account.

Facebook and YouTube acted after weeks of controversy over Jones, who first gained notoriety by insisting that the terrorist attacks of Sept. 11, 2001, were an “inside job” by the U.S. government. Since then, he has questioned whether the 2012 massacre at Sandy Hook Elementary School was a hoax, promoted the so-called Pizzagate conspiracy theory and said fluoridated water was part of a government mind-control plot.

Despite these unsupported views, social media platforms have allowed him to gain a wide audience. Conservatives have accused Facebook, YouTube and other platforms of censoring right-wing views in the past, and have rallied behind him before.

This month, at a press event in New York about Facebook’s efforts to combat misinformation and false news, a reporter from CNN questioned company executives about why Infowars was still allowed to have a Facebook account. At the time, the company appeared unwilling to say Jones’ content violated its policies.

“Look, as abhorrent as some of this content can be, I do think that it gets down to this principle of giving people a voice,” Mark Zuckerberg, Facebook’s chief executive, said in a Recode podcast interview.

As an example, Zuckerberg cited Holocaust denial as a message that he found personally offensive but was wary of removing from Facebook, in order to protect users’ free-speech rights.

Within Facebook, the free-speech issues raised by Infowars and Jones have become an especially contentious topic. Employees have used internal chat forums to question executives about the site’s policies, according to one Facebook employee, who asked to remain anonymous because of fear of retribution. One group of Facebook workers, which included people of Jewish and Eastern European descent, raised Zuckerberg’s position on Holocaust denial with their superiors, saying they found it incomprehensible, according to the employee.

At a House Judiciary Committee hearing this month, Democratic lawmakers pressed Monika Bickert, Facebook’s global head of policy management, about why Infowars was allowed to remain on Facebook. Several days later, two parents of a child killed in the Sandy Hook shooting wrote an open letter to Zuckerberg, criticizing him for allowing Jones and his followers to use Facebook to harass and intimidate the families of victims.

“Our families are in danger as a direct result of the hundreds of thousands of people who see and believe the lies and hate speech, which you have decided should be protected,” the parents wrote.

Facebook’s policies about misinformation have been vague and inconsistently applied, and the company has appeared flat-footed when dealing with popular purveyors of conspiracy theories and hyperpartisan content such as Jones and Infowars.

In briefings with reporters this month, Facebook executives struggled to define the company’s policies regarding accounts that repeatedly post false or misleading news. The executives said that if third-party fact-checkers found roughly one-third of an account’s posts false, the account would be demoted, or “down-ranked,” in order to limit its visibility. The company has refused to reveal a list of accounts that have been down-ranked. Later, the company said it would remove, rather than down-rank, misinformation that could lead to physical violence.

Copyright 2024 New York Times News Service. All rights reserved.