Political News

Facebook just made a very dangerous decision for 2020

On Thursday, Facebook made a very bad decision that will, without question, negatively impact the 2020 election. They decided not to either ban political advertising or institute any sort of fail-safe measure to avoid lies presented in those ads from politicians from propagating across the Internet.

Posted Updated

By
Analysis by Chris Cillizza
, CNN Editor-at-large
CNN — On Thursday, Facebook made a very bad decision that will, without question, negatively impact the 2020 election. They decided not to either ban political advertising or institute any sort of fail-safe measure to avoid lies presented in those ads from politicians from propagating across the Internet.

"Ultimately, we don't think decisions about political ads should be made by private companies, which is why we are arguing for regulation that would apply across the industry," Facebook's Rob Leathern said in a blog post. "In the absence of regulation, Facebook and other companies are left to design their own policies."

Uh, what?

Let's translate that statement: Because the government won't set clear regulations as to how speech should be governed on social media sites. Facebook is essentially throwing up its hands. Unlike, say Twitter, which announced a ban on political advertising, Facebook's decision on how to "design their own policies" is to say they decided not to do anything. Or not much.

"While Twitter has chosen to block political ads and Google has chosen to limit the targeting of political ads, we are choosing to expand transparency and give more controls to people when it comes to political ads," wrote Leathern.

Here's the problem: That's not good enough. Not even close.

For a lot of reasons. The biggest of which is that we KNOW that the Russians used Facebook as a major part of their efforts to influence the 2016 election to help Donald Trump and hurt Hillary Clinton.

Facebook itself has acknowledged that accounts associated with the Russian internet troll farm at the center of the 2016 misinformation campaign reached as many as 126 million(!!) people. And special counsel Robert Mueller's report on Russian interference was even more detailed. As Buzzfeed's Ryan Broderick wrote:

"According to Mueller's report, the Facebook groups were particularly popular. By the time Facebook deactivated them in 2017, the Russia-controlled group 'United Muslims of America' had over 300,000 followers, the 'Don't Shoot Us' group had over 250,000 followers, the 'Being Patriotic' Facebook group had over 200,000 followers, and the 'Secured Borders' Facebook group had over 130,000 followers."

Yes, interacting within Facebook groups is not the same as running ads on Facebook that aren't entirely accurate (or accurate at all). But when you, as Facebook, know that your technology was used -- at massive scale -- to spread misinformation with the hopes of a foreign power influencing the American presidential election, wouldn't you want to bend over backward to try to avoid purposeful misuse of the platform again?

The answer to that question is, obviously, yes. But not for Facebook. And for anyone who has read Andrew Marantz's absolute terrific and terrifying new book "Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation," that shouldn't be all that surprising. As Marantz documents, the founders of these social media behemoths have long viewed their platforms as the ultimate in free speech -- resisting curtailing any sort of speech, no matter how hateful (for example, white supremacists have flourished on social media) or false.

Facebook's decision -- or, really, lack of a decision -- about its willingness to police lying in its ads ensures that falsehoods will spread on its platform. And not just falsehoods, but misinformation and lies produced by people who know that they can take advantage of unknowing people because Facebook's decision. The end result of Facebook's view on lying in ads is that the voting public will be less well informed about its choices this November. Period.

Let's return once more to the Facebook blog post by Leathern. He writes:

"We recognize this is an issue that has provoked much public discussion -- including much criticism of Facebook's position. We are not deaf to that and will continue to work with regulators and policy makers in our ongoing efforts to help protect elections."

Which means, well, nothing. Not. Good. Enough.

Copyright 2024 by Cable News Network, Inc., a Time Warner Company. All rights reserved.