Pressure Mounts on Facebook to Vet Political Content


Should Facebook and other social media giants be treated like publishers, legally liable for the content on their platforms? That’s just one question Facebook CEO Mark Zuckerberg had to face at a congressional hearing this week.

Zuckerberg on Wednesday appeared before the U.S. House Committee on Financial Services to discuss the company’s plans for a new global currency. But many members of Congress had other items to discuss with the tech entrepreneur in light of his recent comments that Facebook would not take down political content even if it contained falsehoods.

Thanks to our sponsors:

View all sponsors

U.S. Rep. Alexandria Ocasio-Cortez (D-N.Y.) asked Zuckerberg about that policy.

“You announced that the official policy of Facebook now allows politicians to pay to spread disinformation in 2020 elections and in the future,” she said. “So I just want to know how far I can push this in the next year?”

She then asked Zuckerberg if it would be OK for politicians to target people living in predominantly black zip codes with election advertisements that had the wrong date for the election.

Zuckerberg said the company would take down such misleading content, but when Ocasio-Cortez asked whether she could send out false information claiming her Republican opponents had voted for the Green New Deal, Zuckerberg said: “Probably.”

If social media companies will not voluntarily take down misleading content, should they be forced to do so through regulation?

Ed Lasky, news editor for American Thinker, a conservative-leaning online daily magazine, does not believe that government regulation is the answer.

“There’s a reason the First Amendment is the first amendment because it’s meant to protect all the other rights that are given to citizens in the Bill of Rights and the Constitution,” Lasky said. “I come down on the side of the ACLU basically on this issue. We really need more speech to combat the false speech that is out there.”

Lasky notes that politicians “often engage in falsehoods” but the public is nevertheless able to discern truth from fiction.

“I really think that Donald Trump’s public approval has shown – and his approval has been going down – that people don’t believe a lot of things that he says,” Lasky said. “I really think that in the marketplace of ideas that’s the best defense for the truthfulness of speech.”

Nick Feamster is a professor in the Department of Computer Science at the University of Chicago, where he is the director of the Center for Data and Computing. Feamster said that before you even get into issues of free speech there are enormous challenges in attempting to vet social media content.

“These social media platforms are processing hundreds of millions of posts per day,” Feamster said. “If you have regulations put in place that basically require the platforms to moderate the content, once you start making these asks you are basically saying that the approach must be automated. No human can look at all of this content – not even the tens of thousands of content moderators that Facebook has hired could possibly look at all of it – particularly within a short time frame.”

And, Feamster says, while some people may be unhappy with Facebook for not doing enough, they are making more of an effort than other major platforms. Twitter and YouTube, he says, are doing “basically nothing” to vet content.

Feamster believes at least part of the reason for the lack of action is simply a matter of cost.

“You need a lot of fact-checkers and fact-checkers are expensive. I suspect these are business decisions. It’s expensive to look at all this stuff,” said Feamster.

At the moment, social media companies are protected from criminal liability for content on their platforms by Section 230 of the 1996 Communications Decency Act.

“There could still be civil lawsuits,” said Feamster. “But it’s a huge safe harbor.”

While some advocates would like to see Section 230 amended to make social media platforms more accountable, Lasky believes there are more effective ways to change company behavior.

“I really think a lot of this comes down to public pressure and shareholder pressure and activist group pressures that will compel companies to give these sorts of issues very serious attention,” said Lasky. “Having government censors is a third rail in my mind. I just don’t think you want government involved in that.”


Related stories:

Congress Grills Zuckerberg on Facebook’s Digital Currency Plans

Facebook Tightens Political Ad Rules, But Leaves Loopholes

Social Media Sans Metrics: One Artist’s Quest to Hide ‘Likes’


Thanks to our sponsors:

View all sponsors

Thanks to our sponsors:

View all sponsors

randomness