Facebook

Alex Stamos, Facebook’s chief security officer, defended the social networking company against recent coverage of the platform on Twitter.

Stamos was responding to an article by Quinta Jurecic of Lawfare who criticized Facebook’s decision to have human editors oversee more ads as an excuse saying it should fix their algorithms instead.

Stamos responded on Twitter by saying the company is being attacked for not doing enough to combat misinformation spreading on the social network. Stamos added that journalists underestimate the difficulty of filtering content for the site’s billions of users. Facebook’s chief security officer later added that the company should not become a “Ministry of Truth,” a reference to George Orwell’s propaganda agency in the book 1984.

“If your piece ties together bad guys abusing platforms, algorithms and the Manifestbro into one grand theory of SV, then you might be biased,” Stamos wrote. “If your piece assumes that a problem hasn’t been addressed because everybody at these companies is a nerd, you are incorrect.”

“If you call for less speech by the people you dislike but also complain when the people you like are censored, be careful,” he added. “… If you call for some type of speech to be controlled, then think long and hard of how those rules/systems can be abused both here and abroad.”

This news comes after reports of targeted Facebook advertisement purchases by organizations linked to the Russian government before the 2016 elections.

Ultimately, the results of the Presidential election and recent investigations into how individuals, government agencies or even hackers can manipulate some of the most popular websites in America has draw scrutiny. But, where does responsibility lie? Should users be at fault for choosing to get their news information from a social networking application, forum, or news feed? Or should the companies hosting and providing the platform for the content hold the blame? Either way, tech companies are facing a new challenge, and it is something that an algorithm may not be able to fix.