Alex Stamos, Facebook’s chief security officer, defended the social networking company against recent coverage of the platform on Twitter.
Stamos was responding to an article by Quinta Jurecic of Lawfare who criticized Facebook’s decision to have human editors oversee more ads as an excuse saying it should fix their algorithms instead.
Nobody of substance at the big companies thinks of algorithms as neutral. Nobody is not aware of the risks.
— Alex Stamos (@alexstamos) October 7, 2017
For example, lots of journalists have celebrated academics who have made wild claims of how easy it is to spot fake news and propaganda.
— Alex Stamos (@alexstamos) October 7, 2017
Stamos responded on Twitter by saying the company is being attacked for not doing enough to combat misinformation spreading on the social network. Stamos added that journalists underestimate the difficulty of filtering content for the site’s billions of users. Facebook’s chief security officer later added that the company should not become a “Ministry of Truth,” a reference to George Orwell’s propaganda agency in the book 1984.
A bunch of the public research really comes down to the feedback loop of “we believe this viewpoint is being pushed by bots” -> ML
— Alex Stamos (@alexstamos) October 7, 2017
Likewise all the stories about “The Algorithm”. In any situation where millions/billions/tens of Bs of items need to be sorted, need algos
— Alex Stamos (@alexstamos) October 7, 2017
“If your piece ties together bad guys abusing platforms, algorithms and the Manifestbro into one grand theory of SV, then you might be biased,” Stamos wrote. “If your piece assumes that a problem hasn’t been addressed because everybody at these companies is a nerd, you are incorrect.”
And to be careful of their own biases when making leaps of judgment between facts.
— Alex Stamos (@alexstamos) October 7, 2017
If you call for some type of speech to be controlled, then think long and hard of how those rules/systems can be abused both here and abroad
— Alex Stamos (@alexstamos) October 7, 2017
A lot of people aren’t thinking hard about the world they are asking SV to build. When the gods wish to punish us they answer our prayers.
— Alex Stamos (@alexstamos) October 7, 2017
“If you call for less speech by the people you dislike but also complain when the people you like are censored, be careful,” he added. “… If you call for some type of speech to be controlled, then think long and hard of how those rules/systems can be abused both here and abroad.”
This news comes after reports of targeted Facebook advertisement purchases by organizations linked to the Russian government before the 2016 elections.
Ultimately, the results of the Presidential election and recent investigations into how individuals, government agencies or even hackers can manipulate some of the most popular websites in America has draw scrutiny. But, where does responsibility lie? Should users be at fault for choosing to get their news information from a social networking application, forum, or news feed? Or should the companies hosting and providing the platform for the content hold the blame? Either way, tech companies are facing a new challenge, and it is something that an algorithm may not be able to fix.