Our website uses cookies

Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing Infosecurity Magazine, you agree to our use of cookies.

Okay, I understand Learn more

Facebook CSO: Spotting Fake News is Harder Than You Think

Facebook CSO Alex Stamos has hit back at critics in the media whom he claims have misunderstood just how difficult it is to design algorithms to effectively control fake news on the site.

The social network is reeling after allegations that Russia tried to manipulate the US presidential election last year, in part by buying thousands of ads to target voters in swing states.

The firm has also been criticized for allowing obviously fallacious stories to appear high up on its pages, most recently in the aftermath of the Las Vegas shootings as well as during the race for the White House.

Former Yahoo CISO Stamos is in charge of the tech giant’s investigation into this far-reaching issue.

He let rip on Twitter over the weekend against a journalist who suggested that Facebook’s recent decision to increase the number of manual ad reviewers fails to address the issue that it simply designed the algorithms badly in the first place.

He argued that Facebook is well aware that algorithms can be biased, that no one at the company is taking the issue lightly, and that fake news can be difficult to distinguish from simply opinionated content.

“Lots of journalists have celebrated academics who have made wild claims of how easy it is to spot fake news and propaganda without considering the downside of training ML systems to classify something as fake based upon ideologically biased training data,” Stamos tweeted.

“So if you don't worry about becoming the Ministry of Truth with ML systems trained on your personal biases, then it's easy!”

Algorithms inevitably have to be used to sort the huge quantity of content published on the social network, he said, although Facebook recently told advertisers that any ads targeted to users based on "politics, religion, ethnicity or social issues" will be manually reviewed.

Interestingly, Stamos didn’t really address one of the key criticisms of Facebook during this debate: that it simply didn’t appreciate the serious impact fake news could have on political outcomes last year. Although it seems to be throwing more resources at the problem now, many will argue the damage has been done.

What’s Hot on Infosecurity Magazine?