Our website uses cookies

Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing Infosecurity Magazine, you agree to our use of cookies.

Okay, I understand Learn more

Facebook Fined £500K for Cambridge Analytica Failings

The Information Commissioner’s Office (ICO) has issued a rare maximum fine of £500,000 to Facebook for data protection mistakes that led to the Cambridge Analytica scandal.

After hearing representations from the social network, the UK privacy watchdog said it decided to keep the fine amount unchanged, as per its Notice of Intent in July.

Its investigation into political campaigning revealed that Facebook had processed user information “unfairly” under the old Data Protection Act 1998. It did so by allowing developers to access this info without adequately “clear and informed consent” — and by allowing access even to users who had not downloaded an app but were friends of those who had.

The social network also failed to keep adequate checks on how this data was being secured or used by developers, a situation which led to Aleksandr Kogan harvesting info on 87 million users without their knowledge and subsequently sharing some of this with Cambridge Analytica parent SCL Group. This infamously allowed the company to target wavering voters ahead of the 2016 US presidential election.

The ICO claimed Facebook also failed to take prompt action to ensure this data was deleted when, in December 2015, it discovered what had happened. SCL Group wasn’t suspended until 2018.

Information commissioner Elizabeth Denham warned that the fine would have been “significantly higher” had the GDPR been in force at the time.

“Facebook failed to sufficiently protect the privacy of its users before, during and after the unlawful processing of this data. A company of its size and expertise should have known better and it should have done better,” she added.

“Our work is continuing. There are still bigger questions to be asked and broader conversations to be had about how technology and democracy interact and whether the legal, ethical and regulatory frameworks we have in place are adequate to protect the principles on which our society is based.”

UK Fast CEO, Lawrence Jones, also raised concerns over the impact of intelligent profiling on democracy.

“To regain trust, and recover their share price, Facebook have to now carry through on the promise of Mark Zuckerberg to investigate every Facebook app that’s mining data and ask questions about where that data is, who has access to it and what it’s being used for. They then need to be extremely clear and transparent about the findings of their investigation,” he added.

"This is now about damage limitation for them, and the only way they can limit damage is by being honest about their mistakes and regaining our trust." 

A new Oxford University paper has claimed that data collection and sharing by apps linked to Google, Facebook and others is “out of control” — presenting major privacy risks to customers, who are bamboozled by policies and guidelines.

It claimed that over 88% of free apps on Google Play share information with firms owned by parent company Alphabet.

What’s Hot on Infosecurity Magazine?