Data protection regulators, particularly in Europe, appear to be taking a tough line with large tech firms over data privacy violations.
Law firm DLA Piper recorded a total of €1.2bn ($1.26bn) of General Data Protection Regulation (GDPR) fines issued across Europe in 2024. This followed €2.9bn ($3.1bn) in fines levied by European regulators in 2023.
Tech giants and social media firms have borne the brunt of these financial penalties. This includes Meta receiving a €1.2bn ($1.3m) fine for transferring personal data to the US on the basis of standard contractual clauses (SCCs) in May 2023.
However, these headline figures do not tell the full story.
Just $19.9m of the $3.26bn in fines levied by the Irish Data Protection Commission (DPC) between 2020 and the end of October 2024 had been paid as of December 2024. This represented just 0.6% of the total fines issued according to an investigation by the Irish Independent at the end of 2024.
This reality has cast significant doubt on the effectiveness of fines in deterring bad data privacy practices by tech companies.
Valerie Lyons, COO & Senior Consultant at BH Consulting, told Infosecurity: “The issue is when people say things like ‘WhatsApp was fined for that,’ as if that would be a deterrent. The truth is, WhatsApp have yet to pay the fine so there is no deterrent until the fine is actually paid.”
During this year’s Data Privacy Week campaign, Infosecurity explores the challenges regulators face in ensuring fines act as a deterrent and what more can be done to improve privacy practices in big tech.

Large Privacy Fines Not Getting Paid
Many of the fines issued by data protection regulators under GDPR over the past five years are still the subject of court appeals or other legal processes.
The Irish DPC, the largest enforcer of GDPR fines in Europe, often undergoes an extensive process to make a fine payable after a penalty decision has been made.
Lyons explained that once a fine has been issued by the regulator, it does not become payable until the decision is confirmed by the Circuit Court. Once the decision is confirmed, the DPC can issue a formal notice to the penalized entity to require payment of the fine.
However, these applications can only be made if the offending company has not invoked its statutory right of appeal.
“Each significant fine awarded against big tech has been appealed and then it enters into the lengthy legal framework, which must be engaged in by the DPC. These cases take years,” she noted.
Large tech organizations have demonstrated they have the necessary financial resources to embark on these legal battles. This has resulted in huge delays in fines being paid and there is the potential courts will eventually reduce or overturn them.
Fines have been successfully reduced in the past, in 2020 a £183m fine issued to British Airways for data security failings was reduced to just £20m ($24.6m). This was due to the financial difficulties faced by the airline because of COVID-19 restrictions and BA making considerable improvements to its IT security.
In 2020, a €9.55m ($9.92m) GDPR fine was issued by Germany’s data protection authority to telecom provider 1&1 Telecom GmbH for insufficient authentication procedures. The fine was eventually reduced by 90% by a German court. It was judged that the fine amount issued was disproportionate to the nature of the violation.
Lyons added: “It’s a simple costs/benefit calculus – fighting fines in court costs the organization more, but the gains are greater with the potential to be overturned/reduced in the appeals court.”
Concerns over the deterrent impact of fines on tech firms’ data privacy practices have even been expressed at a regulatory level.
In November 2024, UK Information Commissioner John Edwards told British newspaper The Times that he didn’t believe the levying of fines was an effective way of keeping big tech firms in line, serving only to tie up the Information Commissioner’s Office (ICO) in litigation.
The Need to Improve Fine Collection
While the impact of fines issued by data protection regulators has come under question, there is no doubt that the ability to issue financial penalties remains a crucial weapon in regulators’ armoury.
Lyons pointed out that as well as acting as a punishment and a deterrent, fines are also an important way of communicating to the public about wrongdoing. This is especially true when the violators are household names.
"Each significant fine awarded against big tech has been appealed and then it enters into the lengthy legal framework"
A spokesperson for the UK’s ICO told Infosecurity that the regulator has achieved successful outcomes from several enforcement actions against big tech firms that involved the use of fines. This includes fining social media giant TikTok £12.7m in 2023 for misusing children’s data.
“In cases such as these, our action has resulted in tangible improvements to firms’ practices,” the ICO said.
A specialist team, the Financial Recovery Unit (FRU), has been established by the ICO ensure the full recovery of monetary penalties issued under data protection laws.
This team is tasked with undertaking follow-up actions such as the instruction of litigation and recovery specialists to identify assets and undertake formal recovery proceedings.
Using the Full Range of Regulatory Levers
Other enforcement methods can be used outside of financial penalties to try and enforce better data privacy practices. These alternatives are likely to become more prominent as enforcing penalties continues to be a challenge.
Using Forceful Techniques
Lyons noted that there are stringent enforcement measures regulators can take to force compliance outside of issuing fines.
This includes ramping up auditing and surveillance mechanisms to pressurize tech companies, including dawn raids in extreme cases.
Another emergency option is to issue desist processing orders to force companies to immediately stop processing personal data in a particular jurisdiction. This power was used by Italian data protection authority against OpenAI in 2023 relating to the use of ChatGPT.
Lyons emphasized that regulators need to undertake a careful cost-benefit analysis before taking such measures.
“A regulator is always going to have to try to balance economic, political, government, societal and regulatory interests. Getting its teeth out to ruthlessly enforce may not be in the best interest of this balancing act and this is a really difficult part of being a regulator,” she noted.
Shift to Individual Liability
There has been a significant shift in focus by regulators towards personal liability for data protection failings, targeting individual senior executives as well as the organization itself.
In one example, in 2024, the Dutch Data Protection Commission announced it was investigating whether it can hold the directors of Clearview AI personally liable for numerous breaches of the GDPR, following a €30.5m ($32.03m) fine against the firm.
This approach is being recognized as an alternative way of encouraging a stronger focus on data privacy protections from the executive level at organizations.
Jonathan Armstrong, Partner at Punter Southall Law, noted that most modern data protection legislation, including the GDPR, contain some form of personal liability provision.
Regulators can use this avenue when there is evidence of illegal activities from executives, such as obstructing investigations into non-compliance, making false statements to regulators and unlawfully obtaining personal data without permission.
“There’s a trend towards holding individuals responsible in extreme cases,” Armstrong noted.
This is an area that the UK ICO has a keen focus on. “In cases involving non-compliant and rogue directors, we also work with other external agencies to disrupt and obstruct ongoing harms. To date this work has resulted in 45 directors being disqualified for 270 years following further action taken with the Insolvency Service,” an ICO spokesperson told Infosecurity.
Collaboration with Tech Firms
In addition to enforcement actions, regulators can take a collaborative approach to improving data privacy practices in tech companies.
This involves directly engaging with firms before taking the enforcement route, encouraging them to make voluntary improvements to their practices through dialogue.
The ICO said it is currently undertaking this type of “regulatory supervision” activities with big tech firms in strategic priority areas of children’s privacy, online tracking and AI.
“In recent months, we have made a range of interventions across big tech platforms, resulting in changes that have significantly enhanced the online privacy of UK residents,” the ICO spokesperson said.
“For example, we secured commitments from Meta to change its plans to use Facebook and Instagram user data to training generative AI by making it simpler for users to object to the processing, and from X to remove personalized advertising for under 18s.”
Conclusion
The volume of substantial data privacy violation fines issued to big tech has created the perception that these companies are being heavily punished for their wrongdoings.
Beneath those headline figures, however, is a different story. Legal proceedings mean there are long delays in the payment of fines. There is also the potential the fines are reduced or overturned entirely. This reality likely limits the impact that these penalties have on big tech privacy practices.
Therefore, it is important that regulators are innovative, deploying the full range of tools at their disposal to ensure compliance with data privacy regulations. These range from desist orders and personal liability actions, to encouraging tech firms to voluntarily improve their behaviors.
As data protection laws become embedded in our digital lives, regulators need to make sure their approaches have an effective impact on compliance.