New Ethical Concerns in Online Privacy and Data Security

Written by

The single most important piece of legislation regarding privacy in the past few years is the GDPR. The control, storage and use of data has been a major privacy issue for a long time. The philosophy behind the GDPR is that users have the right to determine how their data is used and stored.

However, it would seem that the privacy efforts of most technology giants are driven simply by compliance and threat of punishment, rather than a genuine interest in keeping users’ data secure. This often led to run-ins with the law. By 2025, the total size of digital data will be 175 zettabytes, if governments and corporations don’t nip the privacy problem in the bud right now, there won’t be a better time to do so.

Recently, WhatsApp released a controversial new privacy policy that allows it to share data with Facebook companies and forces users to accept the same. What are the risks of such policy changes? Amidst rising privacy concerns, how can companies ensure transparency and accountability?

User Control

The outcry against WhatsApp signals one thing: consumers are becoming more aware of their rights regarding internet privacy. Everyone understands that businesses collect data from consumers that use their websites and apps. The major concerns lie with the use and storage of the data collected. How do companies handle customer data? Who should be in charge of data?

Consumers should be able to decide whether they want their data used in a certain way, and now, as awareness spreads, the masses are increasingly assertive of this right.

There is also the issue of companies matching their words with actions by providing measures of data security proportional to the sensitivity and volume of customer data they collect. Companies that collect very personal data (financial and health information, mostly) or generally a lot of data (social media companies and search engines) are attractive targets of cyber-attackers. We’ve already seen this play out in massive data breaches at Facebook, Google and Microsoft.

Ads and Personalization

This contributes to the collective opinion that the risks of personal data collection outweigh the benefits, a sentiment expressed by 81% of Americans polled about 13 months ago. Companies often collect data to profile users for targeted ads, and people have become increasingly disenchanted with such personalized ads out of fear of undue tracking.

While disabling personalization entirely is a lofty desire that would probably never be fulfilled, according to this author: “What companies couldn’t do anymore is share their dossiers about you with adtech companies and advertisers. As a result, a lot less of your personal information would end up in the hands of data brokers.”

These ethical concerns are not new because no one had ever before considered them. People have, but these were mostly experts. However, ordinary people themselves have now come up at the forefront in the online privacy battle against data-collecting companies. These companies have a greater task at hand now in winning the trust of their customers and perhaps, more frighteningly, complying with regulations to avoid tough sanctions.

Corporate Accountability in Consumer Data Privacy

In protecting customers data, here are some steps a company should take:

  • Don’t collect more data than for required purposes. The more data you collect (and the more sensitive it is), the more you set yourself up as a target for hacking, and the more resources you have to expend to protect such data
  • Let customers know how their data will be used, shared and stored, and have them consent to specific details (beyond the generic privacy policy acceptance). According to Accenture Interactive, 73% of consumers are willing to share more personal information with companies that are transparent about data usage
  • Vet the security architecture of business partners and all third parties, especially those with whom you share customer data. More so, if sharing is required, maintain visibility over how the third party uses the data
  • Upgrade your own security architecture. In this generation, data is gold and there are nefarious actors who desperately want what you have. Particularly, stay wary of AI-based attacks, zero-day vulnerabilities and advanced persistent threats.

In essence, companies should be more responsible in how they handle customer data, and customers should be more careful about the information they share with brands. Meanwhile, one would expect to see more data protection regulations springing up around the world. Europe has taken a major step with the GDPR, and that needs to be replicated in other countries and regions.

What’s hot on Infosecurity Magazine?