Our website uses cookies

Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing Infosecurity Magazine, you agree to our use of cookies.

Okay, I understand Learn more

The Journey to Data Integrity

In 2017, ‘Fake News’ was crowned word of the year thanks in part to a deteriorating relationship between politicians and the media. Claims and counterclaims could be challenged without the need to provide evidence to back up assertions. Photographs and videos used out of context told stories that, once released through social media and on to the internet, became ‘fact’ within seconds. 

At the same time, accelerating AI research showed a frightening advancement in video manipulation where celebrities were shown giving speeches they had never made. This raises the question of how we can verify the authenticity of the things we see and hear? How can we ensure that the information we receive has maintained its integrity from its source throughout its lifecycle? How can this integrity be conveyed in a timely manner so that judgements are made on facts? Data trust takes time to build but only seconds to destroy.
 
This is not just a problem for the media. Corporations face the same challenges and it is having an impact on security teams. How do I verify the identity of a user looking to access the network?  How to I ensure that the payment details of this invoice/transaction are correct? Do I have enough confidence in this data in order to make informed business decisions? While these are not necessarily new concerns within organizations, where resources are scarce, risk decisions are taken to prioritize other risks. In recent times the focus has been on protecting the confidentiality and availability of data rather than its integrity. 

It’s not a surprise. Data breach reports highlight how many organizations do a poor job of keeping their customer data secure and of course keeping the business functioning is non-negotiable. However, the increasing regulatory pressures and emerging stories of victims being targeted in data integrity hacks, which typically go unnoticed until it’s too late, mean there is an urgent need for teams to review their threat models and place more emphasis on data integrity. But where to start?
 
Firstly, let’s make a distinction between ‘Data Integrity’ and ‘Data Security.’ Data integrity refers to the validity and accuracy of the data throughout its lifecycle, while data security refers to the protection of data against unauthorized access or corruption. Clearly there is a strong link between the two as good data security should reduce the risk of a data integrity issue.
 
Threats to data integrity commonly include:

  • Human error
  • Data transfer
  • Software bugs
  • Hardware malfunctions such as read, write, storage errors or in more extreme cases fire or natural disasters
  • Cyber Threats such as malware, insiders or unauthorized access from external actors

The first challenge facing organizations is not having the visibility into what data is being collected or transferred, where that data resides and how much of it requires protecting. A legacy of over provisioning access to data and a lack of control on how it was shared means this is no easy task. Trying to tackle all of it at once leads to the old cliché of trying to boil the ocean. Consider where your biggest risks are and systematically tackle them in order: is it a regulatory driver, a set of audit findings or business critical systems?

Like many problems in infosec, we are not starting from a clean slate. Applying equal levels of data security across the whole organization is not a viable strategy.

Effectively managing the risk of the above threats can broadly be broken down into the following themes.
 
Cleaning & Backup
Sensitive data has a habit of escaping from its 'secure' location and ending up on file shares, personal laptops and email inboxes. In most cases there is no malicious intent to this activity and can often be traced to attempts to increase productivity.

However, the loss of control means that data could be exposed to employees who shouldn't have access or reporting and analysis could be inconsistent with the true source leading to bad or ill-informed decision making. Finding and de-duplicating this data can be tricky. Small variations in files can mean that exact matching methods fail - especially in large organizations. Consider using a data amnesty so that employees can submit data they have or know about without fear of reprisal.
 
A critical part of maintaining data integrity is having effective data backups. If data corruption or loss occurs (by any of the above means) having an accessible and up-to-date backup is essential to recovering business operations and restoring a known good state as well as re-establishing trust in the data. The frequency and accessibility of backups should be determined by the organizations risk appetite.
 
Access Control
While encryption at rest offers some protection from those without approved access, the levels of privileged access to critical systems and data in many organizations poses a greater risk. This can be especially problematic if an outsider manages to gain the credentials of a legitimate user and directly affect the integrity of the systems and data. 
 
Implementing a least privilege model can be challenging for organizations. For a start, the asset inventory is often incomplete and everything or nothing could be labelled as critical. Knowing what should be under management is difficult. Secondly, removing privilege is like snipping the wires on a bomb - one wrong move can have detrimental effects on the business at critical moments. The removal of privilege could be the difference in millions of pounds of lost business because no one has the right access to bring a system back up.
 
At any given point, a security team should be able to know WHO has access to WHAT, WHY they need access as well as WHEN and HOW they are using it. This can be essential when securing business critical systems.
 
Validation Methods
Much of what's been discussed so far relates to reducing the risk to data at rest. Data in motion poses additional challenges. When receiving data from a source (known or unknown) via email, an application or database, a physical transfer or any other method, how can you verify that the transfer was complete, that the data you received is the same as that which left the sending system and how can you validate that the resulting data is authentic (i.e. that the bank account details for the transfer are indeed those of the intended recipient). A combination of technical checks and business processes are needed to ensure that the integrity of data is maintained throughout its lifecycle.

Data and the insight that can be unlocked, empower today’s businesses to move faster and make better more informed decisions. But small interruptions to this process can have significant effects. In the same way we are starting to question and validate where what we see and hear on the internet, is authenticate and can be trusted, organizations need to embrace the right checks and balances to reduce the risk that a data integrity attack has a lasting impact on their business.

What’s Hot on Infosecurity Magazine?