Why Businesses Need to Re-Think Network Monitoring in 2020

Written by

According to recent research, more than half of British businesses found themselves on the end of a cyber-attack in 2019. Further, Cybersecurity Ventures predicts that there will be an attack on a business every 14 seconds by the end of this year.

The figures are remarkable, but not necessarily unexpected. In fact, they’re part of a wider pattern that has seen cyber-crime rising steadily over the past few years and if this trend continues, businesses will face an even higher attack cadence and intensity in 2020.

The techniques employed by cyber-criminals to mount attacks are also growing in scale and sophistication. This is an intimidating prospect for businesses, for whom breaches can pose an existential threat and the reputational damage associated with a cyber-attack can stick with a company for some time, as has been the case for the likes of Capital One, Marriot and Cathay Pacific.
 
Anticipating the next cyber-attack is no easy task and is made even more challenging by the exponential growth of the Internet of Things (IoT), which has vastly increased the number of network entry points a cyber-criminal can exploit. Each device also contributes to the overall network traffic, creating a huge amount of data that security teams need to sift through. 

Greater investment in cybersecurity tools is positive, but security analysts are also more overworked than ever, and attacks are still slipping through the net. So, how should businesses approach network monitoring practices to ensure their critical data remains secure in 2020?

Where are current systems falling down?
In the face of increasingly complex cyber-threats, traditional network security is falling short. Conventional approaches to network security rely on legacy, rule-based machine learning methods to detect possible attacks. This involves feeding as many samples of malicious data as possible into a learning algorithm, so that it can profile threats and alert businesses to future instances. This approach is significantly flawed on two counts.

As a result of the reliance on pre-existing samples to detect malicious activity, these network monitoring systems are inherently biased. Cyber-threats are constantly changing shape, but traditional systems are ill-equipped to deal with this fact because they cannot identify threats beyond those they’re already familiar with.

In today’s ever-changing cybersecurity landscape, there’s always a new threat around the corner, which makes dealing with the unknown an essential part of protecting business-critical data. Relying on legacy systems means enterprises are forced to take a reactive, rather than proactive approach to cybersecurity.

Second, because security professionals understand the limitations of legacy systems, they’re calibrated with excessive caution. In the process of identifying behaviors the system believes to be malicious, it also triggers a high proportion of false positives, as many behaviors exhibited by threats can also occur as part of normal activity.

This means security analysts must spend time investigating each suspicious incident to determine whether it’s a genuine threat or not. The automated nature of the system is made redundant by the need for human micro-management.

Re-thinking network monitoring in 2020
If businesses hope to combat the threat posed by cybercrime in 2020, they will need to refresh their approach to network monitoring. First and foremost, this means moving away from biased, legacy systems that rely on historical samples of malicious data to detect threats.

Businesses should leave behind their preoccupation with the known bad, and look to hunt out the abnormal. In order to do so, they’ll need to create a panoramic picture of what’s normal.

Deep learning-powered and unsupervised algorithms can be instructed to continuously analyze an organization’s regular behavior. Using that baseline understanding, the algorithms can accurately detect abnormalities, as well as concealed threats that could otherwise be perceived as normal traffic. 

This approach also takes into account the natural variance amongst organizations and in the ways they operate. The learning process creates a bespoke detection algorithm for each individual company, which means the bias associated with legacy systems is eliminated and false positives are reduced.

Another benefit is that deep learning algorithms can sift through millions of pieces of data simultaneously, in real-time, performing a level of analysis impossible for traditional machine learning or humans alone. As a result, overall efficiency and security is boosted, and analysts are empowered to focus on the most rewarding part of their job: the investigation and detection of complex malicious activities.

In light of the increasing ferocity of cyber-attacks, it’s clear that businesses need to re-think their approach to network monitoring. This means embracing deep learning techniques capable of keeping pace with the rapid evolution of tactics deployed by the most advanced cyber-criminals.

Focusing on searching for the abnormal, as opposed to the known bad, ensures businesses are protected from unknown threats. Cyber-criminals attempting to blend in amongst network traffic are exposed, and security analysts are freed up to focus on genuine threats.

What’s hot on Infosecurity Magazine?