Majority of Global 2000 Companies Have ‘Blind Spots’

Written by

The majority of Global 2000 companies have ‘blind spots’ in their networks that are not properly analyzed, according to a new survey by ForeScout.

The report of 400 IT and security professionals found almost three quarters (72%) of those polled admitted experiencing five or more network-based incidents in the last year, with firewall, vulnerability assessment and advanced threat defence products suffering the most from blind spots.

“In today’s distributed enterprise, creating a truly secure network, whether managed or unmanaged, requires instant visibility into the devices that are connecting to it, paired with an ability to automate threat responses,” said Rob Greer, CMO and SVP of Products at ForeScout.

“Vulnerable entry points are widespread, and the rise of the Internet of Things devices and mobile computing is only increasing the security attack surface. Automation can help security teams orchestrate their technologies to help eliminate network blind spots – giving them true visibility and actionability into their connected devices.”

Interestingly, the survey also found that IT professionals are unanimously calling for the introduction of more pre-determined security controls within network security technology. With most companies claiming to be understaffed, respondents felt automated security would help save on critical resources. Firewalls (63%), IPS (65%) and anti-virus (63%) were the three leading technologies that respondents wanted to automatically invoke security controls for.

Steve Durbin, Managing Director of the Information Security Forum, believes that whilst technology does and will continue to serve an important role in maximizing efficiency and effectiveness within companies, he warns against allowing overdependence on automated security to “skimp on the people skills and processes required to ensure that technology is performing in the manner we expect.”

“Organizations increasingly use algorithms to operate and make decisions in critical systems, removing the need for individuals who previously filled these roles. As a result, organizations have less visibility into how their systems function and interact, and this lack of transparency can pose significant information security risks. These risks will be revealed as interactions between algorithms create incidents that result in significant disruption,” he added.

“So whilst on the face of it, technology could well solve some of our current resource challenges, it is also not without its own new challenges which may be equally if not more damaging in the long run.”

Listen to a live webinar on this report, Thursday 10th March at 3pm GMT here

What’s hot on Infosecurity Magazine?