Is the Human Factor Overlooked in Cybersecurity?

Written by

Cybersecurity has long been a fast-moving field. However, the pace of innovation is escalating and therefore presents a double-edged sword – as cybersecurity best practice evolves, so too does the technology and sophistication of attacks.

Ransomware attacks dominated the news in 2023 and continued in 2024, thanks to high-profile cyber-attacks on large organizations such as MGM Resorts in September 2023, which impacted the company’s third quarter financial results by $100m.

As eye-catching as incidents like these are, there is only so much they can tell us about best cybersecurity practice. All organizations have different priorities, structures and critical assets, which not only affects what cybersecurity measures should be put in place, but – more importantly – how these measures may come under attack.

The most predictable thing about hackers is that they are unpredictable and highly dynamic, and they will target a business’s weak points whenever possible. 

No two organizations are the same, and the same applies to their potential vulnerabilities. Considering that the cost of an average data breach is $4.45m and its average identification and containment time is 277 days, putting bespoke security measures in place should be an essential part of any business’s operations.

The Human Factor

Most companies are now aware that they need appropriate cybersecurity measures in place, but there may be a perception gap in what constitutes effective security. For example, while it is best practice to invest in a managed Security Information and Event Management (SIEM) service to allow for better detection, analysis and response to security events and threats, it needs to be configured appropriately with analytics engines to generate effective results and insights.

The same applies to managed threat detection and response offerings, otherwise known as Security Operations Center (SOC) as a Service. Though these services are often built around automation-supported tooling, they require careful calibration to ensure the right notifications and potential issues are being flagged to appropriate stakeholders.

Regardless of whether organizations choose to insource or outsource their security monitoring capability, continuous and effective human input is essential. Security monitoring’s automated capabilities may cause it to swamp the stakeholder with meaningless event notifications, devoid of context and potential next steps.

Without security teams applying critical thinking across the affected organization’s environment and IT systems, these indexed events cannot be effectively tracked and recorded, and their threat level assessed. If this cannot be done, the continuous assurance model that underpins security monitoring no longer applies.

But emphasizing this human factor does not mean that tools such as automation, machine learning and AI have no place being part of security monitoring and incident response capability. They are vital to helping manage analyst workloads.

However, the level of automation adopted within an organization can vary and is largely dependent on the company’s level of maturity with cybersecurity practices. For businesses taking their first steps with the technology, only high-confidence processes and extremely common events should be automated or receive an automatic response.

Yet how these processes and events are defined may also differ from company to company. This variation further emphasizes the importance of the human factor when establishing effective cybersecurity protocols, including what constitutes a more complex event that may require human intervention to resolve.

These distinctions should be established extremely early on when setting up a company’s security monitoring capabilities – preferably on day one. Following this, any incident response plans should be updated on an ongoing basis.

As all organizations have their unique priorities and organizational nuances, a one-size-fits-all solution does not exist for any company. Cybersecurity plans should reflect this, with cybersecurity expertise being applied where necessary to adapt existing practices to reflect changing circumstances.

How to Integrate Expertise

Demand for cybersecurity skills and talent continues to grow in response to increases of cyber threats, and many businesses may simply lack the appropriate expertise or availability of key personnel, leaving their systems vulnerable.

Leveraging third-party expertise to work closely within existing organizational structures can support with critical thinking, interpretation of historic business data and hands-on remediation. Bringing these data-driven insights to life is crucial to continuous assurance and effective incident prevention and response.

Looking for consultancies with accreditations from independent cybersecurity bodies such as CREST can help distinguish best possible service. Identifying whether the company in question offers red teaming services provides another benchmark of quality. As a covert assessment allowing organizations to simulate and – crucially – react to real adversaries, leveraging this sort of in-the-field experience may be more valuable in incident response than focusing on tools and software alone.

With expert support, events flagged from networks, infrastructure, applications, end-user interaction, cloud environments or identity and access management software can be more easily categorized depending on their urgency.

Using this method, what could potentially be millions of events can be whittled down to thousands of alarms, hundreds of cases needing possible investigation, and then tens of actual, identified incidents requiring intervention. A funnel-based approach, informed by expertise, provides a clear way to cut through the noise and identify where possible threats are.

An Informed Response

Detection is only part of the incident remediation process – response is vital and is often lacking when cybersecurity is based on technology and automation alone. A common issue in cybersecurity is that while response time is a vitally important metric when incidents occur, it is not always clear what a ‘response’ involves.

Sometimes it can simply mean the issue has been detected, which may be insufficient for businesses lacking the in-house expertise to react swiftly to potential data breaches. Instead, they may be left scrambling for quick fixes for an urgent issue, which they may not be able to fully resolve. For cybersecurity provision to be truly effective, a holistic, joined-up approach is required that includes incident preparation, eradication and remediation.

Incident response plans are often a pressure point for many organizations, as they can be completed and not updated afterwards to account for new technologies, or important contacts leaving, joining or moving within the company.

Without this key information, solving potential incidents becomes like trying to navigate without a map, and underlines why the human factor remains so crucial to cybersecurity. It is therefore pressing that the plan is updated yearly as a matter of course to avoid these potential issues.

This sort of human input is vital to all areas of cybersecurity practice and should not be overlooked as technology marches on. Where cybersecurity is concerned, a once-a-year image of risk is no longer enough. Organizations need continuous assurance to move at the same pace as the risk landscape, provided by experts well-versed in global trends including ESG, cybersecurity and supply chain integrity, which are defining the new era of risk management – the era of Assurance 4.0.

So while other software and automation tools can streamline processes and should be used where appropriate, they are not a silver bullet. Instead, cybersecurity consultancy should work in tandem with these digital services to achieve the desired result – a secure IT environment.

What’s hot on Infosecurity Magazine?