#RSAC: Open-Source Software is a Public Health Hazard

Written by

Open-source software is cost-effective (in theory), easily accessible and represents a known development quantity that allows the pace of application innovation to accelerate. In fact, open-source components are found in 90% of software applications on the market today. There’s one issue: open-source also represents a vast, unpatched quagmire of cyber-risk that’s putting public safety at grave risk.

That’s the assessment of Joshua Corman, CTO at Sonatype, who took to the stage at RSA 2015 to characterize insecure software as a kind of “cyber-asbestos,” widely deployed, inherently dangerous, and eventually carrying an astronomical cost in terms of human suffering and cost to clean up because …we just didn’t know how dangerous it was at the time when we embraced it.

“Insecure software impacts public safety, human life and national security, but we haven’t matured enough to address it as such,” Corman said.

An effective, defensible infrastructure is capable of supporting operational excellence, situational awareness and the ability to launch countermeasures against attackers. But it’s impossible to have a secure foundation if the building blocks are flawed. And flawed they are, Corman said.

Put simply, 90% of the software and hardware we build, buy and deploy is assembled from third-party elements and open-source projects. And that 90% is riddled with known vulnerabilities and flaws that more often than not, have not been patched. In fact, only 41% of security defects are ever fixed in open-source projects, he said. And the time to remediation when there’s a known vulnerability is 390 days (median 265).

But unfortunately, even post-Heartbleed, and post-Shellshock, there’s a startling lack of awareness of the dangers.

“In 2014 there were 17 billion unique downloads of open-source—our dependence on it is going up,” Corman said. And that’s a problem. “Black hats used to research a flaw in a custom website for a bank, exploit it, and that’s it. It stopped with that bank. But if they can attack OpenSSL or the Bash shell, of course they’re going to get a higher return. We have pervasive dependence. It’s one easy target.”

He pointed out the Bouncy Castle vulnerability to illustrate the problem. The crypto API for Java is widely deployed, including in critical infrastructure and for government defense contractors. In 2009, a critical vulnerability was found to have been exploited. It was immediately patched. But, Corman pointed out that because the broken version is still available for download, 4,000 orgs in 2013 made the mistake of doing just that. And then, they proceeded to deploy it 20,000 times in exponentially greater numbers of applications, seven years after the vulnerability was fixed.

And Bouncy Castle is not alone. A full 97% of last years’ exploits tracked to 10 CVEs, eight of which have been available in patched versions for years.

“Current approaches are not working,” Corman said. “There are countless instances of companies constructing supposedly secure things with insecure building versions, even though there are fixed versions available.”

Part of that is because of a lack of policy—70% of organizations have no security policy about downloads, he noted—and part of the problem is the issue with trackability. It’s almost impossible to reconstruct the software supply chain without great expense. An organization may buy a software package from a top vendor—but, much like in the automotive supply chain, that vendor doesn’t build all of its own components. In fact, it is very likely to have sourced the majority of its building blocks from the insecure open-source community.

“It’s risk on top of risk on top of risk,” Corman said. “We can no longer size or manage our IT risk because we have no idea what’s going on inside the software.”

To address the visibility problem, U.S. Representatives Ed Royce (R-CA) and Lynn Jenkins (R-KS) have introduced H.R. 5793, the "Cyber Supply Chain Management and Transparency Act of 2014." The legislation will ensure that all contractors of software, firmware or products to the federal government provide the procuring agency with a bill of materials of all third-party and open-source components used, and demonstrate that those component versions have no known vulnerabilities. And, given that future vulnerabilities are inevitable, it also mandates that software applications be patchable.

"As a house is only as strong as its foundation, it's no wonder cyber-attacks are on the rise with reports showing 71% of software contains components with critical vulnerabilities," said Royce in announcing the bill. “This bill protects our nation's cyber infrastructure by ensuring the building blocks that make it up are secure and uncompromised.”

“The software that runs our critical infrastructure should not contain known security defects,” Corman said. “I don’t think it could get more simple. But you would not believe the controversy—the response was unprecedented. The software industry hates the idea. They say it’s too expensive and too onerous to implement, and will cause a dramatic slowdown in innovation and raise the cost of goods.”

He added, “If we can’t even look to eliminate known vulnerabilities, forget zero-days. The software companies are passing that risk and cost downstream.”

Corman likens it to building a car with known safety issues with no intention of ever fixing them: it’s hard to imagine a world where there may be flaws that could kill you, and the public and everyone else must simply accept both the risk and the fact that we have no way of determining whether or not the flaws are there in the first place.

“The adoption of technnology is progressing further and faster than our ability to secure it,” Corman said. “To ensure connected technologies with the potential to impact public safety and human life are worthy of trust, like cars and home automation and medical devices, infosecurity cannot live in a vacuum. It has to be the concern of auto engineers, trial lawyers, insurance companies, regulators and the public.”

There are real ramifications for not addressing the issue that could be explored, similar to medical-related class-action suits or lawsuits over dangerous automotive flaws. He added, “Trial lawyers will explore negligence. Those that have been breached for known vulnerabilities will be liable.”

What’s hot on Infosecurity Magazine?