Version Upgrades and Security/Privacy Downgrades: A Cautionary Tale

Written by

It’s universally understood that sensitive personal information is something that generally has to be protected from public disclosure. But how can the installation of a system upgrade suddenly make previously protected data public, and no one in the organization notices the change?

According to CNN, it happened to the US Department of Environmental Protection (EPA) on July 9, 2018 when EPA updated the software running the Freedom of Information Act Clearinghouse, a site that allows people to enter Freedom of Information Act (FOIA) requests for multiple agencies. The EPA provides the technology resources to operate and maintain the website (foiaonline.gov).

While the general rule is that FOIA requests are public records, prior practice was to refer the FOIA description to the relevant agency before making it public, so that lawfully protected information was masked. But after the upgrade, the descriptions became fully viewable by anyone, with no consideration to legal privacy protection.

CNN found that after the upgrade, the previous data masking was no longer there and EPA said that until CNN told them, they didn’t know this had happened. 

Once notified, EPA took immediate steps to remediate the problem, but the important message is this: organizations fielding new systems are generally cautious and conduct careful analysis and testing of the new system before it goes into live production.

The objective, of course, is to catch errors as early as possible (hopefully in the design stage) and fixed before the system goes live. In the EPA case, an important system function suddenly wasn’t there. To quote CNN, “When the website was switched from the 2.0 version to the 3.0 version on July 9, the masking feature for descriptions somehow ceased to exist.”

Somehow, upgrades, whether large or small, don’t receive the same attention as new systems, either from IT or from user management. Upgrades and changes feel like they are easier, and less likely to cause problems. That may be true, but they can also cause problems, many – perhaps most – of which are preventable, but they don’t get prevented. What can be done to avoid problems in security and control after an upgrade?

First, there has to be a recognition on the part of both functional and technology managers that changes and upgrades can cause issues that affect privacy, security, internal controls and auditability. Functions can be added, deleted or changed. Data can be added, dropped or handled differently. Log files can be modified or sometimes terminated.

Put more simply, there has to be a recognition that upgrades can introduce mistakes and having that happen isn’t acceptable. Saying “if there’s a problem, someone will notice it, tell us, and we’ll fix it” wouldn’t be acceptable in other fields of work, and it shouldn’t be accepted here.

Next, the process for making changes should be as rigorous as for initial development of a system. There will, of course, be an existing functional specification. In an upgrade, the focus is on the changes to be made, so that they can be reflected properly in the updated specification, but equally important, that any affect that the changes may have in other parts of the system can be evaluated.

If a function is to be added, removed or modified, there should be a documented sign-off verifying that the responsible subject matter manager acknowledges the change. There should be testing to determine both that the changes were made, and that the unchanged parts of the system are actually operating without changes. For any major version upgrades, the testing should include subject matter experts.

Everyone understands that resources are always limited and that regardless of the processes followed, mistakes can – and will – happen. I believe that those organizations running systems that capture, process and use sensitive personal information, subject to both security and privacy requirements, have a special obligation to those whose data they process.

Both development and change processes must evaluate data sensitivity and make sure that new or changed systems will provide the level of control and privacy protection required by laws (like HIPAA or GDPR) and regulations, and by the organization’s own privacy practices. If this standard is not being achieved by the new or upgraded system, how can it be implemented? 

Getting upgrades right from a security and privacy viewpoint will never be a perfect process. We all deal with deadline pressures, resource challenges and new issues coming in faster than older ones can be resolved.

It seems to me that if we focus on making sure that we’re not losing track of privacy, security and compliance, we have a better chance of getting it right, or at least of powerfully mitigating the kinds of issues that the EPA case illustrates. 

What’s hot on Infosecurity Magazine?