What does DevOps do in 2018?

Written by

In 2018, we’re expecting DevOps to become the new norm for larger enterprise teams. This is because we’re likely to see developers on older, higher value systems implementing a more DevOps centric approach, having seen it work on projects that have traditionally been highly visible, but low value. 

In big enterprises in the past, DevOps practices have often been tested and trialed on projects which have low business impact. For example, in banking, teams have used DevOps practices for web or mobile apps, but not for their high frequency trading platforms. Having proven the effectiveness of DevOps for new projects, we will see an increase of DevOps being utilized across more valuable projects.

We will also see increased adoption of DevOps in regulated industries, such as financial services, healthcare and life sciences, who are the main laggards in terms of DevOps adoption. DevOps and cloud computing have evolved hand in hand for some time now, and until recently the regulated industries have been afraid of deploying onto public cloud infrastructure, because of concerns around the security of their data.

This has begun to turn around. Cloud vendors have worked hard at their approach to security, certifying their platforms for use in various compliance and regulatory regimes, and we’ll see increased adoption in 2018.

Traditionally the regulated industries have also been worried about anything that increases the throughput of system change, as many of their change control processes are paper-based relics from the past. We’ve seen adoption of leaner change control strategies, underpinned by automation, using continuous delivery techniques, and expect to see more of this through the next year.

We will also witness DevSecOps gain more ground, as cloud and tool vendors focus more heavily on the building blocks they provide for good security standards.

DevSecOps encourages a “shift left” approach to security, where applications and dependencies are constantly scanned for vulnerabilities as part of the continuous delivery pipeline – but this isn’t yet standard practice for many teams, and many developers still don’t fully understand role based access, principles of least privilege, or how to handle secrets securely.

It used to be the case that new software products optimized for ease of use in order to encourage adoption. We’ll increasingly see that this will no longer be at the expense of good security. Following some embarrassing data ransom incidents in the last couple of years around open-by-default tools such as MongoDB and Elasticsearch, software nowadays needs to have security features enabled before being made publicly accessible.

Finally, open source will continue to drive healthy competition. The days when companies were afraid of using open source software are pretty much long gone now. Almost every recent successful online business has been built on top of freely available software.

Many of the software components that underpin good cloud engineering practice today, such as Kubernetes and Prometheus, are stewarded by the Cloud Native Computing Foundation (part of the Linux Foundation) who have a mandate to maintain standardization and platform agnosticism. CNCF will be key to ensuring inter-operability of services across different vendors’ platforms, which will hopefully drive healthy competition through 2018 and beyond.

What’s hot on Infosecurity Magazine?