Open Source Versioning: The Race to Stay Up-to-Date

Darius Cooper

Open source libraries, once shunned as risky and not ready for prime time, are now used extensively across major corporations, including insurers. The reason is simple: In time- and resource-constrained companies trying to stay technologically competitive, it doesn’t make sense anymore to try to reinvent a wheel that’s already been battle-tested. However, having made the commitment to open source code and solution sets, it’s imperative to keep up-to-date with open source library maintenance and updates.

The Risk of Open Source

Some active open source libraries are updated multiple times a year. Getting out of date means missing out on newer features. Also, if the gap between upgrades bridges several iterations, some future upgrades may not be backward-compatible, causing functional problems at best and risk exposure problems at worst. On occasion, an out-of-date open source library version can expose a system and a company to attack. In 2017, a major U.S. financial company was hit by a hack that exposed the information of millions of customers. The hackers had exploited a software vulnerability that had been public for a month, but the company had not patched its system in time. There are several other well-known examples of a similar vein.

Keeping current is not easy. Upgrading may be costly, but falling behind can make it costlier to catch up. The question is, How frequently should one upgrade to the newer versions?

The Effort and Cost

One of the reasons companies choose not to upgrade every time there’s a new release is the cost involved. While the open source code itself is free, the effort required in time and resources can be substantial and can take valuable resources away from higher priority business-technology initiatives. An additional concern is that, even though the upgrade is supposed to be backward-compatible, there is no way to verify that without actually performing the upgrade. If there are compatibility problems, then a larger effort ensues involving upgrading, restoring or rebuilding a system. Any large system has to be tested thoroughly after upgrading, and without the help of automated tools, testing can be costly and time-consuming.

One approach for mitigating these costs and risks is the process of continuous integration (CI) for open source libraries. A good suite of automated testing tools provides a high degree of confidence that nothing has been broken by the most recent upgrade, before it is implemented into any production environment. However, without automated tests that flush out potential issues from upgrading, the risks and costs are high enough that companies have a hard time keeping up and managers become more motivated to delay upgrades than to perform them. That’s why it’s important that companies employ CI and continuous delivery (CD) processes that include robust automated testing.

Additionally, the use of microservices—an increasingly effective and popular way to construct systems—puts another twist into the efforts and costs of keeping open source libraries up to date. Individually, each microservice might be quick to test. However, if an application has been broken into 50, 100 or even more microservice components, even a small amount of manual work testing each component will add up quickly. Additionally, in an environment that has deployed microservices broadly, there may not be a business need to change some services for months—that is one of the benefits of using microservices.

The Right Approach to Open Source

There is a right way to deal with open source library updates that has the benefit of taking advantage of new functionality and code stability in a low-risk, low-effort and low-resource manner as follows:
Planning: Enterprise and application architects should be vigilant regarding the road maps of any large frameworks that employ open source libraries as a key component to any project or initiative. Proactive identification of open source libraries that may need to be upgraded is key. Likewise, application project managers should make open source library upgrade part of a regular and ongoing project plan. For example, when microservices are involved, they should be classified proactively as services by the libraries they use, as a way to identify the few services that would normally be upgraded first. This has the benefit of providing a few sprints’ worth of feedback before rolling out the upgrades to other libraries.

Monitoring:

Open source version monitoring is essential. There are products that can help to identify and report library versions that have vulnerabilities. Such monitoring can and should be used as part of the CI/CD pipeline and also to monitor artifacts that already are deployed in production. Monitoring processes also can report when newer open source library versions are available, even if the incumbent library has no reported vulnerabilities.

Automated compatibility triage:

It is often too risky to set up production builds to always use the latest version of an open source library. However, separate automated processes can identify new versions, run them through a build and test process and report back the results. If a test fails, the development team gets an early warning that their code is not compatible with the upgrade version. This helps to plan and schedule fixes, and if a vulnerability is discovered in the current version, this kind of proactive planning enables a quicker resolution and turnaround.

Processes:

Strong processes, preferably automated ones, are the foundation of open source versioning. For critical security upgrades, a process must be in place to triage and prioritize identified changes. For other version updates, good processes can help make upgrades easier. For example, a standard and repeatable process may incorporate minor version upgrades into the regular flow of sprint work, thus saving time while mitigating risk.

Open source versioning can be difficult, but it doesn’t have to be. With automated testing, good monitoring tools and some proactive planning, the costs and risks of staying up to date can be kept to a minimum, allowing companies to reap the benefits of new features and functions.

Originally published in
Devops.com
Read the original article here.

Author: Darius Cooper
Darius Cooper is a software architect at X by 2. Cooper specializes in IT transformation projects for the insurance industry. With over 25 years of industry experience, his expertise includes integrations architecture, application architecture, service-oriented architecture and agile methodologies. Cooper is currently the integrations architect for a multi-year core system modernization initiative at a large insurance firm.

Related Articles