Banking has changed.
In the past, financial institutions outsourced their technology. They had large consulting firms creating, managing, and maintaining their back-end systems. Although banks would have knowledge of the systems in place, they wouldn’t be running them on a day-to-day basis. That was the consultants’ responsibility.
Recent years have seen a significant shift in the financial sector. Engineering has been brought in house, and financial institutions have transformed into technology companies with large product portfolios. Instead of outsourcing everything, banks now employ thousands of engineers who are developing a wide range of applications that are morphing into full-fledged product lines.
As with many new technologies, the financial sector was slow to adopt open source software. Financial firms were concerned that they would lose their competitive edge if they developed with open source or that open source components wouldn’t meet stringent regulatory compliance requirements. Even as recently as six or seven years ago when Fintech Open Source Foundation (FINOS) was founded, open source wasn’t as prevalent in banking as it is today.
Despite the industry’s initial reluctance to adopt new technology, once open source software began to increase in popularity and its benefits became apparent, the industry fully embraced it. Financial institutions realized that, by having open source in their arsenal, they weren’t locked into vendors and that their engineering teams had more control over development.
The financial sector is under ever-increasing pressure to create innovative products that speak to their customers’ needs. Using open source software allows banks to develop the easy-to-use, personalized, and secure experiences that their customers are demanding. Open source has allowed the financial industry to innovate more quickly and cost effectively to lock in a competitive advantage.
The financial industry now understands the tremendous amount of value in using open source and has near universal adoption. Developing with open source components is essential to creating higher quality software at the speed of Agile. With this widespread use of open source comes an increased focus on the unique challenges of open source security.
Financial institutions know the importance of managing the risk associated with open source software. They understand that if they ship a product, it’s their responsibility, even if the problem lies in an open source component they didn’t write. At the end of the day, customers don’t care whose fault a data breach is – whether it was in code that was written in house or from somewhere else. If something goes wrong, it’s the bank’s responsibility. At the same time, hackers don’t care if the code they’re attacking is something a financial institution has written or is borrowed from an open source library.
The bottom line is that customers give banks their trust, and banks must do everything in their power to protect that trust. Banks understand that if they ship a product, it’s their responsibility, which is why it’s important to adopt a more holistic awareness of open source and closed source security. Companies need a broad view of the risk of any third-party component.
The good news is that the financial sector has a much higher rate of adopting open source vulnerability management than most industries. Financial institutions have a greater awareness of risk and are highly regulated. They understand what’s at stake.
Banks are risk averse. They understand that implementing an open source vulnerability management program is one way to reduce risk. The challenge for the financial sector is managing open source components in both modern applications as well as back-end legacy systems.
For most institutions in the financial sector, the legacy mainframe systems that are still in service are built on Linux. Organizations can have systems that are decades old, but if the open source code these systems are built on is more than a year old, the chance that those systems contain publicly disclosed vulnerabilities is very high.
Modern banking applications are usually scanned at every step of the software development lifecycle (SDLC), including before every build and deployment. However, scanning just during these phases isn’t enough. Open source vulnerabilities are publicly disclosed, giving malicious actors a roadmap to an organization’s weaknesses. Applications need to be continuously monitored for vulnerabilities even when deployed.
To manage open source in both legacy systems and modern applications, the financial sector needs a mind shift that decouples scanning during the build cycle from vulnerability detection.
Because of how open source vulnerabilities are disclosed, scanning only when new code is committed is a mistake. It’s simply not frequent enough. For example, an application might be secure at 7pm, but then, without doing a commit, a critical open source vulnerability is discovered and publicly disclosed at 8pm. Organizations can’t risk waiting for hours before identifying and remediating a vulnerability that could potentially expose personal information. They must address it immediately.
This idea of decoupling is particularly important for older systems and legacy systems. If detection and scanning are coupled together, older projects won’t get scanned very often. It is a mistake to do scanning and alerting at the same cadence as committing because too many vulnerabilities fall through the cracks. Scanning must occur much more frequently to effectively manage risk.
Organizations need visibility into the open source components used throughout their projects, and not just at the time of scan, but for all systems in use regardless of whether they are current or legacy.
The bottom line is that financial institutions can’t protect what they can’t see. They need to know what packages, dependencies, and versions are being used across the organization. Only then can they layer on protective capabilities by tracking new CVEs, updating to newer versions of open source libraries, and remediating any newly disclosed vulnerabilities.
Due diligence isn’t just for newly written code. It is an ongoing process that also needs to be applied to existing and legacy code. Using constructs such as policies and mandating forceful rebuilds of software – even if nothing has changed – are healthy activities that should be done regularly. Setting expectations with application teams to have them sweep through their entire codebase and rebuild even if there are no feature changes ensures that code is always up to date. And up-to-date code is far more likely to be secure.
Continuous scanning requires automation. This is especially true considering how many applications modern financial institutions touch. Just one application can have many open source components, and each component may have many dependencies and even more transitive dependencies. It simply is not a job that can be accomplished manually. Security champions need the right tool to get the job done accurately and quickly.
Software Composition Analysis (SCA), an automated solution that manages open source vulnerabilities and license compliance issues, is the only way to ensure that code is being continuously scanned and updated in a timely manner. Without SCA, financial institutions have no visibility into what open source libraries are being used across their organizations nor do they understand what vulnerabilities those libraries might contain.
Whether building a new application from the ground up, protecting new versions of older software, or maintaining legacy systems, financial institutions must be continuously vigilant and constantly scanning. In the highly regulated world of financial services, an SCA solution is key to managing and reducing overall risk.