Our Roundtable Sessions are invite-only events hosted by peers for peers that bring together a select group of senior IT leaders from across industries for topic-driven, intimate, dialog on current trends and topics. We hosted this Session featuring a group of CXOs and other IT executives. The group met remotely to discuss the rise of next-gen software supply chain attacks, led by the VP of Global Infrastructure for an energy, commodities, and services company. This Session was sponsored by Sonatype.
Incorporating third-party libraries, packages, and tools into your software has become incredibly easy. Do a simple pip or npm install, and you are halfway through building a web server. However, when you download third-party packages or integrate with enterprise-grade third-party tools, you are letting their publishers run code on your system. Most of the time, without even looking at their source codes. This "blind trust" can be exploited by potential hackers. Bad actors are no longer waiting for public vulnerabilities. They're actively injecting malicious code into open-source projects that feed the global supply chain.
A director of software development mentioned that their biggest concern is securing open-source software that they use within their applications. A VP of Product also shared a set of challenges regarding software supply chain:
An executive argued that traditional security measures like firewalls and ACLs have become irrelevant in today's world, which mostly runs in the cloud. There are three main security control points worth focusing on: 1. Identity 2. Data stores 3. Applications and how they all interconnect. He added that it was virtually impossible for them to keep track of their 15,000+ identities, 2000 data stores, and 700 applications when they initially moved to the cloud. It was also hard for them to implement security guardrails on a cloud platform that made it easy to scale. However, investing in appropriate tooling and automation eventually made things easier for them. E.g., If someone opens up an s3 bucket to the public cloud, the security tool will raise an alert and reverse the configuration change. If a significant breach is detected, the whole "critical incident process" will kick in, notifying all the relevant parties and performing necessary recovery actions.
A speaker talked about how it has become increasingly complex to secure cloud infrastructures. Consider a simple micro-service application. A developer wrote its code keeping all the security guidelines in mind, but will that be enough? The application will be run inside a container; how secure is that containerized environment? Furthermore, the container will become part of a pod that runs inside a Kubernetes cluster, which also needs to be secure and compliant. And it's not just one service, one container, or one cluster. Tracking thousands of network entities and their interconnections while trying to secure infrastructure is not easy.
A participant shared that some of their most experienced developers were used to building applications that ran behind secure firewalls and segmented networks. These applications didn't need to be inherently secure, with most of the security being implemented at the network level. However, with the shift to the cloud, the lack of built-in security, like unencrypted sensitive data, became a major issue. A lot of code refactoring had to be done to make the applications cloud-ready. He added that they also perform a lot of penetration and vulnerability testing on their applications. Different development team members are then involved in the remediation process to build a collective appreciation of security best practices and guidelines.
It's also vital to ensure a delicate balance between speed and security. A security tool or security guardrails should not drastically affect time-to-production. An exec said that it's crucial to have open collaboration between security and development teams. If a tool is not working efficiently for a developer, they should be able to ask the security team to either optimize or replace it.
By focusing on authenticity in communication, relevance in content delivery, and fostering collaboration across key departments, organizations can transform their intranets into powerful tools that support both individual and organizational success.