If you follow the software supply chain space, you’ve heard the noise. The industry often gets stuck in a format war loop; debating schema rather than focusing on the utility of the stored data. It’s like arguing about font kerning on a nutrition label while the ingredients list is passed over.

We recently hosted Steve Springett, Chair of the CycloneDX Core Working Group, to cut through this noise. The conversation moved past the basic definition of an SBOM and into the mechanics of true software transparency.

Here are four takeaways on where the industry is heading—and why the specific format doesn’t matter.

1. Content is king

For years, the debate has centered on “which standard will win.” But this is the wrong question to ask. The goal isn’t to produce a perfectly formatted SBOM; the goal is to reduce systemic risk and increase software transparency.

As Springett noted during the session:

“The format doesn’t really matter as long as that format represents the use cases. It’s really about the content.”

When you focus on form over function, you end up generating an SBOM to satisfy a regulator even while your security team gains no actionable intelligence. The shift we are witnessing is from generation to consumption.

Does your data describe the components? Does it capture the licensing? More importantly, does it support your specific use case, whether that’s procurement, vulnerability management, or forensics? If the content is empty, the schema validation is irrelevant.

2. When theory and reality diverge

In physical manufacturing, there is often a gap between the engineering diagrams and the finished product. Software is no different. We have the source code (the intent) and the compiled binary (the reality).

Springett ran into a situation where a manufacturer needed a way to model the dependencies of the process that created a product:

“We created a manufacturing bill of materials (MBOM) to describe how something should be built versus how it was actually built.”

This distinction is critical for integrity. A “Design BOM” tells you what libraries you intended to pull in. In this case, the Design MBOM and the Build MBOM were able to explain what parts of the process were diverging from the ideal path. Capturing this delta allows you to verify the integrity of the pipeline itself, not just the source that entered it.

3. Solving the compliance cascade

Security teams are drowning in standards. From SSDF to FedRAMP to the EU CRA, the overlap in requirements is massive, yet the evidence collection remains manual and disjointed. It is the classic “many-to-many” problem.

Machine-readable attestations are the mechanism to solve this asymmetry.

“A single attestation can attest to multiple standards simultaneously. This saves a lot of hours!”

Instead of manually filling out a spreadsheet for every new regulation, you map a single piece of evidence—like a secure build log—to multiple requirements. If you prove you use MFA for code changes, that single data point satisfies requirements in FedRAMP, PCI DSS 4.0, and SSDF simultaneously.

This shifts compliance from a manual, document-based operation to an automated process. You attest once, and the policy engine applies it everywhere.

4. Blueprints and behavioral analysis

Reproducible builds are a strong defense, but they aren’t a silver bullet. A compromised build system can very accurately reproduce the malware that has been pulled in from a transitive dependency. To catch this, you need to understand the intended behavior of the system, not just its static composition.

This is where the concept of “blueprints” comes into play.

“Blueprints are the high-level architecture AND what the application does. This is critically important because reproducible builds are fine, but can also be compromised.”

A blueprint describes the expected architecture. It maps the data flows, the expected external connections, and the boundary crossings. If your SBOM says “Calculator App,” but the runtime behavior opens a socket to an unknown IP, a static scan won’t catch it.

By comparing the architectural blueprint against the runtime reality, you can detect anomalies that standard composition analysis misses. It moves the defense line from “what is in this?” to “what is this doing?”

The Path Forward

We’ve moved past the era of format wars. The takeaways are clear: prioritize content over schema, capture the “as-built” reality, automate your compliance evidence, and start validating system behavior, not just static ingredients.

But this is just the baseline. In the full hour, Steve Springett dives much deeper into the mechanics of transparency. He discusses how to handle AI model cards to track training data and bias, how to manage information overload so you don’t drown in “red lights,” and what’s coming next in CycloneDX 1.7 regarding threat modeling and patent tracking.

To get the complete picture—and to see how these pieces fit into a “system of systems” approach—watch the full webinar. It’s the fastest way to move your strategy from passive documentation to active verification.


Learn about how SBOMs, and CycloneDX specifically, planning for the future. Spoiler alert: compliance, attestations and software transparency are all on deck.