This is the second post in a long running series to explain the details of the NIST Secure Software Development Framework (SSDF), also known as the standard NIST 800-218. You can find more details about the SSDF on the NIST website.
Today we’re going to cover control PS.3.2 which is defined as
PS.3.2: Collect, safeguard, maintain, and share provenance data for all components of each software release (e.g., in a software bill of materials [SBOM]).
This one sounds really simple, we just need an SBOM, right? But nothing is ever that easy, especially in the world of cybersecurity compliance.
Let’s break this down into multiple parts. Nearly every word in this framework is important for a different reason. The short explanation is we need data that describes our software release. Then we need to safely store that data. It sounds simple, but like many things in our modern world of technology, the devil is in the details.
Start with the SBOM
Let’s start with an SBOM. Yes, you need an SBOM. That’s the provenance data. There are many ways to store release data, but the current expectation across the industry is that SBOMs will be the primary document. The intent is we have the ability to receive and give out SBOMs. For the rest of this post we will put a focus on how to meet this control using an SBOM and SBOM management.
It doesn’t matter how fast or slow the release process is, every time you ship or deploy software, you need an SBOM. For most of us the days of putting out a release every few years are long gone, almost everyone is releasing software at a breakneck pace. Humans cannot be a part of this process, because humans are slow and make mistakes. To solve the challenge of SBOM automation, we need, well, automation. SBOMs should be generated automatically during stages of the development process. There are many different ways to accomplish this, here at Anchore we’re pretty partial to the Syft SBOM generator. We will be using Syft in our examples, but there are many ways to create this data.
Breaking it Down
Creating an SBOM is the easiest step of meeting this control. If we have a container we need an SBOM for, let’s use the Grype container for our example. It can be as easy as running
syft -o spdx-json anchore/grype:latest
and we have an SBOM of the Grype container image in the SPDX format. In this example we generated an SBOM from a container in the Docker registry, but there’s no reason to wait for a container to be pushed to the registry to generate an SBOM. You can add Syft into the build process. For example, you can see a Syft GitHub action that does this step automatically on every build. There are even ways to include the SBOM in the registry metadata now.
Once we have our SBOMs generated, keep in mind the ‘s’ is important, you are going to have a lot of SBOMs. Some applications will have one, some will have multiple. For example if you ship three container images for the application you will end up with at least three SBOMs. This is why the word “collect” exists in the control. Collecting all the SBOMs for a release is important. Collecting really just means making sure you can find the SBOMs that were automatically generated. In our case, we would collect and store the SBOMs in Anchore Enterprise. It’s a tool that does a great job of keeping track of a lot of SBOMs. More details can be found on the Anchore Enterprise website.
Protect the Data Integrity
After the SBOMs are collected, we have to safeguard the SBOMs contents. The word safeguard isn’t very clear. One of the examples states “Example 3: Protect the integrity of provenance data, and provide a way for recipients to verify provenance data integrity.” This seems pretty straightforward. It would be dishonest to make the claim “just sign the SBOM and you’re done” because digital signatures are still hard.
It’s probably best to use whatever mechanisms you use to safeguard your application artifacts to also safeguard the SBOM. This could be digital signatures. It could be a read only bucket storage over HTTPS. It could be checksum data available out of band. Maybe just a system that provides audit logs of when data changes. There’s no single way to do this and unfortunately there’s no good advice that can be handed out for this step. Be wary of anyone claiming this is a solved problem today. The smart folks working on Syft have some ideas on how to deal with this.
We also are expected to maintain the SBOMs we are now collecting and safeguarding. This one seems easy as in theory an SBOM is a static document. I think one could interpret this in several ways. NIST has a glossary, it doesn’t define maintain, but does define maintenance as “Any act that either prevents the failure or malfunction of equipment or restores its operating capability.” It’s safe to say the intent of this is to make sure the SBOMs are available now and into perpetuity. In a fast moving industry it’s easy to forget that in two or more years from now the data in an SBOM could be needed by customers, auditors, or even forensic investigators. But on the other side of that coin, it’s just as possible that in a few years what passes as an SBOM today won’t be considered an SBOM. Maintaining SBOMs should not be disregarded as unimportant or simple. You should find an SBOM management system that can store and convert SBOM formats as a way to future proof the documents.
There are new products coming to market that can help with this maintain stage. They are being touted as SBOM management platforms. Anchore Enterprise is a product that does this. There are also open source alternatives such as Dependency Track. There will no doubt be even more of these tools into the future as SBOM use increases and the market matures.
Lastly, and possibly most importantly, we have to share the SBOMs.
One aspect of SBOMs that keeps coming up is an idea that every SBOM needs to be available to the public. This is specifically covered by CISA in their SBOM FAQ. It comes up on a pretty regular basis and is a point of confusion. You get to decide who can access an SBOM. You can only distribute an SBOM to your customers, you can distribute them to the public, you can keep them internal only. Today there isn’t a well defined way to distribute SBOM data. Many ecosystems have their own ways of including SBOM data. For example in the world of containers, registries are putting them in metadata. Even GoReleaser lets you create SBOMs. Depending how your product or service is accessed, there may not be a simple answer to this question.
One solution could be having customers email support asking for a specific SBOM. Maybe you have the SBOM available in the same place customers download your application or login to your service. You can even just package the SBOM up into the application, like a file in a zip archive. Once again, the guidance does not specifically tell us how to accomplish this.
Pro Tip: Make sure you include instructions for anyone downloading the SBOM how to verify the integrity of your application and your SBOM. PS3.1 talks about how to secure the integrity of your application and we’ll cover that in a future blog post.
Final Thoughts
This is one control out of 42. It’s important to remember this is a journey, it’s not a one and done sort of event. We have many more blog posts to share on this topic, and a lot of SBOMs to generate. Like any epic journey, there’s not one right way to get to the destination.
Everyone has to figure out how they want to meet each NIST SSDF control, ideally in a way that is valuable to the organization as well as customers. Processes that create unnecessary burden will always end up worked around, and processes integrated into existing workflows are far less cumbersome. Let’s aim high and produce verifiable components that not only meet NIST compliance, but also ease the process for downstream consumers.
To sum it all up, you need to create SBOMs for every release, safeguard them the same way you safeguard your application, store them in a future proof manner, and be able to share the SBOMs. There’s no one way to do any of this, if you have any questions subscribe to our newsletter for monthly updates on software supply chain security insights and trends.
Josh Bressers
Josh Bressers is vice president of security at Anchore where he guides security feature development for the company’s commercial and open source solutions. He serves on the Open Source Security Foundation technical advisory council and is a co-founder of the Global Security Database project, which is a Cloud Security Alliance working group that is defining the future of security vulnerability identifiers.