One of the most popular features of the Anchore Cloud service is the ability to deep dive into any container image to inspect its contents to see what files, packages and software libraries make up an image. Before I import any public image into my development environment I check out the list of security vulnerabilities in the image, if any, the policy status (does it fail basic compliance checks) and then I dig into the contents tab to see what operating system packages and libraries are in the image. I am still surprised at just how large many images are.

 

This content view allows you to dig into every artifact in the image - what operating system packages, what Node.JS NPM modules including details such as their license and versions as well as how they got pulled in - for example, multiple copies of the same module being pulled in as dependencies of other modules.

While this level of inspection is useful before you pull in a new public Docker image this level of detail is even more useful when applied to your own internal images.

When most people talk about container security and compliance the focus is on security vulnerabilities: “Do I have any critical or high vulnerabilities in my image." As we have covered previously CVEs are just the tip of the iceberg and that organizations should be looking at policies that cover licensing, secrets, configuration, etc. Many organizations that we talk to see the value in policy-based compliance and are planning to implement container scanning as part of their CI/CD workflows but are not ready to make the investment required to add checkpoints and gates within their build or deployment infrastructure.

When the Equifax news broke about their massive breach caused by an unpatched Apache Struts vulnerability I think that every CIO in every organization was on the phone with their operations team and developers to ask if they had a vulnerable version of Apache Struts. While it’s simple to find out what version of a library you are running today on your servers, do you know what was run on your production cluster last week, last month, last year?

Even if you do not have the time or resources to invest in securing your CI/CD pipeline today with policies, reports and compliance checks it will take less than 10 minutes to download Anchore’s open source Engine, point it to your container registry and start it scanning. The Anchore Engine will discover new tags and images deployed to your repos, download and analyze them and maintain a history of tags and images over time. When you are ready to start putting in place policies, vulnerabilities, or gate deployments based on compliance checks you already have data at hand to help you track trends, compare images and run reports on changes over time. We find many organizations just using this data to produce detailed build summaries or changelogs.

Get started today, for free, either with Anchore’s cloud service or download and run the open source Anchore Engine on-premises today.