Our communities take security seriously and have been instrumental in creating the tools and standards that every organization needs to comply with the recent US Executive Order
Overview
The US White House recently released its Executive Order (EO) on Improving the Nation’s Cybersecurity (along with a press call) to counter “persistent and increasingly sophisticated malicious cyber campaigns that threaten the public sector, the private sector, and ultimately the American people’s security and privacy.”
In this post, we’ll show what the Linux Foundation’s communities have already built that support this EO and note some other ways to assist in the future. But first, let’s put things in context.
The Linux Foundation’s Open Source Security Initiatives In Context
We deeply care about security, including supply chain (SC) security. The Linux Foundation is home to some of the most important and widely-used OSS, including the Linux kernel and Kubernetes. The LF’s previous Core Infrastructure Initiative (CII) and its current Open Source Security Foundation (OpenSSF) have been working to secure OSS, both in general and in widely-used components. The OpenSSF, in particular, is a broad industry coalition “collaborating to secure the open source ecosystem.”
The Software Package Data Exchange (SPDX) project has been working for the last ten years to enable software transparency and the exchange of software bill of materials (SBOM) data necessary for security analysis. SPDX is in the final stages of review to be an ISO standard, is supported by global companies with massive supply chains, and has a large open and closed source tooling support ecosystem. SPDX already meets the requirements of the executive order for SBOMs.
Finally, several LF foundations have focused on the security of various verticals. For example, LF Public Health and LF Energy have worked on security in their respective sectors. Our cloud computing industry collaborating within CNCF has also produced a guide for supporting software supply chain best practices for cloud systems and applications.
Given that context, let’s look at some of the EO statements (in the order they are written) and how our communities have invested years in open collaboration to address these challenges.
Best Practices
The EO 4(b) and 4(c) says that
The “Secretary of Commerce [acting through NIST] shall solicit input from the Federal Government, private sector, academia, and other appropriate actors to identify existing or develop new standards, tools, and best practices for complying with the standards, procedures, or criteria [including] criteria that can be used to evaluate software security, include criteria to evaluate the security practices of the developers and suppliers themselves, and identify innovative tools or methods to demonstrate conformance with secure practices [and guidelines] for enhancing software supply chain security.” Later in EO 4(e)(ix) it discusses “attesting to conformity with secure software development practices.”
The OpenSSF’s CII Best Practices badge project specifically identifies best practices for OSS, focusing on security and including criteria to evaluate the security practices of developers and suppliers (it has over 3,800 participating projects). LF is also working with SLSA (currently in development) as potential additional guidance focused on addressing supply chain issues further.
Best practices are only useful if developers understand them, yet most software developers have never received education or training in developing secure software. The LF has developed and released its Secure Software Development Fundamentals set of courses available on edX to anyone at no cost. The OpenSSF Best Practices Working Group (WG) actively works to identify and promulgate best practices. We also provide a number of specific standards, tools, and best practices, as discussed below.
Encryption and Data Confidentiality
The EO 3(d) requires agencies to adopt “encryption for data at rest and in transit.” Encryption in transit is implemented on the web using the TLS (“https://”) protocol, and Let’s Encrypt is the world’s largest certificate authority for TLS certificates.
In addition, the LF Confidential Computing Consortium is dedicated to defining and accelerating the adoption of confidential computing. Confidential computing protects data in use (not just at rest and in transit) by performing computation in a hardware-based Trusted Execution Environment. These secure and isolated environments prevent unauthorized access or modification of applications and data while in use.
Supply Chain Integrity
The EO 4(e)(iii) states a requirement for
“employing automated tools, or comparable processes, to maintain trusted source code supply chains, thereby ensuring the integrity of the code.”
The LF has many projects that support SC integrity, in particular:
in-toto is a framework specifically designed to secure the integrity of software supply chains.
The Update Framework (TUF) helps developers maintain the security of software update systems, and is used in production by various tech companies and open source organizations.
Uptane is a variant of TUF; it’s an open and secure software update system design which protects software delivered over-the-air to the computerized units of automobiles.
sigstore is a project to provide a public good / non-profit service to improve the open source software supply chain by easing the adoption of cryptographic software signing (of artifacts such as release files and container images) backed by transparency log technologies (which provide a tamper-resistant public log).
We are also funding focused work on tools to ease signature and verify origins, e.g., we’re working to extend git to enable pluggable support for signatures, and the patatt tool provides an easy way to provide end-to-end cryptographic attestation to patches sent via email.
OpenChain (ISO 5230) is the International Standard for open source license compliance. Application of OpenChain requires identification of OSS components. While OpenChain by itself focuses more on licenses, that identification is easily reused to analyze other aspects of those components once they’re identified (for example, to look for known vulnerabilities).
Software Bill of Materials (SBOMs) support supply chain integrity; our SBOM work is so extensive that we’ll discuss that separately.
Software Bill of Materials (SBOMs)
Many cyber risks come from using components with known vulnerabilities. Known vulnerabilities are especially concerning in key infrastructure industries, such as the national fuel pipelines, telecommunications networks, utilities, and energy grids. The exploitation of those vulnerabilities could lead to interruption of supply lines and service, and in some cases, loss of life due to a cyberattack.
One-time reviews don’t help since these vulnerabilities are typically found after the component has been developed and incorporated. Instead, what is needed is visibility into the components of the software environments that run these key infrastructure systems, similar to how food ingredients are made visible.
A Software Bill of Materials (SBOM) is a nested inventory or a list of ingredients that make up the software components used in creating a device or system. This is especially critical as it relates to a national digital infrastructure used within government agencies and in key industries that present national security risks if penetrated. Use of SBOMs would improve understanding of the operational and cyber risks of those software components from their originating supply chain.
The EO has extensive text about requiring a software bill of materials (SBOM) and tasks that depend on SBOMs:
EO 4(e) requires providing a purchaser an SBOM “for each product directly or by publishing it on a public website” and “ensuring and attesting… the integrity and provenance of open source software used within any portion of a product.” It also requires tasks that typically require SBOMs, e.g., “employing automated tools, or comparable processes, that check for known and potential vulnerabilities and remediate them, which shall operate regularly….” and “maintaining accurate and up-to-date data, provenance (i.e., origin) of software code or components, and controls on internal and third-party software components, tools, and services present in software development processes, and performing audits and enforcement of these controls on a recurring basis.” EO 4(f) requires publishing “minimum elements for an SBOM,” and EO 10(j) formally defines an SBOM as a “formal record containing the details and supply chain relationships of various components used in building software… The SBOM enumerates [assembled] components in a product… analogous to a list of ingredients on food packaging.”
The LF has been developing and refining SPDX for over ten years; SPDX is used worldwide and has is in the process of being approved as ISO/IEC Draft International Standard (DIS) 5962. SPDX is a file format that identifies the software components within a larger piece of computer software and metadata such as the licenses of those components. SPDX 2.2 already supports the current guidance from the National Telecommunications and Information Administration (NTIA) for minimum SBOM elements. Some ecosystems have ecosystem-specific conventions for SBOM information, but SPDX can provide information across all arbitrary ecosystems.
SPDX is real and in use today, with increased adoption expected in the future. For example:
An NTIA “plugfest” demonstrated ten different producers generating SPDX. SPDX supports acquiring data from different sources (e.g., source code analysis, executables from producers, and analysis from third parties). A corpus of some LF projects with SPDX source SBOMs is available. Various LF projects are working to generate binary SBOMs as part of their builds, including yocto and Zephyr. To assist with further SPDX adoption, the LF is paying to write SPDX plugins for major package managers.
Vulnerability Disclosure
No matter what, some vulnerabilities will be found later and need to be fixed. EO 4(e)(viii) requires “participating in a vulnerability disclosure program that includes a reporting and disclosure process.” That way, vulnerabilities that are found can be reported to the organizations that can fix them.
The CII Best Practices badge passing criteria requires that OSS projects specifically identify how to report vulnerabilities to them. More broadly, the OpenSSF Vulnerability Disclosures Working Group is working to help “mature and advocate well-managed vulnerability reporting and communication” for OSS. Most widely-used Linux distributions have a robust security response team, but the Alpine Linux distribution (widely used in container-based systems) did not. The Linux Foundation and Google funded various improvements to Alpine Linux, including a security response team.
We hope that the US will update its Vulnerabilities Equities Process (VEP) to work more cooperatively with commercial organizations, including OSS projects, to share more vulnerability information. Every vulnerability that the US fails to disclose is a vulnerability that can be found and exploited by attackers. We would welcome such discussions.
Critical Software
It’s especially important to focus on critical software — but what is critical software? EO 4(g) requires the executive branch to define “critical software,” and 4(h) requires the executive branch to “identify and make available to agencies a list of categories of software and software products… meeting the definition of critical software.”
Linux Foundation and the Laboratory for Innovation Science at Harvard (LISH) developed the report Vulnerabilities in the Core,’ a Preliminary Report and Census II of Open Source Software, which analyzed the use of OSS to help identify critical software. The LF and LISH are in the process of updating that report. The CII identified many important projects and assisted them, including OpenSSL (after Heartbleed), OpenSSH, GnuPG, Frama-C, and the OWASP Zed Attack Proxy (ZAP). The OpenSSF Securing Critical Projects Working Group has been working to better identify critical OSS projects and to focus resources on critical OSS projects that need help. There is already a first-cut list of such projects, along with efforts to fund such aid.
Internet of Things (IoT)
Unfortunately, internet-of-things (IoT) devices often have notoriously bad security. It’s often been said that “the S in IoT stands for security.”
EO 4(s) initiates a pilot program to “educate the public on the security capabilities of Internet-of-Things (IoT) devices and software development practices [based on existing consumer product labeling programs], and shall consider ways to incentivize manufacturers and developers to participate in these programs.” EO 4(t) states that such “IoT cybersecurity criteria” shall “reflect increasingly comprehensive levels of testing and assessment.”
The Linux Foundation develops and is home to many of the key components of IoT systems. These include:
The Linux kernel, used by many IoT devices. The yocto project, which creates custom Linux-based systems for IoT and embedded systems. Yocto supports full reproducible builds. EdgeX Foundry, which is a flexible OSS framework that facilitates interoperability between devices and applications at the IoT edge, and has been downloaded millions of times. The Zephyr project, which provides a real-time operating system (RTOS) used by many for resource-constrained IoT devices and is able to generate SBOM’s automatically during build. Zephyr is one of the few open source projects that is a CVE Numbering Authority.The seL4 microkernel, which is the most assured operating system kernel in the world; it’s notable for its comprehensive formal verification.
Security Labeling
EO 4(u) focuses on identifying:
“secure software development practices or criteria for a consumer software labeling program [that reflects] a baseline level of secure practices, and if practicable, shall reflect increasingly comprehensive levels of testing and assessment that a product may have undergone [and] identify, modify, or develop a recommended label or, if practicable, a tiered software security rating system.”
The OpenSSF’s CII Best Practices badge project (noted earlier) specifically identifies best practices for OSS development, and is already tiered (passing, silver, and gold). Over 3,800 projects currently participate.
There are also a number of projects that relate to measuring security and/or broader quality:
Community Health Analytics Open Source Software (CHAOSS) focuses on creating analytics and metrics to help define community health and identify risk The OpenSSF Security Metrics Project, which is in the process of development, was created to collect, aggregate, analyze, and communicate relevant security data about open source projects.The OpenSSF Security Reviews initiative provides a collection of security reviews of open source software.The OpenSSF Security Scorecards provide a set of automated pass/fail checks to provide a quick review of arbitrary OSS.
Conclusion
The Linux Foundation (LF) has long been working to help improve the security of open source software (OSS), which powers systems worldwide. We couldn’t do this without the many contributions of time, money, and other resources from numerous companies and individuals; we gratefully thank them all. We are always delighted to work with anyone to improve the development and deployment of open source software, which is important to us.
David A. Wheeler, Director of Open Source Supply Chain Security at the Linux Foundation
The post How LF communities enable security measures required by the US Executive Order on Cybersecurity appeared first on Linux Foundation.