Modern day supply chains leave greater potential for vulnerabilities, and supply chain security should be a high priority for organizations. Vulnerabilities could be catastrophic, and lead to unnecessary costs, inefficient delivery schedules and a loss of intellectual property. 

In addition, over the last few years, supply chains have increasingly been exposed as a major weak point in organizational security. While security may be top of mind within company walls, you are only as strong as your most vulnerable supplier.

We are excited to bring the community a new event where folks can learn directly from experts who have been working on how to solve these vulnerabilities for almost a decade, to find out how to best protect their supply chain and mitigate potential disaster.

Anyone involved in ensuring their company’s supply chain is secure including security professionals, executive leadership and tech leaders.

The event is free to attend, and will take place virtually on August 18. It is comprised of nine sessions covering all aspects of protecting the supply chain, including talks on:

  • Generating SBOMs for IoT at Build Time
  • Securing GCC & GLIBC
  • Building Signing, Distributing SPDX SBOMs as Artifact Reference Type
  • Software Supply Chain Integrity with Sigstore

View all sessions, speakers and register to attend here.

One of the greatest strengths of open source development is how it enables collaboration across the entire world. However, because open source development is a global activity, it necessarily involves making available software across national boundaries. Some countries’ export control regulations, such as the United States, may require taking additional steps to ensure that an open source project is satisfying obligations under local laws.

In July of 2020, The Linux Foundation published a whitepaper on how to address these issues in detail, which can be downloaded here. In 2021, the primary update in the paper is to reflect a change in the US Export Administration Regulations.

  • Previously, in order for publicly available encryption software under ECCN 5D002 to be not subject to the EAR, email notifications were required regardless of whether or not the cryptography it implemented was standardized.
  • Following the change, email notifications are only required for software that implements “non-standard cryptography”.

Please see the updated paper and the EAR for more specific details about this change.

The National Telecommunications and Information Administration (NTIA) recently asked for wide-ranging feedback to define a minimum Software Bill of Materials (SBOM). It was framed with a single, simple question (“What is an SBOM?”), and constituted an incredibly important step towards software security and a significant moment for open standards.

From NTIA’s SBOM FAQ  “A Software Bill of Materials (SBOM) is a complete, formally structured list of components, libraries, and modules that are required to build (i.e. compile and link) a given piece of software and the supply chain relationships between them. These components can be open source or proprietary, free or paid, and widely available or restricted access.”  SBOMs that can be shared without friction between teams and companies are a core part of software management for critical industries and digital infrastructure in the coming decades.

The ISO International Standard for open source license compliance (ISO/IEC 5230:2020 – Information technology — OpenChain Specification) requires a process for managing a bill of materials for supplied software. This aligns with the NTIA goals for increased software transparency and illustrates how the global industry is addressing challenges in this space. For example, it has become a best practice to include an SBOM for all components in supplied software, rather than isolating these materials to open source.

The open source community identified the need for and began to address the challenge of SBOM “list of ingredients” over a decade ago. The de-facto industry standard, and most widely used approach today, is called Software Package Data Exchange (SPDX). All of the elements in the NTIA proposed minimum SBOM definition can be addressed by SPDX today, as well as broader use-cases.

SPDX evolved organically over the last decade to suit the software industry, covering issues like license compliance, security, and more. The community consists of hundreds of people from hundreds of companies, and the standard itself is the most robust, mature, and adopted SBOM in the market today. 

The full SPDX specification is only one part of the picture. Optional components such as SPDX Lite, developed by Pioneer, Sony, Hitachi, Renesas, and Fujitsu, among others, provide a focused SBOM subset for smaller supplier use. The nature of the community approach behind SPDX allows practical use-cases to be addressed as they arose.

In 2020, SPDX was submitted to ISO via the PAS Transposition process of Joint Technical Committee 1 (JTC1) in collaboration with the Joint Development Foundation. It is currently in the approval phase of the transposition process and can be reviewed on the ISO website as ISO/IEC PRF 5962.

The Linux Foundation has prepared a submission for NTIA highlighting knowledge and experience gained from practical deployment and usage of SBOM in the SPDX and OpenChain communities. These include isolating the utility of specific actions such as tracking timestamps and including data licenses in metadata. With the backing of many parties across the worldwide technology industry, the SPDX and OpenChain specifications are constantly evolving to support all stakeholders.

Industry Comments

The Sony team uses various approaches to managing open source compliance and governance… An example is using an OSS management template sheet based on SPDX Lite, a compact subset of the SPDX standard. Teams need to be able to review the type, version, and requirements of software quickly, and using a clear standard is a key part of this process.

Hisashi Tamai, SVP, Sony Group Corporation, Representative of the Software Strategy Committee

“Intel has been an early participant in the development of the SPDX specification and utilizes SPDX, as well as other approaches, both internally and externally for a number of open source software use-cases.”

Melissa Evers, Vice President – Intel Architecture, Graphics, Software / General Manager – Software Business Strategy

Scania corporate standard 4589 (STD 4589) was just made available to our suppliers and defines the expectations we have when Open Source is part of a delivery to Scania. So what is it we ask for in a relationship with our suppliers when it comes to Open Source? 

1) That suppliers conform to ISO/IEC 5230:2020 (OpenChain). If a supplier conforms to this specification, we feel confident that they have a professional management program for Open Source.  

2) If in the process of developing a solution for Scania, a supplier makes modifications to Open Source components, we would like to see those modifications contributed to the Open Source project. 

3) Supply a Bill of materials in ISO/IEC DIS 5962 (SPDX) format, plus the source code where there’s an obligation to offer the source code directly, so we don’t need to ask for it.

Jonas Öberg, Open Source Officer – Scania (Volkswagen Group)

The SPDX format greatly facilitates the sharing of software component data across the supply chain. Wind River has provided a Software Bill of Materials (SBOM) to its customers using the SPDX format for the past eight years. Often customers will request SBOM data in a custom format. Standardizing on SPDX has enabled us to deliver a higher quality SBOM at a lower cost.

Mark Gisi, Wind River Open Source Program Office Director and OpenChain Specification Chair

The Black Duck team from Synopsys has been involved with SPDX since its inception, and I had the pleasure of coordinating the activities of the project’s leadership for more than a decade. In addition, representatives from scores of companies have contributed to the important work of developing a standard way of describing and communicating the content of a software package.

Phil Odence, General Manager, Black Duck Audits, Synopsys

With the rapidly increasing interest in the types of supply chain risk that a Software Bill of Materials helps address, SPDX is gaining broader attention and urgency. FossID (now part of Snyk) has been using SPDX from the start as part of both software component analysis and for open source license audits. Snyk is stepping up its involvement too, already contributing to efforts to expand the use cases for SPDX by building tools to test out the draft work on vulnerability profiles in SPDX v3.0.

Gareth Rushgrove, Vice President of Products, Snyk

For more information on OpenChain: https://www.openchainproject.org/

For more information on SPDX: https://spdx.dev/

References:

After careful consideration, we have decided that the safest course of action for returning to in-person events this fall is to take a “COVID-19 vaccine required” approach to participating in-person. Events that will be taking this approach include:

We are still evaluating whether to keep this requirement in place for events in December and beyond. We will share more information once we have an update.

Proof of full COVID-19 vaccination will be required to attend any of the events listed above. A person is considered fully vaccinated 2 weeks after the second dose of a two-dose series, or two weeks after a single dose of a one-dose vaccine.

Vaccination proof will be collected via a digitally secure vaccine verification application that will protect attendee data in accordance with EU GDPR, California CCPA, and US HIPAA regulations. Further details on the app we will be using, health and safety protocols that will be in place onsite at the events, and a full list of accepted vaccines will be added to individual event websites in the coming months. 

While this has been a difficult decision to make, the health and safety of our community and our attendees are of the utmost importance to us. Mandating vaccines will help infuse confidence and alleviate concerns that some may still have about attending an event in person. Additionally, it helps us keep our community members safe who have not yet been able to get vaccinated or who are unable to get vaccinated. 

This decision also allows us to be more flexible in pivoting with potential changes in guidelines that venues and municipalities may make as organizations and attendees return to in person events. Finally, it will allow for a more comprehensive event experience onsite by offering more flexibility in the structure of the event.

For those that are unable to attend in-person, all of our Fall 2021 events will have a digital component that anyone can participate in virtually. Please visit individual event websites for more information on the virtual aspect of each event.

We hope everyone continues to stay safe, and we look forward to seeing you, either in person or virtually, this fall. 

The Linux Foundation

FAQ

Q:If I’ve already tested positive for COVID-19, do I still need to show proof of COVID-19 vaccination to attend in person? 

A: Yes, you will still need to show proof of COVID-19 vaccination to attend in-person.

Q: Are there any special circumstances in which you will accept a negative COVID-19 test instead of proof of a COVID-19 vaccination? 

A: Unfortunately, no. For your own safety, as well as the safety of all our onsite attendees, everyone who is not vaccinated against COVID-19 will need to participate in these events virtually this year, and will not be able to attend in-person.

Q: I cannot get vaccinated for medical, religious, or other reasons. Does this mean I cannot attend?

A: For your own safety, as well as the safety of all our onsite attendees, everyone who is not vaccinated against COVID-19 – even due to medical, religious or other reasons – will need to participate in these events virtually this year, and will not be able to attend in-person.

Q: Will I need to wear a mask and socially distance at these events if everyone is vaccinated? 

A: Mask and social distancing requirements for each event will be determined closer to event dates, taking into consideration venue and municipality guidelines.

Q: Can I bring family members to any portion of an event (such as an evening reception) if they have not provided COVID-19 vaccination verification in the app? 

A: No. Anyone that attends any portion of an event in-person will need to register for the event, and upload COVID vaccine verification into our application.

Q: Will you provide childcare onsite at events again this year?

A: Due to COVID-19 restrictions, we unfortunately cannot offer child care services onsite at events at this time. We can, however, provide a list of local childcare providers. We apologize for this disruption to our normal event plans. We will be making this service available as soon as we can for future events.

Q: Will international (from outside the US) be able to attend? Will you accept international vaccinations?

A: Absolutely. As mentioned above, a full list of accepted vaccines will be added to individual event websites in the coming months. 

In April, The Linux Foundation asked the open source community: How has Linux impacted your life? Needless to say, responses poured in from across the globe sharing memories, sentiments and important moments that changed your lives forever. We are grateful you took the time to tell us your stories.

We’re thrilled to share 30 of the responses we received, randomly selected from all submissions. As a thank you to these 30 folks for sharing their stories, and in celebration of the 30th Anniversary of Linux, 30 penguins were adopted* from the Southern African Foundation for the Conservation of Coastal Birds in their honor, and each of our submitters got to name their adopted penguin. 

Check out the slides below to read these stories, get a glimpse of their newly adopted penguins and their new names!

Thank you to all who contributed for inspiring us and the community for the next 30 years of innovation and beyond. 

*Each of the adopted wild African penguins have been rescued and are being rehabilitated with the goal of being released back into the wild by the wonderful and dedicated staff at SANCCOB.

Jason Perlow, Director of Project Insights and Editorial Content, spoke with Stephen Hendrick about Linux Foundation Research and how it will promote a greater understanding of the work being done by open source projects, their communities, and the Linux Foundation.

JP: It’s great to have you here today, and also, welcome to the Linux Foundation. First, can you tell me a bit about yourself, where you are from, and your interests outside work?

SH: I’m from the northeastern US.  I started as a kid in upstate NY and then came to the greater Boston area when I was 8.  I grew up in the Boston area, went to college back in upstate NY, and got a graduate degree in Boston.  I’ve worked in the greater Boston area since I was out of school and have really had two careers.  My first career was as a programmer, which evolved into project and product management doing global cash management for JPMC.  When I was in banking, IT was approached very conservatively, with a tagline like yesterday’s technology, tomorrow.  The best thing about JPMC was that it was where I met my wife.  Yes, I know, you’re never supposed to date anybody from work.  But it was the best decision I ever made.  After JPMC, my second career began as an industry analyst working for IDC, specializing in application development and deployment tools and technologies.  This was a long-lived 25+ year career followed by time with a couple of boutique analyst firms and cut short by my transition to the Linux Foundation.

Until recently, interests outside of work mainly included vertical pursuits — rock climbing during the warm months and ice climbing in the winter.  The day I got engaged, my wife (to be) and I had been climbing in the morning, and she jokes that if she didn’t make it up that last 5.10, I wouldn’t have offered her the ring.  However, having just moved to a house overlooking Mt. Hope bay in Rhode Island, our outdoor pursuits will become more nautically focused.

JP: And from what organization are you joining us?

SH: I was lead analyst at Enterprise Management Associates, a boutique industry analyst firm.  I initially focused my practice area on DevOps, but in reality, since I was the only person with application development and deployment experience, I also covered adjacent markets that included primary research into NoSQL, Software Quality, PaaS, and decisioning.  

JP: Tell me a bit more about your academic and quantitative analysis background; I see you went to Boston University, which was my mom’s alma mater as well. 

SH:  I went to BU for an MBA.  In the process, I concentrated in quantitative methods, including decisioning, Bayesian methods, and mathematical optimization.  This built on my undergraduate math and economics focus and was a kind of predecessor to today’s data science focus.  The regression work that I did served me well as an analyst and was the foundation for much of the forecasting work I did and industry models that I built.  My qualitative and quantitative empirical experience was primarily gained through experience in the more than 100 surveys and in-depth interviews I have fielded.  

JP: What disciplines do you feel most influence your analytic methodology? 

SH: We now live in a data-driven world, and math enables us to gain insight into the data.  So math and statistics are the foundation that analysis is built on.  So, math is most important, but so is the ability to ask the right questions.  Asking the right questions provides you with the data (raw materials) shaped into insights using math.  So analysis ends up being a combination of both art and science.

JP: What are some of the most enlightening research projects you’ve worked on in your career? 

SH:  One of the most exciting projects I cooked up was to figure out how many professional developers there were in the world, by country, with five years of history and a 5-year forecast.  I developed a parameterized logistics curve tuned to each country using the CIA, WHO, UN, and selected country-level data.  It was a landmark project at the time and used by the world’s leading software and hardware manufacturers. I was flattered to find out six years later that another analyst firm had copied it (since I provided the generalized equation in the report).

I was also interested in finding that an up-and-coming SaaS company had used some of my published matrix data on language use, which showed huge growth in Ruby.  This company used my findings and other evidence to help drive its acquisition of a successful Ruby cloud application platform.

JP: I see that you have a lot of experience working at enterprise research firms, such as IDC, covering enterprise software development. What lessons do you think we can learn from the enterprise and how to approach FOSS in organizations adopting open source technologies?

SH: The analyst community has struggled at times to understand the impact of OSS. Part of this stems from the economic foundation of the supply side research that gets done.  However, this has changed radically over the past eight years due to the success of Linux and the availability of a wide variety of curated open source products that have helped transform and accelerate the IT industry.  Enterprises today are less concerned about whether a product/service is open or closed source.  Primarily they want tools that are best able to address their needs. I think of this as a huge win for OSS because it validates the open innovation model that is characteristic of OSS. 

JP: So you are joining the Linux Foundation at a time when we have just gotten our research division off the ground. What are the kind of methodologies and practices that you would like to take from your years at firms like IDC and EMA and see applied to our new LF Research?

SH: LF is in the enviable position of having close relationships with IT luminaries, academics, hundreds of OSS projects, and a significant portion of the IT community.  The LF has an excellent opportunity to develop world-class research that helps the IT community, industry, and governments better understand OSS’s pivotal role in shaping IT going forward.

I anticipate that we will use a combination of quantitative and qualitative research to tell this story.  Quantitative research can deliver statistically significant findings, but qualitative interview-based research can provide examples, sound bites, and perspectives that help communicate a far more nuanced understanding of OSS’s relationship with IT.

JP: How might these approaches contrast with other forms of primary research, specifically human interviews? What are the strengths and weaknesses of the interview process?

SH: Interviews help fill in the gaps around discrete survey questions in ways that can be insightful, personal, entertaining, and unexpected.  Interviews can also provide context for understanding the detailed findings from surveys and provide confirmation or adjustments to models based on underlying data.

JP: What are you most looking forward to learning through the research process into open source ecosystems?

SH: The transformative impact that OSS is having on the digital economy and helping enterprises better understand when to collaborate and when to compete.

JP: What insights do you feel we can uncover with the quantitative analysis we will perform in our upcoming surveys? Are there things that we can learn about the use of FOSS in organizations?

SH: A key capability of empirical research is that it can be structured to highlight how enterprises are leveraging people, policy, processes, and products to address market needs.  Since enterprises are widely distributed in their approach and best/worst practices to a particular market, data can help us build maturity models that provide advice on how enterprises can shape strategy and decision based on the experience and best practices of others.

JP: Trust in technology (and other facets of society) is arguably at an all-time low right now. Do you see a role for LF Research to help improve levels of trust in not only software but in open source as an approach to building secure technologies? What are the opportunities for this department?

SH: I’m reminded by the old saying that there are “lies, damned lies, and then there are statistics.” If trust in technology is at an all-time low, it’s because there are people in this world with a certain moral flexibility, and the IT industry has not yet found effective ways to prevent the few from exploiting the many.  LF Research is in the unique position to help educate and persuade through factual data and analysis on accelerating improvements in IT security.

JP: Thanks, Steve. It’s been great talking to you today!

Jason Perlow, Director of Project Insights and Editorial Content at the Linux Foundation, spoke with Daniel Scales about the importance of protecting trademarks in open source projects.

JP: It’s great to have you here today, and also, welcome to the Linux Foundation. First, can you tell me a bit about yourself, where you are from, and your interests outside work?

DS: Thanks, Jason! It is great to be here. I grew up in Upstate New York, lived in Washington and London for a few years after college, and have been in Boston for the last 20+ years. Outside of work, I coach my daughter’s soccer team, I like to cook and play my bass guitar, and I am really looking forward to getting back to some live music and sporting events. 

JP: And from what organization are you joining us?

DS: I have been with the Boston law firm Choate, Hall & Stewart since 2011. In addition to advising The Linux Foundation and other clients on trademark matters, I helped clients with open source license questions, technology licenses, and IP-focused transactions.  Before Choate, I worked as IP Counsel at Avid Technology, where I managed their trademark portfolio through a global rebranding and supported the engineering team on technology licenses. 

JP: So, how did you get into Intellectual Property law?

DS: Great question.  I studied economics in college and took a fantastic senior seminar on the economics of intellectual property.  After graduation, I worked in the economics consulting group at Ernst & Young.  A big part of my job there was determining the value of a company’s intangible property, which in many cases were its brands. I went to law school intending to study trademarks and the new field of “internet law” (this term probably dates me) and started my legal career at Testa, Hurwitz & Thibeault, which had a cutting-edge trademark and open source group.

JP: We typically think of IP and Trademark law as it applies to consumer products and commercial entities. What is the difference between those and when open source projects and organizations use brands?

DS:  On one level, there really isn’t a difference.  A trademark signifies the unique source of a good or service. Trademarks help consumers, developers, and users distinguish various offerings, and they communicate the specific source and quality of those offerings.  Software developers and users need to understand what code they have and where it came from. Trademarks help communicate that information.  Of course, the specific issues that every brand and brand owner faces and how they address them are different, but many of the core principles are the same.

JP: What are some of the trademark issues you’ve seen come up in open source communities?

DS: While it happens in every industry, I see many “helpful” people apply to register projects’ trademarks when they are not the rightful owner.  Sometimes they have good intentions, sometimes not, but it can be a lot of work to sort it out either way.  I’ve also had the opportunity to work with many different people and companies on project branding. It is amazing how many different philosophies there are regarding branding, even within the software industry.  Much of what we do is to bring these folks together to determine the best approach for the specific project.  I also spend a lot of time debating the scope of trademark rights with opposing counsel, but that isn’t really unique to open source:  one lawyer tried to convince me that his client had the exclusive right to use a picture of a hop flower on a beer label. 

Other common issues are helping companies register a mark for their company or product and then used the same mark for an open source project. The neutrality of those situations is imbalanced, and the Linux Foundation has worked with organizations making this transition. Sometimes it involves rebranding the open source project, and we assist in finding and clearing a new name for the community to use independent of the company that started it.

JP: Why is the Linux Foundation a good place for open source projects to protect their brands?

DS: We have worked with many open source projects on their trademarks, and we learn something with every new experience.  We can help them name the project at the beginning, take steps to protect their trademarks across the globe, and show them how trademarks can be a tool to build their communities and increase participation and adoption.  We also recognize the importance of our neutral position in the industries we serve and how that is fundamental to open governance.

Also Read: Open Source Communities and Trademarks: A Reprise

JP: Trademark conformance can also protect a project from technical drift. How can a trademark conformance program be used to encourage conformance with a project’s code base or interfaces? 

DS: Great point. As in most areas of trademarks, clarity and consistency are key. Trademarks used in a conformance program can be a great tool to communicate quickly and accurately to the target community.  Projects can develop specific and transparent criteria so that users understand exactly what the conformance trademark symbolizes.  This can be much more effective and efficient for projects and users alike than everyone deciding for themselves what a term like “compatible” might mean.  

Also Read: Driving Compatibility with Code and Specifications through Conformance Trademark Programs

JP: Do projects at the Linux Foundation give up all control of their trademark? How do you decide what enforcement to pursue or not pursue?

DS: On the contrary — we work very closely with project leadership throughout the lifecycle of their trademarks.  This includes trademark enforcement.  Typically, the first step is to figure out whether the situation requires enforcement (in the traditional legal sense) or if it is simply a matter of educating another party.  More often than not, we can reach out to the other party, discuss our project and our trademarks, discuss our concerns, and work out a solution that works for everyone and strengthens our brands.  But like any brand owner, we do sometimes have to take other action to protect our projects’ trademarks, and we work closely with our projects in those situations, too.

JP: Thanks, Daniel. It’s been great talking to you today!

There is an exciting convergence in the networking industry around open source, and the energy is palpable. At LF Networking, we have a unique perspective as the largest open source initiative in the networking space with the broadest set of projects that make up the diverse and evolving open source networking stack. LF Networking provides platforms and building blocks across the networking industry that enable rapid interoperability, deployment, and adoption and is the nexus for 5G innovation and integration. 

LF Networking has now tapped confluence on industry efforts to structure a new initiative to develop 5G Super Blueprints for the ecosystem. Major integrations between the building blocks are now underway–between ONAP and ORAN, Akraino and Magma, Anuket and Kubernetes, and more. 

“Super” means that we’re integrating multiple projects, umbrellas (such as LF Edge, Magma, CNCF, O-RAN Alliance, LF Energy, and more) with an end-to-end framework for the underlying infrastructure and application layers across edge, access, and core. This end-to-end integration enables top industry use cases, such as fixed wireless, mobile broadband, private 5G, multi-access, IoT, voice services, network slicing, and more. In short, 5G Super Blueprints are a vehicle to collaborate and create end-to-end 5G solutions.

Major industry verticals banking on this convergence and roadmap include the global telcos that you’d expect, but 5G knows no boundaries, and we’re seeing deep engagement from cloud service providers, enterprise IT, governments, and even energy.

5G is poised to modernize today’s energy grid with awareness monitoring across Distribution Systems and more.

This will roll out in 3 phases, the first encompassing 5G Core + Multi-access Edge Computing (MEC) using emulators. The second phase introduces commercial RANs to end-to-end 5G, and the third phase will integrate Open Radio Access Network (O-RAN). 

The 5G Super Blueprint is an open initiative, and participation is open to anyone. To learn more, please see the 5G Super Blueprint FAQ and watch the video, What is the 5G Super Blueprint? from Next Gen Infra

Participation in this group has tripled over the last few weeks! If you’re ready to join us, please indicate your interest in participation on the 5G Super Blueprint webpage, and follow the onboarding steps on the 5G Super Blueprint Wiki. Send any questions to superblueprint@lfnetworking.org

Author: Kate Stewart, VP of Dependable Systems, The Linux Foundation

In a previous Linux Foundation blog, David A. Wheeler, director of LF Supply Chain Security, discussed how capabilities built by Linux Foundation communities can be used to address the software supply chain security requirements set by the US Executive Order on Cybersecurity. 

One of those capabilities, SPDX, completely addresses the Executive Order 4(e) and 4(f) and 10(j) requirements for a Software Bill of Materials (SBOM). The SPDX specification is implemented as a file format that identifies the software components within a larger piece of computer software and metadata such as the licenses of those components. 

SPDX is an open standard for communicating software bill of material (SBOM) information, including components, licenses, copyrights, and security references. It has a rich ecosystem of existing tools that provides a common format for companies and communities to share important data to streamline and improve the identification and monitoring of software.

SBOMs have numerous use cases. They have frequently been used in areas such as license compliance but are equally useful in security, export control, and broader processes such as mergers and acquisitions (M&A) processes or venture capital investments. SDPX maintains an active community to support various uses, modeling its governance and activity on the same format that has successfully supported open source software projects over the past three decades.

The LF has been developing and refining SPDX for over ten years and has seen extensive uptake by companies and projects in the software industry.  Notable recent examples are the contributions by companies such as Hitachi, Fujitsu, and Toshiba in furthering the standard via optional profiles like “SPDX Lite” in the SPDX 2.2 specification release and in support of the SPDX SBOMs in proprietary and open source automation solutions. 

This de facto standard has been submitted to ISO via the Joint Development Foundation using the PAS Transposition process of Joint Technical Committee 1 (JTC1). It is currently in the enquiry phase of the process and can be reviewed on the ISO website as ISO/IEC DIS 5962.

There is a wide range of open source tooling, as well as commercial tool options emerging as well as options available today.  Companies such as FOSSID and Synopsys have been working with the SPDX format for several years. Open Source tools like FOSSology (source code Analysis),  OSS Review Toolkit (Generation from CI & Build infrastructure), Tern (container content analysis), Quartermaster (build extensions), ScanCode (source code analysis) in addition to the SPDX-tools project have also standardized on using SPDX for the interchange are also participating in Automated Compliance Tooling (ACT) Project Umbrella.  ACT has been discussed as community-driven solutions for software supply chain security remediation as part of our synopsis of the findings in the Vulnerabilities in the Core study, which was published by the Linux Foundation and Harvard University LISH in February of 2020.   

One thing is clear: A software bill of materials that can be shared without friction between different teams and companies will be a core part of software development and deployment in this coming decade. The sharing of software metadata will take different forms, including manual and automated reviews, but the core structures will remain the same. 

Standardization in this field, as in others, is the key to success. This domain has an advantage in that we are benefiting from an entire decade of prior work in SPDX. Therefore the process becomes the implementation of this standard to the various domains rather than the creation, expansion, or additional refinement of new or budding approaches to the matter.

Start using the SPDX specification here:https://spdx.github.io/spdx-spec/. Development of the next revision is underway, so If there’s a use case you can’t represent with the current specification, open an issue, this is the right window for input.   

To learn more about the many facets of the SPDX project see: https://spdx.dev/

Our communities take security seriously and have been instrumental in creating the tools and standards that every organization needs to comply with the recent US Executive Order

Overview

The US White House recently released its Executive Order (EO) on Improving the Nation’s Cybersecurity (along with a press call) to counter “persistent and increasingly sophisticated malicious cyber campaigns that threaten the public sector, the private sector, and ultimately the American people’s security and privacy.”

In this post, we’ll show what the Linux Foundation’s communities have already built that support this EO and note some other ways to assist in the future. But first, let’s put things in context.

The Linux Foundation’s Open Source Security Initiatives In Context

We deeply care about security, including supply chain (SC) security. The Linux Foundation is home to some of the most important and widely-used OSS, including the Linux kernel and Kubernetes. The LF’s previous Core Infrastructure Initiative (CII) and its current Open Source Security Foundation (OpenSSF) have been working to secure OSS, both in general and in widely-used components. The OpenSSF, in particular, is a broad industry coalition “collaborating to secure the open source ecosystem.”

The Software Package Data Exchange (SPDX) project has been working for the last ten years to enable software transparency and the exchange of software bill of materials (SBOM) data necessary for security analysis. SPDX is in the final stages of review to be an ISO standard, is supported by global companies with massive supply chains, and has a large open and closed source tooling support ecosystem. SPDX already meets the requirements of the executive order for SBOMs.

Finally, several LF foundations have focused on the security of various verticals. For example,  LF Public Health and LF Energy have worked on security in their respective sectors. Our cloud computing industry collaborating within CNCF has also produced a guide for supporting software supply chain best practices for cloud systems and applications.

Given that context, let’s look at some of the EO statements (in the order they are written) and how our communities have invested years in open collaboration to address these challenges.

Best Practices

The EO 4(b) and 4(c) says that

The “Secretary of Commerce [acting through NIST] shall solicit input from the Federal Government, private sector, academia, and other appropriate actors to identify existing or develop new standards, tools, and best practices for complying with the standards, procedures, or criteria [including] criteria that can be used to evaluate software security, include criteria to evaluate the security practices of the developers and suppliers themselves, and identify innovative tools or methods to demonstrate conformance with secure practices [and guidelines] for enhancing software supply chain security.” Later in EO 4(e)(ix) it discusses “attesting to conformity with secure software development practices.”

The OpenSSF’s CII Best Practices badge project specifically identifies best practices for OSS, focusing on security and including criteria to evaluate the security practices of developers and suppliers (it has over 3,800 participating projects). LF is also working with SLSA (currently in development) as potential additional guidance focused on addressing supply chain issues further.

Best practices are only useful if developers understand them, yet most software developers have never received education or training in developing secure software. The LF has developed and released its Secure Software Development Fundamentals set of courses available on edX to anyone at no cost. The OpenSSF Best Practices Working Group (WG) actively works to identify and promulgate best practices. We also provide a number of specific standards, tools, and best practices, as discussed below.

Encryption and Data Confidentiality

The EO 3(d) requires agencies to adopt “encryption for data at rest and in transit.” Encryption in transit is implemented on the web using the TLS (“https://”) protocol, and Let’s Encrypt is the world’s largest certificate authority for TLS certificates.

In addition, the LF Confidential Computing Consortium is dedicated to defining and accelerating the adoption of confidential computing. Confidential computing protects data in use (not just at rest and in transit) by performing computation in a hardware-based Trusted Execution Environment. These secure and isolated environments prevent unauthorized access or modification of applications and data while in use.

Supply Chain Integrity

The EO 4(e)(iii) states a requirement for

 “employing automated tools, or comparable processes, to maintain trusted source code supply chains, thereby ensuring the integrity of the code.” 

The LF has many projects that support SC integrity, in particular:

  • in-toto is a framework specifically designed to secure the integrity of software supply chains.
  • The Update Framework (TUF) helps developers maintain the security of software update systems, and is used in production by various tech companies and open source organizations.  
  • Uptane is a variant of TUF; it’s an open and secure software update system design which protects software delivered over-the-air to the computerized units of automobiles.
  • sigstore is a project to provide a public good / non-profit service to improve the open source software supply chain by easing the adoption of cryptographic software signing (of artifacts such as release files and container images) backed by transparency log technologies (which provide a tamper-resistant public log). 
  • OpenChain (ISO 5230) is the International Standard for open source license compliance. Application of OpenChain requires identification of OSS components. While OpenChain by itself focuses more on licenses, that identification is easily reused to analyze other aspects of those components once they’re identified (for example, to look for known vulnerabilities).

Software Bill of Materials (SBOMs) support supply chain integrity; our SBOM work is so extensive that we’ll discuss that separately.

Software Bill of Materials (SBOMs)

Many cyber risks come from using components with known vulnerabilities. Known vulnerabilities are especially concerning in key infrastructure industries, such as the national fuel pipelines,  telecommunications networks, utilities, and energy grids. The exploitation of those vulnerabilities could lead to interruption of supply lines and service, and in some cases, loss of life due to a cyberattack.

One-time reviews don’t help since these vulnerabilities are typically found after the component has been developed and incorporated. Instead, what is needed is visibility into the components of the software environments that run these key infrastructure systems, similar to how food ingredients are made visible.

A Software Bill of Materials (SBOM) is a nested inventory or a list of ingredients that make up the software components used in creating a device or system. This is especially critical as it relates to a national digital infrastructure used within government agencies and in key industries that present national security risks if penetrated. Use of SBOMs would improve understanding of the operational and cyber risks of those software components from their originating supply chain.

The EO has extensive text about requiring a software bill of materials (SBOM) and tasks that depend on SBOMs:

  • EO 4(e) requires providing a purchaser an SBOM “for each product directly or by publishing it on a public website” and “ensuring and attesting… the integrity and provenance of open source software used within any portion of a product.” 
  • It also requires tasks that typically require SBOMs, e.g., “employing automated tools, or comparable processes, that check for known and potential vulnerabilities and remediate them, which shall operate regularly….” and “maintaining accurate and up-to-date data, provenance (i.e., origin) of software code or components, and controls on internal and third-party software components, tools, and services present in software development processes, and performing audits and enforcement of these controls on a recurring basis.” 
  • EO 4(f) requires publishing “minimum elements for an SBOM,” and EO 10(j) formally defines an SBOM as a “formal record containing the details and supply chain relationships of various components used in building software…  The SBOM enumerates [assembled] components in a product… analogous to a list of ingredients on food packaging.”

The LF has been developing and refining SPDX for over ten years; SPDX is used worldwide and is in the process of being approved as ISO/IEC Draft International Standard (DIS) 5962.  SPDX is a file format that identifies the software components within a larger piece of computer software and metadata such as the licenses of those components. SPDX 2.2 already supports the current guidance from the National Telecommunications and Information Administration (NTIA) for minimum SBOM elements. Some ecosystems have ecosystem-specific conventions for SBOM information, but SPDX can provide information across all arbitrary ecosystems.

SPDX is real and in use today, with increased adoption expected in the future. For example:

  • An NTIA “plugfest” demonstrated ten different producers generating SPDX. SPDX supports acquiring data from different sources (e.g., source code analysis, executables from producers, and analysis from third parties). 
  • A corpus of some LF projects with SPDX source SBOMs is available. 
  • Various LF projects are working to generate binary SBOMs as part of their builds, including yocto and Zephyr
  • To assist with further SPDX adoption, the LF is paying to write SPDX plugins for major package managers.

Vulnerability Disclosure

No matter what, some vulnerabilities will be found later and need to be fixed. EO 4(e)(viii) requires “participating in a vulnerability disclosure program that includes a reporting and disclosure process.” That way, vulnerabilities that are found can be reported to the organizations that can fix them. 

The CII Best Practices badge passing criteria requires that OSS projects specifically identify how to report vulnerabilities to them. More broadly, the OpenSSF Vulnerability Disclosures Working Group is working to help “mature and advocate well-managed vulnerability reporting and communication” for OSS. Most widely-used Linux distributions have a robust security response team, but the Alpine Linux distribution (widely used in container-based systems) did not. The Linux Foundation and Google funded various improvements to Alpine Linux, including a security response team.

We hope that the US will update its Vulnerabilities Equities Process (VEP) to work more cooperatively with commercial organizations, including OSS projects, to share more vulnerability information. Every vulnerability that the US fails to disclose is a vulnerability that can be found and exploited by attackers. We would welcome such discussions.

Critical Software

It’s especially important to focus on critical software — but what is critical software? EO 4(g) requires the executive branch to define “critical software,” and 4(h) requires the executive branch to “identify and make available to agencies a list of categories of software and software products… meeting the definition of critical software.”

Linux Foundation and the Laboratory for Innovation Science at Harvard (LISH) developed the report Vulnerabilities in the Core,’ a Preliminary Report and Census II of Open Source Software, which analyzed the use of OSS to help identify critical software. The LF and LISH are in the process of updating that report. The CII identified many important projects and assisted them, including OpenSSL (after Heartbleed), OpenSSH,  GnuPG, Frama-C, and the OWASP Zed Attack Proxy (ZAP). The OpenSSF Securing Critical Projects Working Group has been working to better identify critical OSS projects and to focus resources on critical OSS projects that need help. There is already a first-cut list of such projects, along with efforts to fund such aid.

Internet of Things (IoT)

Unfortunately, internet-of-things (IoT) devices often have notoriously bad security. It’s often been said that “the S in IoT stands for security.” 

EO 4(s) initiates a pilot program to “educate the public on the security capabilities of Internet-of-Things (IoT) devices and software development practices [based on existing consumer product labeling programs], and shall consider ways to incentivize manufacturers and developers to participate in these programs.” EO 4(t) states that such “IoT cybersecurity criteria” shall “reflect increasingly comprehensive levels of testing and assessment.”

The Linux Foundation develops and is home to many of the key components of IoT systems. These include:

  • The Linux kernel, used by many IoT devices. 
  • The yocto project, which creates custom Linux-based systems for IoT and embedded systems. Yocto supports full reproducible builds. 
  • EdgeX Foundry, which is a flexible OSS framework that facilitates interoperability between devices and applications at the IoT edge, and has been downloaded millions of times. 
  • The Zephyr project, which provides a real-time operating system (RTOS) used by many for resource-constrained IoT devices and is able to generate SBOM’s automatically during build. Zephyr is one of the few open source projects that is a CVE Numbering Authority.
  • The seL4 microkernel, which is the most assured operating system kernel in the world; it’s notable for its comprehensive formal verification.

Security Labeling

EO 4(u) focuses on identifying:

“secure software development practices or criteria for a consumer software labeling program [that reflects] a baseline level of secure practices, and if practicable, shall reflect increasingly comprehensive levels of testing and assessment that a product may have undergone [and] identify, modify, or develop a recommended label or, if practicable, a tiered software security rating system.”

The OpenSSF’s CII Best Practices badge project (noted earlier) specifically identifies best practices for OSS development, and is already tiered (passing, silver, and gold). Over 3,800 projects currently participate.

There are also a number of projects that relate to measuring security and/or broader quality:

Conclusion

The Linux Foundation (LF) has long been working to help improve the security of open source software (OSS), which powers systems worldwide. We couldn’t do this without the many contributions of time, money, and other resources from numerous companies and individuals; we gratefully thank them all.  We are always delighted to work with anyone to improve the development and deployment of open source software, which is important to us all.

David A. Wheeler, Director of Open Source Supply Chain Security at the Linux Foundation