Nearly a year after the Internet Engineering Task Force took up a plan to replace words that could be considered racist, the debate is still raging.

Anyone who joined a video call during the pandemic probably has a global volunteer organization called the Internet Engineering Task Force to thank for making the technology work. The group, which helped create the technical foundations of the internet, designed the language that allows most video to run smoothly online. It made it possible for someone with a Gmail account to communicate with a friend who uses Yahoo, and for shoppers to safely enter their credit card information on e-commerce sites.

Now the organization is tackling an even thornier issue: getting rid of computer engineering terms that evoke racist history, like “master” and “slave” and “whitelist” and “blacklist.”

But what started as an earnest proposal has stalled as members of the task force have debated the history of slavery and the prevalence of racism in tech. Some companies and tech organizations have forged ahead anyway, raising the possibility that important technical terms will have different meanings to different people — a troubling proposition for an engineering world that needs broad agreement so technologies work together.

While the fight over terminology reflects the intractability of racial issues in society, it is also indicative of a peculiar organizational culture that relies on informal consensus to get things done.

The Internet Engineering Task Force eschews voting, and it often measures consensus by asking opposing factions of engineers to hum during meetings. The hums are then assessed by volume and ferocity. Vigorous humming, even from only a few people, could indicate strong disagreement, a sign that consensus has not yet been reached.

The I.E.T.F. has created rigorous standards for the internet and for itself. Until 2016, it required the documents in which its standards are published to be precisely 72 characters wide and 58 lines long, a format adapted from the era when programmers punched their code into paper cards and fed them into early IBM computers.

“We have big fights with each other, but our intent is always to reach consensus,” said Vint Cerf, one of the founders of the task force and a vice president at Google. “I think that the spirit of the I.E.T.F. still is that, if we’re going to do anything, let’s try to do it one way so that we can have a uniform expectation that things will function.”

The group is made up of about 7,000 volunteers from around the world. It has two full-time employees, an executive director and a spokesman, whose work is primarily funded by meeting dues and the registration fees of dot-org internet domains. It cannot force giants like Amazon or Apple to follow its guidance, but tech companies often choose to do so because the I.E.T.F. has created elegant solutions for engineering problems.

Its standards are hashed out during fierce debates on email lists and at in-person meetings. The group encourages participants to fight for what they believe is the best approach to a technical problem.

While shouting matches are not uncommon, the Internet Engineering Task Force is also a place where young technologists break into the industry. Attending meetings is a rite of passage, and engineers sometimes leverage their task force proposals into job offers from tech giants.

In June, against the backdrop of the Black Lives Matter protests, engineers at social media platforms, coding groups and international standards bodies re-examined their code and asked themselves: Was it racist? Some of their databases were called “masters” and were surrounded by “slaves,” which received information from the masters and answered queries on their behalf, preventing them from being overwhelmed. Others used “whitelists” and “blacklists” to filter content.

Mallory Knodel, the chief technology officer at the Center for Democracy and Technology, a policy organization, wrote a proposal suggesting that the task force use more neutral language. Invoking slavery was alienating potential I.E.T.F. volunteers, and the terms should be replaced with ones that more clearly described what the technology was doing, argued Ms. Knodel and the co-author of her proposal, Nielsten Oever, a postdoctoral researcher at the University of Amsterdam. “Blocklist” would explain what a blacklist does, and “primary” could replace “master,” they wrote.

On an email list, responses trickled in. Some were supportive. Others proposed revisions. And some were vehemently opposed. One respondent wrote that Ms. Knodel’s draft tried to construct a new “Ministry of Truth.”

Amid insults and accusations, many members announced that the battle had become too toxic and that they would abandon the discussion.

The pushback didn’t surprise Ms. Knodel, who had proposed similar changes in 2018 without gaining traction. The engineering community is “quite rigid and averse to these sorts of changes,” she said. “They are averse to conversations about community comportment, behavior — the human side of things.”

In July, the Internet Engineering Task Force’s steering group issued a rare statement about the draft from Ms. Knodel and Mr. ten Oever. “Exclusionary language is harmful,” it said.

A month later, two alternative proposals emerged. One came from Keith Moore, an I.E.T.F. contributor who initially backed Ms. Knodel’s draft before creating his own. His cautioned that fighting over language could bottleneck the group’s work and argued for minimizing disruption.

The other came from Bron Gondwana, the chief executive of the email company Fastmail, who said he had been motivated by the acid debate on the mailing list.

“I could see that there was no way we would reach a happy consensus,” he said. “So I tried to thread the needle.”

Mr. Gondwana suggested that the group should follow the tech industry’s example and avoid terms that would distract from technical advances.

Last month, the task force said it would create a new group to consider the three drafts and decide how to proceed, and members involved in the discussion appeared to favor Mr. Gondwana’s approach. Lars Eggert, the organization’s chair and the technical director for networking at the company NetApp, said he hoped guidance on terminology would be issued by the end of the year.

The rest of the industry isn’t waiting. The programming community that maintains MySQL, a type of database software, chose “source” and “replica” as replacements for “master” and “slave.” GitHub, the code repository owned by Microsoft, opted for “main” instead of “master.”

In July, Twitter also replaced a number of terms after Regynald Augustin, an engineer at the company, came across the word “slave” in Twitter’s code and advocated change.

But while the industry abandons objectionable terms, there is no consensus about which new words to use. Without guidance from the Internet Engineering Task Force or another standards body, engineers decide on their own. The World Wide Web Consortium, which sets guidelines for the web, updated its style guide last summer to “strongly encourage” members to avoid terms like “master” and “slave,” and the IEEE, an organization that sets standards for chips and other computing hardware, is weighing a similar change.

Other tech workers are trying to solve the problem by forming a clearinghouse for ideas about changing language.

That effort, the Inclusive Naming Initiative, aims to provide guidance to standards bodies and companies that want to change their terminology but don’t know where to begin.

The group got together while working on an open-source software project, Kubernetes, which like the I.E.T.F. accepts contributions from volunteers. Like many others in tech, it began the debate over terminology last summer.

“We saw this blank space,” said Priyanka Sharma, the general manager of the Cloud Native Computing Foundation, a nonprofit that manages Kubernetes. Ms. Sharma worked with several other Kubernetes contributors, including Stephen Augustus and Celeste Horgan, to create a rubric that suggests alternative words and guides people through the process of making changes without causing systems to break. Several major tech companies, including IBM and Cisco, have signed on to follow the guidance.

Priyanka Sharma

Priyanka Sharma and several other tech workers in the Inclusive Naming Initiative came up
with a rubric to suggest alternative words

Although the Internet Engineering Task Force is moving more slowly, Mr. Eggert said it would eventually establish new guidelines. But the debate over the nature of racism — and whether the organization should weigh in on the matter — has continued on its mailing list.

In a subversion of an April Fools’ Day tradition within the group, several members submitted proposals mocking diversity efforts and the push to alter terminology in tech.

Two prank proposals were removed hours later because they were “racist and deeply disrespectful,” Mr. Eggert wrote in an email to task force participants, while a third remained up.

“We build consensus the hard way, so to speak, but in the end the consensus is usually stronger because people feel their opinions were reflected,” Mr. Eggert said. “I wish we could be faster, but on topics like this one that are controversial, it’s better to be slower.”

Kate Conger is a technology reporter in the San Francisco bureau, where she covers the gig economy and social media. @kateconger

Linux Foundation Blog Post Abstract Graphic

Every month there seems to be a new software vulnerability showing up on social media, which causes open source program offices and security teams to start querying their inventories to see how FOSS components they use may impact their organizations. 

Frequently this information is not available in a consistent format within an organization for automatic querying and may result in a significant amount of email and manual effort. By exchanging software metadata in a standardized software bill of materials (SBOM) format between organizations, automation within an organization becomes simpler, accelerating the discovery process and uncovering risk so that mitigations can be considered quickly. 

In the last year, we’ve also seen standards like OpenChain (ISO/IEC 5320:2020) gain adoption in the supply chain. Customers have started asking for a bill of materials from their suppliers as part of negotiation and contract discussions to conform to the standard. OpenChain has a focus on ensuring that there is sufficient information for license compliance, and as a result, expects metadata for the distributed components as well. A software bill of materials can be used to support the systematic review and approval of each component’s license terms to clarify the obligations and restrictions as it applies to the distribution of the supplied software and reduces risk. 

Kate Stewart, VP, Dependable Embedded Systems, The Linux Foundation, will host a complimentary mentorship webinar entitled Generating Software Bill Of Materials on Thursday, March 25 at 7:30 am PST. This session will work through the minimum elements included in a software bill of materials and detail the reasoning behind why those elements are included. To register, please click here

There are many ways this software metadata can be shared. The common SBOM document format options (SPDX, SWID, and CycloneDX) will be reviewed so that the participants can better understand what is available for those just starting. 

This mentorship session will work through some simple examples and then guide where to find the next level of details and further references. 

At the end of this session, participants will be on a secure footing and a path towards the automated generation of SBOMs as part of their build and release processes in the future. 

Jason Perlow, Director of Project Insights and Editorial Content at the Linux Foundation, had an opportunity to speak with Shuah Khan about her experiences as a woman in the technology industry. She discusses how mentorship can improve the overall diversity and makeup of open source projects, why software maintainers are important for the health of open source projects such as the Linux kernel, and how language inclusivity and codes of conduct can improve relationships and communication between software maintainers and individual contributors.

JP: So, Shuah, I know you wear many different hats at the Linux Foundation. What do you call yourself around here these days?

SK: <laughs> Well, I primarily call myself a Kernel Maintainer & Linux Fellow. In addition to that, I focus on two areas that are important to the continued health and sustainability of the open source projects in the Linux ecosystem. The first one is bringing more women into the Kernel community, and additionally, I am leading the mentorship program efforts overall at the Linux Foundation. And in that role, in addition to the Linux Kernel Mentorship, we are looking at how the Linux Foundation mentorship program is working overall, how it is scaling. I make sure the LFX Mentorship platform scales and serves diverse mentees and mentors’ needs in this role. 

The LF mentorships program includes several projects in the Linux kernel, LFN, HyperLedger, Open MainFrame, OpenHPC, and other technologies. The Linux Foundation’s Mentorship Programs are designed to help developers with the necessary skills–many of whom are first-time open source contributors–experiment, learn, and contribute effectively to open source communities. 

The mentorship program has been successful in its mission to train new developers and make these talented pools of prospective employees trained by experts to employers. Several graduated mentees have found jobs. New developers have improved the quality and security of various open source projects, including the Linux kernel. Several Linux kernel bugs were fixed, a new subsystem mentor was added, and a new driver maintainer is now part of the Linux kernel community. My sincere thanks to all our mentors for volunteering to share their expertise.

JP: How long have you been working on the Kernel?

SK: Since 2010, or 2011, I got involved in the Android Mainlining project. My first patch removed the Android pmem driver.

JP: Wow! Is there any particular subsystem that you specialize in?

SK: I am a self described generalist. I maintain the kernel self-test subsystem, the USB over IP driver, usbip tool, and the cpupower tool. I contributed to the media subsystem working on Media Controller Device Allocator API to resolve shared device resource management problems across device drivers from different subsystems.

JP: Hey, I’ve actually used the USB over IP driver when I worked at Microsoft on Azure. And also, when I’ve used AWS and Google Compute. 

SK: It’s a small niche driver used in cloud computing. Docker and other containers use that driver heavily. That’s how they provide remote access to USB devices on the server to export devices to be imported by other systems for use.

JP: I initially used it for IoT kinds of stuff in the embedded systems space. Were you the original lead developer on it, or was it one of those things you fell into because nobody else was maintaining it?

SK: Well, twofold. I was looking at USB over IP because I like that technology. it just so happened the driver was brought from the staging tree into the Mainline kernel, I volunteered at the time to maintain it. Over the last few years, we discovered some security issues with it, because it handles a lot of userspace data, so I had a lot of fun fixing all of those. <laugh>.

JP: What drew you into the Linux operating system, and what drew you into the kernel development community in the first place?

SK: Well, I have been doing kernel development for a very long time. I worked on the LynxOS RTOS, a while back, and then HP/UX, when I was working at HP, after which I transitioned into  doing open source development — the OpenHPI project, to support HP’s rack server hardware, and that allowed me to work much more closely with Linux on the back end. And at some point, I decided I wanted to work with the kernel and become part of the Linux kernel community. I started as an independent contributor.

JP: Maybe it just displays my own ignorance, but you are the first female, hardcore Linux kernel developer I have ever met. I mean, I had met female core OS developers before — such as when I was at Microsoft and IBM — but not for Linux. Why do you suppose we lack women and diversity in general when participating in open source and the technology industry overall?

SK: So I’ll answer this question from my perspective, from what I have seen and experienced, over the years. You are right; you probably don’t come across that many hardcore women Kernel developers. I’ve been working professionally in this industry since the early 1990s, and on every project I have been involved with, I am usually the only woman sitting at the table. Some of it, I think, is culture and society. There are some roles that we are told are acceptable to women — even me, when I was thinking about going into engineering as a profession. Some of it has to do with where we are guided, as a natural path. 

There’s a natural resistance to choosing certain professions that you have to overcome first within yourself and externally. This process is different for everybody based on their personality and their origin story. And once you go through the hurdle of getting your engineering degree and figuring out which industry you want to work in, there is a level of establishing credibility in those work environments you have to endure and persevere. Sometimes when I would walk into a room, I felt like people were looking at me and thinking, “why is she here?” You aren’t accepted right away, and you have to overcome that as well. You have to go in there and say, “I am here because I want to be here, and therefore, I belong here.” You have to have that mindset. Society sends you signals that “this profession is not for me” — and you have to be aware of that and resist it. I consider myself an engineer that happens to be a woman as opposed to a woman engineer.

JP: Are you from India, originally?

SK: Yes.

JP: It’s funny; my wife really likes this Netflix show about matchmaking in India. Are you familiar with it?

SK: <laughs> Yes I enjoyed the series, and A Suitable Girl documentary film that follows three women as they navigate making decisions about their careers and family obligations.

JP: For many Americans, this is our first introduction to what home life is like for Indian people. But many of the women featured on this show are professionals, such as doctors, lawyers, and engineers. And they are very ambitious, but of course, the family tries to set them up in a marriage to find a husband for them that is compatible. As a result, you get to learn about the traditional values and roles they still want women to play there — while at the same time, many women are coming out of higher learning institutions in that country that are seeking technical careers. 

SK: India is a very fascinatingly complex place. But generally speaking, in a global sense, having an environment at home where your parents tell you that you may choose any profession you want to choose is very encouraging. I was extremely fortunate to have parents like that. They never said to me that there was a role or a mold that I needed to fit into. They have always told me, “do what you want to do.” Which is different; I don’t find that even here, in the US. Having that support system, beginning in the home to tell you, “you are open to whatever profession you want to choose,” is essential. That’s where a lot of the change has to come from. 

JP: Women in technical and STEM professions are becoming much more prominent in other countries, such as China, Japan, and Korea. For some reason, in the US, I tend to see more women enter the medical profession than hard technology — and it might be a level of effort and perceived reward thing. You can spend eight years becoming a medical doctor or eight years becoming a scientist or an engineer, and it can be equally difficult, but the compensation at the end may not be the same. It’s expensive to get an education, and it takes a long time and hard work, regardless of the professional discipline.

SK: I have also heard that women also like to enter professions where they can make a difference in the world — a human touch, if you will. So that may translate to them choosing careers where they can make a larger impact on people — and they may view careers in technology as not having those same attributes. Maybe when we think about attracting women to technology fields, we might have to promote technology aspects that make a difference. That may be changing now, such as the LF Public Health (LFPH) project we kicked off last year. And with LF AI & Data Foundation, we are also making a difference in people’s lives, such as detecting earthquakes or analyzing climate change. If we were to promote projects such as these, we might draw more women in.

JP: So clearly, one of the areas of technology where you can make a difference is in open source, as the LF is hosting some very high-concept and existential types of projects such as LF Energy, for example — I had no idea what was involved in it and what its goals were until I spoke to Shuli Goodman in-depth about it. With the mentorship program, I assume we need this to attract fresh talent — because as folks like us get older and retire, and they exit the field, we need new people to replace them. So I assume mentorship, for the Linux Foundation, is an investment in our own technologies, correct?

SK: Correct. Bringing in new developers into the fold is the primary purpose, of course — and at the same time, I view the LF as taking on mentorship provides that neutral, level playing field across the industry for all open source projects. Secondly, we offer a self-service platform, LFX Mentorship, where anyone can come in and start their project. So when the COVID-19 pandemic began, we expanded this program to help displaced people — students, et cetera, and less visible projects. Not all projects typically get as much funding or attention as others do — such as a Kubernetes or  Linux kernel — among the COVID mentorship program projects we are funding. I am particularly proud of supporting a climate change-related project, Using Machine Learning to Predict Deforestation.

The self-service approach allows us to fund and add new developers to projects where they are needed. The LF mentorships are remote work opportunities that are accessible to developers around the globe. We see people sign up for mentorship projects from places we haven’t seen before, such as Africa, and so on, thus creating a level playing field. 

The other thing that we are trying to increase focus on is how do you get maintainers? Getting new developers is a starting point, but how do we get them to continue working on the projects they are mentored on? As you said, someday, you and I and others working on these things are going to retire, maybe five or ten years from now. This is a harder problem to solve than training and adding new developers to the project itself.

JP: And that is core to our software supply chain security mission. It’s one thing to have this new, flashy project, and then all these developers say, “oh wow, this is cool, I want to join that,” but then, you have to have a certain number of people maintaining it for it to have long-term viability. As we learned in our FOSS study with Harvard, there are components in the Linux operating system that are like this. Perhaps even modules within the kernel itself, I assume that maybe you might have only one or two people actively maintaining it for many years. And what happens if that person dies or can no longer work? What happens to that code? And if someone isn’t familiar with that code, it might become abandoned. That’s a serious problem in open source right now, isn’t it?

SK: Right. We have seen that with SSH and other security-critical areas. What if you don’t have the bandwidth to fix it? Or the money to fix it? I ended up volunteering to maintain a tool for a similar reason when the maintainer could no longer contribute regularly. It is true; we have many drivers where maintainer bandwidth is an issue in the kernel. So the question is, how do we grow that talent pool?

JP: Do we need a job board or something? We need X number of maintainers. So should we say, “Hey, we know you want to join the kernel project as a contributor, and we have other people working on this thing, but we really need your help working on something else, and if you do a good job, we know tons of companies willing to hire developers just like you?” 

SK: With the kernel, we are talking about organic growth; it is just like any other open source project. It’s not a traditional hire and talent placement scenario. Organically they have to have credibility, and they have to acquire it through experience and relationships with people on those projects. We just talked about it at the previous Linux Plumbers Conference, we do have areas where we really need maintainers, and the MAINTAINERS file does show areas where they need help. 

To answer your question, it’s not one of those things where we can seek people to fill that role, like LinkedIn or one of the other job sites. It has to be an organic fulfillment of that role, so the mentorship program is essential in creating those relationships. It is the double-edged sword of open source; it is both the strength and weakness. People need to have an interest in becoming a maintainer and also a commitment to being one, long term.

JP: So, what do you see as the future of your mentorship and diversity efforts at the Linux Foundation? What are you particularly excited about that is forthcoming that you are working on?

SK: I view the Linux Foundation mentoring as a three-pronged approach to provide unstructured webinars, training courses, and structured mentoring programs. All of these efforts combine to advance a diverse, healthy, and vibrant open source community. So over the past several months, we have been morphing our speed mentorship style format into an expanded webinar format — the LF Live Mentorship series. This will have the function of growing our next level of expertise. As a complement to our traditional mentorship programs, these are webinars and courses that are an hour and a half long that we hold a few times a month that tackle specific technical areas in software development. So it might cover how to write great commit logs, for example, for your patches to be accepted, or how to find bugs in C code. Commit logs are one of those things that are important to code maintenance, so promoting good documentation is a beneficial thing. Webinars provide a way for experts short on time to share their knowledge with a few hours of time commitment and offer a self-paced learning opportunity to new developers.

Additionally, I have started the Linux Kernel Mentorship forum for developers and their mentors to connect and interact with others participating in the Linux Kernel Mentorship program and graduated mentees to mentor new developers. We kicked off Linux Kernel mentorship Spring 2021 and are planning for Summer and Fall.

A big challenge is we are short on mentors to be able to scale the structured program. Solving the problem requires help from LF member companies and others to encourage their employees to mentor, “it takes a village,” they say.

JP: So this webinar series and the expanded mentorship program will help developers cultivate both hard and soft skills, then.

SK: Correct. The thing about doing webinars is that if we are talking about this from a diversity perspective, they might not have time for a full-length mentorship, typically like a three-month or six-month commitment. This might help them expand their resources for self-study. When we ask for developers’ feedback about what else they need to learn new skill sets, we hear that they don’t have resources, don’t have time to do self-study, and learn to become open source developers and software maintainers. This webinar series covers general open source software topics such as the Linux kernel and legal issues. It could also cover topics specific to other LF projects such as CNCF, Hyperledger, LF Networking, etc.

JP: Anything else we should know about the mentorship program in 2021?

SK: In my view,  attracting diversity and new people is two-fold. One of the things we are working on is inclusive language. Now, we’re not talking about curbing harsh words, although that is a component of what we are looking at. The English you and I use in North America isn’t the same English used elsewhere. As an example, when we use North American-centric terms in our email communications, such as when a maintainer is communicating on a list with people from South Korea, something like “where the rubber meets the road” may not make sense to them at all. So we have to be aware of that.

JP: I know that you are serving on the Linux kernel Code of Conduct Committee and actively developing the handbook. When I first joined the Linux Foundation, I learned what the Community Managers do and our governance model. I didn’t realize that we even needed to have codes of conduct for open source projects. I have been covering open source for 25 years, but I come out of the corporate world, such as IBM and Microsoft. Codes of Conduct are typically things that the Human Resources officer shows you during your initial onboarding, as part of reviewing your employee manual. You are expected to follow those rules as a condition of employment. 

So why do we need Codes of Conduct in an open source project? Is it because these are people who are coming from all sorts of different backgrounds, companies, and ways of life, and may not have interacted in this form of organized and distributed project before? Or is it about personalities, people interacting with each other over long distance, and email, which creates situations that may arise due to that separation?

SK: Yes, I come out of the corporate world as well, and of course, we had to practice those codes of conduct in that setting. But conduct situations arise that you have to deal with in the corporate world. There are always interpersonal scenarios that can be difficult or challenging to work with — the corporate world isn’t better than the open source world in that respect. It is just that all of that happens behind a closed setting.

But there is no accountability in the open source world because everyone participates out of their own free will. So on a small, traditional closed project, inside the corporate world, where you might have 20 people involved, you might get one or two people that could be difficult to work with. The same thing happens and is multiplied many times in the open source community, where you have hundreds of thousands of developers working across many different open source projects. 

The biggest problem with these types of projects when you encounter situations such as this is dealing with participation in public forums. In the corporate world, this can be addressed in private. But on a public mailing list, if you are being put down or talked down to, it can be extremely humiliating. 

These interactions are not always extreme cases; they could be simple as a maintainer or a lead developer providing negative feedback — so how do you give it? It has to be done constructively. And that is true for all of us.

JP: Anything else?

SK: In addition to bringing our learnings and applying this to the kernel project, I am also doing this on the ELISA project, where I chair the Technical Steering Committee, where I am bridging communication between experts from the kernel and the safety communities. To make sure we can use the kernel the best ways in safety-critical applications, in the automotive and medical industry, and so on. Many lessons can be learned in terms of connecting the dots, defining clearly what is essential to make Linux run effectively in these environments, in terms of dependability. How can we think more proactively instead of being engaged in fire-fighting in terms of security or kernel bugs? As a result of this, I am also working on any necessary kernel changes needed to support these safety-critical usage scenarios.

JP: Before we go, what are you passionate about besides all this software stuff? If you have any free time left, what else do you enjoy doing?

SK: I read a lot. COVID quarantine has given me plenty of opportunities to read. I like to go hiking, snowshoeing, and other outdoor activities. Living in Colorado gives me ample opportunities to be in nature. I also like backpacking — while I wasn’t able to do it last year because of COVID — I like to take backpacking trips with my son. I also love to go to conferences and travel, so I am looking forward to doing that again as soon as we are able.

Talking about backpacking reminded me of the two-day, 22-mile backpacking trip during the summer of 2019 with my son. You can see me in the picture above at the end of the road, carrying a bearbox, sleeping bag, and hammock. It was worth injuring my foot and hurting in places I didn’t even know I had.

JP: Awesome. I enjoyed talking to you today. So happy I finally got to meet you virtually.

Open Source Summit Europe, October 26, 2020 – The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the Software Developer Diversity and Inclusion (SDDI) project. SDDI will explore, evaluate, and promote best practices from research and industry to increase diversity and inclusion in software engineering. Founding contributors include Comcast, Facebook, GitHub, Intel and VMware and research professors from Beijing University of Posts and Telecommunications, Eindhoven University of Technology, Oregon State University, University of Auckland and University of Victoria.

According to StackOverflow’s 2020 survey of more than 65,000 developers, 91.7 percent identify as male and 70.7 percent as white or of European descent. There is a tremendous amount of work to be done to create inclusive environments that can lead to a more diverse community building the software that is the foundation for our digital society. Research indicates that racially diverse groups make better decisions, diverse open source projects are more productive and that working on gender diverse teams improves attitudes towards women.

“While there are a variety of important diversity and inclusion initiatives in the technology industry, none are focused on increasing diversity across categories – race, gender, age and cognitive ability –  in software engineering and informed by science and research,” said Kate Stewart, senior director of strategic programs at Linux Foundation. “We have optimism about the future of the open source community and our collective ability to increase diversity and inclusion. The work we do today can influence the vibrancy of the community and effectiveness of our technologies tomorrow.”

SDDI will include a steering committee and working groups that explore, evaluate and promote best practices from research and industry to increase diversity and inclusion in software engineering. The steering committee will be responsible for prioritizing the initial working groups, which could address research methods, ethics, resources and data, as well as diversity in the areas of gender, age, cognitive ability and education.

Open source projects are encouraged to participate in SDDI to inform best practices and to benefit from the findings of the Project. Existing Linux Foundation projects – TODO, which focuses on open source program office best practices, and the CHAOSS Project, which identifies tooling and metrics for diversity and inclusion – will also work closely with the new SDDI Project.

Supporting Comments

“The Software Developer Diversity and Inclusion Project (SDDI) is an excellent initiative that complements the work of the CHAOSS Project. Through collaboration, we can accelerate progress towards building a better virtual workplace for all developers,” said Nicole Huesman, Governing Board Co-Chair, the CHAOSS Project. “We’re looking forward to the research and best practices that surface from this work, so we can implement it in our work on metrics and tooling.”

“Diversity and inclusion are the cornerstone of building long term sustainable open source communities and programs,” said Chris Aniszczyk, co-founder of the TODO Group and CTO, CNCF. “The TODO Group looks forward to collaborating with the SDDI to share lessons and best practices from corporate open source programs.”

“Inclusive Open Source is of vital importance to industry and academia. The Software Developer Diversity and Inclusion (SDDI) project is a great initiative to bring inclusivity to OSS projects and products. For example, gender biases are embedded in the very tools that OSS projects use and the way information is structured. I look forward to working with SDDI to bring down these barriers, one feature at a time,” said Dr. Anita Sarma, Associate Professor, Computer Science, School of EECS, Oregon State University.

“Software systems are responsible for all aspects of modern life. They help humans make critical short-term and long-term societal and personal decisions, and yet the diversity and values of the people designing software systems do not remotely represent the diversity and values of people on our planet. The SDDI initiative, an active collaboration between industry and academia, will drive essential and rigorous research towards understanding barriers to diversity and inclusion while also discovering and promoting best practices,” said Margaret-Anne Storey, University of Victoria, Canada.

“Despite significant efforts over recent years to increase diversity and inclusion in many software companies, little traction has been made. This signals that new ways of thinking are needed to better understand the barriers and best practices. This initiative can help to stimulate new understanding and develop improved diversity and inclusion practices, which will lead to more innovative and useful software products,” said Kelly Blincoe, University of Auckland, New Zealand.

“Diversity is essential not only to create products that address needs of diverse groups of users but also to create sustainable and vibrant development teams. SDDI has the power and the promise to combine best industrial practices, insights from open source software developments and findings of the academic research to bring change in the ways teams are are organised and work together, and ultimately both in more comfortable and sustainable working environment, and better software products,” said Alexander Serebrenik, Eindhoven University of Technology, The Netherlands.

“Diversity and inclusion in software development have broad impact beyond our industry, particularly for those who are living in low and medium HDI countries. For them, being included in the software development profession is often a life-changing opportunity. I believe SDDI, a strong collaboration between academia and industry, would benefit the disadvantaged groups around the world,” said Yi Wang, Professor, Beijing University of Posts and Telecommunications.

“Diversity of thought is a vital component for building sustainable and healthy open source communities. Individuals from diverse backgrounds injecting new and innovative ideas advances an inclusive and welcoming ecosystem for all. SDDI with its focus on best practices in increasing D&I will be instrumental in providing the right direction for all committed to increasing diversity,” said Shuah Khan, Kernel Maintainer & Fellow, the Linux Foundation.

“Without an intentional and coordinated effort like the SDDI, it will be hard to move the needle on more diversity in software engineering.   There are many great practices across open source, companies and universities that we need to aggregate, make easier to discover and put into action.  The Linux Foundation is at the center of all of these communities and can get us together to improve the state of diversity in tech,” said Nithya Ruff, Head of Comcast Open Source Program Office, Chair, Linux Foundation Board.

“At Intel, we believe diverse and inclusive teams are more creative and innovative. We continue to raise the bar in areas such as representation, pay equity, and inclusion initiatives. This year, we announced our 2030 goals, global challenges and RISE strategy to create a more responsible, inclusive, and sustainable world, enabled through technology and our collective actions. We welcome the Linux Foundation’s new SDDI initiative to focus on improving inclusion and representation in the Open Source community and look forward to furthering this effort,” said Melissa Evers-Hood, Vice President, General Manager of Software Business Strategy, Intel Architecture, Graphics and Software, Intel Corporation

“Open source lifts all boats — creating innovation and opportunity for developers around the world. For Facebook, investing in open source is a way to empower developers as well as broader communities of individuals and businesses. To that end, we’re thrilled to support Linux Foundation’s SDDI effort which will not only help us invest in the next generation of open source developers but also promote increasing diversity in tech,” said Kathy Kam, Head of Open Source, Facebook.

“As home to most of the world’s open source software, GitHub believes deeply in the potential of a passionate, diverse open source community to move our world forward and accelerate human progress. GitHub is thrilled to collaborate on this project, which will allow us to “open source diversity and inclusion” for the benefit of us all. By making software development more accessible, inclusive, and sustainable, we can support the growth of a community where all developers — no matter who or where they are in the world — can learn, contribute, grow, and feel like they belong,” said Demetris Cheatham, Senior Director of Diversity, Inclusion and Belonging, GitHub.

“Innovation is a core tenet of VMware. We know that to make faster progress around Diversity and Inclusion we need to apply innovation and research the same way we do to technology problems. Supporting initiatives like this aligns with our values and is critical to the long term success of the technology industry as a whole,” said Shanis Windland, vice president, Diversity and Inclusion, VMware.

“SDDI will be an important initiative,” said Daniel Izquierdo, cofounder of Bitergia. “We at Bitergia do D&I research for customers and we look forward to sharing our experience and learning from others through SDDI.”

For more information about SDDI and to contribute, please visit: https://sddiproject.org/

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,500 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

###

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page: www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

Media Contact
Jennifer Cloer
503-867-2304
pr@linuxfoundation.org

[1] Sommers, Samuel R. “On racial diversity and group decision making: identifying multiple effects of racial composition on jury deliberations.” Journal of personality and social psychology 90.4 (2006): 597. Vasilescu, Bogdan, et al. “Gender and tenure diversity in GitHub teams.” Proceedings of the 33rd annual ACM conference on human factors in computing systems. 2015. Wang, Oliver and Zhang, Min. “Reducing Implicit Gender Biases in Software Development: Does Intergroup Contact Theory Work?” Proceedings of Foundations of Software Engineering. 2020.

Accelerating Open Standards development with Community Specifications

Introduction

In an earlier post back in May, the Linux Foundation and Joint Development Foundation (JDF) announced its ability to propose international standards by being recognized as an ISO/IEC JTC1 PAS submitter and that it had submitted its first standard, OpenChain, for international review. We also discussed why Open Standards were essential to the Linux Foundation’s efforts, just as Open Source projects are.

Today, we’re announcing a new way for communities to create Open Standards. We call it the Community Specification, and it allows communities to develop standards and specifications using the tools and approaches that are inspired and proven by open source developers. It’s standards development explicitly designed for Git-based workflows. The Community Specification brings the frictionless approach of open source collaborations to standards development.

It’s flexible, enabling small and large standards collaborations. And it’s built for growth. When or if the time is right, Community Specification projects can move to the Joint Development Foundation or another standards body. From there, the Joint Development Foundation can provide a path to international standardization.

Standards play a role in everyone’s life. Think about the things you touch every day, as simple as a power plug, the USB connector on your phone or laptop, or the WiFi that you use in your business and your home to connect your mobile devices wirelessly. All of these devices need to be able to interoperate with each other. 

Open Standards are best defined as specifications made available to the public, developed, and maintained via an inclusive, collaborative, transparent, and consensus-driven process. Open standards facilitate interoperability and data exchange among different products or services and are intended for widespread adoption.

Setting up a well-formed standards project is important. Items like due process, balance, inclusiveness, and intellectual property clarity are vital to developing technology that meets the needs of the broader community that can be implemented without intellectual property surprises.

The Community Specification builds on these best practices and brings them to the Git repository development environments that developers are already using. And it makes it easy to get started. You can start using the Community Specification by bringing its terms into your repository and getting to work — just like starting an open source project. 

Lowering the costs and reducing the level of effort of creating specifications

Starting a new standards effort is traditionally a time consuming and expensive project. It takes time, money, and effort — from negotiating multi-party agreements to dealing with the legal and corporate formalities to obtaining professional support.

The Joint Development Foundation created a much-streamlined alternative to setting up a traditional standards-setting activity. We created a standardized set of formation documents and procedures that allow the collaborators to choose from a predefined set of licensing terms. 

JDF took this expensive multi-month process and replaced it with a “check-the-box” approach that has already enabled over 13 communities like Open Manufacturing Platform, GraphQL, and Trust Over IP to get up and running quickly, and allowing these communities to create technologies with worldwide impact.

For these projects, the JDF shortened the process of creating a new standards project from many months to as quickly as a few days and removed much of the ongoing legal overhead of creating a new non-profit company to host the project.  

And while JDF has streamlined the creation of new standards organizations by providing a “standards organization in a box,” sometimes an even lighter-weight approach is desired. Today, the JDF is pleased to announce its latest innovation, the Community Specification.  

The Community Specification is the next step in reducing the friction of standards development.  By incorporating the Community Specification materials into a Git-based repository, communities can now start a standards development effort as quickly as an open source project, using proven standards-based best practices for governance and intellectual property. And it’s free. The Community Specification provides a “standards-organization-in-a-repo.” All you have to do is clone or copy the Community Specifications repository, fill in a few details, and get started.

JDF takes its inspiration from the developer community. We know the ultimate consumer of a specification is the implementer, and implementers are by and large developers. So it is no accident that the Community Specification relies on Git-based repositories like GitHub and GitLab as its platform for creating new standards. 

The tools that are natively available for managing contributions in a Git-based repository via an open and inclusive process are based on best practices from standards and open source development models. To make this process attractive to developers, we have adopted a single set of agreements for technical contributions, source code, governance, code of conduct, patents, and copyright. 

The Community Specification will allow communities to employ a fast and easy way to start a specification development process using software development-style tools and workflows that they already know. 

Conclusion

The new Community Specification process allows contributors to start a specification collaboration with a simple set of licenses and procedures at no cost. The Community specification is efficient and runs using tools and approaches that lower the administrative burden on the organizers and ensures contribution integrity. The project can run as a repository-based collaboration or as a legal entity under JDF, depending on the project’s needs. 

From this starting point, the collaborative can move seamlessly into a more structured JDF project that allows the project to scale up the support services to allow for broader member participation, collections of membership dues, test events, and marketing services. As part of the Joint Development ecosystem, the projects may also enjoy the benefits of being part of the world’s largest developer ecosystem at the Linux Foundation.  

In the ultimate expression of a standard’s success, the project may apply to submit the specification to JTC1/ISO/IEC through the JDF PAS submitter program, which allows the specification to reach national standards bodies worldwide.  

The Community Specification can dramatically reduce the time developers spend on building and meeting spec requirements and ensure important work is not lost and time is not wasted. By democratizing the specification build process, developers have more time to innovate and build the technologies that differentiate their work from others. 

We invite interested projects and people with great ideas to benefit from an organized collaboration platform to reach out to the Joint Development Foundation. 

 

Linux Foundation & Harvard Announce Free/Libre and Open Source Software (FOSS) Contributor Survey

“Open source software is everywhere. Now, more than ever, we need to get a better understanding of it to help make it even more secure.” – David A. Wheeler, Director of Open Source Supply Chain Security, Linux Foundation

In 2020, given the wide proliferation of Free/Libre and Open Source Software (FOSS), we aim to identify how to improve security, including the sustainability of the FOSS ecosystem, especially the FOSS systems heavily relied upon by organizations worldwide.

To do this, the Linux Foundation’s Core Infrastructure Initiative (CII) and the Laboratory for Innovation Science at Harvard (LISH) have developed a survey for contributors to FOSS. If you contribute to FOSS, we would love for you to participate in our study. This voluntary survey takes around 15-20 minutes to complete and allows you to advocate for the FOSS projects you care about. 

Please participate now; we intend to close the survey in early August. In appreciation of your participation, we would like to offer our participants the option to have your name included in the overall results. If you opt to be attributed in the final report, you will still have the opportunity to keep your detailed survey responses confidential.

The CII takes a collaborative, pre-emptive approach for strengthening cybersecurity by improving open-source software security. We aim to support, protect, and fortify open software, especially software, critical to the global information infrastructure. We take a holistic view of security; we include security risks in critical projects that are inadequately sustained or vulnerable to supply chain attacks. We intend to use this survey information to help guide this approach.

To take the FOSS Contributor Survey, click the button below:

 

Why CII best practices gold badges are important

“A CII Best Practices badge, especially a gold badge, shows that an OSS project has implemented a large number of good practices to keep the project sustainable, counter vulnerabilities from entering their software, and address vulnerabilities when found.” – David A. Wheeler, Director of Open Source Supply Chain Security

Open source software (OSS) is now widely used by many organizations. But with that popularity, that means the security of OSS is now more important than ever. The CII Best Practices badge project — including its top-ranked “gold” badge — helps improve that security.

In June 2020, two different projects managed to earn a gold badge: the Linux kernel and curl. Both are widely depended on, and yet in many other ways, they are radically different. The Linux kernel has a large number of developers, and as a kernel, it must directly interact with a variety of hardware. Curl has a far smaller set of developers and is a user-level application. They join other projects with gold badges, including the Zephyr kernel and the CII Best Practices badge application itself. Such radically different projects managed to earn a gold badge and thus demonstrated their commitment to security. It also shows that these criteria can be applied even to such fundamentally different programs.

But what are these badges? A Linux Foundation (LF) Core Infrastructure Initiative (CII) Best Practices badge is a way for Open Source Software (OSS) projects to show that they follow best practices. The badges let others quickly assess which projects are following best practices and are more likely to produce higher-quality secure software. It also helps OSS projects find areas where they can improve. Over 3,000 projects participate in the badging project, a number that grows daily.

There are three badge levels: passing, silver, and gold. Each level requires that the OSS project meet a set of criteria; for silver and gold that includes meeting the previous level. Each level requires effort from an OSS project, but the result is reduced risks from vulnerabilities for both projects and the organizations that use that project’s software.

The “passing” level captures what well-run OSS projects typically already do, and has 66 criteria grouped into six categories. For example, the passing level requires that the project publicly state how to report vulnerabilities to the project, that tests are added as functionality is added, and that static analysis is used to analyze software for potential problems. Getting a “passing” badge is an achievement, because while any particular criterion is met by many projects, meeting all the requirements often requires some improvements to any specific project. As of June 14, 2020, there were 3195 participating projects, and 443 had earned a passing badge.

The silver and gold level badges are intentionally more demanding. The silver badge is designed to be harder but possible for one-person projects. Here are examples of silver badge requirements (in addition to the passing requirements):

  • The project MUST have FLOSS automated test suite(s) that provide at least 80% statement coverage if there is at least one FLOSS tool that can measure this criterion in the selected language.
  • The project results MUST check all inputs from potentially untrusted sources to ensure they are valid (a whitelist) and reject invalid inputs if there are any restrictions on the data.

The gold badge adds additional requirements. Here are examples of gold badge requirements (in addition to the silver requirements):

  • The project MUST have a “bus factor” of 2 or more (a “bus factor” is the minimum number of project members that have to suddenly disappear from a project before the project stalls due to lack of knowledgeable or competent personnel).
  • The project MUST have at least 50% of all proposed modifications reviewed before release by a person other than the author.
  • The project MUST have a reproducible build. 
  • The project website, repository (if accessible via the web), and download site (if separate) MUST include key hardening headers with nonpermissive values.

Historically the LF has focused on getting projects to the passing level because projects not even at the passing level have a higher risk. But many projects are widely depended on or are especially important for security, and we love to see them earning higher-level badges.

Of course, a gold badge doesn’t mean that there are no vulnerabilities in the existing code, or that it’s impossible to improve their development processes. Perfection is rare in this life. But a CII Best Practices badge, especially a gold badge, shows that an OSS project has  implemented a large number of good practices to keep the project sustainable, counter vulnerabilities from entering their software, and address vulnerabilities when found. Projects take many such steps to earn a gold badge, and it’s a good thing to see.

We hope other projects will be inspired to pursue — and earn — a gold badge. Of course, the real goal isn’t a badge — the real goal is to make our software much more secure. But good practices can help make our software more secure, and we want to praise and encourage projects to have good practices.

For more background information on the best practices badge, see the presentation “Core Infrastructure Initiative (CII) Best Practices Badge in 2019”.

OSS projects can go to the CII Best Practices badge website to begin the process of earning a badge. If you’re considering the use of some OSS, we encourage you to check that website to see which projects have earned a badge.

Those who wish to learn more are welcome to contact David A. Wheeler, Director of Open Source Supply Chain Security at The Linux Foundation, at dwheeler AT linuxfoundation DOT org.

Building a successful open source community

Why do you need program management as part of your open source project? We asked a few of the Linux Foundation’s program managers to tell us how they each approach the task.

How does coordination and facilitation help improve my project? 

We tend to think of the primary goals of the Linux Foundation’s projects as producing open software, open hardware, open standards, or open data artifacts — the domain of participating programmers & engineers, system architects, and other technical contributors. 

However, successful projects engaging a broader ecosystem of commercial organizations, particularly when raising funds, benefit from active leadership besides pure technical contributions. Contributors often have work outside the project that often puts demands on their time. It takes real time to build and coordinate a commercial ecosystem, ensure stakeholders are engaged, recruiting and onboarding members, create a neutral governance culture (often amid competitors competing), and to keep various aspects of the ecosystem aligned such as when end users begin to participate.

Many Linux Foundation projects fundraise to provide resources for their community. This is an excellent benefit for the technical community when the business ecosystem comes together to invest and help the community obtain resources to build a thriving community and ecosystem. A typical fundraising model in our community is to offer an annual membership structure that provides a yearly fund for the project. 

The Linux Foundation’s approach to governance separates decisions about funds and business affairs from the technical project’s governance. The companies contributing money to a project’s fund can decide how those funds are spent and any related business decisions. The technical community can operate independently with open source best practices and continue to make decisions about what code to accept, how to build releases, etc. based on the technical merit of decisions in front of them and not based on what companies contributed funding.

We will always have representation from the technical community involved in the budget and business decisions to ensure funding decisions are well informed. This is how the Linux Foundation model preserves the development best practices of open source while enabling a community to benefit from the commercial ecosystem dependent on their work.

Guidance for your community

Within a technical project, there are roles for organizing how releases are built. Often some committers decide which code is accepted, and maintainers decide what to put into a release.  When scaling the project to create an ecosystem around it, there are other key roles and responsibilities that a project needs to stay on track and to continue to scale. These functions include:

    • Planning and Building.  Building a cohesive strategy is critical to the success of a project and requires investments in outcomes the core stakeholders want to see happen, and prioritize
    • Measuring KPIs. Tracking a project’s mission, goals, and objectives while moving those through the swim lanes is key to iterating on things that work and addressing things that don’t.
    • Facilitating. To be successful at facilitating, a coordinator must understand the landscape, and remain neutral. This can be difficult and is often the most challenging part of the job, NOT weighing in unless asked. 
    • Advising. Coordinators are a sounding board for these things with some expertise. To mature an organization, you must craft mechanisms for self-governance and sustainability.
    • Iterating and Reflecting. What happens along the way is that stakeholders in the community want to get things done — but when that happens without reflection, you lose sight of what and where you’re going. It’s essential to see the forest AND the trees, especially from an above-the-canopy view.

In the past, we have had a few communities with respected, neutral leaders who have provided these roles. The Xen Project is one example of a member of the community who has offered to perform this role for many years. There is a significant time investment from the community’s leadership to make it work, which is an excellent benefit for the community to have someone able and willing to spend their work time on this function. 

Many other projects are not able to find someone in the community to help. This is often where the Linux Foundation builds a support program to assist the projects we host that need help to obtain neutral coordination and facilitation professionals. We call the people who provide this support Program Manager (PM). PMs are often the first point of contact for community participants and potential members, and are usually involved in the following activities:

    • Program Managers help the governing and technical boards shape the project’s directions and goals. 
    • Program Managers will work with a project’s technical leadership to understand their technical goals. 
    • They work with the members to fill positions such as Chair and Treasurer and are involved with the voting process.
    • They ensure that both the governing and technical boards act within the agreed-upon guidelines of the project’s charter. 
    • They help onboard new members into the project community. 
    • They will engage resources from the Foundation’s Marketing, PR, Events, and Training teams to coordinate the support programs delivered for a project.  
    • Program Managers also oversee the delivery of other support programs provided by the Foundation and any services provided by vendors or contractors.
    • Program managers will pull in the Foundation’s IT service team members for a consultative discussion on the right development infrastructure, tools, and managed IT support programs based on the project community’s needs and roadmap. 
    • Program managers actively engage in community management and help the project’s leaders coordinate meetups, developer hackfests, and participation at events.

Setting strategic goals for your community

Identifying and articulating a project’s mission is essential with an open source project as it is with any business activity. Setting concrete goals enables the participants in a project to discuss and align around a single narrative that can guide their activities and inform decisions. 

Program Managers work with the project’s membership and technical leadership to define a strategy with goals, milestones, and metrics for the project. They coordinate discussions to assist the governing board in coming to a consensus on a budget that supports the technical community’s needs and aligns with the project strategy. 

For open source, very often, the goals include maximizing a project’s footprint in order to help the most people. Goals are often articulated to a fine granular level — enabling contributors to engage more easily, growing the membership from a particular sector of the ecosystem, or increase contributions from end users. 

The CHAOSS project is a community focused on defining community metrics around engagement, risks, etc. that are often helpful to project leaders in setting and establishing goals for measurably improving their ecosystem. 

Implementing a project lifecycle for your community

Open source projects often have subprojects and various efforts to innovate on new ideas that may not be ready to be included in an official release or as their independent release. We often refer to these communities as using an “umbrella” model with several coordinated sub-projects within the community. Within an umbrella community, the projects will typically follow a lifecycle. The lifecycle generally follows a path from imagination to planning to initial execution, expansion, and eventually maintenance and eventual retirement. 

Program managers often work with the technical leadership to codify this lifecycle according to milestones so that participants in the project can immediately understand where a project stands in terms of maturity and resources. CNCF, for example, has project phases that include Sandbox, Incubation, and Graduation. OpenJS Foundation has project phases that include Incubation, At-Large, Growth, Impact, and Emeritus, which map to the needs of their community.

A project lifecycle is an essential tool for a foundation to signal the maturity of multiple projects and identify for the community what the path towards a fully mature project requires. It is both a pathway and a signal, noting that projects grow and change, and what the community thinks a project should rely on to guide itself. 

In most projects, there is an entry-level, a mid-level, and a graduate level. The entry-level projects indicate a promising start for an emerging project and something to be considered. Mid Level projects show growth and development for an audience that might consider using this project, and graduated projects indicate full maturity and a project that many in the ecosystem rely upon.

“Within the Cloud Native Computing Foundation, the various project stages have been beneficial for encouraging projects to grow, not only from a development standpoint but from a community standpoint. A project looking to graduate has to demonstrate both a strong codebase and a strong community.”

Amye Scavarda Perrin, CNCF Program Manager

Linux Foundation Networking (LFN) Program Manager Trishan De Lanerolle notes how the Technical Advisory Council plays an active role in a project’s lifecycle management:

“Linux Foundation Networking project (LFN) technical leadership (Technical Advisory Council) developed and published a model that lays out criteria and checkpoints for projects in various stages of maturity, including an LFN Entry review and evaluation for new candidate projects to the LFN umbrella. The entry process provides a mechanism to amicably and fairly assess upcoming projects. In LFN, that entails asking whether a proposed project: falls within the LFN scope, provides a snapshot into the status or health of the community, and ensures the project’s documented governance is clear, complete, and easily accessible.”

Through facilitating the work of the Strategy Subcommittee, whose primary goal is to assist the Governing Board with developing and implementing Continuous Delivery Foundation (CDF) strategic planning, Program Manager Dan Lopez was able to guide CDF toward sustainable, long-lasting strategic goals. 

“The immense value of a Program Manager lies in their ability to foster a space for progress to happen. It’s not their role to necessarily make the tough decisions, but rather be the ‘glue’ of a program, ask the tough questions, and spark inspiration and critical thinking within their stakeholder group to create, in this case, sustainable goals that will create long term value for the CDF,”

Dan was able to approach strategic planning, as a neutral party who understood the landscape of the CDF, and assist the Governing Board in creating well-aligned goals that mapped to key performance indicators that can be measured and managed over time. 

The importance of open governance in your community

The Program Manager is also a vital member of the leadership team, working collaboratively to facilitate and operationalize the wants, needs, and priorities of the governing bodies. Each Linux Foundation Program Manager works with each project community to establish a transparent, open governance model for the technical community.

In open governance, a project is managed by a group of people representing the stakeholders in a project — generally project members and leaders of the project’s technical efforts. The concept of conducting a major technical effort using an open form of governance, in which all stakeholders’ needs must be addressed, and people are required to cooperate to get work done, is founded on the basic concept of democracy. It differs from closed or proprietary governance due to the transparency and coordination required to reach consensus.

Open governance provides a balance that can never be found in a proprietary, restrictive environment — the dynamics of that activity drive creativity and innovation, and significantly increase the speed of development. Program managers and community managers often guide these processes and help keep governance bodies on track with each other.

DPDK’s Program Manager Trishan de Lanerolle discusses how his project is divided into two bodies of equal responsibility:

“DPDK is one model of open governance, with co-equal governing bodies; the Governing Board has ownership and oversight, over budget, marketing, lab resources, administrative, legal, and licensing issues, and a Technical Board with ownership and oversight on technical issues including approval of new sub-projects, deprecating old sub-projects, the project’s technical roadmap, recruiting maintainers, defining the processes for contributing, testing, and managing security. The Technical Board comprises individuals from various organizations, that are not necessarily corporate members of the project, recognized for their technical contributions. The governing board comprises representatives from member organizations, who financially support the project, working hand in hand to make the project mission a reality.” 

Other projects, such as LF Energy, take a somewhat different path towards how their governance is structured. 

LF Energy represents an example of open, representative governance within a rapidly growing open source foundation. LF Energy has a board of directors, like most foundations, made up of Premier members, and includes a representative from the General members and a representative from the Technical Advisory Council (TAC), which is made up of technical project leaders. No single company has more than one representative on the board, which provides corporate as well as cultural diversity and voices from all over the industry, not just focused on one niche. 

The Linux Foundation’s neutral program management support program can help

Active program management and program management support is one of the main reasons why open source projects join an organization like the Linux Foundation. Our program management professionals provide a unique set of operational skills and capabilities that nearly all of our projects take advantage of — which is to offload operational and facilitation work from the community. 

In summary, a successful project should have community coordination and program managers that can plan and build, that can measure a project’s performance, that can act as prime facilitators and advise, and can help project stakeholders iterate and reflect to learn from their experiences in order to move a project forward.

“Managing Open source projects can be compared to nurturing a young sapling as it grows into a mature, healthy tree — or in this case, a community. Our job is to supply it with the right balance of nutrients and conditions for successful growth. Following proven governance models with strategic program management, helps increase the odds of nurturing a healthy community. Program Managers help clear the path, allowing communities to focus on the code and achieving technical goals. We are horticulturalists, toiling away in the background, and if we are doing our job correctly, you shouldn’t notice us.” 

Trishan de Lanerolle, Technical Program Manager & Community Architect, LF Networking

In 2020, we want to learn from best practices in how companies create effective open source strategies, how their open source programs are structured, and how they measure success.

The TODO Group is a set of companies that collaborate on practices, tools, and other ways to run successful and productive open source projects and programs.

Open source program offices help set open source strategy and improve an organization’s software development practices. Every year, the TODO Group performs a survey to assess the state of open source programs across the industry, and today we are happy to launch the 2020 edition.

Last year, over 2,700 people participated in the survey. As a result, we were able to learn: 

  • Adoption of open source programs and initiatives is widespread and goes beyond early adopters and; 
  • Hiring of open source developers is a prominent concern, and; 
  • Companies value their open source foundations

In 2020, we want to learn from best practices in how companies create effective open source strategies, how their open source programs are structured, and how they measure success.

We are also asking how macroeconomic conditions and COVID-19 are affecting open source. Survey closed.

The Security of the Open Source Software Digital Supply Chain: Lessons Learned and Tools for Remediation

Introduction

It has been estimated that Free and Open Source Software (FOSS) constitutes 70-90% of any given piece of modern software solutions. FOSS is an increasingly vital resource in nearly all industries, in the public and private sectors, among tech and non-tech companies alike. Therefore, ensuring the health and security of FOSS is critical to the future of nearly all industries in the modern economy.

In February of 2020, The Linux Foundation’s Core Infrastructure Initiative (CII), in partnership with the Laboratory for Innovation Science at Harvard (LISH), released the preliminary results of an ongoing study ‘Vulnerabilities in the Core,’ a Preliminary Report and Census II of Open Source Software.` This report represents the first steps towards understanding and addressing structural and security complexities in the modern-day supply chain where open source is pervasive, but not always understood.

The initial report from the Census II study identifies the most commonly used free and open source software (FOSS) components in production applications. It begins to examine the components’ open source communities, which can inform actions to sustain the long-term security and health of FOSS. The stated objectives were:

  1. Identify the most commonly used free and open source software components in production applications.
  2. Examine for potential vulnerabilities in these projects due to:
    • Widespread use of outdated versions;
    • Understaffed projects; and,
    • Known security vulnerabilities, among others.
  1. Use this information to prioritize investments/resources to support the security and health of FOSS

What did the Linux Foundation and Harvard learn from the Census II study?

The study was the first of its kind to analyze the security risks of open source software used in production applications. It is in contrast to the earlier Census I study that primarily relied on Debian’s public repository package data and factors that would identify the profile of each package as a potential security risk.

In order to gain a better understanding of the commonality, distribution, and usage of open source software used within large organizations, the study used software composition analysis (SCA) metadata supplied by Snyk and Synopsys. SCA is the process of automating visibility into any software, and these tools are often used for risk management, security, and license compliance.

With this metadata, the study was able to create a baseline and unique identifiers for common packages and software components used by large organizations, which was then tied to a specific project. This baselining effort allowed the study to identify which packages and components were the most widely deployed.

The top-scoring, most widely deployed projects were the ones that came under additional scrutiny and became the prime focus of the preliminary study, which were examined for the total lines of code, total number of contributors, and frequency of commits during the 2018 calendar year.

Observations and analysis of these specific metrics led the study to come to certain preliminary conclusions. These were:

Software components need to be named in a standardized fashion for security strategies to be effective. The study determined that a lack of naming conventions used by packages and components across repositories was highly inconsistent. Thus any ongoing effort to create strategies for software security and transparency without industry participation would have limited effect and slow such efforts.

Developer accounts must be secured. The analysis of the software packages with the highest dependents found that the majority were hosted with individual (personal) developer accounts. Lax developer security practices have considerable implications for large organizations that use these software packages because they have fewer protections and less granularity of permissions that are associated with them. For example, organizational accounts frequently employ the use of multi-factor authentication (MFA), which individual developers might not, potentially exposing larger organizations to attack.

Project atrophy and contributor abandonment is a known issue with legacy open source software. The number of developer contributors who work on projects to ensure updates for feature improvements, security, and stability decreases over time as they prioritize other software development work in their professional lives or decide to leave the project for any number of reasons. Therefore, as time goes by, it is much more likely that these communities may face challenges without sufficient developers to act as maintainers.

Legacy open source is pervasive in commercial solutions. Many production applications are being deployed that incorporate legacy open source packages. This prevalence of legacy packages is an issue as they are often no longer supported or maintained by the developers, or they have known security vulnerabilities. They often lack updates for known security issues both in their codebase or in the codebase of dependencies they require to operate. Legacy packages present a vulnerability to the companies deploying them in their environments. In essence, it means they will need to know what open source packages they have used and where so that they can maintain and update these codebases over time.

What tools exist to understand better and mitigate potential problem areas in open source software development?

The Linux Foundation’s community and other open source projects initiatives offer important standards, tooling, and guidance that will help organizations and the overall open source community gain better insight into and directly address potential issues in their software supply chain.

Understand the vulnerability vectors of your software supply chain

Concurrent with the publication of the findings of the Census II study is the Open Source Supply Chain Security Whitepaper. This publication explores vulnerabilities in the open source software ecosystem through historical examples of weaknesses in known infrastructure components (such as lax developer security practices and end-user behavior, poorly secured dependency package repositories, package managers, and incomplete vulnerability databases) and provides a set of recommendations for organizations to navigate potential problem areas.

Focus on building security best practices into your open source projects

For open source software developers, the Linux Foundation develops and hosts the Core Infrastructure Initiative’s Best Practices. This initiative was one of the first outputs produced as a result of the Census I, completed in 2015. Since that time, over 3,000 open source software projects have engaged, started, or completed the process of obtaining a CII Best Practices Badge.

Projects that conform to CII best practices can display a badge on their GitHub page, or their web pages and other material. In contrast, consumers of the badge can quickly assess which FLOSS projects are following best practices and, as a result, are more likely to produce higher-quality and secure software. Additionally, a Badge API exists that allows developers and organizations to query the state of CII best practice score of a specific project, such as Silver, Gold, and Passing. This means any organization can do an API check in their workflow to check against the open source packages they’re using and see if that project’s community has obtained a badge.

More information on the CII Best Practices Badging program, including background and criteria, is available on GitHub. Project statistics and criteria statistics are available. The projects page shows participating projects and supports queries (such as a list of projects that have a passing badge).

Gain better insights into the community developing your open source software

We encourage organizations and projects to join the CHAOSS community, whose work on tooling and risk metrics was leveraged in the Census II study. CHAOSS is a Linux Foundation project which is focused on creating analytics and metrics to help define the health of the software community. The community is working on open source tools, including:

Augur, which is a python library and REST server that is used to mine metrics from git transactions, such as the number of committers and commits over a historical period.

Grimore Lab is an open development analytics platform that allows for automatic and incremental data gathering from almost any tool related to contributing to open source development, such as for source code management, issue tracking systems, forums, mailing lists, and others. It also provides a data visualization dashboard that allows for filtering by time range, project, repository, and contributor.

Equally as important, the CHAOSS community has brought academics and corporate practitioners of open source best practices to develop common CHAOSS Metrics that any organization can adopt and begin using. The Metrics project includes metrics focused on Risk (including Security), the Evolution of a project, and many more useful metrics leading open source organizations to monitor their open source software supply chain.

Gain better insight into the open source software being used in your organization

The second major initiative by the Linux Foundation is Automating Compliance Tooling (ACT), which was launched in 2019 and comprises five major projects, which as of today include:

FOSSology an open source license compliance software system and toolkit, allowing users to run license, copyright, and export control scans from the command line. A database and web UI are also provided to provide a compliance workflow. License, copyright, and export scanners are tools available to help with compliance activities.

OSS Review Toolkit (ORT) enables highly automated and customizable Open Source compliance checks the source code and dependencies of a project by scanning it, downloading its sources, reporting any errors and violations against user-defined rules, and by creating third-party attribution documentation. ORT is designed for the CI/CD world and supports a wide variety of package managers, including Gradle, Go modules, Maven, npm, and SBT.

QMSTR, Also known as “Quartermaster”, this tool creates an integrated open source toolchain that implements industry best practices of license compliance management. QMSTR integrates into the build systems to learn about the software products, their sources, and dependencies. Developers can run QMSTR locally to verify outcomes, review problems, and produce compliance reports. By integrating into DevOps CI/CD cycles, license compliance can become a quality metric for software development.

Software Package Data Exchange (SPDX) is an open standard for communicating software bill of material information (SBOM) that supports accurate identification of software components, explicit mapping of relationships between components, and the association of security and licensing information with each component.

Tern is an inspection tool to find the metadata of the packages installed in a container image. It provides a deeper understanding of a container’s bill of materials, so better decisions can be made about container-based infrastructure, integration, and deployment strategies.

The ACT projects are also working with initiatives within other open source communities to support accurate identification and sharing of software metadata in the ecosystem. Of particular note are:

Clearly Defined, which is part of the Open Source Initiative (of which the Linux Foundation is an Associate member), is a shared repository of licensing provenance information through a common database that helps organizations understand the risks associated with using open source software.

Software Heritage, which is being developed in collaboration with UNESCO, is committed to collect, index, preserve and make readily available the source code of the software that lies at the heart of our culture. As with the Internet Archive (“Wayback Machine), which seeks to preserve the history of the Internet and its content, Software Heritage is a historical preservation of open source software and packages that anyone can search and browse through.

Conclusion

The preliminary findings of the Census II study strongly indicate that open source projects require supporting toolsets, infrastructure, staffing, and proper governance to act as a stable and healthy upstream project for your organization.

The Census II study shows that even the most widely deployed open source software packages can have issues with security practices, developer engagement, contributor exodus, and code abandonment.

Addressing these challenges is where an organization like the Linux Foundation and other nonprofit organizations can offer significant assistance and support for the project community using a low overhead model. The Linux Foundation is uniquely suited to not just providing tools, but also the governance, fundraising, and support programs that critical open source projects require in order to maintain a stable, secure and reliable release model. These support programs include:

    • Providing funding support through membership models or securing one-off contributions through crowdfunding, leaving the complexities of managing the legal entity, financial oversight, and regulatory filings to professionals that are highly experienced and dedicated to their administration.
    • Providing base policies that offer a known framework for commercial organizations to collaborate, including an antitrust policy, trademark policy, templates for a code of conduct, and more.
    • Providing entity management for maintaining the core administrative support infrastructure that enables communities to interact, including hiring leadership and community support personnel, in order to facilitate and guide projects on an ongoing basis.
    • Supporting community events for face to face opportunities, as well as marketing and communications support to grow a project’s community audience and help people learn about the great things they contribute to.
    • Eliminating the burden of managing software releases through hiring neutral release engineers that support the maintainers.
    • Providing a free platform in the form of CommunityBridge to address common challenges with fundraising, mentoring, security vulnerability scanning, and managing automated Contributor License Agreements (“CLAs”).
    • Providing training and professional certification support that enables building an ecosystem of skilled professionals in order to use, implement, and manage solutions based on a project’s technology.
    • Providing support for license compliance, export control, and security by the routine scanning of project repositories in order to help the community identify license and security problems before an official release proliferates issues to downstream users.

In summary, the Linux Foundation supplies communities with a repeatable, proven governance model as well as value-added support programs to help communities maintain and scale. The ultimate goal is that our communities become healthy upstream projects that your organization can rely on as secure, and well-maintained upstream open source projects in your software supply chain.