This post originally appeared on the LF Energy’s blog. LF Energy is a project at the Linux Foundation that provides a neutral, collaborative community to build the shared digital investments that will transform the world’s relationship to energy.

The energy sector is amid a huge transformation that will impact the entire world and grid operators need new innovations to match those needs.

That’s why we’re especially excited to see the recognition awarded Antonello Monti, Director of the SOGNO logo Institute for Automation of Complex Power Systems at RWTH Aachen University and group Leader at Center for Digital Energy, Fraunhofer FIT, for his leadership with SOGNO, the “Service-based Open-source Grid automation platform for Network Operation” of the future.

Monti received the second most prestigious award given by the German government, the innovation prize of North Rhine-Westphalia. Awarded annually, this prize recognizes outstanding achievements and excellent research.

We are so proud of the work Monti, who also serves at the Technical Advisory Committee Chair for LF Energy, and Markus Mirz have undertaken. We also want to extend our congratulations to the many individuals, companies, and the European Commission who funded the original work for SOGNO (meaning “dream” in Italian).

SOGNO is an LF Energy project that is creating plug-and-play, cloud-native, micro-services to implement our next generation of data-driven monitoring and control systems. It will simplify the life of distribution utilities by enabling them to optimize their network operations through open source to deliver cost-effectively, and seamless, secure power to customers.

A breakthrough innovation is that SOGNO introduces the idea of grid automation as a modular system in which components can be added through time. This is in opposition to classical monolithic solutions, which weren’t constructed with today’s energy landscape in mind.

Today, as more renewables come onto the grid, the flow of energy moves from just one way, which was true in the past, to both ways on and off the grid.In the future, power system networks will be composed of assets whose profiles may shift between loads, resources, and the ability to provide flexibility back to the grid.

Reinforcing the current system is not sufficient to deal with the increasing complexity of distribution systems. Rather, we are at the cusp of needing deployment of advanced distribution management systems that can be implemented as centralized but even better as distributed architecture.

We reiterate our deep gratitude and support for this project, and the people and entities who’re making it happen.

Read here for more information

LFX dashboard example

Open source communities are driven by a mutual interest in collaboration and sharing around a common solution. They are filled with passion and energy. As a result, today’s world is powered by open source software, powering the Internet, databases, programming languages, and so much more. It is revolutionizing industries and tackling the toughest challenges. Just check out the projects fostered here at the Linux Foundation for a peek into what is possible. 

What is the challenge? 

As the communities and the projects they support grow and mature, active community engagement to recruit, mentor, and enable an active community is critical. Organizations are now recognizing this as they are more and more dependent on open source communities. Yet, while the ethos of open source is transparency and collaboration, the tool chain to automate, visualize, analyze, and manage open source software production remains scattered, siloed, and of varying quality.

How do we address these challenges?

And now, involvement and engagement in open source communities goes beyond software developers and extends to engineers, architects, documentation writers, designers, Open Source Program Office professionals, lawyers, and more. To help everyone stay coordinated and engaged, a centralized source of information about their activities, tooling to simplify and streamline information from multiple sources, and a solution to visualize and analyze key parameters and indicators is critical. It can help: 

  • Organizations wishing to better understand how to coordinate internal participation in open source and measure outcomes
  • CTOs and engineering leads looking to build a cohesive open source strategy 
  • Project maintainers needing to wrangle the legal and operational sides of the project
  • Individual keeping track of their open source impacts

Enter the Linux Foundation’s LFX Platform – LFX operationalizes this approach, providing tools built to facilitate every aspect of open source development and empowers projects to standardize, automate, analyze, and self-manage while preserving their choice of tools and development workflows in a vendor-neutral platform.

LFX tools do not disrupt a project’s existing toolchain but rather integrate a project’s community tools and ecosystem to provide a common control plane with APIs from numerous distributed data sources and operations tools. It also adds intelligence to drive outcome-driven KPIs and utilizes a best practices-driven, vendor-agnostic tools chain. It is the place to go for active community engagement and open source activity, enabling the already powerful open source movement to be even more successful.

How does it work? 

Much of the data and information that makes up the open source universe is, not surprisingly, open to see. For instance, GitHub and GitLab both offer APIs that allow third-parties to track all activity on open projects. Social media and public chat channels, blog posts, documentation, and conference talks are also easily captured. For projects hosted at a foundation, such as the Linux Foundation, there is an opportunity to aggregate the public and semi-private data into a privacy respecting, opt-in unified data layer. 

More specifically to an organization or project, LFX is modular, extensible, and API-driven. It is pluggable and can easily integrate the data sources and tools that are already in use by organizations rather than force them to change their work processes. For instance:

  • Source control software (e.g. Git, GitHub, or GitLab)
  • CI/CD platforms (e.g. Jenkins, CircleCI, Travis CI, and GitHub Actions)
  • Project management (e.g. Jira, GitHub Issues)
  • Registries  (e.g. Docker Hub)
  • Documentation  (e.g. Confluence Wiki)
  • Marketing automation (e.g. social media and blogging platforms)
  • Event management platforms (e.g. physical event attendance, speaking engagements, sponsorships, webinar attendance, and webinar presentations)

This holistic and configurable view of projects, organizations, foundations, and more make it much easier to understand what is happening in open source, from the most granular to the universal. 

What do real-world users think? 

Part of LFX is a community forum to ask questions, share solutions, and more. Recently, Jessica Wagantall shared about the Open Network Automation Platform (ONAP). She notes:

ONAP is part of the LF Networking umbrella and consists of 30+ components working together towards the same goal since 2017. Since then, we have faced situations where we have to evaluate if the components are getting enough support during release schedules and if we are identifying our key contributors to the project.

In this time, we have learned a lot as we grow, and we have had the chance to have tools and resources that we can rely on every step of the way. One of these tools is LFX Insights.

We rely on LFX Insights tools to guide the internal decisions and keep the project growing and the contributions flowing.

LFX Insights has become a potent tool that gives us an overview of the project as well as statistics of where our project stands and the changes that we have encountered when we evaluate release content and contribution trends.

Read Jessica’s full post for some specific examples of how LFX Insights helps her and the whole team. 

John Mertic is a seasoned open source project manager. One of his jobs currently is helping to manage the Academy Software Foundation. John shares: 

The Academy Software Foundation was formed in 2018 in partnership with the Academy of Motion Pictures Arts and Sciences to provide a vendor-neutral home for open source software in the visual effects and motion picture industries.

A challenge this industry was having was that there were many key open source projects used in the industry, such as OpenVDB, OpenColorIO, and OpenEXR, that were cornerstones to production but lacked developers and resources to maintain them. These projects were predominantly single vendor owned and led, and my experience with other open source projects in other verticals and horizontal industries causes this situation, which leads to sustainability concerns, security issues, and lack of future development and innovation.

As the project hit its 3rd anniversary in 2021, the Governing Board was wanting to assess the impact the foundation has had on increasing the sustainability of these projects. There were three primary dimensions being assessed.

  • Contributor growth
  • Contribution growth
  • Contributor diversity

We at the LF know that seeing those metrics increasing is a good sign for a healthy, sustainable project.

Academy Software Foundation projects use LFX Insights as a tool for measuring community health. Using this tool enabled us to build some helpful charts which illustrated the impacts of being a part of the Academy Software Foundation.

We took the approach of looking at before and after data on the contributor, contribution, and contributor diversity.

Here is one of the charts that John shared. You can view all of them on his post


LFX dashboard example

Conclusion 

LFX will improve communication and collaboration, simplify management, surface the best projects and project leaders, and provide insightful guidance based on real data captured at scale, across the widest variety of projects ever collected into a single source of information. And it is available to you – all Linux Foundation members and projects have access to LFX. 

To learn more about what it can do for you and your organization and project(s), read our white paper (LINK), read posts in the LFX Community Forum, or just log in with your free LFID and give it a spin. And check back here on the LF Blog for more articles in the coming months on LFX – digging in deeper. 

If you would like to talk to someone at the Linux Foundation about LFX or membership, reach out to Jen Shelby at jshelby@linuxfoundation.org

OSPOlogy live workshops

As more and more organizations adopt open source initiatives and/or seek to mature their involvement in open source, they often face many challenges, such as educating developers on good open source practices, building policies and infrastructure, ensuring high-quality and frequent releases, engaging with developer communities, and contributing back to other projects effectively. They recognize that open source is a complex ecosystem that is a community of communities. It doesn’t follow traditional corporate rules, so guidance is needed to overcome cultural change. 

To help address these challenges and take advantage of the opportunities, organizations are turning to open source program offices (OSPOs). An OSPO is designed to be the center of competency for an organization’s open source operations and structure. This can include setting code use, distribution, selection, auditing, and other policies, as well as training developers, ensuring legal compliance, and promoting and building community engagement that benefits the organization strategically. 

The Linux Foundation’s TODO Group’s mission is to help foster the adoption and improvement of OSPOs around the world. They are a tremendous resource, with extensive guides, a new mind map, an online course, case studies, and more. Check out their resources, community, and join their efforts

Thanks in part to their efforts, the OSPO movement is expanding across industries and regions of all types and sizes. However, due to the wide range of responsibilities and ways to operate, OSPO professionals often find it difficult to implement OSPO best practices, policies, processes, or tools for their open source management efforts.

To help people with these challenges, the TODO Group is introducing a new framework for in-person OSPO workshops. The framework is publicly available in ospology. This repo encapsulates a set of open initiatives (including an OSPO Mind Map 2.0, virtual global & regional meetings, an OSPO discussion forum, monthly OSPO News, and now, in-person workshops) to work in collaboration that aims to study and discuss the status of OSPOs and, ultimately, make them even more effective. 

TODO is piloting these in Europe first, and they are currently seeking collaborators to bring together the various communities involved in OSPO-specific topics and help organizations effectively implement OSPO Programs based on the specific needs for the region.

Backing up a bit, let’s look at the OSPOlogy.live framework. 

OSPOlogy.live framework in a nutshell

  • Follows an “unconference style,” meaning it’s a participants-driven meeting
  • Adheres to the Chatham House Rule in order to share openly and learn from each other 
  • Connects OSPOs with various open source communities involved in the open source activities that matter to them (e.g. policies, tooling, standards, and community building)
  • Takes place over two days and is an in-person event
  • Consists of prepared presentations, hands-on workshops, and space for networking
  • Falls under the Linux Foundation’s policies and code of conduct
  • Held at a location provided by one of the participants for free
  • Each participant pays for their own food, travel, and lodging. Meals may be free if workshop organizers find sponsors.
  • Participants can register their interest to receive an invite via Linux Foundation’s community platform as seats are limited.

With that overview, let’s dig in a little on how the workshop is conducted.

Unconference style

Typically at an unconference, the agenda of the workshop portion is created by the attendees at the beginning of the meeting. Anyone who wants to initiate a discussion on a topic can claim a time and a space. OSPOlogy workshops are not fully an unconference as the first day is a series of prepared presentations, so you know what the sessions are before joining (1 or 2 will be chosen by the participants ahead of time). For Day 2, the workshops follow the unconference model. Participants vote on topics to be worked on that day. Participants may be asked to submit their topic before the workshop to accelerate/simplify the voting process.

Suggested workshop sections

  • OSPO USE CASES ➡️Expert-led panels or talks to share experiences and case studies from specific OSPOs
  • OSPO ACCELERATORS ➡️Presentation highlighting a specific activity within the specific project, such as outcomes of recent community activities. The aim of the presentation is to give people insights on various topics the communities are working on and get their feedback / to ask for contributions.
  • SHARED CHALLENGES ASSESSMENT ➡️ Description: Identify OSPO shared challenges / pain points on the OSPO Mind Map 2.0 and let the audience vote for the areas of interest (working groups) for the workshop breakout groups. For instance, focus areas can be specific activities within OSPO responsibilities.
  • BREAK OUT SESSIONS ➡️ Define goals and identify pain points. Each break out group aims to capture their challenges for the selected focus and if possible document their experiences/solutions.
  • NETWORKING

Interested in becoming a collaborator?

We can’t do this alone! If you are part of an open source community involved in OSPO-specific topics or an organization willing to help with the workshop planning, schedule and/or provide a space to kick off the first meet-up in Europe, we need your help! Please contact:

And check out the FAQs below. 

Don’t live in Europe? Pencil us in for when this is expanded. 

Not involved in an OSPO yet? Take time to check out the TODO Group and join the community to start your OSPOlogy journey.

Also, consider joining OSPONCon North America next week, June 21-24, 2022, either in Austin, Texas during the Open Source Summit or virtually. Register here.



Frequently Asked Questions

What do we mean by communities involved in OSPO-specific topics?

OSPO-specific topics range from safely using open source to license compliance, sustainability, contributing back to the community, and more. For the full list of OSPO topics please see https://ospomindmap.todogroup.org/:

  • Develop and Execute Open Source Strategy
  • Oversee Open Source Compliance
  • Establish and Improve Open Source Policies and Processes
  • Prioritize and Drive Open Source Upstream Development
  • Collaborate with Open Source Organizations
  • Track Performance Metrics
  • Implement InnerSource Practices
  • Grow and Retain Open Source Talent Inside the Organization
  • Give Advice on Open Source
  • Manage Open Source IT Infrastructure

Some examples of OS communities highly involved in these topics are:

What are the necessary roles to set up an OSPOlogy.live workshop?

There are two ways in which you can play your part in OSPOlogy.live set up: (1) the hosting party who makes available a meeting room; and, (2) the workshop organizer/facilitator in charge of workshop activities and planning. (1) and (2) may be the same entity/individual. Further details can be found in the framework documentation

Where can I register for the next OSPOlogy.live?

Efforts are already on the way to organize the OSPOlogy workshops in different European countries each quarter. Once collaborators and days are confirmed, registration details and schedules will be published via the OSPOlogy community platform.

For further updates, please subscribe to OSPONewsletter and join the TODO community.

SBOMs and food labels

Software Bill of Materials (SBOMs) are like ingredient labels on food. They are critical to keep consumers safe and healthy, they are somewhat standardized, but it is a lot more exciting to grow or make the food rather than the label. 

What is an SBOM?

What is an SBOM? In short, it is a way to tell another party all of the software that is used in the stack that makes up an application. One benefit of having a SBOM is you know what is in there when a vulnerability comes up. You can easily determine if you are vulnerable and where. 

As modern software is built utilizing a base of software already written (no sense in recreating the wheel), it is important that all of the components don’t get lost in the shuffle. It isn’t readily apparent what a particular piece of software utilizes. So, if a vulnerability for Software A arises, you need to know, do I have that piece of software somewhere in my ecosystem, and, if so, where. Then you can remediate if you need to.

I can’t take credit for the food label analogy used in my introduction. I heard it from Allan Friedman, a Senior Advisor and Strategist at the U.S. Cybersecurity and Infrastructure Security Agency (CISA) and a key SBOM advocate, when he presented about SBOMs at the RSA Conference 2022 with Kate Stewart, the VP of Dependable Embedded Systems here at the Linux Foundation. Allan made the point that food labels only provide information. The consumer needs to read and understand them and take appropriate action. For instance, if they are allergic to peanuts, they can look at an ingredient label and determine if they can safely eat the food. 

SBOMs are similar – they tell a person what software is used as an “ingredient” so someone can determine if they need to take action if a vulnerability arises. It isn’t a silver bullet, but it is a vital tool. Without SBOMs no one can track what component “ingredients” are in their software applications.

SBOMs and the Software Supply Chain

Supply chains are impacting our lives more than just restricting availability of consumer goods. Software supply chains are immensely more complicated now as software is built with pre-existing components. This makes software better, more effective, more powerful, etc. But it also introduces risk as more and more parties touch a particular piece of software. Much like our world has become so interdependent, so has our software. 

Understanding what is in the supply chain for our software helps us effectively secure it. When a new risk emerges, we know what we need to do. 

SBOMs and Software Security

SBOMs are increasingly being recognized as an important pillar in any comprehensive software security plan. A global survey conducted in 2021 Q3 by the Linux Foundation found that 78% of organizations responding plan to use SBOMs in 2022. Additionally, the recently published Open Source Software Security Mobilization Plan recommends SBOMs be universal and the U.S. Executive Order on Improving the Nation’s Cybersecurity requires SBOMs be provided for software purchased by the U.S. government. And, as Allan points out in his talk, “We buy everything.” The E.O. actually lays out a nice summary of SBOMs and their benefits: 

The term “Software Bill of Materials” or “SBOM” means a formal record containing the details and supply chain relationships of various components used in building software.  Software developers and vendors often create products by assembling existing open source and commercial software components.  The SBOM enumerates these components in a product.  It is analogous to a list of ingredients on food packaging.  An SBOM is useful to those who develop or manufacture software, those who select or purchase software, and those who operate software.  Developers often use available open source and third-party software components to create a product; an SBOM allows the builder to make sure those components are up to date and to respond quickly to new vulnerabilities.  Buyers can use an SBOM to perform vulnerability or license analysis, both of which can be used to evaluate risk in a product.  Those who operate software can use SBOMs to quickly and easily determine whether they are at potential risk of a newly discovered vulnerability.   A widely used, machine-readable SBOM format allows for greater benefits through automation and tool integration.  The SBOMs gain greater value when collectively stored in a repository that can be easily queried by other applications and systems.  Understanding the supply chain of software, obtaining an SBOM, and using it to analyze known vulnerabilities are crucial in managing risk.

Allan and Kate spent time in their talk going into the current state of SBOMs, challenges, benefits, tools available for creating and sharing SBOMs, what is a minimum SBOM, standards being developed, making them fully automated, and more. Look for some future LF Blog posts digging into these. 

But there are things you can do now. 

What can you and your organization do now?

Allan and Kate laid out several things you and your organization can do, starting now. Starting within your organization: 

Next week: Understand origins of software your organization is using

  • Commercial: can you ask for an SBOM?
  • Open source: do you have an SBOM for the binary or sources you’re importing? 

Three months: Understand what SBOMs your customers will require

  • Expectations: which standards, dependency depth, licensing info?

Six months: Prototype and deploy

  • Implement SBOM through using an OSS tool and/or starting a conversation with vendor

And participate in ongoing discussions to determine best practices for the ecosystem and contribute to open source project any code developed to support SBOMs. 

But there are also steps you can take as an individual:

Next week: Start playing with an open source SBOM tool and apply it to a repo

Three months: Have an SBOM strategy that explicitly identifies tooling needs

Six months

  • Begin SBOM implementation through using an OSS tool or starting a conversation with vendor
  • Participate in a plugfest and try to consume another’s SBOM

And make sure to share any open source and commercial tools you find helpful and work with the tools to help harden them, test and report bugs, and push them to scale.

How can you shape the future of SBOMs?

First, I want to highlight some upcoming opportunities they shared to help shape the future of SBOMs. CISA is running public Tooling & Implementation work stream discussions in July 2022. They are the same, but occur at different times to help accommodate more time zones: 

  • July 13, 2022 – 3:00-4:30 PM ET
  • July 21, 2022 – 9:30-11:00 AM ET 

If you want to participate, please email SBOM@cisa.dhs.gov

Additionally, there will be “plugfests” to be announced soon, and they suggested organizations already adopting SBOMs publish case studies and reference tooling workflows to help others. 

Conclusion

SBOMs are here to stay. If you aren’t already, get on the train now. It is pulling out of the station, but you still have an opportunity to help shape where it is going and how well the journey goes. 

Allan’s and Kate’s slides are available here. If you registered to attend the RSA Conference, you can now watch their full presentation on demand here.



The Software Package Data ExchangeⓇ (SPDXⓇ)

The Linux Foundation hosts SPDX, which is an open standard for communicating software bill of material information, including components, licenses, copyrights, and security references. SPDX reduces redundant work by providing a common format for companies and communities to share important data, thereby streamlining and improving compliance. The SPDX specification is an international open standard (ISO/IEC 5962:2021). Learn more at spdx.dev

O3D community building a first-class, open-source 3D engine to advance development across gaming, the metaverse, and a variety of other applications

SAN FRANCISCO – June 15, 2022 – The Open 3D Foundation (O3DF), the home of a vibrant community focused on advancing the future of open 3D development, announces its growing ecosystem with the addition of LightSpeed Studios as a Premier member alongside Adobe, AWS, Huawei, Intel, Microsoft and Niantic.

Today’s top-quality 3D engines are as complex as operating systems, requiring significant time, cost, and human capital investments to keep pace with advancements. Open source has repeatedly proven to be the path to quickest innovation. The Open 3D Engine (O3DE) offers a high-fidelity, fully-featured, open source alternative poised to revolutionize real-time 3D development across a variety of industries—from game development, the metaverse, AI and digital twin, to automotive, healthcare, robotics and more.

As a Premier member, LightSpeed Studios will bring its leadership and wealth of experience in global research and development of high-quality games to help drive the development of O3DE’s specifications and initiatives. Tencent Senior Project Manager, Lanye Wang, will join the Open 3D Foundation’s Governing Board, helping shape the Foundation’s strategic direction and its stewardship of 3D visualization and simulation projects. 

“We are very excited to join the Open 3D Foundation, especially for the opportunity to leverage the connection with all of the other members to dive deep into the graphic technologies and build a top-level open source 3D engine community,” said Lanye Wang, representing LightSpeed Studios. “We look forward to working with you.”

LightSpeed Studios is one of the world’s most innovative and successful game developers, with teams around the world. Founded in 2008, LightSpeed Studios has created over 50 games across multiple platforms and genres for over 4 billion registered users. Comprised of passionate players who advance the art and science of game development through great stories, great gameplay and advanced technology, LightSpeed Studios is focused on bringing next-generation experiences to gamers who want to enjoy them anywhere, anytime across multiple genres and devices.

“It has been amazing to see the rapid growth of the O3D ecosystem, and we’re elated to welcome LightSpeed Studios to our community,” said Royal O’Brien, Executive Director of Open 3D Foundation and General Manager of Games and Digital Media at the Linux Foundation. “LightSpeed Studios has achieved a strong reputation as a leading global game developer, offering high-quality gaming experiences to hundreds of millions of users worldwide, and we are excited to collaborate with them as we enhance O3DE’s capabilities for global 3D developers.”

A Growing Community

LightSpeed Studios is one of 25 member companies since the public announcement of the Open 3D Foundation in July 2021. Other premier members include Adobe, AWS, Huawei, Intel, Microsoft and Niantic.

In May, O3DE announced its latest release, focused on performance, stability and usability enhancements. With over 1,460 code merges, this new release offers several improvements aimed to make it easier to build 3D simulations for AAA games and a range of other applications. Significant enhancements include core stability, installer validation, motion matching, user-defined property (UDP) support for the asset pipeline, and automated testing advancements. The O3D Engine community is very active, averaging up to 2 million line changes and 350-450 commits monthly from 60-100 authors across 41 repos.

Where to See the O3D Engine Next

On October 17-19, the Open 3D Foundation will host O3Dcon, its flagship conference, bringing together technology leaders, indie and independent 3D developers, and the academic community to share ideas, discuss hot topics and foster the future of 3D development across a variety of industries and disciplines. For those interested in sponsoring this event, please contact sponsorships@linuxfoundation.org

Anyone interested in the O3D Engine is invited to get involved and connect with the community on Discord.com/invite/o3de and GitHub.com/o3de

About the Open 3D Engine (O3DE) project

O3D Engine is the flagship project managed by the Open 3D (O3D) Foundation. The open-source project is a modular, cross-platform 3D engine built to power anything from AAA games to cinema-quality 3D worlds to high-fidelity simulations. The code is hosted on GitHub under the Apache 2.0 license. To learn more, please visit o3de.org.

About the Open 3D Foundation

Established in July 2021, the mission of the Open 3D Foundation (O3DF) is to make an open-source, fully-featured, high-fidelity, real-time 3D engine for building games and simulations, available to every industry. The Open 3D Foundation is home to the O3D Engine project. To learn more, please visit o3d.foundation.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

Media Inquiries:

pr@o3d.foundation

# # #

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

A brief about my experience with the Linux Foundation Mentorship.

The post originally appeared on deprov477’s blog. The author, Anubhav Choudhary, particpated in the Linux Foundation’s Mentorship Program in 2022. The program is designed to help developers — many of whom are first-time open source contributors — with necessary skills and resources to learn, experiment, and contribute effectively to open source communities. By participating in a mentorship program, mentees have the opportunity to learn from experienced open source contributors as a segue to get internship and job opportunities upon graduation. If you are interested, we invite you to learn more and apply today here.


Hi everyone, I recently completed my LFX Mentorship project. I was a mentee for the LFXM summer term of 2022 at Pixie, a CNCF sandbox project donated by The New Relic.

In this blog, I will be sharing my experience of mentorship. (TLDR; just awesome, one-of-a-kind experience <3) If you're also applying for this (which every open-source newbie should), or have a doubt, feel free to drop me a message. I’d be more than happy to help.

What is LFX Mentorship?

Let’s start this by knowing about The Linux Foundation. The Linux Foundation (LF) is a non-profit organization, that standardizes the development of the Linux kernel and also promotes open source projects such as Kubernetes, GraphQL, Hyperledger, RISC-V, Xen project, etc.

The Linux Foundation Mentorship is a program run by LF, which helps developers with the necessary skills and resources to learn and contribute to open source projects, through 3 or 6 months of internship. During this period, the mentee is guided through the development workflow and methodologies used by open source organizations, through a project.

Selection procedure

I’ve been involved in open source for some time and have been applying for the mentorship, but got rejected every time.

This time also I was going through the projects and found a particularly interesting project. It was about parsing a protocol. This took my eye as at that time I was learning networking and experimenting a lot with communications. So naturally, I got interested. After reading the project details, I went to the project’s slack channel to find a mentor. Omid, one of Pixie’s founding engineers, was kind enough to reply to my message and asked for a quick call.

I talked to him and told him about my interest and how I made a preliminary Mongo wire protocol parser using Node.js as preparation. He seemed satisfied with this and told me about further steps and time commitment.

Other formalities included submitting a cover letter, and my resume.

A few days later got this:

LFX Hi Anubhav

Finally, after applying so many times, got selected !!!

Month 1

Started, and was introduced to my mentor Yaxiong Zhao, another founding engineer at Pixie. He told me about what we were going to do in the next 3 months. He demoed me the Pixie UI and explained to me the working of it, and how pixie catches packets (hint: eBPF). And then sent me the AMQP spec sheet, and how it needs to be implemented using C++.

Yes, the protocol changed from Mongo to AMQP, and the language from Node.js to C++. But I guess a very important survival quality of industry is being flexible.

So, in the first month, I got a theoretical knowledge about AMQP wire spec and experimented with it by deploying a local RabbitMQ server, and monitoring packets using Wireshark. My mentor also tried helping me build Pixie on my local machine, but we failed, even after switching distros. At last, we were able to set up my dev environment inside a container.

…quite a month

Month 2

In the first half of this month, I continued my research on AMQP (apparently implementing a protocol required a lot of extensive reading) and found analogies of it with protocols I was already familiar with, and kept on manually experimenting with packet translation.

3rd week of the month, It was finally time for me to start writing some code. Okay, so this was the difficult part. Having very limited knowledge of C++, continued forward. But my mentor was being an angel at this point, very patiently explaining to me, and pointing me in the right direction, making me understand every lex required. I started with implementing a data structure for storing and creating relations between packets. After some effort, finally got my PR merged.

AMQP types header file

Month 3

Continuing my code work, I started building a parser code. Yaxiong was very patient and helpful during this time, sending me blogs, and guides and explaining to me every little doubt I had. Thanks to him I was able to finally submit my preliminary code for parsing the code.

And a final thing for this was to write tests. Learned google’s C++ testing library. Wrote code, pushed.

Concluding the program

Like every good thing, this also came to an end. 12 weeks just fly by — faster than you can think — The program opened up a new world of open source and got me introduced to a lot of professional tools and etiquette. I appreciate the time and efforts my mentor put into this program.

Completing this internship was a dream come true, dodging tonnes of problems: internet, college, placement preparation, exams, everything. At many points in the internship, I was very certain I won’t be able to complete the project. but:

At some point, everything’s gonna go south on you… everything’s going to go south and you’re going to say, this is it. This is how I end. Now you can either accept that, or you can get to work. That’s all it is. You just begin. You do the math. You solve one problem… and you solve the next one… and then the next. And If you solve enough problems, you get to come home.

— Tail ender, The Martian.

junu and charmed projects from FINOS open source project

The article by Srikrishna ‘Kris’ Sharma with Canonical originally appeared in the FINOS Project’s Community Blog. It is another example of enterprises open sourcing their code so that they can “collectively solve common problems so they can separately innovate and differentiate on top of the common baseline.” Read more about Why Do Enterprises Use and Contribute to Open Source Software.

Orchestrating Legend with Juju

Goldman Sachs open sourced the code and contributed its internally developed Legend data management platform into FINOS in October 2020.  Legend provides an end-to-end data platform experience covering the full data lifecycle. It encompasses a suite of data management and governance components known as the Legend Platform. Legend enables breaking down silos and building a critical bridge over the historical divide between business and engineering, allowing companies to build data-driven applications and insightful business intelligence dashboards.

Accelerate FINOS Open Source Project Adoption

Ease and speed of deployment enables innovation and lowers the barrier of entry to open source consumption and contribution. Engineering experience is about leveraging software ops automation to demonstrate impact of an open source project to the community. An awesome engineering experience is more often required to enable wider adoption and contribution to an open source project.

Over the last few months, Canonical has been working closely with FINOS and its community members to offer a consistent way to deploy and manage enterprise applications using Juju and Charmed Operators with a focus on Day 2 operations. The idea is to provide a software ops automation framework and toolkit that enables the DevOps teams at financial institutions to realise the benefits of rapid deployment/ testing and application management using a platform that is 100% open source, vendor-agnostic and hybrid-multi-cloud ready.

What is Juju and Charmed Operator?

Charmed Operator:

A charmed operator (also known, more simply, as a “charm”) encapsulates a single application and all the code and know-how it takes to operate it, such as how to combine and work with other related applications or how to upgrade it. Charms are programmed to understand a single application, its operations, and its potential to integrate with other applications. A charm defines and enables the channels by which applications connect. Hundreds of charms are available at charmhub.io.

Juju Operator Lifecycle Manager (OLM) is a hybrid-cloud application management and orchestration system for installation and day 2 operations. It helps deploy, configure, scale, integrate, maintain, and manage Kubernetes native, container-native and VM-native applications—and the relations between them.

Juju allows anyone to deploy and operate charmed operators (charms) in any cloud–including Kubernetes, VMs and Metal. Charms encapsulate the application plus deployment and operations knowledge into one single reusable artefact. Juju manages the lifecycle of applications and infrastructure stacks from cloud to the edge. Juju is cloud-vendor agnostic and hybrid-multi-cloud by nature: it can manage the lifecycle of applications in public clouds, private clouds, or on bare metal. Once bootstrapped, Juju will offer the same deployment and operations experience regardless of the cloud vendor.

The Legend Charm Bundle

In the spirit of providing an enterprise-grade automated deployment and maintenance experience to FINOS members, Canonical created a charmed bundle for Legend and contributed it to FINOS.

The Legend Charm Bundle provides a simple, efficient and enterprise-ready way to deploy and orchestrate a Legend instance in various environments across the CI/CD pipeline, from developer’s workstation to production environment. The bundle includes several Charmed Operators, one for each Legend component.

Why a Legend Charm Bundle?

  1. A simple way to evaluate Legend
    One can spin up a Legend environment from scratch using one single command juju deploy finos-legend-bundle
  2. An intuitive approach (for banks and other financial institutions) to spin up production environments
  3. Provides orchestration capabilities, not only deployment scripting
  4. Easily plugs into Legend release lifecycle and simplifies Legend FINOS instance maintenance

The Legend charm documentation resides on finos/legend-integration-juju github repository and here is the link to related repositories.multiple components.

Detailed instructions are available for local and cloud installations if you would like to spin up your own Legend instance within a few mins and start using Legend either locally or on AWS EKS.

Priyanka Sharma

In open source communities, we meet people every day.  We probably know their current role and responsibilities, but we don’t always have perspective on the history, education, and career path that made them who they are.  These are some of the untold stories of open source.  

At the Linux Foundation, we’re a couple of weeks away from launching a new podcast series, The Untold Stories of Open Source.  For our blog readers, you’re getting a sneak peek into a few of the stories that will kick off our series.  Today, we’ll share perspectives from episode 1, Priyanka Sharma.

After Graduating

Priyanka Sharma is an evangelist for the power of community in open source. Okay, she is much more than that, and we will get to that in a bit, but her passion and what drives all of her other successes in open source is the power of an inclusive, supportive community. 

Priyanka didn’t begin in open source. After graduating from Stanford University in 2009 with a degree in computer science, she started her career at Google in the online partnership group, where she was a technical consultant onboarding new Doubleclick clients and acted as an interim project manager for internal insights tools. Following Google, she held roles at Outright and GoDaddy, including integrating the Outright product into the GoDaddy sales catalog.  However, she was bitten by the build-a-business bug years earlier. In 2014, she gathered up some ideas and funding, experimenting with consumer products, but nothing was sticking. 

A Road to TechCrunch Disrupt

She realized that her business partner had built a time-tracking app for himself that was geared towards software developers. It was plugin based, so you could put it into your IDE and have time tracking at your fingertips. After all, who wants to track time, so the easier you make it, the better. 

All of the plugins were open source – introducing her to the world that she was about to live in. She noticed how people were drawn to the plugins, customizing them to work better for what they needed. She thought, “Maybe this is what we should focus on.” So, with a path she couldn’t have seen coming, she ended up getting into developer tools. The plugins were eventually used by 100,000 developers, featured by TechCruch Disrupt, and chosen by Y-Combinator

Setting Out on Her Own

But, as she says, “All that glitters isn’t gold.” There were challenges every day as with any startup, from fundraising to public visibility. Getting into Y-Combinator was a pivotal moment, forcing the team to come to terms with what it would take to work together to make a real commitment to the project together, as a team. 

Priyanka thought back to that time, “I think you can overcome anything when you are part of a team when you jive with each other, where everyone is aligned on the final outcome. When that is not the case, it is very tricky because everyone is going towards different goals. That is the meta issue that led us to go our different ways.” 

Now out on her own, she realized that there were not many people who understood marketing developer tools or a go-to-market strategy for developer tools. So, she began working with Heavybit, an accelerator and incubator for developer products. “They really took me in and gave me opportunities to help their portfolio companies.” Her work helped Rainforest QA, Lightstep, LaunchDarkly, and Postman API

Reflecting on Ben’s Approach

She ended up joining the Lightstep team because she saw not only the value of their reputation, but was drawn to the top-notch team and what they could teach her. Part of the draw was Dapper, a tool built at Google to provide developers with a distributed tracing system exploring the behavior of complex distributed systems. Dapper sparked many tools that weren’t anticipated by its initial developers. Ben Sigelman, co-creator of Dapper and the OpenTracing and OpenTelemetry projects, now part of the Cloud Native Computing Foundation (CNCF). “Ben’s approach was very much as an educator. There are lots of experts out there, but if they aren’t interested in teaching, I don’t get any value in it.” 

As the second hire at Lightstep, she had a variety of roles, including developer relations, marketing, documentation, and more. 

The initial focus of the company was on OpenTracing. They initially were an independent open source project, but they eventually decided to join the Cloud Native Computing Foundation to give them more firepower than “us by ourselves.” 

Now, between her startup and Lightstep, she heard more and more about open source. She was drawn to the value placed on creation and collaboration. 

Evolving to Cloud Native

Priyanka attributes the growth of cloud native to the fact that the core group welcomed everyone. You can see that in person at KubeCon + CloudNativeCon, the largest open source events in the world. She recalls how nervous she was going to her first Kube Con, feeling out of her element, but as soon as she walked through the doors, everyone was so welcoming and inclusive. 

Dan Kohn built CNCF into one of the most successful open source foundations in the world in large part because it was built on being an open and welcoming community. Priyanka recalls, “Dan baked DEI into everything at CNCF from day one. . . He set the example and put it into the structure.” 

Priyanka felt welcomed into the community and began asking for opportunities to participate. Sometimes the answer was yes, sometimes it was no thank you. But she still felt she had the support of the community. She had a sense of belonging for the first time in her career. 

In 2018, she joined GitLab as director of technical evangelism, where she formed the technical thought leadership team. She was also in charge of cloud native alliances. At the urging of her boss at GitLab, she put her name forward to be elected to the CNCF Board of Directors. 

While on the CNCF Board, she was energized by several other women on the Board. She said they set the bar high with a focus on the project’s good at all times. 

Fast forward. Now, Priyanka is the general manager of the CNCF, leading one of open source’s largest and most effective foundations. 

Seeking More Insight

You can listen to the full episode with her story on the Untold Stories of Open Source podcast and hear about the power of the CNCF community and its impact. 

The Untold Stories of Open Source is a new podcast from the Linux Foundation to share the stories behind those in open source. Take time to listen to all of the episodes and let us know what you think (or if you have suggestions of stories to be told). Look for the formal launch at Open Source Summit North America and OpenSSF Day on June 20, 2022. 

There are thousands of incredible open source stories to share and we’re looking forward to bringing more of them your way.  If you like what you hear, we encourage you to add the series to your playlist.  

For those seeking even more open source stories from across the Linux Foundation and the communities we serve, you might start with some of the other storytelling pioneers including: Open Source Stories, , FinOpsPod, I am a Mainframer, and The Changelog.  As we grow deeper roots in the podcasting arena, we’ll introduce more news about a network of open source podcasts.

Have even more time? Feedspot recently covered an additional 40 Open Source Podcasts worth listening to on your morning walk or commute home from the office.

There are some universal truths about open source software (OSS). It has revolutionized our world and become the foundation of our digital society, the backbone of our digital economy, and the basis of our digital existence. Every household and enterprise brand name in technology is built upon it, whether that name is Alexa or Android, Azure, or AWS. 

Open source software has played a significant part in everything from the internet and mobile apps we use every day to operating systems and programming languages used to construct the future. Even the systems we traditionally think of as being closed, such as Microsoft Windows and Apple’s Mac and iPhone, are developed using open source software.

Just as a powerful current drives the arteries of a river, open source software is the force that propels our digital economy and allows for scientific and technological advancements that benefit our lives. 

But only a few decades ago, few people had even heard of open source software, and it was limited to a small group of enthusiastic devotees. Yet the concept of free and open source software (FOSS) has been around a long time, going back to the early days of the user communities for IBM mainframes and academic institutions. FOSS is software that anyone can use, study, modify, and distribute without restriction. The term “open source” was coined to describe this type of software, and the concept was formalized with the launch of the Open Source Initiative (OSI) in 1998.

Organizations involved in building products or services involving software, regardless of their specific industry or sector, are likely to adopt OSS and contribute to open source projects deemed critical to their products and services. Organizations are creating open source program offices (OSPOs) to manage their open source activities, from adopting OSS and compliance with applicable licenses to participating in open standards and foundations. 

Many new industries and thousands of businesses have joined the open source revolution. Those organizations that chose a deliberate OSS strategy, incorporating best practices,  methods, and engineering processes, emerged as leaders in their industries or verticals for open source initiatives.

And yet, many organizations have not embraced open source at all. Some see it as a risky undertaking, lacking a strategy to move forward, needing pathways to see the value proposition of free and open source software, and requiring migration from a risk point of view to a value point of view. In addition to challenges with open source consumption, many organizations prohibit their employees from open source contributions either on their behalf or personally in the employee’s spare time.

To help guide organizations through their own open source journeys, Ibrahim Haddad, Ph.D., Executive Director of LF AI & Data, has written a report that offers a practical and systematic approach to establishing an OSS strategy, which includes developing an implementation plan and accelerating an organization’s open source efforts. 

The past two decades have been critical for open source software in enterprise engagement and adoption. The challenge for organizations is their transition from ad hoc and incidental adoption to open source value delivered back to the business using a strategic and planned methodology. This report delivers on the promise of helping enterprises establish an open source strategy, develop and execute an implementation plan, and accelerate their open source efforts to support their business goals. 

Ibrahim Haddad, Ph.D.

This research is a collection of learnings and best practices that Dr. Haddad has developed, collaborating with the LF AI & Data community members who have pursued their own open source journeys for years.

Effective organizations have guided their open source usage through strategy, honed over time with communities such as LF AI & Data and the TODO Group to guide their ongoing use of OSS and their engagement with the open source ecosystem.

This report helps to address the fears of transitioning to open source and explore the many opportunities it offers by covering the following topics:

  • The business case for open source software
  • How to develop an open source strategy
  • Creating an open source program office
  • Implementing an open source strategy
  • Measuring success with open source
  • Best practices for organizational involvement in open source projects

open source software word cloud

When people find out I work at the Linux Foundation they invariably ask what we do? Sometimes it is couched around the question, As in the Linux operating system? I explain open source software and try to capture the worldwide impact into 20 seconds before I lose their attention. If they happen to stick around for more, we often dig into the question, Why would enterprises want to participate in open source software projects or use open source software? The reality is – they do, whether they know it or not. And the reality is thousands of companies donate their code to open source projects and invest time and resources helping to further develop and improve open source software.

How extensively used is open source software

To quote from our recently released report, A Guide to Enterprise Open Source, “Open source software (OSS) has transformed our world and become the backbone of our digital economy and the foundation of our digital world. From the Internet and the mobile apps we use daily to the operating systems and programming languages we use to build the future, OSS has played a vital role. It is the lifeblood of the technology industry. Today, OSS powers the digital economy and enables scientific and technological breakthroughs that improve our lives. It’s in our phones, our cars, our airplanes, our homes, our businesses, and our governments. But just over two decades ago, few people had ever heard of OSS, and its use was limited to a small group of dedicated enthusiasts.”

Open source software (OSS) has transformed our world and become the backbone of our digital economy and the foundation of our digital world.

But what does this look like practically:

  • In vertical software stacks across industries, open source penetration ranges from 20 to 85 percent of the overall software used
  • Linux fuels 90%+ of web servers and Internet-connected devices
  • The Android mobile operating system is built on the Linux kernel
  • Immensely popular libraries and tools to build web applications, such as: AMP, Appium, Dojo, jQuery, Marko, Node.js and so many more are open source
  • The world’s top 100 supercomputers run Linux
  • 100% of mainframe customers use Linux
  • The major cloud-service providers – AWS, Google, and Microsoft – all utilize open-source software to run their services and host open-source solutions delivered through the cloud

Why do companies want to participate in open source software projects

Companies primarily participate in open source software projects in three ways:

  1. They donate software they created to the open source community
  2. They provide direct funding and/or allocate software developers and other staff to contribute to open source software projects

The question often asked is, why wouldn’t they want to keep all of their software proprietary or only task their employees to work on their proprietary software?

The 30,000-foot answer is that it is about organizations coming together to collectively solve common problems so they can separately innovate and differentiate on top of the common baseline. They see that they are better off pooling resources to make the baseline better. Sometimes it is called “coopetition.” It generally means that while companies may be in competition with each other in certain areas, they can still cooperate on others.

It is about organizations coming together to collectively solve common problems so they can separately innovate and differentiate

Some old-school examples of this principle:

  • Railroads agreed on a common track size and build so they can all utilize the same lines and equipment was interchangeable
  • Before digital cameras, companies innovated and differentiated on film and cameras, but they all agreed on the spacing for the sprockets to advance the film
  • The entertainment industry united around the VHS and Blu-Ray formats over their rivals

Now, we see companies, organizations, and individuals coming together to solve problems while simultaneously improving their businesses and products:

  • Let’s Encrypt is a free, automated, and open certificate authority with the goal of dramatically increasing the use of secure web protocols by making it much easier and less expensive to setup. They are serving 225+ million websites, issuing ~1.5 million certificates each day on average.
  • The Academy Software Foundation creates value in the film industry through collectively engineering software that powers much of the entertainment, gaming, and media industry productions and open standards needed for growth.
  • The Hyperledger Foundation hosts enterprise-grade blockchain software projects, notably using significantly fewer energy resources than other popular solutions.
  • LF Energy is making the electric grid more modular, interoperable, and scalable to help increase the use of renewable energy sources
  • Dronecode is enabling the development of drone software so companies can use their resources to innovate further
  • OpenSSF is the top technology companies coming together to strengthen the security and resiliency of open source software
  • Kubernetes was donated by Google and is the go-to solution for managing cloud-based software

These are just a small sampling of the open source software projects that enterprises are participating in. You can explore all of the ones hosted at the Linux Foundation here.

How can companies effectively use and participate in open source software projects?

Enterprises looking to better utilize and participate in open source projects can look to the Linux Foundation’s resources to help. Much of what organizations need to know is provided in the just-published report, A Guide to Enterprise Open Source. The report is packed with information and insights from open source leaders at top companies with decades of combined experience. It includes chapters on these topics:

  • Leveraging Open Source Software
  • Preparing the Enterprise for Open Source
  • Developing an Open Source Strategy
  • Setting Up Your Infrastructure for Implementation
  • Setting Up Your Talent for Success
  • Challenges

Additionally, the Linux Foundation offers many open source training courses, events throughout the year, the LFX Platform, and hosts projects that help organizations manage open source utilization and participation, such as:

  • The TODO Group provides resources to setup and run an open source program office, including their extensive guides
  • The Openchain Project maintains an international standard for sharing what software package licenses are included in a larger package, including information on the various licensing requirements so enterprises can ensure they are complying with all of the legal requirements
  • The FinOps Foundation is fostering an, “evolving cloud financial management discipline and cultural practice that enables organizations to get maximum business value by helping engineering, finance, technology, and business teams to collaborate on data-driven spending decisions.”
  • The Software Data Package Exchange (SPDX) is an open standard for communication software bill of materials (SBOMs) so it is clear to every user which pieces of software are included in the overall package.

Again, this is just a snippet of the projects at the Linux Foundation that are working to help organizations adapt, utilize, contribute, and donate open source projects.

The bottom line: Enterprises are increasingly turning to open source software projects to solve common problems and innovate beyond the baseline, and the Linux Foundation is here to help.