There is an exciting convergence in the networking industry around open source, and the energy is palpable. At LF Networking, we have a unique perspective as the largest open source initiative in the networking space with the broadest set of projects that make up the diverse and evolving open source networking stack. LF Networking provides platforms and building blocks across the networking industry that enable rapid interoperability, deployment, and adoption and is the nexus for 5G innovation and integration.
LF Networking has now tapped confluence on industry efforts to structure a new initiative to develop 5G Super Blueprints for the ecosystem. Major integrations between the building blocks are now underway–between ONAP and ORAN, Akraino and Magma, Anuket and Kubernetes, and more.
“Super” means that we’re integrating multiple projects, umbrellas (such as LF Edge, Magma, CNCF, O-RAN Alliance, LF Energy, and more) with an end-to-end framework for the underlying infrastructure and application layers across edge, access, and core. This end-to-end integration enables top industry use cases, such as fixed wireless, mobile broadband, private 5G, multi-access, IoT, voice services, network slicing, and more. In short, 5G Super Blueprints are a vehicle to collaborate and create end-to-end 5G solutions.
Major industry verticals banking on this convergence and roadmap include the global telcos that you’d expect, but 5G knows no boundaries, and we’re seeing deep engagement from cloud service providers, enterprise IT, governments, and even energy.
5G is poised to modernize today’s energy grid with awareness monitoring across Distribution Systems and more.
This will roll out in 3 phases, the first encompassing 5G Core + Multi-access Edge Computing (MEC) using emulators. The second phase introduces commercial RANs to end-to-end 5G, and the third phase will integrate Open Radio Access Network (O-RAN).
https://www.linuxfoundation.org/wp-content/uploads/SuperBlueprints-053121.png12602400Arpit Joshipurahttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgArpit Joshipura2021-06-01 08:03:302021-06-01 08:03:33Super Blueprints Integrate the 5G Open Source Stack from Core to Door
One of those capabilities, SPDX, completely addresses the Executive Order 4(e) and 4(f) and 10(j) requirements for a Software Bill of Materials (SBOM). The SPDX specification is implemented as a file format that identifies the software components within a larger piece of computer software and metadata such as the licenses of those components.
SPDX is an open standard for communicating software bill of material (SBOM) information, including components, licenses, copyrights, and security references. It has a rich ecosystem of existing tools that provides a common format for companies and communities to share important data to streamline and improve the identification and monitoring of software.
SBOMs have numerous use cases. They have frequently been used in areas such as license compliance but are equally useful in security, export control, and broader processes such as mergers and acquisitions (M&A) processes or venture capital investments. SDPX maintains an active community to support various uses, modeling its governance and activity on the same format that has successfully supported open source software projects over the past three decades.
The LF has been developing and refining SPDX for over ten years and has seen extensive uptake by companies and projects in the software industry. Notable recent examples are the contributions by companies such as Hitachi, Fujitsu, and Toshiba in furthering the standard via optional profiles like “SPDX Lite” in the SPDX 2.2 specification release and in support of the SPDX SBOMs in proprietary and open source automation solutions.
This de facto standard has been submitted to ISO via the Joint Development Foundation using the PAS Transposition process of Joint Technical Committee 1 (JTC1). It is currently in the enquiry phase of the process and can be reviewed on the ISO website as ISO/IEC DIS 5962.
There is a wide range of open source tooling, as well as commercial tool options emerging as well as options available today. Companies such as FOSSID and Synopsys have been working with the SPDX format for several years. Open Source tools like FOSSology (source code Analysis), OSS Review Toolkit (Generation from CI & Build infrastructure), Tern (container content analysis), Quartermaster (build extensions), ScanCode (source code analysis) in addition to the SPDX-tools project have also standardized on using SPDX for the interchange are also participating in Automated Compliance Tooling (ACT) Project Umbrella. ACT has been discussed as community-driven solutions for software supply chain security remediation as part of our synopsis of the findings in the Vulnerabilities in the Core study, which was published by the Linux Foundation and Harvard University LISH in February of 2020.
One thing is clear: A software bill of materials that can be shared without friction between different teams and companies will be a core part of software development and deployment in this coming decade. The sharing of software metadata will take different forms, including manual and automated reviews, but the core structures will remain the same.
Standardization in this field, as in others, is the key to success. This domain has an advantage in that we are benefiting from an entire decade of prior work in SPDX. Therefore the process becomes the implementation of this standard to the various domains rather than the creation, expansion, or additional refinement of new or budding approaches to the matter.
Start using the SPDX specification here:https://spdx.github.io/spdx-spec/. Development of the next revision is underway, so If there’s a use case you can’t represent with the current specification, open an issue, this is the right window for input.
https://www.linuxfoundation.org/wp-content/uploads/LF-blogpost-graphics-06-1.jpg6281200Kate Stewarthttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgKate Stewart2021-05-25 13:28:032021-05-25 13:32:50SPDX: It’s Already in Use for Global Software Bill of Materials (SBOM) and Supply Chain Security
Linux Foundation Editorial Director Jason Perlow had a chance to speak with Masato Endo, OpenChain Project Automotive Chair and Leader of the OpenChain Project Japan Work Group Promotion Sub Group, about the Japan Ministry of Economy, Trade and Industry’s (METI) recent study on open source software management.
ME: Hi, Jason-san! Thank you for such a precious opportunity. I’m a manager and scrum master in the planning and development department of new services at a Japanese automotive company. We were also working on building the OSS governance structure of the company, including obtaining OpenChain certification.
As an open source community member, I participated in the OpenChain project and was involved in establishing the OpenChain Japan Working Group and Automotive Working Group. Recently, as a leader of the Promotion SG of the Japan Working Group, I am focusing on promoting OSS license compliance in Japan.
In this project, I contribute to it as a bridge between the Ministry of Economic, Trade, and Industry and the members of OSS community projects such as OpenChain.
For example, I recently gave a presentation of OpenChain at the meeting and introduced the companies that cooperate with the case study.
ME: METI has jurisdiction over the administration of the Japanese economy and industry. This case study was conducted by a task force that examines software management methods for ensuring cyber-physical security of the Commerce and Information Policy Bureau’s Cyber Security Division.
ME: METI itself conducted this survey. The Task Force has been considering appropriate software management methods, vulnerability countermeasures, license countermeasures, and so on.
Meanwhile, as the importance of OSS utilization has increased in recent years, it concluded that sharing the knowledge of each company regarding OSS management methods helps solve each company’s problems.
JP: How do Japanese corporations differ from western counterparts in open source culture?
ME: Like Western companies, Japanese companies also use OSS in various technical fields, and OSS has become indispensable. In addition, more than 80 companies have participated in the Japan Working Group of the OpenChain project. As a result, the momentum to promote the utilization of OSS is increasing in Japan.
On the other hand, some survey results show that Japanese companies’ contribution process and support system are delayed compared to Western companies. So, it is necessary to promote community activities in Japan.
When developing software using OSS, it is necessary to comply with the license declared by each OSS. If companies don’t conduct in-house licensing education and management appropriately, OSS license violations will occur.
Challenge 2: Long term support
Since the development term of OSS depends on the community’s activities, the support term may be shorter than the product life cycle in some cases.
Challenge 3:OSS supply chain management
Recently, the software supply chain scale has expanded, and there are frequent cases where OSS is included in deliveries from suppliers. OSS information sharing in the supply chain has become important to implement appropriate vulnerability countermeasures and license countermeasures.
JP: Are there initiatives that are working to address these challenges?
ME: In this case study, many companies mentioned license compliance. It was found that each company has established a company-wide system and rules to comply with the license and provides education to engineers. The best way to do this depends on the industry and size of the company, but I believe the information from this case study is very useful for each company of all over the world.
In addition, it was confirmed that Software Bill of Materials (SBOM) is becoming more critical for companies in the viewpoint of both vulnerability response and license compliance. Regardless of whether companies are using OSS internally or exchanging software with an external partner, it’s important to clarify which OSS they are using. I recognize that this issue is a hot topic as “Software transparency” in Western companies as well.
In this case study, several companies also mentioned OSS supply chain management. In addition to clarifying the rules between companies, it is characterized by working to raise the level of the entire supply chain through community activities such as OpenChain.
JP: What are the benefits of Japanese companies adopting standards such as OpenChain and SPDX?
ME: Companies need to do a wide range of things to ensure proper OSS license compliance, so some guidance is needed. The OpenChain Specification, which has become an ISO as a guideline for that, is particularly useful. In fact, several companies that responded to this survey have built an OSS license compliance process based on the OpenChain Specification.
Also, from the perspective of supply chain management, it is thought that if each supply chain company obtains OpenChain certification, software transparency will increase, and appropriate OSS utilization will be promoted.
In addition, by participating in OpenChain’s Japan Working Group, companies can share the best practices of each company and work together to solve problems.
Since SPDX is a leading international standard for SBOM, it is very useful to use it when exchanging information about OSS in the supply chain from the viewpoint of compatibility.
Japanese companies use the SPDX standard and actively contribute to the formulation of SPDX specifications like SPDX Lite.
Nearly a year after the Internet Engineering Task Force took up a plan to replace words that could be considered racist, the debate is still raging.
Anyone who joined a video call during the pandemic probably has a global volunteer organization called the Internet Engineering Task Force to thank for making the technology work. The group, which helped create the technical foundations of the internet, designed the language that allows most video to run smoothly online. It made it possible for someone with a Gmail account to communicate with a friend who uses Yahoo, and for shoppers to safely enter their credit card information on e-commerce sites.
Now the organization is tackling an even thornier issue: getting rid of computer engineering terms that evoke racist history, like “master” and “slave” and “whitelist” and “blacklist.”
But what started as an earnest proposal has stalled as members of the task force have debated the history of slavery and the prevalence of racism in tech. Some companies and tech organizations have forged ahead anyway, raising the possibility that important technical terms will have different meanings to different people — a troubling proposition for an engineering world that needs broad agreement so technologies work together.
While the fight over terminology reflects the intractability of racial issues in society, it is also indicative of a peculiar organizational culture that relies on informal consensus to get things done.
The Internet Engineering Task Force eschews voting, and it often measures consensus by asking opposing factions of engineers to hum during meetings. The hums are then assessed by volume and ferocity. Vigorous humming, even from only a few people, could indicate strong disagreement, a sign that consensus has not yet been reached.
The I.E.T.F. has created rigorous standards for the internet and for itself. Until 2016, it required the documents in which its standards are published to be precisely 72 characters wide and 58 lines long, a format adapted from the era when programmers punched their code into paper cards and fed them into early IBM computers.
“We have big fights with each other, but our intent is always to reach consensus,” said Vint Cerf, one of the founders of the task force and a vice president at Google. “I think that the spirit of the I.E.T.F. still is that, if we’re going to do anything, let’s try to do it one way so that we can have a uniform expectation that things will function.”
The group is made up of about 7,000 volunteers from around the world. It has two full-time employees, an executive director and a spokesman, whose work is primarily funded by meeting dues and the registration fees of dot-org internet domains. It cannot force giants like Amazon or Apple to follow its guidance, but tech companies often choose to do so because the I.E.T.F. has created elegant solutions for engineering problems.
Its standards are hashed out during fierce debates on email lists and at in-person meetings. The group encourages participants to fight for what they believe is the best approach to a technical problem.
While shouting matches are not uncommon, the Internet Engineering Task Force is also a place where young technologists break into the industry. Attending meetings is a rite of passage, and engineers sometimes leverage their task force proposals into job offers from tech giants.
In June, against the backdrop of the Black Lives Matter protests, engineers at social media platforms, coding groups and international standards bodies re-examined their code and asked themselves: Was it racist? Some of their databases were called “masters” and were surrounded by “slaves,” which received information from the masters and answered queries on their behalf, preventing them from being overwhelmed. Others used “whitelists” and “blacklists” to filter content.
Mallory Knodel, the chief technology officer at the Center for Democracy and Technology, a policy organization, wrote a proposal suggesting that the task force use more neutral language. Invoking slavery was alienating potential I.E.T.F. volunteers, and the terms should be replaced with ones that more clearly described what the technology was doing, argued Ms. Knodel and the co-author of her proposal, Nielsten Oever, a postdoctoral researcher at the University of Amsterdam. “Blocklist” would explain what a blacklist does, and “primary” could replace “master,” they wrote.
On an email list, responses trickled in. Some were supportive. Others proposed revisions. And some were vehemently opposed. One respondent wrote that Ms. Knodel’s draft tried to construct a new “Ministry of Truth.”
Amid insults and accusations, many members announced that the battle had become too toxic and that they would abandon the discussion.
The pushback didn’t surprise Ms. Knodel, who had proposed similar changes in 2018 without gaining traction. The engineering community is “quite rigid and averse to these sorts of changes,” she said. “They are averse to conversations about community comportment, behavior — the human side of things.”
In July, the Internet Engineering Task Force’s steering group issued a rare statement about the draft from Ms. Knodel and Mr. ten Oever. “Exclusionary language is harmful,” it said.
A month later, two alternative proposals emerged. One came from Keith Moore, an I.E.T.F. contributor who initially backed Ms. Knodel’s draft before creating his own. His cautioned that fighting over language could bottleneck the group’s work and argued for minimizing disruption.
The other came from Bron Gondwana, the chief executive of the email company Fastmail, who said he had been motivated by the acid debate on the mailing list.
“I could see that there was no way we would reach a happy consensus,” he said. “So I tried to thread the needle.”
Mr. Gondwana suggested that the group should follow the tech industry’s example and avoid terms that would distract from technical advances.
Last month, the task force said it would create a new group to consider the three drafts and decide how to proceed, and members involved in the discussion appeared to favor Mr. Gondwana’s approach. Lars Eggert, the organization’s chair and the technical director for networking at the company NetApp, said he hoped guidance on terminology would be issued by the end of the year.
The rest of the industry isn’t waiting. The programming community that maintains MySQL, a type of database software, chose “source” and “replica” as replacements for “master” and “slave.” GitHub, the code repository owned by Microsoft, opted for “main” instead of “master.”
In July, Twitter also replaced a number of terms after Regynald Augustin, an engineer at the company, came across the word “slave” in Twitter’s code and advocated change.
But while the industry abandons objectionable terms, there is no consensus about which new words to use. Without guidance from the Internet Engineering Task Force or another standards body, engineers decide on their own. The World Wide Web Consortium, which sets guidelines for the web, updated its style guide last summer to “strongly encourage” members to avoid terms like “master” and “slave,” and the IEEE, an organization that sets standards for chips and other computing hardware, is weighing a similar change.
Other tech workers are trying to solve the problem by forming a clearinghouse for ideas about changing language.
That effort, the Inclusive Naming Initiative, aims to provide guidance to standards bodies and companies that want to change their terminology but don’t know where to begin.
The group got together while working on an open-source software project, Kubernetes, which like the I.E.T.F. accepts contributions from volunteers. Like many others in tech, it began the debate over terminology last summer.
“We saw this blank space,” said Priyanka Sharma, the general manager of the Cloud Native Computing Foundation, a nonprofit that manages Kubernetes. Ms. Sharma worked with several other Kubernetes contributors, including Stephen Augustus and Celeste Horgan, to create a rubric that suggests alternative words and guides people through the process of making changes without causing systems to break. Several major tech companies, including IBM and Cisco, have signed on to follow the guidance.
Although the Internet Engineering Task Force is moving more slowly, Mr. Eggert said it would eventually establish new guidelines. But the debate over the nature of racism — and whether the organization should weigh in on the matter — has continued on its mailing list.
In a subversion of an April Fools’ Day tradition within the group, several members submitted proposals mocking diversity efforts and the push to alter terminology in tech.
Two prank proposals were removed hours later because they were “racist and deeply disrespectful,” Mr. Eggert wrote in an email to task force participants, while a third remained up.
“We build consensus the hard way, so to speak, but in the end the consensus is usually stronger because people feel their opinions were reflected,” Mr. Eggert said. “I wish we could be faster, but on topics like this one that are controversial, it’s better to be slower.”
Kate Conger is a technology reporter in the San Francisco bureau, where she covers the gig economy and social media. @kateconger
https://www.linuxfoundation.org/wp-content/uploads/LF-blogpost-graphics-04.jpg6281200The Linux Foundationhttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgThe Linux Foundation2021-05-06 10:14:492021-05-06 12:32:05‘Master,’ ‘Slave’ and the Fight Over Offensive Terms in Computing (Kate Conger, New York Times, April 13, 2021)
Over the past several decades farmers have been depending increasingly on groundwater to irrigate their crops due to climate change and reduced rainfall. Farmers, even in drought-prone areas, continue to need to grow water-intensive crops because these crops have a steady demand.
In 2019, as part of Call for Code, a team of IBMers came together and brainstormed on ideas they were passionate about – problems faced by farmers in developing countries due to more frequent drought conditions. The team designed an end-to-end solution that focuses on helping farmers gain insight into when to water their crops and help them optimize their water usage to grow healthy crops. This team, Liquid Prep, went on to win the IBM employee Call for Code Global Challenge.
Liquid Prep provides a mobile application that can obtain soil moisture data from a portable soil moisture sensor, fetch weather information from The Weather Company, and access crop data through a service deployed on the IBM Cloud. Their solution brings all this data together, analyzes it, and computes watering guidance to help the farmer decide whether to water their crops right now or conserve it for a better time.
To validate the Liquid Prep prototype, in December 2019, one of the team members traveled to India and interviewed several farmers in the village Nuggehalli, which is near the town Hirisave in the Hassan district of Karnataka, India. The interviews taught the team that the farmers did not have detailed information on when they should water their specific crops and by how much, as they didn’t know the specific needs on a plant-by-plant basis. They also just let the water run freely if the water was available from a nearby source, like a river or stream, and some were entirely dependent on rainfall. The farmers expressed a great interest in the described Liquid Prep solution as it could empower them to make more informed decisions that could improve yields.
A prototype is born
After winning the challenge the Liquid Prep team took on the opportunity to convert the concept to a more complete prototype through an IBM Service Corps engagement. The team was expanded with dedicated IBM volunteers from across the company and they were assigned to optimize Liquid Prep from August through October 2020. During this time the team developed the Minimum Viable Product (MVP) for the mobile solution.
The prototype consists of three primary components:
A hardware sensor to measure soil moisture
A highly visual and easy-to-use mobile web application, and
A back-end data service to power the app.
It works like this: the mobile web application gets soil moisture data from the soil moisture sensor. The app requests environmental conditions from The Weather Company and crop data from the plant database via the backend service deployed on the IBM Cloud. The app analyzes and computes a watering schedule to help the farmer decide if they should water their crops now or at a later time.
Liquid Prep has a developed a great working relationship with partners SmartCone Technologies, Inc., and Central New Mexico Community College. Students in the Deep Dive Coding Internet of Things (IoT) Bootcamp at CNM are designing, developing, and producing a robust IoT sensor and housing it in the shape of a stick that can be inserted into the soil and transfer the soil moisture data to the Liquid Prep mobile app via Bluetooth. The collaboration gives students important real-world experience before they enter the workforce.
“SmartCone is honored to be part of this project. This is a perfect example of technology teams working together to help make the world a better place, “ said Jason Lee, Founder & CEO, SmartCone Technologies Inc.
Additionally, Liquid Prep will work together with J&H Nixon Farms, who largely grow soybeans and corn crops on about 2800 acres of agricultural land in Ottawa, Canada. They have offered Liquid Prep the opportunity to pilot test the prototype on several plots of land that have different soil conditions, which in turn can expand the breadth of recommendation options to a larger number of potential users.
Now available as open source
Liquid Prep is now available as an open source project hosted by the Linux Foundation. The goal of the project is to help farmers globally farm their crops with the least amount of water by taking advantage of real-time information that can help improve sustainability and build resiliency to climate change.
Participation is welcomed from software developers, designers, testers, agronomists/agri experts/soil experts, IoT engineers, researchers, students, farmers, and others that can help improve the quality and value of the solution for small farmers around the world. Key areas the team are interested in developing include localizing the mobile app, considering soil properties for the improvement of the watering advice, updating project documentation, software and hardware testing, more in-depth research, and adding more crop data to the database.
https://www.linuxfoundation.org/wp-content/uploads/callforcode_stacked_400.svgThe Linux Foundationhttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgThe Linux Foundation2021-03-22 06:00:002021-03-22 07:24:23Liquid Prep intelligent watering solution now hosted by the Linux Foundation as a Call for Code project
Every month there seems to be a new software vulnerability showing up on social media, which causes open source program offices and security teams to start querying their inventories to see how FOSS components they use may impact their organizations.
Frequently this information is not available in a consistent format within an organization for automatic querying and may result in a significant amount of email and manual effort. By exchanging software metadata in a standardized software bill of materials (SBOM) format between organizations, automation within an organization becomes simpler, accelerating the discovery process and uncovering risk so that mitigations can be considered quickly.
In the last year, we’ve also seen standards like OpenChain (ISO/IEC 5320:2020) gain adoption in the supply chain. Customers have started asking for a bill of materials from their suppliers as part of negotiation and contract discussions to conform to the standard. OpenChain has a focus on ensuring that there is sufficient information for license compliance, and as a result, expects metadata for the distributed components as well. A software bill of materials can be used to support the systematic review and approval of each component’s license terms to clarify the obligations and restrictions as it applies to the distribution of the supplied software and reduces risk.
Kate Stewart, VP, Dependable Embedded Systems, The Linux Foundation, will host a complimentary mentorship webinar entitled Generating Software Bill Of Materials on Thursday, March 25 at 7:30 am PST. This session will work through the minimum elements included in a software bill of materials and detail the reasoning behind why those elements are included. To register, please click here.
There are many ways this software metadata can be shared. The common SBOM document format options (SPDX, SWID, and CycloneDX) will be reviewed so that the participants can better understand what is available for those just starting.
This mentorship session will work through some simple examples and then guide where to find the next level of details and further references.
At the end of this session, participants will be on a secure footing and a path towards the automated generation of SBOMs as part of their build and release processes in the future.
https://www.linuxfoundation.org/wp-content/uploads/default_blog_soc.jpg12602400The Linux Foundationhttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgThe Linux Foundation2021-03-16 06:00:002021-03-15 11:48:19Generating a Software Bill of Materials (SBOM) with Open Source Standards and Tooling
In mid-February, the Linux Foundation announced it had signed a collaboration agreement with the Defense Advanced Research Projects Agency (DARPA), enabling US Government suppliers to collaborate on a common open source platform that will enable the adoption of 5G wireless and edge technologies by the government. Governments face similar issues to enterprise end-users — if all their suppliers deliver incompatible solutions, the integration burden escalates exponentially.
The first collaboration, Open Programmable Secure 5G (OPS-5G), currently in the formative stages, will be used to create open source software and systems enabling end-to-end 5G and follow-on mobile networks.
The road to open source influencing 5G: The First, Second, and Third Waves of Open Source
If we examine the history of open source, it is informative to observe it from the perspective of evolutionary waves. Many open-source projects began as single technical projects, with specific objectives, such as building an operating system kernel or an application. This isolated, single project approach can be viewed as the first wave of open source.
We can view the second wave of open source as creating platforms seeking to address a broad horizontal solution, such as a cloud or networking stack or a machine learning and data platform.
The third wave of open source collaboration goes beyond isolated projects and integrates them for a common platform for a specific industry vertical. Additionally, the third wave often focuses on reducing fragmentation — you commonly will see a conformance program or a specification or standard that anyone in the industry can cite in procurement contracts.
Industry conformance becomes important as specific solutions are taken to market and how cross-industry solutions are being built — especially now that we have technologies requiring cross-industry interaction, such as end-to-end 5G, the edge, or even cloud-native applications and environments that span any industry vertical.
The third wave of open source also seeks to provide comprehensive end-to-end solutions for enterprises and verticals, large institutional organizations, and government agencies. In this case, the community of government suppliers will be building an open source 5G stack used in enterprise networking applications. The end-to-end open source integration and collaboration supported by commercial investment with innovative products, services, and solutions accelerate the technology adoption and transformation.
Why DARPA chose to partner with the Linux Foundation
DARPA at the US Department of Defense has tens of thousands of contractors supplying networking solutions for government facilities and remote locations. However, it doesn’t want dozens, hundreds, or thousands of unique and incompatible hardware and software solutions originating from its large contractor and supplier ecosystem. Instead, it desires a portable and open access standard to provide transparency to enable advanced software tools and systems to be applied to a common code base various groups in the government could build on. The goal is to have a common framework that decouples hardware and software requirements and enabling adoption by more groups within the government.
Naturally, as a large end-user, the government wants its suppliers to focus on delivering secure solutions. A common framework can ideally decrease the security complexity versus having disparate, fragmented systems.
The support programs at Linux Foundation provide the key foundations for a shared community innovations pool. These programs include IP structure and legal frameworks, an open and transparent development process, neutral governance, conformance, and DevOps infrastructure for end-to-end project lifecycle and code management. Therefore, it is uniquely suited to be the home for a community-driven effort to define an open source 5G end-to-end architecture, create and run the open source projects that embody that architecture, and support its integration for scaling-out and accelerating adoption.
The foundations of a complete open source 5G stack
The Linux Foundation worked in the telecommunications industry early on in its existence, starting with the Carrier Grade Linux initiatives to identify requirements and building features to enable the Linux kernel to address telco requirements. In 2013, The Linux Foundation’s open source networking platform started with bespoke projects such as OpenDaylight, the software-defined networking controller. OPNFV (now Anuket), the network function virtualization stack, was introduced in 2014-2015, followed by the first release of Tungsten Fabric, the automated software-defined networking stack. FD.io, the secure networking data plane, was announced in 2016, a sister project of the Data Plane Development Kit (DPDK) released into open source in 2010.
At the time, the telecom/network and wireless carrier industry sought to commoditize and accelerate innovation across a specific piece of the stack as software-defined networking became part of their digital transformation. Since the introduction of these projects at LFN, the industry has seen heavy adoption and significant community contribution by the largest telecom carriers and service providers worldwide. This history is chronicled in detail in our whitepaper, Software-Defined Vertical Industries: Transformation Through Open Source.
The work that the member companies will focus on will require robust frameworks for ensuring changes to these projects are contributed back upstream into the source projects. Upstreaming, which is a key benefit to open source collaboration, allows the contributions specific to this 5G effort to roll back into their originating projects, thus improving the software for every end-user and effort that uses them.
The Linux Foundation networking stack continues to evolve and expand into additional projects due to an increased desire to innovate and commoditize across key technology areas through shared investments among its members. In February of 2021, Facebook contributed the Magma project, which transcends platform infrastructure such as the others listed above. Instead, it is a network function application that is core to 5G network operations.
The E2E 5G Super Blueprint is being developed by the LFN Demo working group. This is an open collaboration and we encourage you to join us. Learn more here.
Building through organic growth and cross-pollination of the open source networking and cloud community
Tier 2 operators, rural operators, and governments worldwide want to reap the benefits of economic innovation as well as potential cost-savings from 5G. How is this accomplished?
With this joint announcement and its DARPA supplier community collaboration, the Linux Foundation’s existing projects can help serve the requirements of other large end-users. Open source communities are advancing and innovating some of the most important and exciting technologies of our time. It’s always interesting to have an opportunity to apply the results of these communities to new use cases.
The Linux Foundation understands the critical dynamic of cross-pollination between community-driven open source projects needed to help make an ecosystem successful. Its proven governance model has demonstrated the ability to maintain and mature open source projects over time and make them all work together in one single, cohesive ecosystem.
As a broad set of contributors work on components of an open source stack for 5G, there will be cross-community interactions. For example, that means that Project EVE, the cloud-native edge computing platform, will potentially be working with Project Zephyr, the scalable real-time operating system (RTOS) kernel, so that Eve can potentially orchestrate Zephyr devices. It’s all based on contributors’ self-interests and motivations to contribute functionality that enables these projects to work together. Similarly, ONAP, the network automation/orchestration platform, is tightly integrated with Akraino so that it has architectural deployment templates built around network edge clouds and multi-edge clouds.
An open source platform has implications not just for new business opportunities for government suppliers but also for other institutions. The projects within an open source platform have open interfaces that can be integrated and used with other software so that other large end-users like the World Bank, can have validated and tested architectural blueprints, with which can go ahead and deploy effective 5G solutions in the marketplace in many host countries, providing them a turnkey stack. This will enable them to encourage providers through competition or challenges native to their in-country commercial ecosystem to implement those networks.
This is a true solutions-oriented open source for 5G stack for enterprises, governments, and the world.
https://www.linuxfoundation.org/wp-content/uploads/HowOpenSourceCommunitiesAreDriving-031021-2.png9451800The Linux Foundationhttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgThe Linux Foundation2021-03-11 09:00:002021-03-16 11:22:52How open source communities are driving 5G’s future, even within a large government like the US
Today, the Linux Foundation announced that it would be adding Rend-o-matic to the list of Call for Code open source projects that it hosts. The Rend-o-matic technology was originally developed as part of the Choirless project during a Call for Code challenge as a way to enable musicians to jam together regardless of where they are. Initially developed to help musicians socially distance because of COVID 19, the application has many other benefits, including bringing together musicians from different parts of the world and allowing for multiple versions of a piece of music featuring various artist collaborations. The artificial intelligence powering Choirless ensures that the consolidated recording stays accurately synchronized even through long compositions, and this is just one of the pieces of software being released under the new Rend-o-matic project.
Created by a team of musically-inclined IBM developers, the Rend-o-matic project features a web-based interface that allows artists to record their individual segments via a laptop or phone. The individual segments are processed using acoustic analysis and AI to identify common patterns across multiple segments which are then automatically synced and output as a single track. Each musician can record on their own time in their own place with each new version of the song available as a fresh MP3 track. In order to scale the compute needed by the AI, the application uses IBM Cloud Functions in a serverless environment that can effortlessly scale up or down to meet demand without the need for additional infrastructure updates. Rend-o-matic is itself built upon open source technology, using Apache OpenWhisk, Apache CouchDB, Cloud Foundry, Docker, Python, Node.js, and FFmpeg.
Since its creation, Choirless has been incubated and improved as a Call for Code project, with an enhanced algorithm, increased availability, real-time audio-level visualizations, and more. The solution has been released for testing, and as of January, users of the hosted Choirless service built upon the Rend-o-matic project – including school choirs, professional musicians, and bands – have recorded 2,740 individual parts forming 745 distinct performances.
Call for Code invites developers and problem-solvers around the world to build and contribute to sustainable, open source technology projects that address social and humanitarian issues while ensuring the top solutions are deployed to make a demonstrable difference. Learn more about Call for Code. You can learn more about Rend-o-matic, sample the technology, and contribute back to the project at https://choirless.github.io/
https://www.linuxfoundation.org/wp-content/uploads/callforcode_stacked_400.svgThe Linux Foundationhttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgThe Linux Foundation2021-03-11 06:00:002021-03-11 08:01:10New open source project helps musicians jam together even when they’re not together
Today the Linux Foundation announced that it would be hosting seven projects that originated at Call for Code for Racial Justice, an initiative driven by IBM and Creator David Clark Cause to urge the global developer ecosystem and open source community to contribute to solutions that can help confront racial inequalities.
Launched by IBM in October 2020, Call for Code for Racial Justice facilitates the adoption and innovation of open source projects by developers, ecosystem partners, and communities across the world to promote racial justice across three distinct focus areas: Police & Judicial Reform and Accountability; Diverse Representation; and Policy & Legislation Reform.
The initiative builds upon Call for Code, created by IBM in 2018 and has grown to over 400,000 developers and problem solvers in 179 countries, in partnership with Creator David Clark Cause, Founding Partner IBM, Charitable Partner United Nations Human Rights, and the Linux Foundation.
As part of today’s announcement, the Linux Foundation and IBM unveiled two new solution starters, Fair Change and TakeTwo:
Fair Change is a platform to help record, catalog, and access evidence of potentially racially charged incidents to enable transparency, reeducation, and reform as a matter of public interest and safety. For example, real-world video footage related to routine traffic stops, stop and search, or other scenarios may be recorded and accessed by the involved parties and authorities to determine whether the incidents were handled in a biased manner. Fair Change consists of a mobile application for iOS and Android built using React Native, an API for capturing data from various sources built using Node JS. It also includes a website with a geospatial map view of incidents built using Google Maps and React. Data can be stored in a cloud-hosted database and object-store. Visit the tutorial or project page to learn more.
TakeTwo aims to help mitigate digital content bias, whether overt or subtle, focusing on text across news articles, headlines, web pages, blogs, and even code. The solution is designed to leverage directories of inclusive terms compiled by trusted sources like the Inclusive Naming Initiative, which the Linux Foundation and CNCF co-founded. The terminology is categorized to train an AI model to enhance its accuracy over time. TakeTwo is built using open source technologies, including Python, FastAPI, and Docker. The API can be run locally with a CouchDB backend database or IBM Cloudant database. IBM has already deployed TakeTwo within its existing IBM Developer tools that are used to publish new content produced by hundreds of IBMers each week. IBM is trialing TakeTwo for IBM Developer website content. Visit the tutorial or project page to learn more.
In addition to the two new solution starters, The Linux Foundation will now host five existing and evolving open source projects from Call for Code for Racial Justice:
Five-Fifths Voter: This web app empowers minorities to exercise their right to vote and ensures their voice is heard by determining optimal voting strategies and limiting suppression issues.
Legit-Info:Local legislation can significantly impact areas as far-reaching as jobs, the environment, and safety. Legit-Info helps individuals understand the legislation that shapes their lives.
Incident Accuracy Reporting System: This platform allows witnesses and victims to corroborate evidence or provide additional information from multiple sources against an official police report.
Open Sentencing: To help public defenders better serve their clients and make a stronger case, Open Sentencing shows racial bias in data such as demographics.
Truth Loop: This app helps communities simply understand the policies, regulations, and legislation that will impact them the most.
These projects were built using open source technologies that include Red Hat OpenShift, IBM Cloud, IBM Watson, Blockchain ledger, Node.js,Vu.js, Docker, Kubernetes, and Tekton. The Linux Foundation and IBM ask developers and ecosystem partners to contribute to these solutions by testing, extending, implementing them, and adding their own diverse perspectives and expertise to make them even stronger.
For more information and to begin contributing, please visit:
https://www.linuxfoundation.org/wp-content/uploads/feat_021921_frj_50.jpg12602400The Linux Foundationhttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgThe Linux Foundation2021-02-19 06:00:002021-02-19 07:29:38New Open Source Projects to Confront Racial Justice
https://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svg00The Linux Foundationhttps://live-linux-foundation.pantheonsite.io/wp-content/uploads/lf_logo.svgThe Linux Foundation2021-02-12 10:00:122021-02-12 10:00:14Linux Foundation Newsletter February 2021: IT Training & Certification Sale, Shuah Khan & Mentorship, OpenSSF First Six Months