Posts

Open Source Leadership Summit

Share your knowledge, best practices, and strategies at Open Source Leadership Summit.

Open Source Leadership Summit (OSLS) is an invitation-only think tank where open source software and collaborative development thought leaders convene, discuss best practices, and learn how to manage today’s largest shared technology investments.

The Linux Foundation invites you to share your knowledge, best practices, and strategies with fellow open source leaders at OSLS.  

Tracks & Suggested Topics for Open Source Leadership Summit:

OS Program Office

  • Consuming and Contributing to Open Source
  • Driving Participation and Inclusiveness in Open Source Projects
  • Standards and Open Source
  • Managing Competing Corporate Interests while Driving Coherent Communities
  • How to Vet the Viability of OS Projects
  • Open Source + Startup Business Models
  • Project Planning and Strategy
  • Internal vs. External Developer Adoption

Best Practices in Open Source Development / Lessons Learned

  • Contribution Policies
  • Promoting Your Open Source Project
  • Open Source Best Practices
  • Open Source Program Office Case Studies and Success Stories
  • Standards and Open Source

Growing & Sustaining Project Communities / Metrics and Actions Taken

  • Collaboration Models to Address Security Issues
  • Metrics for Understanding Project Health

Automating Compliance / Gaps & Successes

  • Using Trademarks in Open Communities
  • Working with Regulators / Regulated Industries
  • Working with the Government on OS
  • How to Incorporate SPDX Identifiers in Your Project
  • Legal + Compliance
  • Licensing + Patents
  • Successfully Working Upstream & Downstream

Certifying Open Source Projects

  • Security
  • Safety
  • Export
  • Government Restrictions
  • Open Source vs. Open Governance
  • New Frontiers for Open Source in FinTech and Healthcare

Futures

  • Upcoming Trends
  • R&D via Open Source
  • Sustainability

Business Leadership

  • Cultivating Open Source Leadership
  • How to Run a Business that Relies on Open Source
  • How to be an Effective Board Member
  • How to Invest in Your Project’s Success
  • Managing Competing Corporate Interests while Driving Coherent Communities
  • Monetizing Open Source & Innovators Dilemma

View here for more details on suggested topics, and submit your proposal before the Jan. 21 deadline.

Get inspired! Watch keynotes from Open Source Leadership Summit 2017.

See all keynotes from OSLS 2017 »

OpenStack

The OpenStack Foundation team has been thinking about what “open” means for the project. Learn more.

In his keynote at OpenStack Summit in Australia, Jonathan Bryce (Executive Director of the OpenStack Foundation) stressed on the meaning of both “Open” and “Stack” in the name of the project and focused on the importance of collaboration within the OpenStack ecosystem.

OpenStack has enjoyed unprecedented success since its early days. It has excited the IT industry about applications at scale and created new ways to consume cloud. The adoption rate of OpenStack and the growth of its community exceeded even the biggest open source project on the planet, Linux. In its short life of 6 years, OpenStack has achieved more than Linux did in a similar time span.

So, why does OpenStack need to redefine the meaning of the project and stress collaboration? Why now?

“We have reached a point where the technology has proven itself,” said Mark Collier, the CTO of the OpenStack Foundation. “You have seen all the massive use case of OpenStack all around the globe.”

Collier said that the OpenStack community is all about solving problems. Although they continue to refine compute, storage, and networking, they also look beyond that.

With big adoption and big growth, come new challenges. The OpenStack community and the OpenStack Foundation responded to those challenges and the project transformed along with changing market dynamics — evolving from integrated release to big tent to composability.

OpenStack community

One of the things that the Foundation team has been doing this year is thinking about what “open” means for the project. In the past five years, OpenStack has built a great community around it. There are more than 82,000 people from around the globe who are part of this huge community. The big question for the Foundation was, what’s next for the coming five years? The first thing that they looked at was what got them to this position.

When you put this all into context, Bryce’s stress on openness and collaboration makes sense. In an interview with The Linux Foundation, Bryce said, “We haven’t really talked a lot about our attitude around openness. I think that it’s a little bit overdue because when you look into the technology industry right now you see the term ‘open’ thrown around constantly. The word open gets attached to different products, it gets attached to different vendor conferences because who doesn’t want something that’s open.”

“One of the key things has been those four opens that we use as the pillars of our community:  how we write our code, how we design our systems, how we manage our development process, and how we interact as a community,” said Bryce.

When you look at the stack part of OpenStack, there is no single component that builds the OpenStack cloud; there are many different components that come from different independent open source projects. These components are the part of the stack. “We’re building technology stack but it’s not a rigid stack and it’s not a single approach to doing things. It’s actually a flexible programmable infrastructure technology stack,” Bryce said.

What’s really interesting about these different open source projects is that in most cases they work in silos. Whether it’s KVM or Open vSwitch or Kubernetes, they are developed independently of each other.

“And that’s not a bad thing, actually,” Byce said, “because you want experts in a topic who are focused on that. This expertise gives you a really good container orchestration system, a really good distributed storage system, a software defined networking system. But users don’t run those things independently. There isn’t a single OpenStack cloud on the planet that only runs software that we wrote in the OpenStack community.”

Staying in sync

One big problem that the OpenStack community saw was big gaps between these projects.

“There are issues to keep in sync between these different open source projects that have different release cadence,” said Bryce. “So far, we’ve left it to users to solve those problems, but we realized we can do better than that. And that’s where the focus is in terms of collaboration.”

The OpenStack community has been working with other communities from day one. Collaboration has always been the core of the project. Bryce used the example of KVM project, one of the many projects that OpenStack users use.

“When we started the OpenStack project, KVM was not widely considered a production-ready hypervisor,” said Bryce. “There were a lot of features that were new, unstable and totally unreliable. But OpenStack became a big driver for KVM usage. OpenStack developers contributed upstream to KVM and that combination ended up helping both Nova and KVM mature because we were jointly delivering real use cases.”

It’s happening all across the board now. For example, Bryce mentioned a report from Research 451 that said that companies that already have OpenStack were adopting containers three times faster than those who don’t.

Yes, the collaboration has been happening, but there is huge potential in refining that collaboration. Collier said that the OpenStack community members who have been gluing these different projects together have gained expertise in doing so. The OpenStack Foundation plans to help members of the community share this expertise and experience with each other.

“The Open Source community loves to give back,” said Collier. “This collaboration is about sharing the playbook — both software and operational know how — that allows you to take this innovation and put it into production.”

“Those are the missing links, the last mile of open infrastructure the users have had to do on their own. We’re bringing that into the community and that’s where I think the collaboration becomes critical,” added Collier.

“How do you deliver that collaboration?” said Bryce. “Writing software is hard, but it becomes less hard when you get people together. That’s something people forget in the open source community as we work remotely, collaborating online, from different parts of the world.”

Face to Face Collaboration

Physical events like OpenStack Summit, Open Source Summit, KubeCon, and many others bring these people together, face to face.

“Meeting each other in person is extremely valuable. It builds trust and when we go back to our remote location and collaborate online, that trust makes us even more productive,” said Bryce.

Going forward, OpenStack Foundation plans to make its events inclusive of all those technologies that matter to OpenStack users. They have started events like OpenStack Days that include projects such as Ceph, Ansible, Kubernetes, Cloud Foundry, and more.

“When you meet people,  spend time with them and work together, you naturally start to understand each other better and figure out how to work together,” said Bryce. “And that to me is a really important part of how you actually make collaboration happen.”

Via collaboration of global, sustainable community, ONAP Amsterdam release addresses real-world SDN, NFV and VNFs just in time for 5G

San Francisco, November 20, 2017– The Open Network Automation Platform (ONAP) Project today announced the availability of its first platform release, ONAP “Amsterdam,” which delivers a unified architecture for end-to-end, closed-loop network automation. ONAP is transforming the service delivery lifecycle for network, cable and cloud providers. ONAP is the first open source project to unite the majority of operators (end users) with the majority of vendors (integrators) in building a real service automation and orchestration platform, and already, 55 percent of the world’s mobile subscribers are supported by its members.

“Amsterdam represents significant progress for both the ONAP community and the greater open source networking ecosystem at large,” said Arpit Joshipura, general manager, Networking and Orchestration, The Linux Foundation. “By bringing together member resources, Amsterdam is the first step toward realization of a globally shared architecture and implementation for network automation, based on open source and open standards. It’s exciting to see a new era of industry collaboration and architectural convergence – via a healthy, rapidly diversifying ecosystem – begin to take shape with the release of ONAP Amsterdam.”

The Amsterdam release provides a unified architecture which includes production-proven code from open source ECOMP and OPEN-O to provide design-time and run-time environments within a single, policy-driven service orchestration platform. Common, vendor-agnostic models allow users to quickly design and implement new services using best-of-breed components, even within existing brownfield environments. Real-time inventory and analytics support monitoring, end-to-end troubleshooting, and closed-loop feedback to ensure SLAs as well as rapid optimization of service design and implementations. Additionally, ONAP is able to manage and orchestrate both virtualized and physical network functions.

The entire platform has been explicitly architected to address current real-world challenges in operating tier-one networks. Amsterdam provides verified blueprints for two initial use cases, with more to be developed and tested in future releases. This includes VoLTE (Voice Over LTE), which allows voice to be unified onto IP networks. By virtualizing the the core network, ONAP is used to design, deploy, monitor and manage the lifecycle of a complex end-to-end VoLTE service. The second use case is Residential vCPE. With ONAP, all services are provided in-network, which means CSPs can add new services rapidly and on-demand to their residential customers to create new revenue streams and counter competitors.

“In six short months, the community has rallied together to produce a platform that transforms the service delivery lifecycle via closed-loop automation,” said Mazin Gilbert, ONAP Technical Steering Committee (TSC) chair, and vice president, Advanced Technology, AT&T Labs.This initial release provides blueprints for service provider use cases, representing the collaboration and innovation of the community.”

Ecosystem Growth Produces ONAP PoCs

With more than 55 percent of global mobile subscribers represented by member carriers, ONAP is poised to become the de facto automation platform for telecom carriers. This common, open platform greatly reduces development costs and time for VNF vendors, while allowing network operators to optimize their selection of best-of-breed commercial VNF offerings for each of their services. Standardized models and interfaces greatly simplify integration time and cost, allowing telecom and cloud providers to deliver new offerings quickly and competitively.

Member companies which represent every aspect of the ecosystem (vendors, telecommunication providers, cable and cloud operators, NFV vendors, solution providers) are already leveraging ONAP for commercial products and services. Amsterdam code is also integrated into proof of concepts.

Additionally, ONAP is part of a thriving global community; more than 450 people attended the recent Open Source Networking Days events to learn how ONAP and other open source networking projects are changing network operations.

More detailsincluding download information, white papers, solutions briefs and videoson Amsterdam are available here. Comments from members, including those who contributed technically to Amsterdam, can be found here.

What’s Next for ONAP

Looking ahead, the community is already beginning plans for the second ONAP release, “Beijing.” Scheduled for release in summer 2018, Beijing will include “S3P” (scale, stability, security and performance) enhancements, more use cases to support today’s service provider needs, key 5G features, and inter- cloud connectivity. Interest from large enterprises will likely further shape the platform and use cases in future releases.

ONAP will continue to evolve harmonization with SDOs and other other source projects, with a focus on aligning APIs/Information Models as well as OSS/BSS integration.

ONAP Beijing Release Developer Forum will take place on Dec. 11-13 in Santa Clara, California, and will include topics for end users, VNF providers, and the ONAP developer community via a variety of sessions including presentations, panels and hands-on labs.

ONAP community members and developers are encouraged to submit a proposal to share knowledge and expertise with the rest of the community: https://www.onap.org/event/submit-a-proposal-for-the-onap-beijing-release-developer-forum-santa-clara-ca

Additionally, ONAP will host a Workshop on “Container Networking with ONAP”  in conjunction with CloudNativeCon + KubeCon December 5 in Austin, Texas. The workshop is designed to bring together networking and cloud application developers to discuss their needs, ideas and aspirations for automating the deployment of secure network services on demand. Details and registration information: https://www.onap.org/event/cfp-submit-a-proposal-to-onap-mini-summit-at-cloudnativecon-kubecon-north-america-tuesday-december-5-2017

About the Open Network Automation Platform

The Open Network Automation Platform (ONAP) Project brings together top global carriers and vendors with the goal of allowing end users to automate, design, orchestrate and manage services and virtual functions. ONAP unites two major open networking and orchestration projects, open source ECOMP and the Open Orchestrator Project (OPEN-O), with the mission of creating a unified architecture and implementation and supporting collaboration across the open source community. The ONAP Project is a Linux Foundation project. For more information, visit https://www.onap.org.

# # #

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

 

Additional Resources

Download ONAP Amsterdam

Amsterdam Architecture Overview

VoLTE Solution Brief

VCPE Solution Brief

Related videos

ONAP Blog

Join as a Member

 

Media Contact

Sarah Conway

The Linux Foundation

(978) 578-5300

sconway@linuxfoundation.org

Autodesk is undergoing a company-wide shift to open source and inner source. And that’s on top of the culture change that both development methods require.

Autodesk is undergoing a company-wide shift to open source and inner source. And that’s on top of the culture change that both development methods require.

Inner source means applying open source development practices and methodologies to internal projects, even if the projects are proprietary. And the culture change required to be successful can be a hard shift from a traditional corporate hierarchy to an open approach. Even though they’re connected, all three changes are distinct heavy lifts.

They began by hiring Guy Martin as Director of Open Source Strategy in the Engineering Practice at Autodesk, which was designed to transform engineering across the company. Naturally, open source would play a huge role in that effort, including spurring the use of inner source. But neither would flourish if the company culture didn’t change. And so the job title swiftly evolved to Director of Open @ADSK at the company.

“I tend to focus a lot more on the culture change and the inner source part of my role even though I’m working through a huge compliance initiative right now on the open source side,” Martin said.

The history of Autodesk’s open source transformation began shortly after the shift of all its products to cloud began, including its AutoCAD architecture software, building information modeling with its Revit products, as well as  its media and entertainment products. The company’s role in open source in entertainment is now so significant that Martin often speaks at the Academy of Motion Picture Arts and Sciences on open source. They want to hear about what  Autodesk is doing as part of a larger collection of initiatives that the Academy is working on, Martin said.

At Autodesk, the goal is to spring engineers loose from their business silos and create a fully open source, cloud-centric company.

“Your primary identity detaches from being part of the AutoCAD team or part of the Revit team, or the 3ds Max or Inventor team or any of these products,” Martin explained. “It’s now shaping you into part of the Autodesk engineering team, and not your individual silo as a product organization in the company.”

Talent acquisition is among the top business goals for Open@Autodesk, especially given the company’s intense focus on innovation as well as making all of its products work seamlessly together. It takes talent skilled in open source methodologies and thinking to help make that happen. But it also means setting up the team dynamics so collaboration is more natural and less forced.

“With company cultures and some engineering cultures, the freedom to take an unconventional route to solve a problem doesn’t exist,” Martin said. “A lot of my job is to create that freedom so that smart and motivated engineers can figure out a way to put things together in a way that maybe they wouldn’t have thought of without that freedom and that culture.”

To help create an open source culture, the right tools must be in place and, oddly enough, those tools sometimes aren’t open source. For example, Martin created a single instance of Slack rather than use IRC, because Slack was more comfortable for users in other lines of the business who were already using it. The intent was to get teams to start talking across their organizational boundaries.

Another tool Martin is working with is Bitergia Analytics to monitor and manage Autodesk’s use of GitHub Enterprise.

Martin says the three key lessons he’s learned as an open source program manager are:

  1. Stay flexible because change happens
  2. Be humble but bold
  3. Be passionate.

“I’ve been at Autodesk two years but I’m still bootstrapping some of the things around culture. We have strong contributors in some projects, while in some projects we’re consuming. I think you have to do both, especially if you’re bootstrapping a new open source effort in a company. ”

“The challenge is always balancing the needs of the product teams, who have to get a product out the door, and who (and as an engineer I can say this) will take shortcuts whenever possible. They want to know, ‘why should we be doing this for the community? All we care about is our stuff.’ And it’s getting them past that. Yes, we’re doing work that’s going to be used elsewhere, but in the end we’re going to get benefits from pulling work from other people who have done work that they knew was going to be used in the community.”

Read about featured Linux kernel developers in the 2017 Linux Kernel Development Report.

The recent Linux Kernel Development Report released by The Linux Foundation, included information about several featured Linux kernel developers. According to the report, roughly 15,600 developers from more than 1,400 companies have contributed to the Linux kernel since 2005, when the adoption of Git made detailed tracking possible. Over the next several weeks, we will be highlighting some specific Linux kernel developers who agreed to answer a few questions about what they do and why they contribute to the kernel.

Linux kernel developer

Laura Abbott, a Fedora Kernel Engineer at Red Hat

In this article, we feature Laura Abbott, a Fedora Kernel Engineer at Red Hat.

The Linux Foundation: What role do you play in the community and what subsystem(s) do you work on?

Laura Abbott: My full-time job is working as one of two maintainers for the Fedora kernels. This means I push out kernel releases and fix/shepherd bugs. Outside of that role, I maintain the Ion memory management framework and do occasional work on arm/arm64 and KSPP (kernel hardening).

The Linux Foundation: What have you been working on this year?

Abbott: I did some major reworking on Ion this year and ripped out a lot of code (everyone’s favorite type of patch!). Hopefully, I’ll be able to report that Ion is out of staging in the next kernel report. Apart from that, I’ve spent a lot of time testing and reviewing patches for kernel hardening.

The Linux Foundation: What do you think the kernel community needs to work on in the upcoming year?

Abbott: As a general theme, there needs to be a focus on scaling the community. There’s always an ongoing discussion about how to attract new developers and there’s been a recent focus on how to grow contributors into maintainers. There’s still a lot of ‘tribal knowledge’ in pretty much every area which makes things difficult for everyone. I’d like to see the kernel community continue to make processes easier for new and existing developers. I’d also like to see the discussions about building an inclusive community continue.

The Linux Foundation: Why do you contribute to the Linux kernel?

Abbott: I’ve always found low-level systems fascinating and enjoy seeing how all the pieces work together. There’s always something new to learn about in the kernel, and I find the work challenging.

You can learn more about the Linux kernel development process and read more developer profiles in the full report. Download the 2017 Linux Kernel Development Report now.

participating in open source

The Linux Foundation’s free online guide Participating in Open Source Communities can help organizations successfully navigate open source waters.

As companies in and out of the technology industry move to advance their open source programs, they are rapidly learning about the value of participating in open source communities. Organizations are using open source code to build their own commercial products and services, which drives home the strategic value of contributing back to projects.

However, diving in and participating without an understanding of projects and their communities can lead to frustration and other unfortunate outcomes. Approaching open source contributions without a strategy can tarnish a company’s reputation in the open source community and incur legal risks.

The Linux Foundation’s free online guide Participating in Open Source Communities can help organizations successfully navigate these open source waters. The detailed guide covers what it means to contribute to open source as an organization and what it means to be a good corporate citizen. It explains how open source projects are structured, how to contribute, why it’s important to devote internal developer resources to participation, as well as why it’s important to create a strategy for open source participation and management.

One of the most important first steps is to rally leadership behind your community participation strategy. “Support from leadership and acknowledgement that open source is a business critical part of your strategy is so important,” said Nithya Ruff, Senior Director, Open Source Practice at Comcast. “You should really understand the company’s objectives and how to enable them in your open source strategy.”

Building relationships is good strategy

The guide also notes that building relationships at events can make a difference, and that including community members early and often is a good strategy. “Some organizations make the mistake of developing big chunks of code in house and then dumping them into the open source project, which is almost never seen as a positive way to engage with the community,” the guide notes. “The reality is that open source projects can be complex, and what seems like an obvious change might have far reaching side effects in other parts of the project.”

Through the guide, you can also learn how to navigate issues of influence in community participation. It can be challenging for organizations to understand how influence is earned within open source projects. “Just because your organization is a big deal, doesn’t mean that you should expect to be treated like one without earning the respect of the open source community,” the guide advises.

The Participating in Open Source Communities guide can help you with these strategies and more, and it explores how to weave community focus into your open source initiatives. It is one of a new collection of free guides from The Linux Foundation and The TODO Group that provide essential information for any organization running an open source program. The guides are available now to help you run an open source program office where open source is supported, shared, and leveraged. With such an office, organizations can efficiently establish and execute on their open source strategies.

These guides were produced based on expertise from open source leaders. Check out the guides and stay tuned for our continuing coverage.

Don’t miss the previous articles in the series:

How to Create an Open Source Program

Tools for Managing Open Source Programs

Measuring Your Open Source Program’s Success

Effective Strategies for Recruiting Open Source Developers

Open Source Summit EU

Going to Open Source Summit? Check out some featured conference presentations and activities below.

Going to Open Source Summit EU in Prague? While you’re there, be sure stop by The Linux Foundation training booth for fun giveaways and a chance to win one of three Raspberry Pi kits.

Giveaways include The Linux Foundation branded webcam covers, The Linux Foundation projects’ stickers, Tux stickers, Linux.com stickers, as well as free ebooks: The SysAdmin’s Essential Guide to Linux Workstation Security, Practical GPL Compliance, and A Guide to Understanding OPNFV & NFV.

You can also enter the raffle for a chance to win a Raspberry Pi Kit. There will be 3 raffle winners: names will be drawn and prizes will be mailed on Nov. 2.

And, be sure to check out some featured conference presentations below, including how to deploy Kubernetes native applications, deploying and scaling microservices, opportunities for inclusion and collaboration, and how to build your open source career.

Session Highlights

  • Love What You Do, Everyday! – Zaheda Bhorat, Amazon Web Services
  • Detecting Performance Regressions In The Linux Kernel – Jan Kara, SUSE
  • Highway to Helm: Deploying Kubernetes Native Applications – Michelle Noorali, Microsoft
  • Deploying and Scaling Microservices with Docker and Kubernetes – Jérôme Petazzoni, Docker
  • printk() – The Most Useful Tool is Now Showing its Age – Steven Rostedt, VMWare
  • Every Day Opportunities for Inclusion and Collaboration – Nithya Ruff, Comcast

Activities

  • Technical Showcase
  • Real-Time Summit
  • Free Day with Prague tour from local students
  • KVM Forum
  • FOSSology – Hands On Training
  • Tracing Summit

The Cloud Native Computing Foundation will also a have booth at OSSEU. Get your pass to Open Source Summit Europe and stop by to learn more! Use discount OSSEULFM20 code for 20% off your all-access attendee pass.

Check out the full list of co-located events on the website and register now.

Join the Apache Mesos community in Prague for town halls, MesosCon university, and a full-day hackathon.

Get the latest on Apache Mesos with Ben Hindman, Co-Creator of Apache Mesos, at MesosCon Europe taking place October 25-27, 2017 in Prague, Czech Republic. At the conference, you’ll hear insights by industry experts deploying Mesos clusters, learn about containerization and security in Mesos, and more.

This annual conference brings together users and developers to share and learn about the Mesos project and its growing ecosystem. The conference features two days of sessions focused on the Apache Mesos Core and related technologies, as well as a one-day hackathon, town halls, and MesosCon University.  

Highlights include:

  • SMACK in the Enterprise keynote panel: Hear how the SMACK stack is impacting the data analytics landscape at large enterprises. Panelists will be announced soon.
  • MesosCon University: Tutorial-style sessions will offer hands-on learning for building a stateful service, operating your cluster, or bootstrapping a secure Mesos cluster.
  • Town Halls: A community gathering to discuss pressing needs and issues. The town halls will begin at 7:00pm after the onsite reception on Thursday, and will include drinks and appetizers sponsored by Mesosphere. Have a town hall you think we should run? Reach out to events@linuxfoundation.org.
  • Hackathon: Come and work on new Mesos features, new demos, new documentation, and win great prizes! The Hackathon will take place on Wednesday, October 25, and is included with your conference registration.  

View the full schedule of sessions and activities here.

Get a preview of what to expect at MesosCon Europe. Watch videos from MesosCon North America 2017 here.

Register now and use discount code MCEULDC17 to save $25 off your pass to MesosCon Europe.

Through a collaborative effort from enterprises and communities invested in cloud, big data, and standard APIs, I’m excited to welcome the OpenMessaging project to The Linux Foundation. The OpenMessaging community’s goal is to create a globally adopted, vendor-neutral, and open standard for distributed messaging that can be deployed in cloud, on-premise, and hybrid use cases.

Alibaba, Yahoo!, Didi, and Streamlio are the founding project contributors. The Linux Foundation has worked with the initial project community to establish a governance model and structure for the long-term benefit of the ecosystem working on a messaging API standard.

As more companies and developers move toward cloud native applications, challenges are developing at scale with messaging and streaming applications. These include interoperability issues between platforms, lack of compatibility between wire-level protocols and a lack of standard benchmarking across systems.

In particular, when data transfers across different messaging and streaming platforms, compatibility problems arise, meaning additional work and maintenance cost. Existing solutions lack standardized guidelines for load balance, fault tolerance, administration, security, and streaming features. Current systems don’t satisfy the needs of modern cloud-oriented messaging and streaming applications. This can lead to redundant work for developers and makes it difficult or impossible to meet cutting-edge business demands around IoT, edge computing, smart cities, and more.

Contributors to OpenMessaging are looking to improve distributed messaging by:

  • Creating a global, cloud-oriented, vendor-neutral industry standard for distributed messaging
  • Facilitating a standard benchmark for testing applications
  • Enabling platform independence
  • Targeting cloud data streaming and messaging requirements with scalability, flexibility, isolation, and security built in
  • Fostering a growing community of contributing developers

You can learn more about the new project and how to participate here: http://openmessaging.cloud

These are some of the organizations supporting OpenMessaging:

“We have focused on the messaging and streaming field for years, during which we explored Corba notification, JMS and other standards to try to solve our stickiest business requirements. After evaluating the available alternatives, Alibaba chose to create a new cloud-oriented messaging standard, OpenMessaging, which is a vendor-neutral and language-independent and provides industrial guidelines for areas like finance, e-commerce, IoT, and big data. Moreover, it aims to develop messaging and streaming applications across heterogeneous systems and platforms. We hope it can be open, simple, scalable, and interoperable. In addition, we want to build an ecosystem according to this standard, such as benchmark, computation, and various connectors. We would like to have new contributions and hope everyone can work together to push the OpenMessaging standard forward.” — Von Gosling, senior architect at Alibaba, co-creator of Apache RocketMQ, and original initiator of OpenMessaging

“As the sophistication and scale of applications’ messaging needs continue to grow, lack of a standard interface has created complexity and inflexibility barriers for developers and organizations. Streamlio is excited to work with other leaders to launch the OpenMessaging standards initiative in order to give customers easy access to high-performance, low-latency messaging solutions like Apache Pulsar that offer the durability, consistency, and availability that organizations require.” — Matteo Merli, software engineer at Streamlio, co-creator of Apache Pulsar, and member of Apache BookKeeper PMC

“Oath–a Verizon subsidiary of leading media and tech brands including Yahoo and AOL– supports open, collaborative initiatives and is glad to join the OpenMessaging project.” Joe Francis, director, Core Platforms

“In Didi, we have defined a private set of producer API and consumer API to hide differences among open source MQs such as Apache Kafka, Apache RocketMQ, etc. as well as to provide additional customized features. We are planning to release these to the open source community. So far, we have accumulated a lot of experience on MQs and API unification, and are willing to work in OpenMessaging to construct a common standard of APIs together with others. We sincerely believe that a unified and widely accepted API standard can benefit MQ technology and applications that rely on it.” — Neil Qi, architect at Didi

“There are many different open source messaging solutions, including Apache ActiveMQ, Apache RocketMQ, Apache Pulsar, and Apache Kafka. The lack of an industry-wide, scalable messaging standard makes evaluating a suitable solution difficult. We are excited to support the joint effort from multiple open source projects working together to define a scalable, open messaging specification. Apache BookKeeper has been successfully deployed in production at Yahoo (via Apache Pulsar) and Twitter (via Apache DistributedLog) as their durable, high-performance, low-latency storage foundation for their enterprise-grade messaging systems. We are excited to join the OpenMessaging effort to help other projects address common problems like low-latency durability, consistency and availability in messaging solutions.” — Sijie Guo, co-founder of Streamlio, PMC chair of Apache BookKeeper, and co-creator of Apache DistributedLog

Reuben Paul, co-founder of CyberShaolin, will speak at Open Source Summit in Prague, highlighting the importance of cybersecurity awareness for kids.

Reuben Paul is not the only kid who plays video games, but his fascination with games and computers set him on a unique journey of curiosity that led to an early interest in cybersecurity education and advocacy and the creation of CyberShaolin, an organization that helps children understand the threat of cyberattacks. Paul, who is now 11 years old, will present a keynote talk at Open Source Summit in Prague, sharing his experiences and highlighting insecurities in toys, devices, and other technologies in daily use.

Reuben Paul, co-founder of CyberShaolin

We interviewed Paul to hear the story of his journey and to discuss CyberShaolin and its mission to educate, equip, and empower kids (and their parents) with knowledge of cybersecurity dangers and defenses.  

Linux.com: When did your fascination with computers start?
Reuben Paul: My fascination with computers started with video games. I like mobile phone games as well as console video games. When I was about 5 years old (I think), I was playing the “Asphalt” racing game by Gameloft on my phone. It was a simple but fun game. I had to touch on the right side of the phone to go fast and touch the left side of the phone to slow down. I asked my dad, “How does the game know where I touch?”

He researched and found out that the phone screen was an xy coordinate system and so he told me that if the x value was greater than half the width of the phone screen, then it was a touch on the right side. Otherwise, it was a touch on the left side. To help me better understand how this worked, he gave me the equation to graph a straight line, which was y = mx + b and asked, “Can you find the y value for each x value?” After about 30 minutes, I calculated the y value for each of the x values he gave me.

“When my dad realized that I was able to learn some fundamental logics of programming, he introduced me to Scratch and I wrote my first game called “Big Fish eats Small Fish” using the x and y values of the mouse pointer in the game. Then I just kept falling in love with computers.Paul, who is now 11 years old, will present a keynote talk at Open Source Summit in Prague, sharing his experiences and highlighting insecurities in toys, devices, and other technologies in daily use.

Linux.com: What got you interested in cybersecurity?
Paul: My dad, Mano Paul, used to train his business clients on cybersecurity. Whenever he worked from his home office, I would listen to his phone conversations. By the time I was 6 years old, I knew about things like the Internet, firewalls, and the cloud. When my dad realized I had the interest and the potential for learning, he started teaching me security topics like social engineering techniques, cloning websites, man-in-the-middle attack techniques, hacking mobile apps, and more. The first time I got a meterpreter shell from a test target machine, I felt like Peter Parker who had just discovered his Spiderman abilities.

Linux.com: How and why did you start CyberShaolin?
Paul: When I was 8 years old, I gave my first talk on “InfoSec from the mouth of babes (or an 8 year old)” in DerbyCon. It was in September of 2014. After that conference, I received several invitations and before the end of 2014, I had keynoted at three other conferences.

So, when kids started hearing me speak at these different conferences, they started writing to me and asking me to teach them. I told my parents that I wanted to teach other kids, and they asked me how. I said, “Maybe I can make some videos and publish them on channels like YouTube.” They asked me if I wanted to charge for my videos, and I said “No.” I want my videos to be free and accessible to any child anywhere in the world. This is how CyberShaolin was created.

Linux.com: What’s the goal of CyberShaolin?
Paul: CyberShaolin is the non-profit organization that my parents helped me found. Its mission is to educate, equip, and empower kids (and their parents) with knowledge of cybersecurity dangers and defenses, using videos and other training material that I develop in my spare time from school, along with kung fu, gymnastics, swimming, inline hockey, piano, and drums. I have published about a dozen videos so far on the www.CyberShaolin.org website and plan to develop more. I would also like to make games and comics to support security learning.

CyberShaolin comes from two words: Cyber and Shaolin. The word cyber is of course from technology. Shaolin comes from the kung fu martial art form in which my dad and are I are both second degree black belt holders. In kung fu, we have belts to show our progress of knowledge, and you can think of CyberShaolin like digital kung fu where kids can become Cyber Black Belts, after learning and taking tests on our website.

Linux.com: How important do you think is it for children to understand cybersecurity?
Paul: We are living in a time when technology and devices are not only in our homes but also in our schools and pretty much any place you go. The world is also getting very connected with the Internet of Things, which can easily become the Internet of Threats. Children are one of the main users of these technologies and devices.  Unfortunately, these devices and apps on these devices are not very secure and can cause serious problems to children and families. For example, I recently (in May 2017) demonstrated how I could hack into a smart toy teddy bear and turn it into a remote spying device.
Children are also the next generation. If they are not aware and trained in cybersecurity, then the future (our future) will not be very good. 

Linux.com: How does the project help children?
Paul:As I mentioned before, CyberShaolin’s mission is to educate, equip, and empower kids (and their parents) with knowledge of cybersecurity dangers and defenses.

As kids are educated about cybersecurity dangers like cyber bullying, man-in-the-middle, phishing, privacy, online threats, mobile threats, etc., they will be equipped with knowledge and skills, which will empower them to make cyber-wise decisions and stay safe and secure in cyberspace.
And, just as I would never use my kung fu skills to harm someone, I expect all CyberShaolin graduates to use their cyber kung fu skills to create a secure future, for the good of humanity.