Open Source Gets Threatened By The Cyber Resilience Act

0
281

Additionally, “an unnecessary economic and technological risk to the EU” is created by the proposed legislation.

Governments and society are having trouble adjusting to a world rife with cybersecurity dangers. A good example is the EU CRA, or Cyber Resilience Act, a legislative proposal by the European Commission to protect customers from cybercrime by incorporating security into product design. Even if you don’t reside in the EU, the adoption of this legislation by the European Parliament will likely have an impact on the items you buy and, maybe, the things you make because of the current global marketplace. In a recent podcast, our own [Jonathan Bennett] and [Doc Searles] spoke with [Mike Milinkovich] from the Eclipse Foundation about the plan and what they believe would be the development of open source software’s near-death blow. The podcast is available to view below.

You may read the blog post from opensource.org describing the issues and the EU’s now-closed request for comments if you want some background information. The requirement that organisations self-certify their conformity with the act is at the core of the problem. It is difficult to understand how this will function because open source is frequently maintained by a small, loose-knit group of contributors.

Here is a brief summary of the issue. Let’s say you create a fun little C++ programme for personal usage. You are not a business, and you did not do anything to make money. You publish your programme on GitHub with an open source licence because you want to share your work. This occurs frequently.

While all of this is going on, another developer of a sizable open source program—let’s use the imaginary GRID database server as an example—decides to include your code. That is permitted. It is even encouraged, in fact. This is how open source functions.

The issue arises when the GRID database experiences an issue that results in a data leak. It turns out that your code’s vulnerability is the cause of the issue. It’s likely that under the new law, your charitable hobby project, which didn’t bring in any money for you, would leave you holding the bag for a substantial sum of money. If different people contributed to your code, the problem is considerably more complicated. Was the breach brought on by your code or by the code of the other developer? Who “owns” the undertaking? Are all participants accountable? When faced with this, the majority of people would probably either cease donating or impose a licence that made it unlawful to use their code in areas where laws similar to this applied.

The aforementioned scenario is unlikely, since [Milinkovich] notes that hobbyists will likely be officially exempted. He claims, however, that the majority of important open source software is not created by hobbyist programmers. Important software is frequently produced by paid programmers who are a part of a foundation or sponsor company. There is concern that when the EU refers to “commercial activity,” significant software like Apache, Linux, and other significant open source projects will be included.

There is general agreement that the EU does not intend to harm or end open source. However, there is still time for the act to be amended in a way that will make it more acceptable. Other nations are also making similar attempts. We recognise the need to safeguard customers and vital systems from cybersecurity flaws, and [Mike] concurs that it makes sense in some cases. We are aware, however, that eliminating open source software won’t be beneficial. We anticipate that changes to the act and similar initiatives in other nations will help safeguard open source software so that it may continue to spur innovation.

LEAVE A REPLY

Please enter your comment!
Please enter your name here