The Cyber Resilience Act: What it means for open source
Stephanie Domas
on 13 August 2024
Tags: CRA , cybersecurity , EU regulation , security & compliance
The Cyber Resilience Act (CRA) is nearly upon us. This wide-reaching piece of legislation will introduce new requirements, checks and balances on developers, retailers and device manufacturers; many of the looming demands haven’t gone down well in the open source community.
In this blog, I’ll examine the CRA’s impact on open source, give some expert insights into where the Act is a force for good and where it leaves some grey areas, and show you what you should be thinking about to prepare for its arrival if you use or create open source.
Why was the Cyber Resilience Act created?
If you’re not up to speed, the CRA is a piece of European Union legislation that aims to make devices safer by implementing more rigorous cybersecurity, documentation, and vulnerability reporting requirements into the EU’s IT industry. The legislation would apply to developers, distributors, manufacturers and retailers of hardware, devices, software, applications, and other “products with digitally connected elements”.
I recently went into some depth in an earlier article on our website and for the Forbes Technology Council, discussing the legislation and what it means for those looking to sell digital products in the EU. If you want to read more about the Act and its goals, aims, and new requirements for developers and manufacturers, I recommend you check out my pieces there.
To summarise my earlier articles, the CRA has high ambitions for its regulations. It promises:
- harmonised rules when bringing to market products or software with a digital component;
- a framework of cybersecurity requirements governing the planning, design, development and maintenance of such products, with obligations to be met at every stage of the value chain;
- an obligation to provide duty of care for the entire lifecycle of such products.
In a way, the Act is a direct response to the IoT and devices market. Historically in any tech sector, regulation follows long after the curve of market expansion. New inventions, platforms, services and entire industries explode into life — and then a few years later regulators follow with a slew of hard-hitting laws and legislation.
The IoT and the devices market was no exception. In the early days, manufacturers and device developers were caught up in the gold rush of the new market as devices became smaller, more powerful, and more connected. There was a race to be first; and as with all first-to-market pushes, security, robust design, and safety can fall to wayside.
Just looking at tech news headlines about data breaches and device faults, it’s easy to see why the CRA came into existence. After all, IoT device controversy is in the news every other day: IoT grills being hacked to burn their meals to a crisp; ring cameras with publicly accessible interfaces; printers with exposed IP addresses; and routers with publicly listed access credentials or vulnerabilities to malicious firmware updates.
In a global context, the CRA is simply part of broader pushes for tighter requirements for more secure systems. Whether it’s the US’s executive orders on cybersecurity, or the EU’s replacement of the NIS Directive with NIS2, or similar cybersecurity regulations in the UK, the CRA is simply another instance of a global shift towards securing devices that are available on the global marketplace.
What the CRA gets right
The legislation has its heart in the right place. At its core, its intention is to protect free open source or non-commercial development from onerous regulations:
This is healthy; we need safer devices, and a marketplace where products for sale won’t fail and lose all of our personal data, or open up our network to attackers.
Similarly commendable is the CRA’s creation of an open source steward role, as a result of community feedback. This new role works as a sort of exemption from CRA liability, but under specific circumstances (which I’ll touch on later). This enables those interested in open source simply for the pursuit of driving innovation to continue to do so without the liability of the CRA. Enabling contributors, and Foundations to continue to play crucial roles in the open source ecosystem.
It’s also been collaborative and taken submissions and guidance from open source experts and the community at large. Many of the biggest criticisms of the initial drafts of the Act were worked into the final version, with removal or amendment of the most painful clauses. For example, it greatly reduced the scale of what is considered “commercial activity” and removed clauses that would regulate projects receiving contributions from commercial organisations or their employees. Through consultation, the Act was changed to exclude non-profit organisations which sell open source software on the market but reinvest all the revenues in non-for-profit activities.
Criticisms of the CRA and where it misses the mark
However, there’s been a lot of uncertainty and potential conflict with common open source commercial mechanisms in the Act’s terminology. Over the course of the Act’s development, it has been at loggerheads with the open source community over many factors, including its generalised terminology and clashing definitions, among others. Let’s take a look at the open source community reaction.
The open source community reaction to the Cyber Resilience Act
The Cyber Resilience Act has been met with very mixed reactions across the open source community.
Many welcome elements of the new regulation in the interest of better cybersecurity. The view of the Act’s proponents is that for too long, rush-to-market development approaches and lack of focus on cybersecurity basics have sustained a significant global risk of breaches and data leaks from vulnerable, rushed, or poorly designed devices. In GitHub’s public statements concerning the CRA, it noted that, “Cybersecurity reform is clearly needed. Too often products are shipped without adequate security and not maintained as new vulnerabilities come to light. Many of us have directly suffered as a result.”
However, the onerous requirements and uncertain terminologies of the Cyber Resilience Act have left a large portion of the open source community in dismay. GitHub, The Linux Foundation, and many other figureheads of the open source community have noted the Act’s loose wording and heavy demands on open source or non-profit developers. The main criticisms relate to the Act’s lack of precision in defining open source, meaning that many open source projects could in fact fall into the scope of “commercial activities”. We’ll run through these concerns one by one in the next section.
There are even those who fear that it could mean the end of open source.
How the CRA may negatively impact enterprises ability to use open source or the open source community
The broad aims and lengthy text of the CRA will have considerable impacts on enterprises looking to make use of open source software.
Someone has to take accountability for the open source
At its core, the CRA marks the end of manufacturers, suppliers, retailers, and developers not taking security accountability for their products. The common way that most compliance requirements, such as the new CRA, manifest in your supply chain is that they trickle down to your suppliers. For example, consider the CRA requirement that software not be released with known exploited vulnerabilities: as a product vendor I would normally work this into contracting with my suppliers.
But what happens if you’re using open source, as most organisations do? Research in 2023 from Synopsys shows that upwards of 90% of commercial applications contain open source code. Who takes the security accountability of the open source components? If your product is subject to CRA, then there is an obligation for the manufacturer to meet the regulation on all of their digital components. If you’re consuming open source from a commercial entity willing to step up and take CRA ownership then you’re in great shape. But what if it’s a community-led project not willing to take responsibility for the CRA’s requirements? Or a commercial entity who released the open source as a boon to the world, but is not interested in taking on the work load or liability of CRA? You will be left with the difficult decision of taking on that responsibility yourself, or finding new software to use.
More extensive documentation and red tape
As a baseline, the CRA means you’re going to be much, much busier with documentation and transparency efforts.
Under the CRA, you will need to understand and communicate your entire software supply chain. You’ll need to have in-depth documentation about the components and stack that makes up your project. You’ll need to be tracking vulnerabilities and pushing security updates to your products for their entire lifecycle. What’s more, you’ll need to make much of this information freely and readily available in machine- and human-readable formats.
To get all this done, you’ll need robust new development and cybersecurity standards, including clear policies and processes for monitoring and reporting vulnerabilities.
Responsibilities for your downstream
The CRA assures us that 90% of devices will fall in its lowest non-critical category, and have the lowest (or no) requirement for compliance or certification. However, don’t breathe a sigh of relief just yet, because other sections of the CRA take a very broad approach to vulnerabilities, stating that, “Under certain conditions, all products with digital elements integrated in or connected to a larger electronic information system can serve as an attack vector for malicious actors”
And while changes to the Act have freed open source projects from direct regulation, the Act will still affect every EU commercial product using that open source software. This will undoubtedly move open source projects to prioritise helping their downstream with getting to market.
What’s more, it’s nearly impossible to anticipate how general purpose technology or libraries will be used, so there will be additional pressure on developers to ensure their products are compliance-ready for an even wider range of use cases.
The regulation of non-profit open source projects
Open source development is a tough space, often specifically because it has no direct monetary reward. Early versions of the CRA would have regulated anyone accepting donations or grants or other monetary gain for their software. However this has changed, to have a more grounded definition of commerical activity, and open up the scope of assessment for ‘commercial activity’ to be more broadly encompassing of an organisation’s mission and activities beyond simply its economic model:
However, its current form will still affect a great many open source projects due to their financial models. Non-profit models are still likely to fall under the CRA if:
- All income or revenue relating to a product or service isn’t reinvested into the project
- Their software is free but support for it isn’t
- Their software is open source but it is offered on a platform that monetises other services
- Their software is free but they financially gain from data, etc, generated by the software
- They do not charge for the product, but they do make monetary gain from support or open-source consulting services
The risks to coordinated vulnerability disclosure
The CRA aims to resolve flaws in the current upstream-downstream security fix model by requiring all developers to report discovered vulnerabilities to the European Union Agency for Cybersecurity (ENISA) within 24 hours of discovery. There are often situations where vulnerabilities are chosen, for good reason, to be not published while a fix is being created, or while coordination is happening between multiple vendors. Such a significant change to the current modus operandi for vulnerability fixes could put open source projects more at risk; it could also pressure developers to make frequent, very minor updates to address individual vulnerabilities rather than deep patches that better improve the device’s overall security. Frequent quick code changes can actually introduce more risk into the system than what was being fixed.
More red tape could mean less open source
In 2009, the discovery of unsafe amounts of lead in childrens’ toys in the US led to a panic over safety. Predictably, regulations and testing requirements were soon issued into law. The aim was to stop lead-laced toys ending up in childrens’ hands. And to an extent, the heavier testing reduced the number of cases; however, an unintended end result was a stifling effect on small businesses and local toy producers. While large manufacturers and mass import commercial entities were able to quite easily take the brunt of this extra cost (paid for by seizing the newly available market share previously held by companies who could not), the smaller players found themselves at a disadvantage.
One of the greatest risks and misconceptions about regulation is that regulations do not guarantee positive outcomes, and worse, they can be terminally disruptive to small businesses and innovation. There are warranted fears that the higher barriers to entry, increased pressure for improving compliance-readiness of software for downstream users, and onerous requirements of conformity assessment, documentation, and disclosure will discourage and disincentive small-scale development and innovation, and serve only to cement the dominance of large-scale proprietary and commercial software.
Canonical’s commitment to the CRA
At Canonical, we believe in the security of open source for all, and in enabling enterprises to rely on open source. The CRA will create new pressures for developers, manufacturers, and thousands of people who work with technology to meet stringent new requirements, and we are focussing on making that process as easy as possible on our entire range of products and services. To that end we’ve chosen to meet the challenges and requirements of the CRA head-on, allowing all of our customers who consume open source through us to benefit from our commitments to the CRA and focus on building their products without having to take on extra responsibility themselves. By taking on the requirements, we strive to make products that use open source more secure for everyone.
Whatever the debate, the regulatory landscape is shifting
Like it or not, the CRA it is on its way. Cybersecurity compliance and rigorous development protocols are a win in the long term. The open source community has had some wins with the addition of the steward role. With this new initiative, open source developers will have access to high-quality guidance and resources for developing more secure devices and products that customers and end users can trust with their networks and data.
However, there are still some grey areas where I believe friction will arise with the creation, and consumption of open source. The advent of the CRA also represents a call to all open source communities to share a voice and join active working groups, such as the Eclipse Foundation’s working group focusing on the CRA, in order to take part in shaping these regulations and guidance and making sure open source can continue to flourish.
Talk to us today
Interested in running Ubuntu in your organisation?
Newsletter signup
Related posts
A CISO’s comprehensive breakdown of the Cyber Resilience Act
Strong, wide-reaching regulation can bring safety to communities – but it can also bring uncertainty. The Cyber Resilience Act (CRA) has proven no exception...
What the Cyber Resilience Act (CRA) means for IoT manufacturers
The EU Cyber Resilience Act has considerable repercussions for the IoT device manufacturers. In this blog, we explore these new regulatory requirements and...
Canonical’s commitment to quality management
As Canonical approaches its 20th anniversary, we have proven our proficiency in managing a resilient software supply chain. But in the pursuit of excellence,...