web analytics

Cyber Insights 2025: Cybersecurity Regulatory Mayhem – Source: www.securityweek.com

Rate this post

Source: www.securityweek.com – Author: Kevin Townsend

Cyber Insights 2025 examines expert opinions on the expected evolution of more than a dozen areas of cybersecurity interest over the next 12 months. We spoke to hundreds of individual experts to gain their expert opinions. Here we discuss what to expect with cybersecurity regulations.

Regulations are facing a tipping point. There are too many and they are too complex to manage – and it’s getting worse.

Background

The purpose of cyber regulations is to protect confidentiality and physical assets. The confidentiality could relate to personal information, commercial intellectual property, or national security. The abusers of confidentiality and physical assets could be local businesses or foreign aggressors. Regulations therefore primarily address two areas: the prevention of local abuse or misuse of computer systems and their content, and better protection from foreign aggressors. This means placing limitations on how companies can use data and insisting on minimum levels of cybersecurity.

Abiding by these regulations is known as compliance.

Regulations and compliance are necessary evils. They place additional requirements on security teams, but seek to ensure a minimum level of data protection. As the cyber world grows in scope and complexity, regulations increase in quantity and complexity. 

But there is always a conundrum: politicians must protect the people (their voters), while not damaging innovation (the economy). A good example is the emergence of mainstream artificial intelligence. It is a new technology, and authorities are struggling with how to regulate against its misuse without harming its progress.

Today there is an additional complication emerging: the global growth of conservative politics with its philosophical preference for small government. The result is like two weather systems, one warm and one cool, coming together. The result of such collisions is usually turbulence. All of this is exacerbated by technical globalization: global technology in what remains primarily a tribal world. 

The two leading regulators in the free world are the United States (US) and the European Union (EU). The EU favors monolithic regulation – basically one single regulation per subject governing everything for everyone. This inevitably results in a long process that comes with loopholes and contradictions that seek to appease different national views and capabilities, and different business lobbyists.

Advertisement. Scroll to continue reading.

The individual States in the US have greater legislative autonomy than the individual nations in the EU – and the partisan nature of the US ensures that the states cling to their own autonomy. Monolithic regulation for the entire USA is almost impossible – different states have different regulations, sometimes colored by partisan politics.

The result of these different regulatory pressures is a complex compliance landscape – and one that is only likely to worsen over the next few years.

Overview

2025 will see considerable activity in regulatory requirements in both organizational response to recent regulations and the development of new regulations. We will focus on two primary areas: AI and the US political scene; and then look at wider regulatory concerns for cybersecurity.

Regulations on the use of artificial intelligence

Artificial intelligence is a new technology that clearly needs some form of regulatory control to prevent its misuse. The EU is already demonstrating its monolithic approach with the AI Act. The US continues with its own approach to regulations: federally with a combination of Executive Orders and agency requirements, and supplemented by separate state level laws.

The EU AI Act came into being on August 1, 2024 – but most of its provisions are not yet enforced. It is expected, but not guaranteed, that transparency and data governance obligations, and specific requirements for high risk systems will be enforced during 2025, with more following in 2026. However, there are already concerns that this regulation is over complex and difficult to understand.

“The US appears to be moving towards a light touch regulatory regime around the ethics, security and privacy of AI and focusing more on open innovation,” comments David Ferbrache, MD at Beyond Blue and former head of cyber at the UK MoD, “while the EU has taken a more prescriptive and cautious approach with its EU AI act which enters into force in 2025.”

He adds, “Another challenge is the proliferation of cyber regulatory regimes across different countries, with many countries regulating to protect their national sovereignty in cyber space, not least in the processing of sensitive information or their dependency on overseas technology providers.”

One of the main problems for the EU is the popular belief that AI and existing regulations such as GDPR (data privacy) and the Digital Single Market Directive (copyright protection) are fundamentally incompatible. People believe that their own data should be private, and their own creations should be protected. While this has never been completely true (there have always been business-based exceptions), the sheer scale of AI model training (everything on the entire internet) takes this to a different level.

The result is a classic regulatory fudge. Any Ai past indiscretions are effectively excused while future indiscretions will be monitored. One key element introduced is transparency. AI model developers must be transparent about their training data and the processes used to generate output. Once consumed, however, training data becomes unrecognizable. It is difficult to see how transparency can be ensured and enforced.

Nevertheless, most security professionals believe the EU AI Act is a positive force. “We’re seeing a model for thoughtful regulation that could inspire global standards,” suggests Dana Simberkoff, chief risk, privacy and information security officer at AvePoint. “The key is finding that sweet spot where regulation provides enough structure to build trust and safety while maintaining the flexibility needed for continued innovation.”

The US, meanwhile, is continuing its piecemeal and more targeted approach to regulation. There is no current equivalent to the monolithic EU AI Act. Instead, there are presidential executive orders (such as Biden’s ‘Blueprint for an AI Bill of Rights‘), agency-specific advice and regulations targeted at different industry sectors, and state-level initiatives.

John Lynch, director at Kiteworks
John Lynch, director at Kiteworks.

This evolution will continue to gather pace through 2025 and beyond – retaining and multiplying the regulators’ primary dilemma: safety versus innovation. “While these regulations aim to increase trust and safety, they may also impose compliance burdens that could inhibit smaller businesses from innovating at scale,” warns John Lynch, director at Kiteworks.

And always, Eric Schmidt’s prescient concern from 15 years ago, echoes and is amplified in the new world of AI: “The Internet is the first thing that humanity has built that humanity doesn’t understand, the largest experiment in anarchy that we have ever had. Governments will not be able to keep up with the speed of innovation.” What was true in 2010 for the internet is even more true in 2015 for artificial intelligence.

Conservative politics and the Chevron effect

On June 28, 2024, the Supreme Court struck down a legal principle known as the Chevron Doctrine, which for 40 years had given federal agencies deference over how to interpret laws relating to their own responsibilities. Striking down Chevron passed the advantage from the agencies to those they were pursuing, by at worst allowing defendant organizations to shop around for more favorable courts, or at best by reducing agency actions to a law court lottery. 

We simply do not know how this will play out in 2025 under a small government favoring administration.

However, given the general perception that the new administration will seek to rein in perceived federal agency regulatory excesses, we can expect the unexpected. While still president-elect, Trump made his feelings clear – signaling upcoming federal de-regulation. “We will probably see less administrative rulemaking by the SEC, HHS, FTC and other federal agencies traditionally active in privacy and data protection spheres at the federal level,” suggests Ilia Kolochenko (CEO at ImmuniWeb, partner at Platt Law LLP, and an adjunct professor of Cyber Law & Cybersecurity).

“The death of the Chevron doctrine could result in challenges to the data privacy regulations employed by the FTC, the FCC, and the HHS,” agrees Dante Stella at the Dykema law firm. “If such challenges succeed, they would put more pressure on Washington to enact new sectorial laws – or one comprehensive data privacy law to rule them all.”

Ilia Kolochenko
Ilia Kolochenko, CEO at ImmuniWeb

Kolochenko feels the pressure is more likely to find an outlet at the state level. “US states will retain a broad leeway to enact state laws on any matters, to the extent permitted by the Constitution. Therefore,” he says, “we should expect even more state laws on privacy, data protection and AI – making US-wide compliance a pretty burdensome and cost-prohibitive exercise.”

It is, of course, conjecture at this stage – but we can be sure that the Chevron effect and conservative politics will have a dramatic effect on the regulatory landscape in 2025 and subsequent years.

Cyber Incident Reporting for Critical Infrastructure Act (CIRCIA)

CIRCIA is a US federal law, enacted in 2022, requiring critical infrastructure bodies to report significant cyber incidents to CISA. CISA published a Notice of Proposed Rulemaking (NPRM) on April 4, 2024, with the final rule expected 18 months later. CIRCIA is consequently currently expected to come into force in late 2025.

However, it is worth noting that CIRCIA is potentially a target for Conservative government trimming. David White, co-founder and president at Axio, comments, “In an effort to streamline and reduce regulation, I predict that the Trump administration will weaken or dismantle the SEC cyber disclosure rules, CIRCIA (Cyber Incident Reporting for Critical Infrastructure Act of 2022), or both.”

The EU Digital Operational Resilience Act (DORA)

DORA is designed to strengthen the operational resilience of the EU financial sector. It is expected to come into effect on January 17, 2025, and like other major EU regulations, it will impact non-EU organizations that deal with the EU.

Leigh Glasper, director, cyber advisory at BlueVoyant, comments, “Although DORA is specifically targeted at financial institutions, it includes a mandate for financial firms to only work with third parties that can also demonstrate they are compliant, due to the commonality of attack types that target the supply chain.”

If history repeats itself, we will see other nations following the EU lead in stricter financial sector regulations just as happened with privacy regulations following the introduction of the EU’s GDPR. However, the widespread political movement toward smaller government may also come into play.

“We may see an exodus of certain US tech companies from the European market because of unclear and relentlessly expanding laws and regulations,” cautions Kolochenko. “If the EU continues its current strategy of over-regulation in virtually all tech sectors, Europe may sooner or later find itself isolated from the rest of the world.” 

It gets worse. “Over-regulation is a global phenomenon that may provoke most companies to knowingly ignore and violate laws, since the payment of fines will be much less expensive than ensuring compliance.” He warns that the conflict between laudable regulation and stifling innovation is real; “Ultimately giving foreign companies undue competitive advantage – let alone cybercrime groups who don’t care a fig about any laws. Laws and regulations are essential instruments to protect society and to safeguard human rights, but they must be enacted with care and scrupulous analysis of collateral effects on the global scale.”

The UK Cyber Security and Resiliency Act (CSRA)

The CSRA is expected to be presented to the UK Parliament during 2025. It is designed to strengthen cybersecurity within the UK, but will also apply to foreign companies operating within the UK. In some ways it can be seen as a UK version of the EU’s DORA. It is being introduced following major incidents such as attacks against the NHS and critical infrastructure, and in the general belief that regulations inherited from the EU (prior to Brexit) are no longer adequate for the UK.

It will expand regulatory oversight to a wider range of services, will give regulators stronger enforcement powers, and will introduce stricter incident reporting requirements.

The effect of US State-level cybersecurity laws

Federal government in the US regularly swings between conservatives and liberals because the usual numerical difference is traditionally just a few percentage points. The result is that there is always a sizable proportion of individual states that do not agree with the basic politics of the national government. For this reason alone, the sovereignty granted by the Constitution to the individual states is jealously guarded. The result is that different states develop their own regulations that are inevitably colored by their local political views regardless of the political views of the existent federal administration.

This in turn means that cybersecurity in the US is subject to multiple separate state-level regulations. Matthew Hays of the Dykema law firm gives an example. “Political ideology tends to impact the focus of the regulation: for example, the California CCPA’s sweeping, parental and hand-holding approach compared with the Florida Digital Bill of Rights, which is laser-targeted on big tech.”

It is always dangerous to generalize, but Liberals tend to believe in regulatory control, while Conservatives prefer minimal government intrusion. “Political ideology can also create challenges, with Republican states typically leaning toward less regulation and Democratic states more toward consumer protection, data privacy and cybersecurity regulations,” says Gaurav Kapoor, co-CEO of MetricStream. “For an organization with business in multiple states, this can result in the need to accommodate vastly different compliance requirements in the same system – for example, if one state requires a different data breach notification time.”

Current US politics provides a breeding ground for multiple unaligned state-level cybersecurity regulations. “The incoming administration has indicated a severe reduction in federal oversight and enforcement budgets,” continues Hays. “State legislators and attorney generals in blue states are taking the federal government’s new face as a challenge and will react strongly by passing more proscriptive consumer protection regulation or strictly enforcing existing regulation.”

Melissa Ventrone at Clark Hill Law agrees. “We expect more states will propose and enact comprehensive privacy laws in 2025 as part of their efforts to ensure companies protect consumer data. We will also see more states taking a closer look at when companies are using automated decision making (ADM) technology and when notice of such use is required.”

Boris Bohrer-Bilowitzki, CEO of Concordium points out, “Several American states are bringing in their own cybersecurity laws, including Delaware, Nebraska, New Hampshire, New Jersey, Iowa, Maryland and Tennessee. It’s clear that, in 2025, American businesses will need to take a more proactive approach to cybersecurity and privacy, or they could face large fines.” 

This nuanced conflict between conservative and liberal attitudes (we say ‘nuanced’ because cybersecurity is not in itself a partisan subject) will also apply to emerging AI regulations. “We do expect to see a lighter federal regulatory touch with regard to the development of AI – perhaps, including revocation or partial revocation of President Biden’s Executive Order on AI,” suggests Philip N Yannella, a partner at Blank Rome, LLP. “During the next Congress, we expect that any new federal AI laws will again have a light touch,” he adds.

Any tardiness at the federal level will be matched by equal haste at the state level. “State lawmakers introduced nearly 700 AI-related bills across 45 states in 2024 alone,” comments Bill Wright, global head of government affairs at Elastic. “Of these, 113 have been enacted. Each state’s approach reflects its unique priorities, but the resulting fragmentation risks stifling innovation while leaving gaps in security and oversight.”

The patchwork and fragmented nature of US cybersecurity regulations, divided between state and federal laws, exacerbated by new and little understood technologies such as AI, and complicated by the traditional over-regulation that emanates from the EU will place a huge burden on business in general and security teams in particular throughout 2025 and beyond.

“It’s always been messy, and it’s about to get messier,” says Dan Ortega, security strategist at Anomali. 

Effect on cybersecurity leaders and teams

There is a growing feeling that regulations are approaching their own singularity: a black hole of complexity where reason cannot escape, and business cannot continue. Conservative politics may begin to rein this within the US, but there is little sign of restraint from the EU.

Echoing earlier comments from Kolochenko, Wright says: “Some may limit or withdraw their presence from entire states or markets where regulatory barriers are prohibitively high – particularly if those markets impose compliance costs or require resources that exceed returns. This translates to specific regions lagging when it comes to cybersecurity readiness and advancement.”

The only alternative is to limit the effort to maintain regulatory compliance, or to increase internal resources to comply. For the former, continues Wright, “Organizations may choose to comply with the strictest regulations as a baseline, based on the assumption that this will satisfy all other requirements. While this could be perceived as pragmatic, this approach can significantly increase operational costs and create constraints that stifle innovation across all the markets they serve.”

The latter approach may be simply out of reach for all but the larger and better-resourced organizations. The threat of over-regulation impinging on the small company innovation engine may blow up in 2025.

Related: California Advances Unique Safety Regulations for AI Companies Despite Tech Firm opposition

Related: The European Union’s World-First Artificial Intelligence Rules Are Officially Taking Effect

Related: Risk and Regulation: Preparing for the Era of Cybersecurity Compliance

Related: CISA Moving Forward With Cyber Incident Reporting Rules Impacting 316,000 Entities

Original Post URL: https://www.securityweek.com/cyber-insights-2025-cybersecurity-regulatory-mayhem/

Category & Tags: Government,Artificial Intelligence,Government Policy,Regulations – Government,Artificial Intelligence,Government Policy,Regulations

Views: 2

LinkedIn
Twitter
Facebook
WhatsApp
Email

advisor pick´S post