The EU ended 2025 with a legislative blitz – AI, cybersecurity, data use, data protection, and medical devices were all subject to proposals promising to shake-up the status quo. On paper, it looks like progress: extended grace periods for high-risk AI, lighter admin under the AI Act, and clearer personal data rules. But scratch the surface, and the story changes. Coordination between proposals? Missing. Clarity? Questionable. And the biggest headache? Conflicting proposals on whether medical device AI (MDAI) is in or out of AI Act conformity assessment.

2026 will be the stress test for the EU’s promise to “simplify.” We expect big changes as Parliament and Council scrutinise the Commission’s proposals.

The Big Three: What was Published?

  • Digital Omnibus on AI: amendments to the AI Act (available here).
  • Digital Omnibus: changes to data protection, data use, and cybersecurity through amendments to the GDPR, Data Act, NIS 2, and ePrivacy Directive (available here).
  • MDR / IVDR Proposal: a bid to streamline medical device rules (read out blog post here for full details).

Here’s what pharma and medtech need to know.

1. Medical Device AI: In or Out?

Manufacturers are stuck in limbo. Do you prepare for AI Act conformity assessment, or gamble that you won’t need it? Disappointingly, the Digital Omnibus on AI and the MDR / IVDR Proposal take wildly different positions:

  • Digital Omnibus on AI: Keeps MDAI in-scope for high-risk AI, but the good news is that compliance under the AI Act and MDR / IVDR is assessed under a single, integrated conformity assessment under MDR/IVDR. However, manufacturers still need a gap analysis for AI Act-specific obligations, such as automatic log generation, training data quality, and human oversight measures.
  • MDR / IVDR Proposal: This swings the other way, and MDAI is out of scope for the AI Act conformity assessment process. This is facilitated by an amendment to the AI Act so that the MDR and IVDR would be placed in Section B of Annex I, instead of the current Section A of Annex I. What’s the impact of this change? If the MDR and IVDR are in Section B, MDAI manufacturers only need to note a few token obligations under the AI Act for high-risk AI:
    • Article 6(1): Identifies MDAI as high-risk but imposes no obligations.
    • Articles 102–109: Amendments to existing EU legislation.
    • Article 112: Commission power to redefine high-risk categories.

If the MDR / IVDR Proposal wins out, the EU would be ‘righting a wrong’ – medical devices should never have been within scope of the EU AI Act in the first place. It’s true that the AI Act addresses AI-specific issues such as quality of training data and bias more expressly than the MDR / IVDR. But the existing MDR and IVDR have the broad infrastructure in place to address many AI-specific concerns that apply to medical devices through targeted and specific guidance for MDAI. Product-agnostic legislation like the AI Act was never the answer. But for now? Manufacturers must plan for every scenario. Welcome to regulatory roulette.

2. Extended Timelines: Grace or Guesswork?

If MDAI stays in-scope of for AI Act conformity assessment, there is a saving grace – the Digital Omnibus on AI extends grace periods for high-risk AI systems. These would no longer apply from 2 August 2026 for Annex III systems, and 2 August 2027 for Annex I systems. Instead, grace periods will hinge on when the Commission finalises necessary infrastructure (standards and common specifications):

  • Annex III AI systems: 6 months after infrastructure is ready, with a long-stop date of 2 December 2027.
  • Annex I AI systems (like MDAI): 12 months after infrastructure is ready, with a long-stop date of 2 August 2028.

Unfortunately, this places industry’s compliance planning in limbo – will compliance obligations go-live in 2027 or 2028? Industry will need to plan for all scenarios.

One further change: if a manufacturer places one unit of a high-risk AI system on the EU market before the above cut-off, they may keep supplying units of the same type and model without AI Act certification, as long as the design of the high-risk AI system remains unchanged. What does this mean? First-to-market may be a compliance strategy under the Digital Omnibus on AI.

3. Notified Bodies: Bottleneck Busting

The EU finally addressed the elephant in the room – designating notified bodies is painfully slow. Fixes include:

  • A single application and assessment for MDR / IVDR and AI Act designation for notified bodies.
  • Existing MDR / IVDR notified bodies must apply for AI Act designation within 18 months of the Digital Omnibus on AI entering into force.
  • New classification codes for AI systems to ensure competence.

4. Admin Relief: AI Literacy & Registration

AI literacy is no longer a requirement for deployers and providers. Instead, the Commission and Member States should “encourage” a sufficient level of AI literacy amongst staff of deployers and providers. The Digital Omnibus on AI further removes the obligation to register AI systems in the EU database where providers determine that their systems are not high-risk under article 6(3) because they perform only narrow procedural or preparatory tasks.

5. Real-World Testing: Gap Plugged

Pre-market clinical investigations of MDAI are facilitated under the MDR, but it’s not clear that these are compliant with the AI Act. The Digital Omnibus on AI addresses this, expanding the scope of ‘real-world testing’ to cover Annex I product legislation like the MDR and IVDR (rather than Annex III systems as is currently the case). This plugs an otherwise ugly gap in the AI Act for medical devices.

6. Personal Data: A Relative Concept

The Commission proposed an addition to the definition of ‘personal data’ under the GDPR: “Information relating to a natural person is not necessarily personal data for every other person or entity, merely because another entity can identify that natural person. Information shall not be personal for a given entity where that entity cannot identify the natural person to whom the information relates, taking into account the means reasonably likely to be used by that entity. Such information does not become personal for that entity merely because a potential subsequent recipient has means reasonably likely to be used to identify the natural person to whom the information relates”.

At first glance, this codifies CJEU case law and could unlock coded clinical data for research. The proposal centres on the concept of identifiability, i.e. the idea that in one party’s hands, data may be coded (or pseudonymised). However, if a third party receives such coded data but does not have the key to re-identify individuals, this coded data is not personal data in that third party’s hands, unless that third party has the “means reasonably likely to be used” to identify individuals.

But the devil is in the details. The use of the term “entity” instead of the GDPR concepts of “controller/processor” creates uncertainty. The proposal may allow processors to argue they fall outside of the GDPR if they lack re-identification means, undermining one of the GDPR’s core principles around processorship.

Example: If a device manufacturer acting as processor receives coded data from a healthcare provider (controller), it may argue that local laws on medical secrecy or patient confidentiality (or even contractual restrictions) prevent it from re-identifying the individual. That processor may argue that the coded data falls outside the GDPR altogether, as it does not have “the means reasonably likely to be used” to reidentify an individual.

Unsurprisingly, this provision has been subject to pushback from privacy advocates like NOYB.

7. Scientific Research Defined

The Digital Omnibus defines “scientific research” under the GDPR to include innovation (including technological development and demonstration) and may include commercial interests. It emphasises that scientific research is carried out with “the aim of contributing to the growth of society´s general knowledge and wellbeing” and should “adhere to ethical standards in the relevant research area” – these are core tenets of traditional research for pharma and medtech.

A new Article 13(5) takes this further – controllers are exempt from informing data subjects of processing for scientific research purposes where: (i) they collected data directly from subjects; and (ii) this “proves impossible or would involve a disproportionate effort”. Where sponsors are unable to provide data subjects with transparency information as they access only pseudonymised information and are unable to contact patients, this may prove useful.

8. Processing of Special Category Data for AI: The Non-Exemption

The Digital Omnibus on AI dangles an exemption for incidental processing of special category data during AI development and operation, but then guts it with conditions that make it almost unusable. The conditions for relying on this exemption are why this may be “regulatory theatre”:

  • Avoidance Mandate: Controllers must put in place organisational and technical measures to “avoid the collection and otherwise processing of special categories of personal data”. This negates the exemption – if you have to avoid special category processing in the first place, what’s the point of an exemption?
  • Mandatory Removal: “Where, despite the implementation of such measures, the controller identifies special categories of personal data in the datasets used for training, testing or validation or in the AI system or AI model, the controller shall remove such data.” Again, is this an exemption if the controller can’t keep the data?
  • Disproportionate Effort Clause: If removal is “disproportionate,” the controller must prevent the data from being used to produce outputs, from being disclosed or otherwise made available to third parties.

In short, this is an exemption that isn’t, and amendments will be needed to make this work.

9. Pseudonymisation vs Anonymisation: Implementing Acts

The Digital Omnibus empowers the Commission, together with the EDPB, to issue guidance on this issue. This includes technical criteria, state-of-the-art methods and risk-based indicators for assessing re-identification.  While this may sound helpful, there is overlap with similar guidance expected under the European Health Data Space, which could easily create conflicting standards. This is a real concern for pharma and medtech, who are already face differing interpretations across data protection authorities on this issue.

10. DSARs: A Weapon No More, or Just Sharpened Differently?

Data Subject Access Requests (DSARs) have become a favourite tactical weapon, especially in HR disputes and disputes with healthcare professionals (HCPs), where employees and HCPs use them to dig for evidence in grievance or litigation scenarios. Current GDPR exemptions for “manifestly unfounded or excessive” requests have been interpreted so narrowly that controllers get little relief.

Enter the Digital Omnibus: where the data subject “abuses” this right for purposes other than the protection of their data, the controller may either:

  • charge a reasonable fee based on admin costs; or
  • refuse the request outright.

Sounds great at first glance, but the burden is on controllers to prove abuse, which means diving into the murky waters of a data subject’s intent. Was the DSAR really about data protection, or a grievance strategy? Drawing that line may prove messy, subjective, and a litigation magnet. Controllers might take a bullish stance and refuse, but expect challenges in court.

This change looks like a win for controllers, but in practice, it’s a legal grey zone that could create more headaches than it solves.

Bottom Line

The EU’s digital proposals promise clarity but deliver complexity. For pharma and medtech, 2026 is about navigating uncertainty, hedging bets, and staying agile. The question isn’t if these rules will change – it’s how fast you can adapt when they do.

Author

Jaspreet Takhar is a Counsel in Baker McKenzie's London office and advises market-leading tech and healthcare companies on issues at the cutting-edge of digital health.

Author

Julia Gillert is Of Counsel at Baker McKenzie's London office, and has shaped her practice to focus exclusively on regulatory matters affecting the Healthcare & Life Sciences industry.