How does the EU AI Act impact upon medicines R&D? The answer is not much at all. This position stems from an exemption for scientific R&D in the AI Act, as explained by a recent statement from the European Federation of Pharmaceutical Industries and Associations (EFPIA). This will be welcome news for the pharmaceutical industry, but questions remain for the medical device industry, where the position is far murkier at the research and development stages. We’ve set out our top takeaways below.

  1. Exemption for R&D: AI systems developed and put into service for the sole purpose of scientific research and development are excluded from the scope of the EU AI Act (Articles 2(6) and 2(8)). EFPIA considers that this exemption applies to AI-based drug development tools used in the research and development of medicines.
  1. No high risk use cases for medicines development: Even if the exemption were not to apply, EFPIA highlights that the majority of uses of AI software in medicines R&D are not regulated under the legal frameworks outlined in Annex II of the EU AI Act (including those for medical devices) and do not fall under Annex III high risk uses. Therefore, they are not “high risk” under the AI Act. (However, if an AI system qualifies as a medical device, or is intended to be used as a safety component of a medical device, it would be caught as a high risk system under the Act.)
  1. Sufficient existing regulation for medicines R&D, and more guidance is on the way: EFPIA’s view is that medicines development in Europe is already sufficiently and highly regulated so as to address AI uses in the development of medicines. What’s more, the European Medicines Agency plans to issue guidance later this year on the use of AI in the medicine lifecycle, and the European Medicines Regulatory Network are developing guidance and provisions for oversight. EFPIA stresses that the pharmaceutical industry needs “dynamic, flexible, future-proof guidance which takes into account the specifics of intended uses and context, and includes appropriate human oversight.”

This means that the position is clear for medicines, but devices remain another story, particularly at the clinical investigation stage. This flows from an inconsistency between the EU Medical Devices Regulation (MDR) and AI Act in respect of AI as a medical device (AIaMD) i.e. an AI system that qualifies as a medical device, or where an AI system is intended to be used as a safety component of a medical device:

  • the MDR effectively mandates a clinical investigation in respect of AIaMD prior to that device being made available or put into service on the EU market, and the MDR expressly permits such a clinical investigation prior to conformity assessment by ensuring an investigational device is excluded from the definitions of “making available”, “placing on the market” or “putting into service”.
  • the EU AI Act deems AIaMD to be a “high risk” system that requires conformity assessment and CE marking, but does not contain an equivalent provision to the MDR that permits clinical investigation prior to conformity assessment.

It’s unclear as to whether the exemption for scientific R&D is intended to assist device manufacturers in resolving this inconsistency. This means there is an argument that a clinical investigation involving AIaMD prior to conformity assessment is required by the MDR, but in breach of the AI Act. The medtech industry awaits guidance from industry bodies and legislators as to how this issue will be resolved.  

Author

Julia Gillert is Of Counsel at Baker McKenzie's London office, and has shaped her practice to focus exclusively on regulatory matters affecting the Healthcare & Life Sciences industry.

Author

Jaspreet Takhar is a senior associate in Baker McKenzie' London office and advises market-leading tech and healthcare companies on issues at the cutting-edge of digital health.