MedTech Europe response to the open public consultation on the Proposal for an Artificial Intelligence Act (COM/2021/206)

Posted on 06.08.2021

MedTech Europe, the European trade association representing the medical technology industry including diagnostics, medical devices and digital health, would like to provide its response to the European Commission’s adoption consultation on the proposed Artificial Intelligence Act (AIA). Artificial Intelligence (AI) technology is increasingly used in healthcare and in recent years has been greatly enhancing the workflows and decision-making processes of healthcare providers.

The medical technology industry would like to stress the importance of a robust regulatory framework, which provides legal coherence, certainty, and clarity to all actors. In particular, interpretation issues of the new rules for AI that comprises, or is incorporated in, a medical technology, should be addressed. We call for particular attention to be paid to misalignment between provisions in the AI Act and the Medical Device Regulation (MDR)1 and In-Vitro Diagnostics Regulation (IVDR)2 as well as the General Data Protection Regulation (GDPR)3. Addressing this misalignment is essential to ensure the legal coherence, certainty and clarity needed to foster innovation, citizen access to quality care and competitiveness of industry.

MedTech Europe would like to point out that the proposed broad definition of AI and risk classification will result in any medical device software (placed on the market or put into service as a stand-alone product or component of hardware medical device) falling in the scope of the AI Act and being considered a high-risk AI system, since most medical device software needs a conformity assessment by a Notified Body.

Duplication and potential conflicts arising from misalignment between the AIA and existing obligations under MDR/IVDR must be avoided in order to ensure legal coherence, certainty and clarity. The sectoral regulations MDR/IVDR lay down some of the most stringent rules in the world on the safety and performance of medical technologies, including those medical technologies that comprise, or incorporate AI. These include, for instance, dedicated rules on risk management, quality management, technical documentation, and conformity assessment with Notified Bodies. Obligations in the AIA are thematically similar to the requirements in MDR/IVDR but differ in terms of details, which may lead to complex interpretation issues. Although we acknowledge that duplication is not the Commission’s intended vision, there are concerns that the AIA would in effect create the need for manufacturers to undertake duplicative certification / conformity assessment, via two Notified Bodies, and maintain two sets of technical documentation, should misalignments between AIA and MDR/IVDR not be resolved. Duplication of this kind would lead to unnecessary overlaps in the regulatory approval of AI as/in medical technology, which could have a negative effect on the timely access of citizens and patients to highly innovative and fairly priced AI medical technology in the EU.

The AIA appears to provide a legal basis for processing certain categories of personal data. For instance, Article 10(5) states that providers of high-risk AI systems “may process special categories of personal data referred to in Article 9(1)” of the GDPR, where doing so is “strictly necessary for the purposes of ensuring bias monitoring, detection, and correction,” subject to additional safeguards set out in that paragraph. While MedTech Europe supports this positive development the AIA is nevertheless not sufficiently providing legal ground for processing personal data in general under the GDPR (as stated in Recital 41). As such, providers of medical technology are likely to face numerous challenges in ensuring that the steps they take to comply with the AIA do not conflict with their obligations under the GDPR.

The full response to the consultation is below.