Conformity assessment under the AI Act: Latest Insights for Medical Devices

To some industries ‘conformity assessment’ might be new, to others it may be an established process. The AI Act classifies different levels of risk for AI Systems, and those considered ‘high-risk’ per Annex II Section A and those per Annex III will require conformity assessment, and affixing of a CE mark in order to access the European Union market. 

For the industries pointed out in Annex II Section A (such as medical devices per MDR 2017/745 and in-vitro medical devices per IVDR 2017/746) conformity assessment is not new, and these sections are already undergoing sectoral conformity assessment procedures. For Annex III devices (biometric systems, critical infrastructure, education systems, employment systems etc) conformity assessment under a ‘Product Legislation’, such as the AI Act, may be entirely new. 

For those organizations, the implementation of a Quality Management System, Technical Documentation, Post market monitoring and product labeling are new concepts, and should not be taken lightly, especially where non-conformity against these requirements will render them liable for any damages caused under the new AI Liability Directive. Penalties under the AI Act for non-compliance with the requirements applicable to providers of High-Risk AI may run up to 15 million Euro or 3% of the total worldwide annual turnover for the preceding financial year (whichever is higher).

Conformity Assessment routes

In line with the above, it is critical for providers to understand whether their devices are considered High-Risk, and if so to ensure they follow the requirements set out for High-Risk AI devices in Title III Chapters 2 & 3 irrespective of the conformity assessment route to be followed by the provider.

The Image 1 presented below provides an overview of the Conformity Assessment routes described in the AI Act (Article 43). As shown, there are four different routes to obtain CE marking under the AI Act.

Image 1. ‘High-Risk AI’ Conformity Assessment routes under the AI Act, note these routes apply to devices considered high-risk AI only, e.g. Class I Medical Devices not under-going conformity assessment, are not considered high-risk AI.  

Conformity assessment routes explained

Route 1. Incorporation of the Notified Body assessment into the assessment under Sectoral law (assessment of the QMS + Technical Documentation)

For providers of AI system who already undergo Notified Body Conformity Assessment under sectoral legislation set out in Annex II Section A. These will be required to obtain a conformity assessment under their sectoral law, which is mandated to include an assessment against the requirements set out in the AI Act, by a Notified Body who has fulfilled obligations set out in Article 43.3 second paragraph.

Route 2. Opt-out from Notified Body assessment

Developers under Annex II Section A who are capable of demonstrating compliance against their sectoral law using harmonized standards or common specifications, are allowed to do so under the AI Act as well. 

Route 3. Conformity assessment through internal control

This would apply to all AI Systems covered under Annex III points 2 through 8, who will need to demonstrate compliance with the AI Act through internal control, and by drawing up the Declaration of Conformity through Annex V. 

Route 4. Notified Body assessment under the AI Act (assessment of the QMS + Technical Documentation under sectoral law)

This would apply to AI systems (Annex III point 1) that are currently not governed by existing legislation, and which cannot demonstrate compliance through harmonized standards or common specifications, or the provider decides to involve a Notified Body to obtain CE certification. This would require the organization to follow Annex VII and Annex V (Declaration of Conformity) of the AI Act.

Conformity assessment by a Notified Body (Annex VII)

When a Notified Body conformity assessment is required, per Route 4 described in the overview above, the Notified Body will be required to assess both the Quality Management System and the contents of the Technical Documentation. 

The developer will need to lodge an application with an accredited Notified Body, and consequently, the Notified Body will be required to assess the details of the Quality Management System and the Technical Documentation. 

Warning: 

All organizations (either route 1 or 4) who require Notified Body assessment shall make their full training, validation and testing datasets available (per Annex VII 4.3) to their Notified Bodies. If such data is not anonymised, organizations will need to enter into data processing agreements with their Notified Bodies, and on top, if the data is not owned by the organization, ensure that they have the permission to share such data with their Notified Bodies. Similarly, per 4.4 and 4.5 (also applicable to route 1 and 4) Notified Bodies may execute tests on AI systems as they deem necessary, and request access to the trained models, including its relevant parameters.

Changes to the QMS (Annex VII, 3.4)

Of specific interest to the contents of Annex VII is the low threshold for interaction with a Notified Body after initial approval of the Quality Management System and the product range covered under the Quality Management System. As specified in Annex VII:

Any intended change to the approved quality management system or the list of AI Systems covered by the latter shall be brought to the attention of the Notified Body by the provider

Hopefully, the European Commission will clarify the exact threshold for change notifications to the Quality Management System rather than requiring providers to inform the Notified Body for ‘any’ intended change to the system. 

Changes to the AI system (Annex VII, 4.7)

Changes to AI Systems shall be reported to the Notified Body where these can affect the compliance with the requirements of the AI Act or the intended purpose of the AI system. Such changes shall be reported prior to the release of the updated version onto the market, and the Notified Body will decide whether a new conformity assessment procedure would be required, and whether issuance of a new CE certificate would be deemed necessary.

Continuous surveillance (Annex VII, 5)

The Notified Body is required to exercise surveillance over the approved Quality Management System and the Notified Body is required to execute periodic audits to ensure that the provider maintains and applies the Quality Management System. 

Note, the Notified Body may carry out additional tests of the AI system itself for which a CE certificate was issued. 

Applicability to Annex II Section A

The change reporting requirements described above solely apply to conformity assessments per route 4 of the AI Act. For industries falling under Annex II Section A,  following Route 1, these sections do not apply, and the thresholds set out in the sectoral legislation should be followed instead.

Notified Bodies who audit manufacturers (providers under the AI Act) of Annex II Section A will have to undergo a limited review by their relevant competent authorities to ensure they have the right level of independence and competence for assessing AI systems in-house prior to being able to audit these manufacturers. 

There will be a three year transition period for the Notified Bodies to upgrade their systems, obtain the relevant expertise and extend the certification of devices under the sectoral law, e.g. medical devices and in-vitro medical devices. This may seem like a long time, but practice tells us it is not.

Conformity Assessment through Internal Control (Annex VI)

First of all, the conformity assessment procedure per Annex VI is poorly described within the AI Act. Annex VI is divided into 4 points, where the first point (1) directly and only refers to points 2 to 4, and seems rather ‘pointless’. 

In essence this route merely requires organizations to assess whether their QMS and Technical Documentation complies with the requirements set out in Chapters 2 & 3 of Title III and the Post Marketing Monitoring requirements set out in Article 61. 

The Annex VI does not require any explicit documentation of such internal assessment, however it is strongly recommended to document such the steps undertaken to review compliance. Learn more about how Matrix Requirements can help you on your journey to conformity with our QMS.

A strategy can be to request an external independent organization to assess the level of compliance, and have available an assessment report with documented evidence. Specifically noting the sensitivity of the use of AI, for example in employment processes, where insufficient fulfillment of the AI Act’s requirements opens organizations up to large penalties and potentially liability claims (e.g. people who experience discrimination as the result of the use of AI).

Declaration of conformity (Annex V)

Commonly, Product Legislation explicitly refers to the act of declaring conformity against the regulation involved. The AI Act oddly does not clarify this need within Article 43, but rather demands it as part of the Technical Documentation. It is unclear whether this is intentional, or whether this is an omission from Article 43 that may need to be resolved in the future or prior to publication. 

The Declaration of Conformity clarifies that the provider claims that its AI System developed complies with the requirements of the AI Act. Where the provider makes use of Harmonised Standards to evidence compliance (e.g. in routes 2 and 3, but might also apply to 1 and 4), conformity against such standards (or common specifications) is mandatory on the Declaration of Conformity. For providers who are required to draw up a Declaration of Conformity under sectoral law, the requirements of the declaration set out in Annex V shall be integrated into their existing Declarations of Conformity.

A final note worth mentioning, which introduces complexity (either intentionally or not by the European institutions), is the addition of the requirement to claim conformity with the GDPR 2016/679 if the AI system would process personal data. At a first glance, this may seem logical, however the consequences of this addition are obscure to say the least. Inherently, the Declaration of Conformity forms part of the Technical Documentation. Non-compliance with Technical Documentation under the AI Act can render penalties under the AI Act. If the provider now incorrectly claims compliance against the GDPR 2016/679 they would not just be liable / subject to penalties under the GDPR, but in addition thereto, also under the AI Act and the AI Liability Directive

Whilst the final text is now available to the public, there finally is clarity in regards to the exact conformity assessment routes. The final text includes some serious risks for providers of AI systems to consider in relation to conformity assessment. 

For example, poor conformity assessment, may result in unknown gaps against the AI Act, especially if a third-party conformity assessment has not been executed. The law itself introduces penalty clauses, and third parties can claim damages once the AI liability directive is published. Organizations need to be cautious under the AI Act, as it may open them up to financial risks. 

It is also concerning that a Notified Body may require access to the training, validation and testing data, and the AI system itself to e.g. execute testing on that system as they deem necessary. For medical devices, it is questionable whether third parties who may own the data would agree to such distribution to another party than the developer (medical device manufacturer), and if not, such training or testing data may no longer be useful for the manufacturer as a consequence, since they cannot demonstrate compliance.

Another risk is the Notified Body capacity and availability. If harmonized standards are not available in due time before the end of the transition period (2 or 3 years after publication in the Official EU Journal), a multitude of providers will require notified body assessment per route 1 or 4. 

Finally the declaration of conformity seems to have been poorly implemented into Article 43, however, it is still mandatory to issue one prior to entering the market. If an AI system also processes personal data, compliance of the AI System against the GDPR 2016/679 must be claimed on top of compliance against the AI Act.

About the Author
Leon Doorn
Independent Consultant