SafeAI Pro

AI Conformity Assessment

By Katya Kamlovskaya

Do you think AI systems should be subject to rigorous conformity assessment? The EU AI Act makes it a cornerstone of its regulatory framework, particularly for applications deemed to carry significant risks. This article summarises key insights from the AI Standards Hub webinar last week - it was a very informative session!

What is conformity assessment?

Conformity assessment is about checking (and proving) that something measures up to a set of predefined standards. Not a new concept, it has been widely used in various industries to ensure quality, safety, and compliance. But with the rise of AI, especially in high-stakes applications like medical devices, the need for robust conformity assessment has become even more critical, and conformity assessment bodies are now playing a key role in this process, as they have to adapt to the unique challenges posed by AI systems.

How is it usually done?

Conformity assessment involves a range of activities designed to verify that a product, service, process, system, or organisation meets specified requirements, standards, or regulations. Here’s an elaboration on each of the common methods:

  1. Testing: Testing involves subjecting a product, system, or component to specific procedures to determine its characteristics or performance against predefined criteria. For AI systems, testing could involve performance testing, bias and fairness testing, or security and safety testing. The aim is to generate objective data about the attributes and behaviour of the AI system, proving that it meets technical specifications and functional requirements.

  2. Inspection: Inspection is the examination of a product, process, service, or installation to determine its conformity with specific requirements. Unlike testing, which often involves active manipulation or input, inspection often involves a more visual, systematic check of features, documentation, or procedures. For AI products and processes, inspection might involve documentation review or process/lifecycle check.

  3. Certification: It is a third-party attestation related to products, processes, systems, or persons. An independent certification body, after conducting an assessment (which may include testing and inspection), issues a certificate stating that the object of conformity assessment meets specified requirements.

  4. Supplier’s Declaration of Conformity (SDoC): This is a first-party conformity assessment where the supplier (manufacturer, importer, or responsible party) themselves declares that their product, process, or service conforms to specified requirements.

  5. Management system assessment (like ISO 42001 for AI management systems): This type of assessment focuses on an organisation’s management system rather than a specific product. It verifies that an organisation has put in place a systematic approach (a “management system”) to achieve certain objectives, such as quality, environmental performance, or in our case, responsible AI development and deployment. For AI, this would primarily involve auditing against ISO/IEC 42001:2023 - Artificial Intelligence Management System (AIMS): an independent auditor assesses the organisation’s processes related to AI system lifecycle, including data governance, risk management, ethical considerations, transparency, human oversight, and accountability, and checks for mechanisms that ensure continuous improvement of the AIMS.

  6. Accreditation:

    Accreditation is the formal recognition by an independent, authoritative body (an “accreditation body”) that a conformity assessment body (like a testing laboratory, inspection body, or certification body) is competent, impartial, and operates consistently according to international standards (e.g., ISO/IEC 17025 for testing labs, ISO/IEC 17020 for inspection bodies, ISO/IEC 17021 for certification bodies). An accreditation body (e.g., ENAC in Spain, UKAS in the UK) thoroughly assesses a conformity assessment body’s technical competence, management systems, impartiality, and consistent operation. Accreditation provides an additional layer of confidence and credibility to the entire conformity assessment system. When a product is certified by an accredited certification body, or tested by an accredited laboratory, it signifies that the body performing the assessment itself has been rigorously evaluated and deemed competent.

Why is AI conformity assessment important?

The EU AI Act mandates conformity assessment for high-risk AI systems before they are placed on the market or put into service. This process ensures these systems meet the Act’s stringent requirements.

For most high-risk AI systems (listed in Annex III), providers can generally perform an internal conformity assessment (self-assessment), assuming they apply relevant harmonised standards. However, for certain critical high-risk AI systems (e.g., biometric systems used in law enforcement, or where no harmonised standards are used), a third-party conformity assessment by a designated “Notified Body” is mandatory.

The conformity assessment process under the AI Act involves demonstrating compliance with requirements such as:

Upon successful completion of the conformity assessment, the provider issues an EU Declaration of Conformity and, for high-risk systems, is required to affix the CE marking.

Of course, conformity assessment is important not only for regulatory compliance - it ensures safety, mitigates risks, builds trust, fosters greater accountability, promotes culture of continuous improvement, and even facilitates international trade through creating globally recognised standards and conformity assessment schemes.

What is the role of World Trade Organisation Committee on Technical Barriers to Trade (TBT)?

World Trade Organisation Committee on Technical Barriers to Trade (TBT)** is the one that keeps international trade flowing while protecting legitimate objectives such as human health or safety, animal or plant life or health, the environment, and national security requirements. They do it by ensuring that technical regulations, standards, and conformity assessment procedures do not create unnecessary obstacles to trade. For example, topics they work on include regulatory impact assessments, medical device regulation, critical emerging technologies, such as biofuel, clean energy generation and storage, semiconductors - and, of course, AI. In 2024, the TBT Committee adopted guidelines (G/TBT/54) to support regulators in the choice and design of conformity assessment procedures to ensure that measures to verify compliance with technical regulations and standards do not create unnecessary obstacles to international trade.

What about national regulators?

National regulators in WTO member countries develop and enact regulations, design and implement conformity assessment procedures, and do market surveillance. In Luxembourg, for example, we have Luxembourg Institute of Standardisation, Accreditation, Safety and Quality of Products and Services (ILNAS), but also Luxembourg Agency for Medicines and Health Products (ALMPS) (regulating medical devices), and National Commission for Data Protection (CNPD) (AI systems and AI Act compliance). Australia, in turn, has a shared responsibility model for product safety regulation, involving federal, state, and territory governments: Australian Competition and Consumer Commission (ACCC) overseas general consumer product safety while Department of Infrastructure, Transport, Regional Development, Communications and the Arts is responsible for vehicle safety - and Australian Communication and Media Authority (ACMA). Such regulators use WTO TBT guidelines to ensure their regulations and conformity assessment procedures are effective and facilitate trade while protecting public interests.

What is ISO Conformity Assessment (CASCO) toolbox?

It is comprehensive collection of international standards and guides developed by the ISO Committee on Conformity Assessment (CASCO). It can be used by assessment and accreditation bodies, as well as by businesses - to understand how to comply with requirements, select competent conformity assessment providers, and even implement their own internal assessment processes.

What are conformity assessment schemes?

ISO Committee on conformity assessment has existed for a long time - but now they have a new task: developing conformity assessment schemes - sets of requirements and procedures that help conformity assessment bodies certify a system or a product against a specific standard. We see such schemes applied in the wild every day: just look at the packaging of a household appliance or an electrical device you own - there will be a logo of some certification scheme, which means it has passed some tests to ensure it is safe and will perform as expected. In other words, this new committee will develop such certification schemas to check AI products and systems and allow their provides label them as “certified” (that is, safe and suitable for intended use).

Developing such conformity assessment schemas for AI is challenging for a number of reasons:

All this necessitates the development of new, more dynamic and adaptive conformity assessment methods and frameworks. This is what the WG is currently doing. The aim is to publish the schemas in May 2027, so there is plenty of time for experts to get involved in drafting.

Will there be more support for certification bodies and auitors?

Yes! A new document, currently under publication, will soon provide more support to the certification bodies and auditors - it is ISO/IEC 42006 - “IT — AI — Requirements for bodies providing audit and certification of AI management systems”. The new standard will ensure that certification bodies operate with the competence and rigour necessary to assess organisations developing, deploying or offering AI systems. AI systems present unique challenges in areas like ethics, data quality, risk, and transparency. To certify that an organisation responsibly manages these challenges, auditors themselves need specialised knowledge and clear rules for conducting assessments.


Conformity assessment is not merely a bureaucratic hurdle but a fundamental pillar for the responsible and trustworthy development and deployment of AI, especially for high-stakes applications like AI-enabled medical devices. It provides the essential mechanisms to verify that AI systems meet predefined standards for safety, performance, ethics, and regulatory compliance.

Join your national standards body: https://www.iso.org/about/members


Image: Kittens and Cats; a Book of Tales, by Eulalie Osgood Grover; 1911; Boston, Houghton Mifflin. — Source.