Compliance Checklist

EU AI Act Compliance Checklist 2026: Complete Step-by-Step Guide

🇮🇹 Leggi in Italiano

This comprehensive EU AI Act compliance checklist guides organizations through all mandatory requirements for the August 2, 2026 deadline. Covering risk classification, technical documentation, quality management systems, post-market monitoring, and compliance verification for high-risk AI systems.

Table of Contents

Phase 1: Pre-Compliance Assessment (Months 1-3)

Before implementing compliance measures, organizations must understand their AI landscape and regulatory obligations. This phase establishes the foundation for all subsequent compliance activities.

1.1 Organizational Readiness Assessment

Checklist Item Status Priority
Establish AI Act compliance team and assign responsibilities High
Allocate budget for compliance activities (documentation, tools, training) High
Review EU AI Act regulation (Regulation EU 2024/1689) and understand requirements High
Identify key stakeholders (legal, technical, compliance, risk management) High
Create compliance project timeline with milestones leading to August 2, 2026 High

1.2 AI Systems Inventory

Checklist Item Status
Create complete inventory of all AI systems in use or development
Document AI system purposes, use cases, and deployment environments
Identify AI systems imported from third parties or developed in-house
Document AI systems used by subsidiaries or partners in EU market

Phase 2: Risk Classification and Inventory (Months 2-4)

Organizations must classify each AI system according to the EU AI Act's risk-based framework. This classification determines which compliance requirements apply.

2.1 Risk Classification Checklist

Classification Step Article Reference Status
Identify prohibited AI practices (Article 5) - social scoring, manipulation Article 5
Check if AI systems fall under Annex III (high-risk AI systems) Article 6, Annex III
Determine if AI systems are used in products subject to EU harmonization legislation Article 6(2)
Classify limited-risk AI systems (transparency obligations) Article 50
Document risk classification decisions with rationale -

Phase 3: Mandatory Requirements Implementation (Months 4-12)

High-risk AI systems must comply with eight mandatory requirements under Articles 8-15 and Article 17. This phase covers the implementation of each requirement.

3.1 Risk Management System (Article 9)

Requirement Status
Establish continuous risk identification and evaluation process
Implement risk mitigation measures for identified risks
Document risk management activities and decisions
Update risk management system throughout AI system lifecycle

3.2 Data Governance (Article 10)

Requirement Status
Ensure training, validation, and testing data is relevant and representative
Implement bias detection and mitigation in datasets
Document data collection, preparation, and annotation processes
Establish data quality management procedures

3.3 Technical Documentation (Article 11)

Documentation Element (Annex IV) Status
General description of AI system and intended purpose
System architecture and design specifications
Training methodologies and datasets used
Performance metrics and testing results
Risk assessment and mitigation measures
Generate AI-BOM (AI Bill of Materials) compliant with SPDX 3.0

3.4 Record Keeping (Article 12)

  • ☐ Implement automated logging system for AI system operations
  • ☐ Ensure logs capture input data, output data, and system behavior
  • ☐ Establish log retention policies compliant with Article 12 requirements
  • ☐ Enable log access for auditing and regulatory inspections

3.5 Transparency and User Information (Article 13)

  • ☐ Provide clear information to users about AI system capabilities and limitations
  • ☐ Inform users that they are interacting with an AI system
  • ☐ Disclose AI system purpose, expected level of accuracy, and potential risks
  • ☐ Ensure transparency requirements are met before system deployment

3.6 Human Oversight (Article 14)

  • ☐ Design appropriate human oversight mechanisms for high-risk AI systems
  • ☐ Enable human intervention during system operation when needed
  • ☐ Train human overseers on AI system capabilities and limitations
  • ☐ Document human oversight procedures and decision-making processes

3.7 Accuracy, Robustness, and Cybersecurity (Article 15)

  • ☐ Design AI systems to achieve appropriate levels of accuracy
  • ☐ Implement robustness testing against adversarial attacks
  • ☐ Establish cybersecurity measures appropriate to system risks
  • ☐ Document accuracy metrics and security measures implemented

3.8 Quality Management System (Article 17)

QMS Requirement Status
Establish Quality Management System (QMS) aligned with ISO 13485 or equivalent
Document QMS procedures for design, development, and production
Implement QMS for quality control and quality assurance
Conduct regular QMS audits and reviews

Phase 4: Documentation and Compliance Verification (Months 10-15)

Before the August 2, 2026 deadline, organizations must complete all documentation and verify compliance with all mandatory requirements.

4.1 Compliance Documentation Checklist

  • ☐ Complete technical documentation per Article 11 and Annex IV
  • ☐ Prepare EU Declaration of Conformity (Annex V)
  • ☐ Register high-risk AI systems in EU database (Article 51)
  • ☐ Complete conformity assessment procedures where required
  • ☐ Document compliance with all Articles 8-15 requirements
  • ☐ Prepare audit trail documentation for regulatory inspections

4.2 Compliance Verification

  • ☐ Conduct internal compliance audit against all mandatory requirements
  • ☐ Verify QMS implementation and effectiveness
  • ☐ Test risk management system functionality
  • ☐ Validate technical documentation completeness and accuracy
  • ☐ Review post-market monitoring system readiness
  • ☐ Document any compliance gaps and remediation plans

Phase 5: Ongoing Monitoring and Maintenance (Post-August 2026)

Compliance is not a one-time activity. Organizations must maintain continuous compliance through post-market monitoring and regular updates.

5.1 Post-Market Monitoring (Article 72)

Monitoring Activity Frequency Status
Monitor AI system performance in real-world conditions Continuous
Collect and analyze incident reports and user feedback Ongoing
Report serious incidents to market surveillance authorities Within 15 days
Update technical documentation when system changes occur As needed

Complete Compliance Checklist Summary

This summary table provides a quick reference for all mandatory compliance requirements:

Requirement Category Article Deadline Status
Prohibited AI Practices Article 5 Feb 2, 2025
Risk Management System Article 9 Aug 2, 2026
Data Governance Article 10 Aug 2, 2026
Technical Documentation Article 11 Aug 2, 2026
Record Keeping Article 12 Aug 2, 2026
Transparency Article 13 Aug 2, 2026
Human Oversight Article 14 Aug 2, 2026
Accuracy & Cybersecurity Article 15 Aug 2, 2026
Quality Management System Article 17 Aug 2, 2026
Post-Market Monitoring Article 72 Aug 2, 2026
EU Database Registration Article 51 Aug 2, 2026

Source: EU AI Act - Regulation (EU) 2024/1689

Next Steps

Organizations should use this checklist to track their compliance progress and ensure all requirements are met before the August 2, 2026 deadline. Regular reviews and updates to this checklist are recommended as the AI Act implementation evolves.

Need Help with Compliance?

ActProof.ai automates EU AI Act compliance through AI-BOM generation, Policy-as-Code validation, bias monitoring, and automated documentation. Our platform helps organizations complete this checklist efficiently and meet the 2026 deadline. Contact us to learn how we can help.

Start Free Trial

Related Articles

Complete Guide to EU AI Act Compliance: What You Need to Know by 2026

Comprehensive guide covering all aspects of EU AI Act compliance.

How to Build a Quality Management System for EU AI Act Compliance 2026

Learn how to implement QMS for AI Act compliance.