EU AI Act Compliance Checker

Use this interactive tool to determine whether your AI system will be subject to obligations under the EU AI Act. This tool is provided by the Future of Life Institute and helps you understand your compliance requirements.

About This Tool

This compliance checker is developed by the Future of Life Institute and is based on the official EU AI Act text (Regulation (EU) 2024/1689). The tool helps you understand:

  • Whether your AI system falls within the scope of the EU AI Act
  • What risk category your system belongs to (unacceptable, high-risk, limited, or minimal risk)
  • What obligations apply to your organization as a Provider, Deployer, Importer, or Distributor
  • Specific requirements for high-risk AI systems

Note: For legal clarity and compliance decisions, we recommend seeking professional legal advice and following national guidance. This tool provides general guidance based on the EU AI Act text.

Launch the checker & ActGuard quick guides

Open the official assessment in a new tab, then use the guides below so you already know what Articles 12 and 14 expect—before you answer a single question.

Official tool

Open Compliance Checker Launch in new tab

Why this matters

Surfacing Articles 12 and 14 before the external checker shows that ActGuard understands the substance of the assessment—not only the form. The third-party tool is the official questionnaire; ActGuard is your guide to what it is asking.

Quick link

Article 12 · Record keeping

Cheat sheet: High-risk systems must support logging that lets you trace how the system behaved in operation (not just design docs). You need enough retained logs to reconstruct serious incidents, respect GDPR minimisation, and keep records accessible for the period authorities expect. If you cannot show what the system did when it mattered, you are not demonstrating Article 12 readiness.

Quick link

Article 14 · Human oversight (deployer)

Five-point checklist for deployers

  • Assign competent people with authority to intervene and override the system when needed.
  • Ensure operators understand limitations and failure modes; avoid blind automation.
  • Define when and how a human must intervene before harm or fundamental rights impact.
  • Document oversight decisions and incidents linked to human judgment.
  • Keep training and instructions updated as the system or context changes.

Why there is no embedded version on this page

The official site at artificialintelligenceact.eu sends security headers that forbid embedding in other websites. The workaround is to use the checker in its own tab or window.

Regulatory Intelligence

Curated EU AI Act article snapshots — updated as you extend the JSON feed.

EU

Latest EU AI Office updates

Placeholder Automated briefings of new EU AI Office publications can be connected here (e.g. via your secure ActGuard server route that calls DeepSeek or another model to summarize official pages). Until then, rely on the article cards below and the official checker.

Loading article cards…

About ActGuard

Our platform helps organizations document and manage AI incidents in compliance with the EU AI Act. After using the compliance checker, consider how ActGuard can support your governance needs.

Learn more about our platform →

Legal Disclaimer

This compliance checker is provided by the Future of Life Institute and is based on the EU AI Act text. ActGuard is not affiliated with the Future of Life Institute or the European Union. This tool is provided for informational purposes only and does not constitute legal advice. For compliance decisions, please consult with qualified legal counsel and follow national guidance from your Member State's competent authority.