aicomply.
Assessment Dashboard

Assess Your AI Systems

Follow the EU AI Act methodology to evaluate your AI portfolio and identify compliance requirements.

New

Interactive Assessment Journey

Walk through the complete EU AI Act assessment — from system identification to compliance scoring — in one interactive wizard. Try it with sample data to see how it works.

First: Determine Your Role

Article 3

Your obligations under the EU AI Act depend on your role. Most organizations are Deployers (using AI), but you may also be a Provider (developing AI) or both. Identify your role for each AI system.

Provider

Chapter 3, Section 2

Develops or has an AI system developed and places it on the market or puts it into service under own name/trademark

Conformity assessment

CE marking

Deployer

Chapter 3, Section 3

Uses an AI system under its authority, except for personal non-professional activity

Use according to instructions

Human oversight

Importer

Chapter 3, Section 2

Places on the EU market an AI system from a third country

Verify conformity assessment

Check CE marking

Distributor

Chapter 3, Section 2

Makes an AI system available on the EU market (not provider or importer)

Verify CE marking present

Check documentation

Not sure which role applies to you?

Take our quick assessment to determine your EU AI Act role for each AI system.

Start Role Assessment
Current Focus: Step 1

AI System Inventory

Catalog all AI systems in your organization. Identify what AI you use, develop, or deploy.

Key Actions
  • 1
    List all AI systems and tools
  • 2
    Document purpose and use cases
  • 3
    Identify stakeholders and owners
  • 4
    Record vendor information
Get Started

Your Progress

0

AI Systems

0/0

Classified

0

High Risk

0%

Compliance Score

EU AI Act Risk Categories

The EU AI Act classifies AI systems into four risk categories. Understanding these categories is essential for determining your compliance obligations.

Prohibited

AI practices that are banned under the EU AI Act

Social scoring

Real-time biometric ID in public spaces

Emotion recognition in workplace

High Risk

AI systems subject to strict requirements before market placement

Recruitment tools

Credit scoring

Medical devices

Limited Risk

AI systems with transparency obligations

Chatbots

Emotion recognition

Deep fakes

Minimal Risk

AI systems with no specific obligations (voluntary codes)

Spam filters

AI video games

Inventory management

After assessment is complete

Implement compliance measures

Continue to Implement