aicomply.
Lesson10 minChapter 3 of 9

Key Definitions

Master the essential terminology used throughout the regulation.

Learning Objectives

By the end of this chapter, you will be able to:

  • Precisely define what constitutes an "AI system" under the regulation
  • Distinguish between the six categories of actors in the AI value chain
  • Understand critical lifecycle concepts (placing on market, putting into service)
  • Identify when substantial modifications trigger new compliance obligations
  • Apply definitions accurately in compliance assessments

Article 3 of the AI Act contains 68 legal definitions that form the interpretive foundation of the entire regulation. Mastering these definitions is essential—the difference between compliance and violation often turns on precise definitional analysis.

The AI System Definition (Article 3(1))

The definition of "AI system" is the most consequential in the regulation:

'AI system' means a machine-based system designed to operate with varying levels of autonomy, that may exhibit adaptiveness after deployment and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.

What is an AI System? (Article 3)

Machine-Based

Uses computational systems

Autonomy

Varying levels of independence

Output Generation

Produces predictions, recommendations, decisions

All three elements must be present for a system to qualify as an "AI system" under the Act

Breaking Down the Definition

ElementMeaningSignificance
Machine-based systemSoftware-based, potentially with hardware componentsExcludes purely human decision-making
Varying levels of autonomyFrom minimal to full autonomyNot limited to fully autonomous systems
May exhibit adaptivenessCan change behaviour after deploymentIncludes but doesn't require learning systems
InfersDerives outputs not explicitly programmedDistinguishes from deterministic rule-based systems
Generates outputsPredictions, content, recommendations, decisionsBroad output categories covered
Influence environmentsPhysical or virtual impactCovers both digital and real-world effects

Expert Insight

The definition was significantly revised from the Commission's original proposal. The final version aligns with the OECD's AI definition but adds the "inference" element, which helps distinguish AI from traditional software while remaining technology-neutral.

What IS and IS NOT an AI System

Likely AI Systems:

  • Machine learning models (supervised, unsupervised, reinforcement)
  • Large language models and generative AI
  • Computer vision systems
  • Recommendation engines using ML
  • Autonomous decision-making systems
  • Predictive analytics using inference

Likely NOT AI Systems:

  • Simple rule-based expert systems (no inference)
  • Basic statistical analysis
  • Traditional database queries
  • Deterministic algorithms
  • Conventional software without inference capability

Compliance Note

The boundary is not always clear. When uncertain, conduct a detailed technical analysis focusing on whether the system "infers" outputs rather than following pre-programmed rules.

Actor Definitions: The AI Value Chain

Provider (Article 3(3))

'Provider' means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge.

Key Elements:

  • Develops OR has developed (covers outsourced development)
  • Places on market OR puts into service
  • Under own name or trademark (takes responsibility)
  • Regardless of payment (free AI included)

Provider Triggers:

  • You build AI and sell/deploy it under your brand
  • You commission AI development and release it as your product
  • You significantly modify another's AI and rebrand it

Deployer (Article 3(4))

'Deployer' means a natural or legal person, public authority, agency or other body using an AI system under its authority, except where the AI system is used in the course of a personal non-professional activity.

Key Elements:

  • Uses AI "under its authority" (operational control)
  • Excludes personal/non-professional use
  • Includes internal business use
  • Does not require market placement

Deployer Examples:

  • Company using AI for HR decisions
  • Bank using AI for credit scoring
  • Hospital using AI diagnostic tool
  • Retailer using AI for pricing

Importer (Article 3(6))

'Importer' means a natural or legal person located or established in the Union that places on the market an AI system that bears the name or trademark of a natural or legal person established in a third country.

Key Elements:

  • Must be EU-established
  • Brings non-EU AI to EU market
  • AI bears third-country provider's name

Distributor (Article 3(7))

'Distributor' means a natural or legal person in the supply chain, other than the provider or the importer, that makes an AI system available on the Union market.

Key Elements:

  • In supply chain but not provider/importer
  • Makes AI available (resellers, marketplaces)

Authorised Representative (Article 3(5))

'Authorised representative' means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation.

Purpose: Enables non-EU providers to fulfill EU obligations

Critical Lifecycle Definitions

AI System Lifecycle

Design
Development
Testing
Deployment
Operation
Monitoring

Compliance requirements apply throughout the entire AI lifecycle

Placing on the Market (Article 3(9))

The first making available of an AI system or a general-purpose AI model on the Union market.

Triggered by: First commercial availability in the EU Significance: Triggers most provider obligations

Putting into Service (Article 3(11))

The supply of an AI system for first use directly to the deployer or for own use in the Union for its intended purpose.

Triggered by: First operational use for intended purpose Significance: Alternative trigger for obligations; relevant for internal deployments

Substantial Modification (Article 3(23))

A change to an AI system after its placing on the market or putting into service which is not foreseen or planned in the initial conformity assessment carried out by the provider and as a result of which the compliance of the AI system with the requirements set out in Chapter III, Section 2 is affected or the intended purpose for which the AI system has been assessed is modified.

Substantial Modification Triggers:

  • Change affects compliance with requirements
  • Change modifies assessed intended purpose
  • Change was not foreseen in original conformity assessment

Important: A substantial modification can convert a deployer into a provider, triggering full provider obligations for the modified system.

Purpose and Misuse Definitions

Intended Purpose (Article 3(12))

The use for which an AI system is intended by the provider, including the specific context and conditions of use, as specified in the information supplied by the provider in the instructions for use, promotional or sales materials and statements, as well as in the technical documentation.

Compliance Implication: Provider obligations are assessed against intended purpose

Reasonably Foreseeable Misuse (Article 3(13))

The use of an AI system in a way that is not in accordance with its intended purpose, but which may result from reasonably foreseeable human behaviour or interaction with other systems, including other AI systems.

Compliance Implication: Risk management must address foreseeable misuse, not just intended use

Definitions Quick Reference

TermArticleOne-Line Summary
AI system3(1)Machine-based system that infers outputs with autonomy
Provider3(3)Develops/has developed AI, places on market under own name
Deployer3(4)Uses AI under own authority (not personal use)
Importer3(6)EU entity bringing non-EU AI to market
Distributor3(7)Makes AI available, not provider/importer
Placing on market3(9)First making AI available in EU
Putting into service3(11)First use for intended purpose
Substantial modification3(23)Change affecting compliance or intended purpose
Intended purpose3(12)Use intended by provider per documentation
Reasonably foreseeable misuse3(13)Predictable non-intended use

What You Learned

Key concepts from this chapter

The AI system definition is **broad and technology-neutral**, focusing on inference capability

**Six actor categories** exist in the value chain, each with distinct obligations

**Provider** has the most extensive obligations; develops AI and places on market under own name

**Deployer** uses AI under its authority for business purposes

**Placing on market** and **putting into service** trigger different compliance obligations

Chapter Complete

AI Act Fundamentals

3/9

chapters