aicomply.
Lesson8 minChapter 2 of 9

Scope and Territorial Application

Understanding who the AI Act applies to and its extraterritorial reach.

Learning Objectives

By the end of this chapter, you will be able to:

  • Determine whether the AI Act applies to your organization
  • Understand the extraterritorial reach and "output used in the EU" rule
  • Identify which activities and sectors are excluded from scope
  • Navigate the nuanced open-source AI exemption
  • Assess your obligations as a non-EU entity

Understanding the AI Act's scope is the essential first step in any compliance assessment. Article 2 establishes both who the regulation applies to (personal scope) and where it applies (territorial scope), with significant extraterritorial reach.

Territorial Scope: The Three Application Scenarios

Article 2(1) establishes three distinct scenarios where the AI Act applies:

Does the EU AI Act Apply to You?

Are you established in the EU?
Yes: AI Act applies
Continue to next question
Do you place AI on EU market?
Yes: AI Act applies
Continue to next question
Is output used in the EU?
Yes: AI Act applies
AI Act may not apply

Scenario 1: EU-Based Operators (Article 2(1)(a))

Providers placing AI systems on the EU market or putting them into service in the EU, regardless of whether those providers are established in the EU or a third country.

Scenario 2: EU-Based Deployers (Article 2(1)(b))

Deployers of AI systems that have their place of establishment or are located within the EU.

Scenario 3: The "Output Rule" (Article 2(1)(c))

Providers and deployers of AI systems that have their place of establishment or are located in a third country, where the output produced by the AI system is used in the Union.

Compliance Note

Article 2(1)(c) means that a company with no EU presence whatsoever can still be subject to the AI Act if EU residents or entities use the outputs of their AI systems. This "output rule" is the broadest extraterritorial provision.

Territorial Application Matrix

ScenarioProvider LocationDeployer LocationAI Output UsedAI Act Applies?
1EU or Non-EUAnyEU MarketYes
2AnyEUAnyYes
3Non-EUNon-EUIn EUYes
4Non-EUNon-EUOutside EUNo

Personal Scope: Who is Covered?

The AI Act applies to these categories of actors throughout the AI value chain:

AI Value Chain

Provider

Develops or places AI on market

Deployer

Uses AI under their authority

Affected Person

Subject to AI decisions

Each role has distinct obligations under the EU AI Act

Primary Actors (Article 2(1))

ActorDefinition (Summary)Primary Obligations
ProviderDevelops AI or has it developed, places on market under own name/trademarkMost extensive—conformity assessment, documentation, registration
DeployerUses AI under own authority (excluding personal use)Ensure appropriate use, human oversight, incident reporting
ImporterEU entity placing non-EU provider's AI on marketVerify provider compliance before import
DistributorMakes AI available on market (not provider/importer)Verify conformity marking, storage conditions
Authorised RepresentativeDesignated by non-EU provider to act on their behalfFulfill provider obligations on behalf of non-EU provider
Product ManufacturerPlaces product containing AI on marketEnsure AI component compliance

Expert Insight

A single entity can hold multiple roles simultaneously. For example, a company that develops AI, places it on the market, and also uses it internally would be both a provider AND a deployer, with cumulative obligations.

Activities and Sectors Excluded from Scope

Article 2(3-12) establishes important exclusions:

Military and National Security Exclusion (Article 2(3))

AI systems developed or used exclusively for military, defence, or national security purposes are outside scope, regardless of the type of entity carrying out those activities.

⚠️ Limitation: This exclusion is narrow—the word "exclusively" means dual-use AI (military AND civilian) remains in scope for its civilian applications.

Third Country Law Enforcement Cooperation (Article 2(4))

AI systems used by public authorities in third countries, or by international organisations, in the framework of law enforcement and judicial cooperation agreements with the EU.

Research and Development Exclusions

Scientific R&D (Article 2(6)): AI systems and models used solely for the purpose of scientific research and development are excluded from the AI Act.

Pre-Market R&D (Article 2(8)): The AI Act does not apply to AI systems and models specifically developed and put into service for the sole purpose of research, testing, and development before being placed on the market or put into service. This includes:

  • Research activities before any placement on market
  • Development without intent for market placement
  • Testing and validation during pre-deployment phases

However: Once an AI system transitions from R&D to deployment or market placement, full obligations apply.

Personal/Non-Professional Use (Article 2(10))

Article 2(10) exempts natural persons using AI systems for purely personal, non-professional activities from deployer obligations only. The AI Act's other provisions (e.g., provider obligations, prohibited practices, transparency requirements) continue to apply in full.

Examples of excluded deployer obligations for personal use:

  • Using a personal AI assistant at home
  • Hobbyist AI experimentation
  • Personal creative AI tools

The Open-Source AI Exemption: A Nuanced Carve-Out

Article 2(12) provides a partial exemption for certain open-source AI, but it is narrower than many expect:

The Exemption (Article 2(12))

"This Regulation does not apply to AI systems released under free and open-source licences, unless they are placed on the market or put into service as high-risk AI systems or as an AI system that falls under Article 5 or 50."

What is NOT Exempt (Even if Open-Source)

  • High-risk AI systems placed on the market or put into service
  • Prohibited practices (Article 5)
  • Transparency obligations (Article 50)
  • GPAI model obligations (Chapter V)

Open-Source Exemption Decision Tree

Is it high-risk under Article 6/Annex III?
YES
Full obligations apply
NO
Continue assessment
Is it a prohibited practice under Article 5?
YES
Prohibited regardless
NO
Continue assessment
Is it a GPAI model?
YES
Chapter V obligations apply
NO
Continue assessment
Does it require transparency under Article 50?
YES
Transparency obligations apply
NO
Continue assessment
Is it placed on market under provider's name?
YES
Full provider obligations
NO
May be exempt from most obligations

Expert Insight

The open-source exemption is primarily beneficial for foundational AI research and open-source components that are integrated into larger systems by others. Once an open-source AI is deployed in a high-risk context, full obligations apply to whoever places it into that context.

Non-EU Entities: Special Considerations

If you are located outside the EU, special rules apply:

Authorised Representative Requirement (Article 22)

Non-EU providers of high-risk AI systems must appoint an authorised representative established in the EU before placing their AI on the EU market. The authorised representative must:

  • Be designated in writing with defined tasks
  • Possess adequate knowledge and resources
  • Make documentation available to authorities
  • Cooperate with enforcement activities

The "Output Rule" in Practice

For Article 2(1)(c) to apply, the AI system's output must be used in the EU. This includes:

  • EU residents receiving AI-generated recommendations
  • EU companies using AI analysis in their operations
  • AI-generated content consumed by EU audiences
  • Decisions affecting EU data subjects

Practical Compliance Implications

Self-Assessment Questions

Review your understanding with these questions

  1. 1Where is your organization established?
  2. 2Where are your AI systems placed on the market?
  3. 3Who are your users/deployers and where are they located?
  4. 4Where are the outputs of your AI systems ultimately used?
  5. 5Do any exclusions apply (military, R&D, personal use)?
  6. 6If open-source, does the exemption apply to your use case?

What You Learned

Key concepts from this chapter

The AI Act has **broad territorial scope** with significant extraterritorial reach

The "output rule" (Article 2(1)(c)) means non-EU companies may be subject to the regulation

Exclusions exist for military/national security, R&D, and personal use—but are narrowly construed

The open-source exemption is **partial**—high-risk, GPAI, and transparency obligations still apply

Non-EU providers of high-risk AI must appoint an EU-based authorised representative

Chapter Complete

AI Act Fundamentals

2/9

chapters