aicomply.
Lesson15 minChapter 9 of 14

Provider Obligations

Complete requirements for AI system providers.

Provider Obligations (Articles 16-22)

Learning Objectives

By the end of this chapter, you will be able to:

  • Identify all provider obligations under the AI Act
  • Design and implement a compliant quality management system
  • Establish effective post-market monitoring programmes
  • Execute incident reporting and corrective action procedures
  • Manage authorised representatives for non-EU providers
  • Navigate the relationship between providers and downstream operators

Providers of high-risk AI systems bear the most extensive obligations under the AI Act. This chapter provides a comprehensive guide to all provider responsibilities across the entire AI system lifecycle.

Understanding Provider Role and Responsibility

Who is a Provider?

Article 3(3) Definition: A provider is any natural or legal person, public authority, agency, or other body that:

  • Develops an AI system or a general-purpose AI model, OR
  • Has an AI system or GPAI model developed for it, AND
  • Places that system/model on the market or puts it into service under its own name or trademark, whether for payment or free of charge

When Do You Become a Provider?

ScenarioProvider StatusRationale
Develop AI internally, deploy under own brandProviderMeet all criteria
Commission AI development, brand as your ownProvider"Has developed" + own name
Substantially modify third-party AIMay become ProviderArticle 25 modification rules
Deploy third-party AI without modificationDeployer (not provider)Original provider responsible
White-label/rebrand third-party AIProviderOwn name/trademark trigger

Compliance Note

The provider role carries the heaviest compliance burden. Incorrectly classifying your role can result in non-compliance with fundamental obligations.

Core Provider Obligations (Article 16)

The Complete Obligation Framework

ObligationReferencePriority
(a) Ensure Chapter III, Section 2 complianceArticle 16(a)Before market placement
(b) Indicate name, address, contact on systemArticle 16(b)Before market placement
(c) Quality management systemArticle 16(c), Article 17Continuous
(d) Technical documentationArticle 16(d), Article 11Before assessment
(e) Logging (when under provider control)Article 16(e), Article 12During operation
(f) Conformity assessmentArticle 16(f), Article 43Before market placement
(g) EU declaration of conformityArticle 16(g), Article 47Before market placement
(h) CE markingArticle 16(h), Article 48Before market placement
(i) Registration in EU databaseArticle 16(i), Article 49Before market placement
(j) Corrective action for non-conformityArticle 16(j)When non-conformity identified
(k) Cooperation with authoritiesArticle 16(k)Upon reasoned request
(l) Accessibility requirementsArticle 16(l)Throughout lifecycle

Immediate Action Items for Providers

Before Market Placement:

  1. Complete all Chapter III, Section 2 requirements (Articles 8-15)
  2. Prepare Annex IV technical documentation
  3. Implement quality management system per Article 17
  4. Conduct conformity assessment per Article 43
  5. Issue EU declaration of conformity per Article 47
  6. Affix CE marking per Article 48
  7. Register in EU database per Article 49

Ongoing Obligations:

  1. Maintain quality management system
  2. Operate post-market monitoring system
  3. Keep logs (when AI is under provider control)
  4. Report serious incidents
  5. Take corrective actions when needed
  6. Cooperate with competent authorities

Quality Management System (Article 17)

QMS Requirements Framework

Article 17(1) mandates a QMS that ensures compliance in a systematic and documented manner. The QMS must address:

QMS ElementArticle 17(1)DescriptionDocumentation Required
Compliance Strategy(a)Strategy for regulatory complianceStrategy document, roles, responsibilities
Design & Development(b)Design, design control, and design verification techniquesDevelopment processes, design reviews
Quality Control(c)Development, quality control, and quality assurance proceduresQC procedures, acceptance criteria
Testing & Validation(d)Examination, test, and validation proceduresTest protocols, validation plans
Technical Standards(e)Technical specifications and standards appliedStandards list, implementation evidence
Data Management(f)Systems and procedures for data managementData procedures, quality criteria
Risk Management(g)Implementation of risk management system (Article 9)Risk management plan, assessment records
Post-Market Monitoring(h)Implementation of post-market monitoring (Article 72)PMS plan, data collection procedures
Incident Reporting(i)Procedures for serious incident reporting (Article 73)Incident procedures, reporting protocols
Communication(j)Communication with authorities, notified bodies, customers, and other interested partiesCommunication logs, correspondence records
Record-Keeping(k)Systems and procedures for record-keeping of all relevant documentation and informationRecord-keeping systems, document registers
Resource Management(l)Resource management including security-of-supply measuresResource allocation, training records
Accountability Framework(m)Accountability framework for management and other staffAccountability matrix, role assignments

QMS Documentation Structure

Level 1 - Policy Documents:

  • AI quality policy statement
  • Compliance strategy
  • Organisational chart with responsibilities

Level 2 - Procedures:

  • Design and development procedures
  • Testing and validation procedures
  • Risk management procedures
  • Post-market monitoring procedures
  • Incident reporting procedures
  • Corrective action procedures
  • Document control procedures

Level 3 - Work Instructions:

  • Specific technical instructions
  • Testing protocols
  • Data handling guidelines
  • Logging specifications

Level 4 - Records:

  • Design review records
  • Test results
  • Risk assessments
  • Incident reports
  • Training records

💡 Expert Tip: Align your AI QMS with ISO 9001 (quality management) and ISO/IEC 42001 (AI management systems). While not required, these standards provide proven frameworks that facilitate compliance.

QMS Implementation Checklist

  • Define QMS scope and boundaries
  • Establish quality policy and objectives
  • Assign management representative and responsibilities
  • Document all required procedures
  • Implement document control system
  • Establish competence and training programme
  • Create internal audit schedule
  • Define management review process
  • Implement continuous improvement mechanism

Technical Documentation (Article 11 / Annex IV)

Documentation Requirements Summary

Technical documentation must be drawn up before market placement and kept up-to-date throughout the lifecycle.

SectionContentKey Elements
General DescriptionSystem overviewName, version, intended purpose, deployer instructions
Detailed DescriptionTechnical specificationsArchitecture, algorithms, data requirements
Development ProcessDesign and developmentDesign decisions, development methods, third-party tools
Monitoring & ControlOperational aspectsPerformance metrics, oversight implementation
Risk ManagementArticle 9 complianceRisks identified, measures taken, residual risks
Changes LogModification historySubstantial modifications, impact assessments
Standards AppliedCompliance evidenceHarmonised standards, common specifications
EU DeclarationConformity statementDeclaration of conformity copy

Documentation Retention

Document TypeRetention PeriodReference
Technical documentation10 years from market placementArticle 18
EU declaration of conformity10 years from market placementArticle 47
QMS documentation10 years from market placementArticle 17
LogsAt least 6 months (provider: Art. 19(1); deployer: Art. 26(6))Article 19(1), Article 26(6)

Post-Market Monitoring (Article 72)

PMS System Requirements

Article 72(1) requires providers to establish and document a post-market monitoring system that is proportionate to:

  • The nature of the AI technology
  • The risks of the specific high-risk AI system

PMS Data Collection

Data CategoryPurposeCollection Method
Performance dataMonitor accuracy and driftAutomated logging, user feedback
Usage dataUnderstand deployment contextsAnalytics, deployer reports
Incident dataIdentify safety issuesIncident reports, complaint systems
Feedback dataCapture user experienceSurveys, support tickets
Compliance dataVerify ongoing conformitySelf-assessments, audits

PMS Plan Elements

The PMS plan must include:

  1. Data collection strategy — What data, how collected, from whom
  2. Analysis methodology — How data will be evaluated
  3. Trigger thresholds — When corrective action required
  4. Corrective action procedures — How issues will be addressed
  5. Communication protocols — How findings shared with authorities
  6. Review schedule — How often PMS plan updated

Compliance Note

For high-risk AI systems, the PMS plan forms part of the technical documentation under Annex IV. It must be available to market surveillance authorities upon request.

Incident Reporting (Article 73)

Serious Incident Definition

A serious incident is any incident or malfunctioning that directly or indirectly leads to, or is likely to lead to:

Incident TypeExamples
DeathAI system failure contributing to fatality
Serious damage to healthPhysical injury, psychological harm
Serious damage to propertySignificant property destruction
Serious damage to environmentEnvironmental contamination
Serious and irreversible disruptionCritical infrastructure failure
Serious breach of fundamental rightsDiscrimination, privacy violation

Reporting Timeline

EventActionDeadline
Serious incident occursReport to market surveillance authority of the Member State(s) where the incident occurredNot later than 15 days after the provider or deployer becomes aware of the serious incident
Death of a personReport to market surveillance authorityNot later than 10 days after becoming aware
Widespread infringements or breach of fundamental rights obligationsReport to market surveillance authorityNot later than 2 days after becoming aware
Initial report incompleteProvide additional informationAs available, without undue delay
Investigation completeSubmit final reportWithout undue delay

Note: The 15-day deadline runs from the date the provider or deployer becomes aware of the serious incident, not from the date a causal link is established.

Reporting Requirements

Reports must include:

  • AI system identification (name, version, registration number)
  • Provider identification
  • Description of incident
  • Date and location of incident
  • Assessment of causal link to AI system
  • Description of immediate actions taken
  • Analysis of root cause (if known)
  • Proposed corrective measures

Corrective Actions (Article 16(j))

Non-Conformity Response Framework

When an AI system does not conform to requirements:

SituationRequired ActionTimeline
Non-conformity identifiedImmediate corrective actionWithout delay
Immediate riskInform authoritiesImmediately
Risk to fundamental rightsNotify affected personsWithout undue delay
Product recall neededExecute recallAs circumstances require
System withdrawal neededWithdraw from marketAs circumstances require

Corrective Action Procedure

  1. Identify — Detect non-conformity through monitoring, feedback, or audit
  2. Assess — Evaluate severity, scope, and impact
  3. Contain — Implement immediate containment measures
  4. Investigate — Determine root cause
  5. Correct — Implement corrective measures
  6. Verify — Confirm effectiveness of corrections
  7. Prevent — Implement preventive measures
  8. Document — Record all actions taken
  9. Report — Notify authorities if required

Authorised Representatives (Article 22)

When Required

Non-EU providers must appoint an authorised representative before placing AI systems on the EU market.

Authorised Representative Obligations

The representative must:

ObligationDescription
Verify complianceConfirm EU declaration and technical documentation exist
Maintain documentationKeep copies available for 10 years
Cooperate with authoritiesRespond to information requests
Act on behalf of providerFor enforcement communications
Terminate mandateIf provider acts contrary to AI Act

Mandate Requirements

The written mandate must specify:

  • Representative's identity and contact details
  • Provider's identity and contact details
  • AI systems covered by the mandate
  • Tasks delegated to representative
  • Representative's authority limits

Provider Relationships in the Value Chain

Provider-Deployer Interface

Provider ResponsibilityDeployer Expectation
Provide instructions for useReceive clear operational guidance
Declare accuracy levelsKnow system capabilities and limitations
Specify intended purposeUnderstand permitted uses
Enable human oversightHave functional oversight tools
Enable logging accessAccess logs per Article 26

Provider-Importer-Distributor Obligations

ActorKey ChecksAction on Non-Conformity
ImporterCE marking, declaration, documentation presentDo not import
DistributorCE marking visible, storage conditions metDo not distribute
BothProvider contact details accessibleReport to authorities

Compliance Checklist: Provider Obligations

Documentation:

  • Technical documentation complete per Annex IV
  • EU declaration of conformity prepared
  • Instructions for use drafted
  • QMS documented

Systems:

  • Quality management system implemented
  • Post-market monitoring system operational
  • Incident reporting procedures established
  • Corrective action procedures defined

Market Placement:

  • Conformity assessment completed
  • CE marking affixed
  • EU database registration done
  • Authorised representative appointed (if non-EU)

Ongoing:

  • Performance monitoring active
  • Log retention operational
  • Incident reporting functional
  • Authority cooperation readiness confirmed

What You Learned

Key concepts from this chapter

Providers bear the most comprehensive obligations under the AI Act

Quality management system must be documented and cover all required elements

Technical documentation must be complete before market placement and maintained for 10 years

Post-market monitoring is mandatory throughout the AI system lifecycle

Serious incidents must be reported to authorities within 15 days of becoming aware (2 days for widespread infringements, 10 days for deaths)

Chapter Complete

High-Risk AI Compliance

9/14

chapters