Provider Obligations
Complete requirements for AI system providers.
Provider Obligations (Articles 16-22)
Learning Objectives
By the end of this chapter, you will be able to:
- Identify all provider obligations under the AI Act
- Design and implement a compliant quality management system
- Establish effective post-market monitoring programmes
- Execute incident reporting and corrective action procedures
- Manage authorised representatives for non-EU providers
- Navigate the relationship between providers and downstream operators
Providers of high-risk AI systems bear the most extensive obligations under the AI Act. This chapter provides a comprehensive guide to all provider responsibilities across the entire AI system lifecycle.
Understanding Provider Role and Responsibility
Who is a Provider?
Article 3(3) Definition: A provider is any natural or legal person, public authority, agency, or other body that:
- Develops an AI system or a general-purpose AI model, OR
- Has an AI system or GPAI model developed for it, AND
- Places that system/model on the market or puts it into service under its own name or trademark, whether for payment or free of charge
When Do You Become a Provider?
| Scenario | Provider Status | Rationale |
|---|---|---|
| Develop AI internally, deploy under own brand | Provider | Meet all criteria |
| Commission AI development, brand as your own | Provider | "Has developed" + own name |
| Substantially modify third-party AI | May become Provider | Article 25 modification rules |
| Deploy third-party AI without modification | Deployer (not provider) | Original provider responsible |
| White-label/rebrand third-party AI | Provider | Own name/trademark trigger |
Compliance Note
The provider role carries the heaviest compliance burden. Incorrectly classifying your role can result in non-compliance with fundamental obligations.
Core Provider Obligations (Article 16)
The Complete Obligation Framework
| Obligation | Reference | Priority |
|---|---|---|
| (a) Ensure Chapter III, Section 2 compliance | Article 16(a) | Before market placement |
| (b) Indicate name, address, contact on system | Article 16(b) | Before market placement |
| (c) Quality management system | Article 16(c), Article 17 | Continuous |
| (d) Technical documentation | Article 16(d), Article 11 | Before assessment |
| (e) Logging (when under provider control) | Article 16(e), Article 12 | During operation |
| (f) Conformity assessment | Article 16(f), Article 43 | Before market placement |
| (g) EU declaration of conformity | Article 16(g), Article 47 | Before market placement |
| (h) CE marking | Article 16(h), Article 48 | Before market placement |
| (i) Registration in EU database | Article 16(i), Article 49 | Before market placement |
| (j) Corrective action for non-conformity | Article 16(j) | When non-conformity identified |
| (k) Cooperation with authorities | Article 16(k) | Upon reasoned request |
| (l) Accessibility requirements | Article 16(l) | Throughout lifecycle |
Immediate Action Items for Providers
Before Market Placement:
- Complete all Chapter III, Section 2 requirements (Articles 8-15)
- Prepare Annex IV technical documentation
- Implement quality management system per Article 17
- Conduct conformity assessment per Article 43
- Issue EU declaration of conformity per Article 47
- Affix CE marking per Article 48
- Register in EU database per Article 49
Ongoing Obligations:
- Maintain quality management system
- Operate post-market monitoring system
- Keep logs (when AI is under provider control)
- Report serious incidents
- Take corrective actions when needed
- Cooperate with competent authorities
Quality Management System (Article 17)
QMS Requirements Framework
Article 17(1) mandates a QMS that ensures compliance in a systematic and documented manner. The QMS must address:
| QMS Element | Article 17(1) | Description | Documentation Required |
|---|---|---|---|
| Compliance Strategy | (a) | Strategy for regulatory compliance | Strategy document, roles, responsibilities |
| Design & Development | (b) | Design, design control, and design verification techniques | Development processes, design reviews |
| Quality Control | (c) | Development, quality control, and quality assurance procedures | QC procedures, acceptance criteria |
| Testing & Validation | (d) | Examination, test, and validation procedures | Test protocols, validation plans |
| Technical Standards | (e) | Technical specifications and standards applied | Standards list, implementation evidence |
| Data Management | (f) | Systems and procedures for data management | Data procedures, quality criteria |
| Risk Management | (g) | Implementation of risk management system (Article 9) | Risk management plan, assessment records |
| Post-Market Monitoring | (h) | Implementation of post-market monitoring (Article 72) | PMS plan, data collection procedures |
| Incident Reporting | (i) | Procedures for serious incident reporting (Article 73) | Incident procedures, reporting protocols |
| Communication | (j) | Communication with authorities, notified bodies, customers, and other interested parties | Communication logs, correspondence records |
| Record-Keeping | (k) | Systems and procedures for record-keeping of all relevant documentation and information | Record-keeping systems, document registers |
| Resource Management | (l) | Resource management including security-of-supply measures | Resource allocation, training records |
| Accountability Framework | (m) | Accountability framework for management and other staff | Accountability matrix, role assignments |
QMS Documentation Structure
Level 1 - Policy Documents:
- AI quality policy statement
- Compliance strategy
- Organisational chart with responsibilities
Level 2 - Procedures:
- Design and development procedures
- Testing and validation procedures
- Risk management procedures
- Post-market monitoring procedures
- Incident reporting procedures
- Corrective action procedures
- Document control procedures
Level 3 - Work Instructions:
- Specific technical instructions
- Testing protocols
- Data handling guidelines
- Logging specifications
Level 4 - Records:
- Design review records
- Test results
- Risk assessments
- Incident reports
- Training records
💡 Expert Tip: Align your AI QMS with ISO 9001 (quality management) and ISO/IEC 42001 (AI management systems). While not required, these standards provide proven frameworks that facilitate compliance.
QMS Implementation Checklist
- Define QMS scope and boundaries
- Establish quality policy and objectives
- Assign management representative and responsibilities
- Document all required procedures
- Implement document control system
- Establish competence and training programme
- Create internal audit schedule
- Define management review process
- Implement continuous improvement mechanism
Technical Documentation (Article 11 / Annex IV)
Documentation Requirements Summary
Technical documentation must be drawn up before market placement and kept up-to-date throughout the lifecycle.
| Section | Content | Key Elements |
|---|---|---|
| General Description | System overview | Name, version, intended purpose, deployer instructions |
| Detailed Description | Technical specifications | Architecture, algorithms, data requirements |
| Development Process | Design and development | Design decisions, development methods, third-party tools |
| Monitoring & Control | Operational aspects | Performance metrics, oversight implementation |
| Risk Management | Article 9 compliance | Risks identified, measures taken, residual risks |
| Changes Log | Modification history | Substantial modifications, impact assessments |
| Standards Applied | Compliance evidence | Harmonised standards, common specifications |
| EU Declaration | Conformity statement | Declaration of conformity copy |
Documentation Retention
| Document Type | Retention Period | Reference |
|---|---|---|
| Technical documentation | 10 years from market placement | Article 18 |
| EU declaration of conformity | 10 years from market placement | Article 47 |
| QMS documentation | 10 years from market placement | Article 17 |
| Logs | At least 6 months (provider: Art. 19(1); deployer: Art. 26(6)) | Article 19(1), Article 26(6) |
Post-Market Monitoring (Article 72)
PMS System Requirements
Article 72(1) requires providers to establish and document a post-market monitoring system that is proportionate to:
- The nature of the AI technology
- The risks of the specific high-risk AI system
PMS Data Collection
| Data Category | Purpose | Collection Method |
|---|---|---|
| Performance data | Monitor accuracy and drift | Automated logging, user feedback |
| Usage data | Understand deployment contexts | Analytics, deployer reports |
| Incident data | Identify safety issues | Incident reports, complaint systems |
| Feedback data | Capture user experience | Surveys, support tickets |
| Compliance data | Verify ongoing conformity | Self-assessments, audits |
PMS Plan Elements
The PMS plan must include:
- Data collection strategy — What data, how collected, from whom
- Analysis methodology — How data will be evaluated
- Trigger thresholds — When corrective action required
- Corrective action procedures — How issues will be addressed
- Communication protocols — How findings shared with authorities
- Review schedule — How often PMS plan updated
Compliance Note
For high-risk AI systems, the PMS plan forms part of the technical documentation under Annex IV. It must be available to market surveillance authorities upon request.
Incident Reporting (Article 73)
Serious Incident Definition
A serious incident is any incident or malfunctioning that directly or indirectly leads to, or is likely to lead to:
| Incident Type | Examples |
|---|---|
| Death | AI system failure contributing to fatality |
| Serious damage to health | Physical injury, psychological harm |
| Serious damage to property | Significant property destruction |
| Serious damage to environment | Environmental contamination |
| Serious and irreversible disruption | Critical infrastructure failure |
| Serious breach of fundamental rights | Discrimination, privacy violation |
Reporting Timeline
| Event | Action | Deadline |
|---|---|---|
| Serious incident occurs | Report to market surveillance authority of the Member State(s) where the incident occurred | Not later than 15 days after the provider or deployer becomes aware of the serious incident |
| Death of a person | Report to market surveillance authority | Not later than 10 days after becoming aware |
| Widespread infringements or breach of fundamental rights obligations | Report to market surveillance authority | Not later than 2 days after becoming aware |
| Initial report incomplete | Provide additional information | As available, without undue delay |
| Investigation complete | Submit final report | Without undue delay |
Note: The 15-day deadline runs from the date the provider or deployer becomes aware of the serious incident, not from the date a causal link is established.
Reporting Requirements
Reports must include:
- AI system identification (name, version, registration number)
- Provider identification
- Description of incident
- Date and location of incident
- Assessment of causal link to AI system
- Description of immediate actions taken
- Analysis of root cause (if known)
- Proposed corrective measures
Corrective Actions (Article 16(j))
Non-Conformity Response Framework
When an AI system does not conform to requirements:
| Situation | Required Action | Timeline |
|---|---|---|
| Non-conformity identified | Immediate corrective action | Without delay |
| Immediate risk | Inform authorities | Immediately |
| Risk to fundamental rights | Notify affected persons | Without undue delay |
| Product recall needed | Execute recall | As circumstances require |
| System withdrawal needed | Withdraw from market | As circumstances require |
Corrective Action Procedure
- Identify — Detect non-conformity through monitoring, feedback, or audit
- Assess — Evaluate severity, scope, and impact
- Contain — Implement immediate containment measures
- Investigate — Determine root cause
- Correct — Implement corrective measures
- Verify — Confirm effectiveness of corrections
- Prevent — Implement preventive measures
- Document — Record all actions taken
- Report — Notify authorities if required
Authorised Representatives (Article 22)
When Required
Non-EU providers must appoint an authorised representative before placing AI systems on the EU market.
Authorised Representative Obligations
The representative must:
| Obligation | Description |
|---|---|
| Verify compliance | Confirm EU declaration and technical documentation exist |
| Maintain documentation | Keep copies available for 10 years |
| Cooperate with authorities | Respond to information requests |
| Act on behalf of provider | For enforcement communications |
| Terminate mandate | If provider acts contrary to AI Act |
Mandate Requirements
The written mandate must specify:
- Representative's identity and contact details
- Provider's identity and contact details
- AI systems covered by the mandate
- Tasks delegated to representative
- Representative's authority limits
Provider Relationships in the Value Chain
Provider-Deployer Interface
| Provider Responsibility | Deployer Expectation |
|---|---|
| Provide instructions for use | Receive clear operational guidance |
| Declare accuracy levels | Know system capabilities and limitations |
| Specify intended purpose | Understand permitted uses |
| Enable human oversight | Have functional oversight tools |
| Enable logging access | Access logs per Article 26 |
Provider-Importer-Distributor Obligations
| Actor | Key Checks | Action on Non-Conformity |
|---|---|---|
| Importer | CE marking, declaration, documentation present | Do not import |
| Distributor | CE marking visible, storage conditions met | Do not distribute |
| Both | Provider contact details accessible | Report to authorities |
Compliance Checklist: Provider Obligations
Documentation:
- Technical documentation complete per Annex IV
- EU declaration of conformity prepared
- Instructions for use drafted
- QMS documented
Systems:
- Quality management system implemented
- Post-market monitoring system operational
- Incident reporting procedures established
- Corrective action procedures defined
Market Placement:
- Conformity assessment completed
- CE marking affixed
- EU database registration done
- Authorised representative appointed (if non-EU)
Ongoing:
- Performance monitoring active
- Log retention operational
- Incident reporting functional
- Authority cooperation readiness confirmed
What You Learned
Key concepts from this chapter
Providers bear the most comprehensive obligations under the AI Act
Quality management system must be documented and cover all required elements
Technical documentation must be complete before market placement and maintained for 10 years
Post-market monitoring is mandatory throughout the AI system lifecycle
Serious incidents must be reported to authorities within 15 days of becoming aware (2 days for widespread infringements, 10 days for deaths)