aicomply.
STD-AI-009

AI Quality Management Standard

Establish and maintain a Quality Management System (QMS) for AI systems.

2

Controls

1

Compliant

1

In Progress

0

Not Started

Overall Progress
50%
Implementation Guidance
Detailed guidance for implementing this standard

AI Quality Management Standard

Document Type: Standard
Standard ID: STD-AI-009
Standard Title: AI Quality Management Standard
Version: 1.0
Effective Date: 2025-08-01
Next Review Date: 2026-08-01
Review Frequency: Annually or upon regulatory change
Parent Policy: POL-AI-001 - Artificial Intelligence Policy
Owner: Quality Director
Approved By: AI Governance Committee Chair
Status: Draft
Classification: Internal Use Only


TABLE OF CONTENTS

  1. Document History
  2. Objective
  3. Scope and Applicability
  4. Control Standard
  5. Supporting Procedures
  6. Compliance
  7. Roles and Responsibilities
  8. Exceptions
  9. Enforcement
  10. Key Performance Indicators (KPIs)
  11. Training Requirements
  12. Definitions
  13. Link with AI Act and ISO42001

DOCUMENT HISTORY

VersionDateAuthorChangesApproval DateApproved By
0.12025-07-08Robert Martinez, Quality DirectorInitial draft--
0.22025-07-22Robert Martinez, Quality DirectorAdded Article 17 details--
0.32025-08-01Robert Martinez, Quality DirectorIncorporated stakeholder feedback--
1.02025-08-01Robert Martinez, Quality DirectorFinal version approved - GRC restructured2025-07-25Jane Doe, AI Governance Committee Chair

OBJECTIVE

This standard defines requirements for establishing and maintaining a Quality Management System (QMS) for high-risk AI systems in compliance with EU AI Act Article 17.

Primary Goals:

  • Establish comprehensive QMS framework per Article 17(1)
  • Implement systematic design and development controls per Article 17(1)(b)
  • Implement quality assurance and testing per Article 17(1)(c)(d)
  • Implement corrective and preventive actions (CAPA) per Article 17(1)(a)
  • Ensure QMS effectiveness through management review and continuous improvement

SCOPE AND APPLICABILITY

2.1 Mandatory Applicability

This standard is mandatory for:

  • All high-risk AI systems (EU AI Act Article 17)
  • All lifecycle phases (design, development, testing, deployment, operation)

2.2 Recommended Applicability

This standard is recommended for:

  • All AI systems for quality assurance
  • Limited-risk and minimal-risk AI systems (voluntary QMS)

2.3 QMS Coverage

  • Quality policies and objectives
  • Design and development controls
  • Quality assurance and testing
  • Corrective and preventive actions (CAPA)
  • Management review
  • Continuous improvement

2.4 Out of Scope

  • General enterprise quality management (covered by enterprise QMS)
  • Non-AI system quality (covered by other quality standards)
  • Quality outside EU AI Act scope

CONTROL STANDARD

Control QMS-001: Quality Management System Documentation

Control ID: QMS-001
Control Name: QMS Framework Documentation
Control Type: Preventive
Control Frequency: Initial establishment, annual review
Risk Level: High

Control Objective

Document QMS systematically and orderly per Article 17(1) to ensure comprehensive quality management framework is established, maintained, and continuously improved.

Control Requirements

CR-001.1: QMS Documentation Structure

Create comprehensive QMS documentation following hierarchical structure.

QMS Documentation Structure:

Document LevelDocument TypeExamplesPurpose
Level 1: Quality ManualQMS overviewAI Quality ManualHigh-level QMS description
Level 2: ProceduresHow to perform activitiesDesign Control Procedure, Testing ProcedureProcess descriptions
Level 3: Work InstructionsDetailed step-by-step instructionsTest execution instructionsDetailed guidance
Level 4: RecordsEvidence of activitiesTest reports, review recordsEvidence of compliance

Mandatory Actions:

  • Create AI Quality Manual
  • Document all QMS procedures
  • Develop work instructions
  • Define record requirements
  • Obtain executive approval
  • Review and update annually

QMS Documentation Requirements (Article 17(1)(a-m)):

Article 17 ElementDocumentation RequiredDocument Type
(a) Regulatory compliance strategyCompliance procedures, change management, CAPAProcedures
(b) Design controlDesign procedures, verification proceduresProcedures
(c) Development and QADevelopment procedures, QA proceduresProcedures
(d) Testing and validationTest procedures, validation proceduresProcedures
(e) Technical specificationsData specs, computational resource specsSpecifications
(f) Data managementData governance proceduresProcedures
(g) Risk managementRisk management procedures (integrated)Procedures
(h) Post-market monitoringPost-market monitoring procedures (integrated)Procedures
(i) Serious incident reportingIncident reporting procedures per Article 73Procedures
(j) Communication with authoritiesProcedures for communication with competent authorities, notified bodies, operators, customersProcedures
(k) Record-keeping systemsSystems and procedures for record-keeping of all relevant documentation and informationProcedures
(l) Resource managementResource management procedures including security-of-supply measuresProcedures
(m) Accountability frameworkAccountability framework setting out responsibilities of management and other staffFramework

Evidence Required:

  • AI Quality Manual (MANUAL-AI-QMS-001)
  • QMS procedure library
  • Work instructions
  • Record templates
  • Approval records
  • Annual review records

Audit Verification:

  • Verify QMS documentation exists
  • Confirm all Article 17 elements documented
  • Check documentation structure follows hierarchy
  • Validate executive approval obtained
  • Verify annual review completed

Control QMS-002: Quality Policy and Objectives

Control ID: QMS-002
Control Name: Quality Policy and Objectives Management
Control Type: Preventive
Control Frequency: Annual review
Risk Level: Medium

Control Objective

Establish quality policy and measurable quality objectives to guide QMS implementation and provide direction for quality improvement.

Control Requirements

CR-002.1: Quality Policy and Objectives

Establish and maintain quality policy and objectives.

Quality Policy Requirements:

  • Aligned with organizational strategy
  • Commitment to EU AI Act compliance
  • Commitment to continuous improvement
  • Communicated to all staff
  • Reviewed annually
  • Approved by executive management

Quality Objectives:

Objective CategoryExample ObjectivesTargetMeasurement
Compliance100% high-risk AI compliant100%% compliant
QualityDefect escape rate < 5%< 5%Defect metrics
PerformanceTest pass rate ≥ 95%≥ 95%Test metrics
ImprovementCAPA closure rate ≥ 90%≥ 90%CAPA metrics

Mandatory Actions:

  • Define quality policy
  • Set quality objectives (cascaded to functions)
  • Communicate to organization
  • Monitor objective achievement
  • Review and update annually
  • Report to management

Evidence Required:

  • AI Quality Policy (POLICY-AI-QMS-001)
  • Quality Objectives (OBJ-AI-QMS-001)
  • Communication records
  • Objective tracking dashboard
  • Annual review records

Audit Verification:

  • Verify quality policy exists and approved
  • Confirm quality objectives set and measurable
  • Check objectives communicated
  • Validate objectives monitored
  • Verify annual review completed

Control QMS-003: QMS Roles and Responsibilities

Control ID: QMS-003
Control Name: QMS Organizational Structure and Roles
Control Type: Preventive
Control Frequency: Annual review
Risk Level: Medium

Control Objective

Define clear roles and responsibilities for QMS to ensure accountability and effective QMS implementation.

Control Requirements

CR-003.1: QMS Organizational Structure

Define and document QMS organizational structure with clear roles.

Key QMS Roles:

RoleResponsibilityAuthorityCompetence Required
Quality DirectorOverall QMS accountabilityFull QMS authorityQuality management, EU AI Act
Quality ManagerDay-to-day QMS managementQMS operational authorityQuality management, process improvement
Design Quality EngineerDesign control and verificationDesign review authorityDesign control, verification
Test Quality EngineerTesting and validationTest approval authorityTesting methodologies, validation
Quality AuditorInternal QMS auditsAudit authorityAudit expertise, QMS knowledge
CAPA CoordinatorCAPA system managementCAPA authorityRoot cause analysis, CAPA
Document ControlQMS documentation managementDocument control authorityDocument management

Mandatory Actions:

  • Define QMS organizational structure
  • Document roles and responsibilities
  • Assign qualified personnel
  • Provide QMS training
  • Review assignments annually
  • Update as needed

Evidence Required:

  • QMS organizational chart
  • Role descriptions
  • Assignment records
  • Training records
  • Annual review records

Audit Verification:

  • Verify QMS structure defined
  • Confirm roles documented
  • Check qualified personnel assigned
  • Validate training provided
  • Verify annual review completed

Control QMS-004: Design Planning

Control ID: QMS-004
Control Name: Design and Development Planning
Control Type: Preventive
Control Frequency: Per AI system
Risk Level: High

Control Objective

Plan and control AI system design and development per Article 17(1)(b) to ensure systematic design process with appropriate reviews, verification, and validation.

Control Requirements

CR-004.1: Design and Development Plan

Create comprehensive design and development plan.

Design Planning Elements:

ElementDescriptionRequired
Design StagesPhases of design processYES
Review PointsDesign review stagesYES
Verification ActivitiesDesign verification activitiesYES
Validation ActivitiesDesign validation activitiesYES
ResponsibilitiesRoles and responsibilitiesYES
ResourcesResource requirementsYES
InterfacesInterface managementYES
OutputsRequired design outputsYES

Mandatory Actions:

  • Create design and development plan
  • Define design phases
  • Identify review/verification/validation points
  • Assign responsibilities
  • Allocate resources
  • Obtain approval
  • Update as needed

Design Phases:

PhaseActivitiesReview PointsOutputs
ConceptConcept development, feasibilityConceptual Design ReviewConcept document
Preliminary DesignHigh-level design, architecturePreliminary Design ReviewArchitecture document
Detailed DesignDetailed specificationsCritical Design ReviewDetailed design specs
DevelopmentImplementation, unit testingDevelopment reviewsCode, unit tests
IntegrationIntegration, integration testingIntegration reviewIntegrated system
ValidationSystem validationValidation reviewValidated system

Evidence Required:

  • Design and Development Plan (PLAN-AI-DESIGN-XXX)
  • Design phase definitions
  • Responsibility matrix
  • Resource allocation
  • Approval records

Audit Verification:

  • Verify design plan created for all high-risk AI
  • Confirm all required elements included
  • Check review points defined
  • Validate approval obtained

Control QMS-005: Design Inputs

Control ID: QMS-005
Control Name: Design Input Requirements
Control Type: Preventive
Control Frequency: Per AI system, per major update
Risk Level: High

Control Objective

Define and document design inputs to ensure all requirements are captured, reviewed, and approved before design begins.

Control Requirements

CR-005.1: Design Input Documentation

Gather, document, and approve all design inputs.

Design Input Categories:

CategoryDescriptionExamplesSource
Functional RequirementsWhat system must doFeatures, capabilitiesUser needs
Performance RequirementsHow well system must performAccuracy, speed, throughputUser needs, standards
Regulatory RequirementsEU AI Act and other regulationsArticle 9, 10, 11, etc.EU AI Act
Risk Management RequirementsRisk-based requirementsRisk controls, mitigationsRisk assessment
User NeedsUser requirementsUse cases, user storiesUsers, stakeholders
Interface RequirementsSystem interfacesAPIs, data formatsSystem architecture
ConstraintsDesign constraintsResources, time, technologyProject constraints

Mandatory Actions:

  • Gather design inputs from stakeholders
  • Document all requirements
  • Review for completeness and clarity
  • Resolve ambiguities
  • Obtain approval
  • Maintain traceability

Evidence Required:

  • Design Input Specification (SPEC-AI-DESIGN-XXX)
  • Requirements traceability matrix
  • Review records
  • Approval records

Audit Verification:

  • Verify design inputs documented for all high-risk AI
  • Confirm all categories covered
  • Check inputs reviewed
  • Validate approval obtained
  • Verify traceability maintained

Control QMS-006: Design Outputs

Control ID: QMS-006
Control Name: Design Output Verification
Control Type: Preventive
Control Frequency: Per design phase
Risk Level: High

Control Objective

Define and verify design outputs meet design inputs to ensure design is complete, correct, and ready for development.

Control Requirements

CR-006.1: Design Output Creation and Verification

Create design outputs and verify against design inputs.

Design Output Requirements:

  • Meet design input requirements
  • Provide appropriate information for development
  • Contain or reference acceptance criteria
  • Specify characteristics essential for safe use
  • Enable traceability to design inputs

Design Outputs:

Output TypeDescriptionRequired
System ArchitectureHigh-level system designYES
Detailed Design SpecificationsDetailed component specificationsYES
Interface SpecificationsInterface definitionsYES
Test SpecificationsTest requirements and plansYES
Risk Control SpecificationsRisk control implementationsYES
Data SpecificationsData requirements and formatsYES

Mandatory Actions:

  • Create design outputs
  • Verify against design inputs
  • Conduct design reviews
  • Document verification
  • Obtain approval
  • Maintain traceability

Evidence Required:

  • Design Output Documentation (DOC-AI-DESIGN-XXX)
  • Design verification records
  • Design review records
  • Approval records
  • Traceability matrix

Audit Verification:

  • Verify design outputs created
  • Confirm outputs verified against inputs
  • Check design reviews conducted
  • Validate approval obtained
  • Verify traceability maintained

Control QMS-007: Design Review

Control ID: QMS-007
Control Name: Systematic Design Reviews
Control Type: Preventive
Control Frequency: Per design phase
Risk Level: High

Control Objective

Conduct systematic design reviews at appropriate stages to ensure design quality, identify issues early, and enable informed decisions.

Control Requirements

CR-007.1: Design Review Process

Conduct design reviews per defined schedule.

Design Review Types:

Review TypeWhenPurposeParticipantsApproval Required
Conceptual Design ReviewAfter concept phaseValidate concept feasibilityAll stakeholdersYES
Preliminary Design ReviewAfter preliminary designVerify design approachTechnical team, QAYES
Critical Design ReviewBefore developmentApprove final designAll stakeholdersYES
Design Verification ReviewAfter verificationConfirm design meets requirementsQA, Technical teamYES

Review Participants:

  • AI System Owner
  • Technical Lead
  • Quality Engineer
  • Risk Manager
  • Subject matter experts
  • Legal (if needed)

Mandatory Actions:

  • Schedule design reviews
  • Prepare review materials
  • Conduct reviews
  • Document findings and actions
  • Track action closure
  • Obtain approval to proceed
  • Block progression if issues not resolved

Evidence Required:

  • Design Review Records (REC-AI-REVIEW-XXX)
  • Review findings
  • Action item tracking
  • Approval to proceed records

Audit Verification:

  • Verify design reviews conducted per schedule
  • Confirm all required participants attend
  • Check findings documented
  • Validate actions tracked to closure
  • Verify approval obtained before progression

Control QMS-008: Design Verification

Control ID: QMS-008
Control Name: Design Verification Activities
Control Type: Preventive
Control Frequency: Per design phase
Risk Level: High

Control Objective

Verify design outputs meet design inputs to ensure design correctness and completeness before proceeding to next phase.

Control Requirements

CR-008.1: Design Verification Methods

Conduct design verification using appropriate methods.

Verification Methods:

MethodDescriptionWhen to UseEvidence
Design CalculationsMathematical verificationWhen calculations involvedCalculation records
Alternative CalculationsIndependent verificationFor critical calculationsAlternative calculation records
Comparison with Proven DesignsComparison with similar systemsWhen similar systems existComparison analysis
Testing and DemonstrationsTest or demonstrate designWhen testableTest results
Document ReviewsReview design documentsFor all designsReview records

Mandatory Actions:

  • Plan verification activities
  • Execute verification
  • Document verification results
  • Address any failures
  • Obtain verification approval
  • Block progression if verification fails

Evidence Required:

  • Design Verification Plan (PLAN-AI-VERIFY-XXX)
  • Verification test results
  • Verification reports
  • Approval records

Audit Verification:

  • Verify design verification conducted
  • Confirm appropriate methods used
  • Check verification results documented
  • Validate failures addressed
  • Verify approval obtained

Control QMS-009: Design Validation

Control ID: QMS-009
Control Name: Design Validation Activities
Control Type: Preventive
Control Frequency: Before deployment, after substantial modifications
Risk Level: High

Control Objective

Validate AI system meets user needs and intended use to ensure system is fit for purpose before deployment.

Control Requirements

CR-009.1: Design Validation Process

Conduct design validation per defined requirements.

Validation Requirements:

  • Conducted under defined operating conditions
  • Uses representative data
  • Includes user feedback
  • Covers all intended use cases
  • Performed before deployment
  • Documented with evidence

Mandatory Actions:

  • Plan validation activities
  • Execute validation in realistic conditions
  • Collect and analyze results
  • Obtain user feedback
  • Document validation
  • Obtain approval
  • Block deployment if validation fails

Validation Test Matrix:

Use CaseTest ScenarioSuccess CriteriaStatus
[Use Case 1][Scenario][Criteria]
[Use Case 2][Scenario][Criteria]
[Use Case N][Scenario][Criteria]

Evidence Required:

  • Design Validation Plan (PLAN-AI-VALID-XXX)
  • Validation test results
  • User feedback
  • Validation report
  • Approval records

Audit Verification:

  • Verify design validation conducted before deployment
  • Confirm realistic conditions used
  • Check all use cases validated
  • Validate user feedback obtained
  • Verify approval obtained
  • Check deployment blocked if validation fails

Control QMS-010: Design Transfer

Control ID: QMS-010
Control Name: Design Transfer to Development/Production
Control Type: Preventive
Control Frequency: Per AI system
Risk Level: Medium

Control Objective

Transfer design to development/production with appropriate controls to ensure design is correctly implemented.

Control Requirements

CR-010.1: Design Transfer Process

Execute design transfer with verification.

Transfer Requirements:

  • Design outputs complete and approved
  • Development/production capabilities verified
  • Transfer plan documented
  • Transfer verification performed
  • Acceptance criteria met

Mandatory Actions:

  • Create design transfer plan
  • Verify development readiness
  • Execute transfer
  • Verify transfer success
  • Obtain acceptance
  • Document transfer

Evidence Required:

  • Design Transfer Plan (PLAN-AI-TRANSFER-XXX)
  • Transfer verification records
  • Acceptance records

Audit Verification:

  • Verify design transfer planned
  • Confirm development readiness verified
  • Check transfer executed
  • Validate acceptance obtained

Control QMS-011: Design Change Control

Control ID: QMS-011
Control Name: Design Change Management
Control Type: Preventive
Control Frequency: As needed
Risk Level: Medium

Control Objective

Control and document design changes per Article 17(1)(a) to ensure changes are properly assessed, approved, and implemented.

Control Requirements

CR-011.1: Design Change Control Process

Manage design changes through formal change control process.

Change Control Process:

  1. Change request submitted
  2. Impact assessment conducted
  3. Risk assessment updated
  4. Change reviewed and approved
  5. Change implemented
  6. Change verified
  7. Documentation updated
  8. Stakeholders notified

Mandatory Actions:

  • Submit change requests
  • Assess impact and risks
  • Review and approve changes
  • Implement changes
  • Verify changes
  • Update documentation
  • Notify stakeholders

Change Classification:

Change TypeImpact AssessmentApproval RequiredVerification Required
Major ChangeComprehensiveAI Governance CommitteeFull verification
Minor ChangeStandardAI System OwnerStandard verification
Emergency ChangePost-implementationAI System Owner (immediate)Post-verification

Evidence Required:

  • Design Change Requests (DCR-AI-XXX)
  • Impact assessments
  • Change approvals
  • Verification records
  • Documentation updates

Audit Verification:

  • Verify change control process followed
  • Confirm impact assessments conducted
  • Check changes approved
  • Validate changes verified
  • Verify documentation updated

Control QMS-012: Quality Assurance Program

Control ID: QMS-012
Control Name: Quality Assurance Program Implementation
Control Type: Preventive
Control Frequency: Continuous
Risk Level: Medium

Control Objective

Establish comprehensive quality assurance program per Article 17(1)(c) to ensure quality throughout AI system lifecycle.

Control Requirements

CR-012.1: QA Program Elements

Implement comprehensive quality assurance program.

QA Program Elements:

ElementDescriptionImplementation
Quality PlanningPlan quality activitiesQuality plans
Process AuditsAudit processes for complianceProcess audit schedule
Product AuditsAudit products for qualityProduct audit schedule
Supplier Quality ManagementManage supplier qualitySupplier quality procedures
Quality Metrics and ReportingMonitor and report qualityQuality dashboards, reports

Mandatory Actions:

  • Define QA program
  • Conduct process audits
  • Conduct product audits
  • Monitor quality metrics
  • Report quality status
  • Implement improvements

Evidence Required:

  • QA Program Plan (PLAN-AI-QA-001)
  • Audit schedules and reports
  • Quality metrics dashboard
  • Quality reports

Audit Verification:

  • Verify QA program defined
  • Confirm audits conducted
  • Check quality metrics monitored
  • Validate reports generated

Control QMS-013: Testing Strategy and Execution

Control ID: QMS-013
Control Name: Comprehensive Testing Program
Control Type: Preventive
Control Frequency: Per AI system, per release
Risk Level: High

Control Objective

Implement comprehensive testing strategy per Article 17(1)(d) to ensure AI systems are tested before, during, and after development.

Control Requirements

CR-013.1: Testing Strategy and Execution

Define and execute comprehensive testing strategy.

Testing Levels:

Test LevelPurposeWhenSuccess Criteria
Unit TestingTest individual componentsDuring developmentAll unit tests pass
Integration TestingTest component interactionsAfter unit testingAll integration tests pass
System TestingTest complete systemAfter integrationAll system tests pass
Acceptance TestingValidate user requirementsBefore deploymentAll acceptance tests pass
Regression TestingVerify no unintended changesAfter modificationsAll regression tests pass

Mandatory Actions:

  • Define testing strategy
  • Create test plans
  • Execute tests
  • Document results
  • Track defects
  • Obtain test approval
  • Block deployment if tests fail

Test Planning Requirements:

ElementDescriptionRequired
Test ObjectivesWhat to testYES
Test ScopeWhat is coveredYES
Test ApproachHow to testYES
Test EnvironmentTest environment setupYES
Test DataTest data requirementsYES
Entry/Exit CriteriaWhen to start/stop testingYES
Test ScheduleTesting timelineYES
ResourcesResources requiredYES

Evidence Required:

  • Testing Strategy (STRATEGY-AI-TEST-001)
  • Test plans (PLAN-AI-TEST-XXX)
  • Test results (TEST-AI-TEST-XXX)
  • Defect logs
  • Test reports
  • Approval records

Audit Verification:

  • Verify testing strategy defined
  • Confirm test plans created
  • Check tests executed
  • Validate test results documented
  • Verify approval obtained
  • Check deployment blocked if tests fail

Control QMS-014: Corrective and Preventive Actions (CAPA)

Control ID: QMS-014
Control Name: CAPA System Implementation
Control Type: Corrective/Preventive
Control Frequency: Continuous
Risk Level: High

Control Objective

Implement systematic CAPA system per Article 17(1)(a) to eliminate causes of nonconformities and prevent recurrence.

Control Requirements

CR-014.1: CAPA System Establishment

Establish and maintain CAPA system.

CAPA System Elements:

  • Nonconformity identification
  • Root cause analysis
  • Corrective action planning
  • Preventive action planning
  • Implementation and verification
  • Effectiveness review

Corrective Action Process:

  1. Identify nonconformity
  2. Contain immediate issue
  3. Investigate root cause
  4. Plan corrective action
  5. Implement corrective action
  6. Verify effectiveness
  7. Update documentation
  8. Close CAPA

Preventive Action Process:

  1. Identify potential nonconformity
  2. Assess risk
  3. Plan preventive action
  4. Implement preventive action
  5. Verify effectiveness

Root Cause Analysis Methods:

  • 5 Whys
  • Fishbone diagram
  • Fault tree analysis
  • Pareto analysis

Mandatory Actions:

  • Define CAPA process
  • Train staff on CAPA
  • Manage CAPA workflow
  • Track CAPA to closure
  • Review CAPA effectiveness
  • Report CAPA metrics

Evidence Required:

  • CAPA Procedure (PROC-AI-CAPA-001)
  • CAPA tracking system
  • CAPA records (CAPA-AI-XXX)
  • Root cause analysis records
  • Action plans
  • Verification records
  • Training records
  • CAPA effectiveness reviews

Audit Verification:

  • Verify CAPA system established
  • Confirm CAPA process followed
  • Check root cause analysis conducted
  • Validate actions implemented
  • Verify effectiveness reviewed

Control QMS-015: Management Review and Continuous Improvement

Control ID: QMS-015
Control Name: QMS Management Review and Improvement
Control Type: Detective
Control Frequency: Annually minimum
Risk Level: Medium

Control Objective

Ensure QMS effectiveness through management review and drive continuous improvement to enhance QMS and AI system quality.

Control Requirements

CR-015.1: Management Review

Conduct periodic management review of QMS.

Review Frequency: At least annually, or more frequently if needed

Review Inputs:

  • Quality policy and objectives status
  • Quality metrics and KPIs
  • Audit results (internal and external)
  • Customer/user feedback
  • Process performance
  • Product/service conformity
  • Nonconformities and CAPA status
  • Changes affecting QMS
  • Improvement opportunities

Review Outputs:

  • Decisions on improvements
  • Resource allocation decisions
  • Quality objective updates
  • QMS changes needed

Mandatory Actions:

  • Schedule management reviews
  • Prepare review materials
  • Conduct review meeting
  • Document decisions
  • Track action items
  • Communicate outcomes
  • Implement decisions

Evidence Required:

  • Management Review Records (REC-AI-MGT-REVIEW-XXX)
  • Review presentations
  • Decisions and action items
  • Follow-up tracking

CR-015.2: Continuous Improvement

Drive continuous improvement of QMS and AI systems.

Improvement Sources:

  • CAPA system
  • Management review
  • Internal audits
  • Process metrics
  • Lessons learned
  • Industry best practices

Mandatory Actions:

  • Identify improvement opportunities
  • Prioritize improvements
  • Plan and implement improvements
  • Measure improvement effectiveness
  • Share lessons learned

Evidence Required:

  • Improvement initiatives log
  • Improvement project plans
  • Effectiveness measurements
  • Lessons learned database

Audit Verification:

  • Verify management review conducted annually
  • Confirm all inputs reviewed
  • Check decisions documented
  • Validate action items tracked
  • Verify improvements implemented

SUPPORTING PROCEDURES

This standard is implemented through the following detailed procedures:

Procedure PROC-AI-QMS-001: QMS Framework Implementation Procedure

Purpose: Define step-by-step process for establishing QMS
Owner: Quality Director
Implements: Controls QMS-001, QMS-002, QMS-003

Procedure Steps:

  1. Create Quality Manual - Control QMS-001
  2. Define quality policy and objectives - Control QMS-002
  3. Define QMS structure and roles - Control QMS-003
  4. Document all procedures
  5. Obtain executive approval
  6. Communicate to organization

Outputs:

  • Quality Manual
  • Quality Policy
  • Quality Objectives
  • QMS structure

Procedure PROC-AI-QMS-002: Design Control Procedure

Purpose: Define process for controlling design
Owner: Quality Director
Implements: Controls QMS-004, QMS-005, QMS-006, QMS-007, QMS-008, QMS-009, QMS-010, QMS-011

Procedure Steps:

  1. Create design plan - Control QMS-004
  2. Gather design inputs - Control QMS-005
  3. Create design outputs - Control QMS-006
  4. Conduct design reviews - Control QMS-007
  5. Verify design - Control QMS-008
  6. Validate design - Control QMS-009
  7. Transfer design - Control QMS-010
  8. Manage design changes - Control QMS-011

Outputs:

  • Design plans
  • Design inputs/outputs
  • Review records
  • Verification/validation records

Procedure PROC-AI-QMS-003: Quality Assurance Testing Procedure

Purpose: Define process for QA testing
Owner: Quality Director
Implements: Controls QMS-012, QMS-013

Procedure Steps:

  1. Define QA program - Control QMS-012
  2. Define testing strategy - Control QMS-013
  3. Plan tests
  4. Execute tests
  5. Report results

Outputs:

  • QA program
  • Test plans
  • Test results

Procedure PROC-AI-CAPA-001: Corrective and Preventive Action Procedure

Purpose: Define process for CAPA
Owner: Quality Director
Implements: Control QMS-014

Procedure Steps:

  1. Identify nonconformity/potential issue
  2. Conduct root cause analysis
  3. Plan actions
  4. Implement actions
  5. Verify effectiveness
  6. Close CAPA

Outputs:

  • CAPA records
  • Root cause analysis
  • Action plans

Procedure PROC-AI-QMS-004: Management Review Procedure

Purpose: Define process for management review
Owner: Quality Director
Implements: Control QMS-015

Procedure Steps:

  1. Prepare review materials
  2. Conduct review
  3. Document decisions
  4. Track action items
  5. Implement improvements

Outputs:

  • Management review records
  • Decisions
  • Action items

COMPLIANCE

5.1 Compliance Monitoring

Monitoring Approach: Continuous automated monitoring supplemented by monthly manual reviews and quarterly comprehensive audits.

Compliance Metrics:

MetricTargetMeasurement MethodFrequencyOwner
QMS Documentation Completeness100%% of Article 17 elements documentedMonthlyQuality Director
Design Review Completion100%% of required reviews completedMonthlyQuality Director
Test Pass Rate≥95%% of tests passed first timePer releaseQuality Director
CAPA Closure Rate≥90%% of CAPAs closed on timeMonthlyQuality Director
Defect Escape Rate<5%% of defects found post-releasePer releaseQuality Director
Audit Findings Closure100%% of audit findings closed on timeQuarterlyQuality Director
Management Review Completion100%% of scheduled reviews completedAnnuallyQuality Director

Monitoring Tools:

  • QMS Dashboard
  • Quality Metrics Dashboard
  • Compliance Reports
  • Monthly compliance reports
  • Quarterly AI Governance Committee reviews

5.2 Internal Audit Requirements

Audit Frequency: Annually (minimum)

Audit Scope:

  • QMS documentation completeness
  • Process compliance
  • Design control effectiveness
  • Testing effectiveness
  • CAPA effectiveness
  • Management review effectiveness
  • Controls effectiveness (QMS-001 through QMS-015)

Audit Activities:

  • Review 100% of QMS documentation
  • Sample 20% of processes for compliance testing
  • Test design control process
  • Test testing process
  • Review CAPA system
  • Review management review
  • Interview key personnel

Audit Outputs:

  • Annual QMS Audit Report
  • Findings and recommendations
  • Corrective action plans for deficiencies

5.3 External Audit / Regulatory Inspection

Preparation:

  • Maintain audit-ready QMS documentation at all times
  • Designate Quality Director and Legal as regulatory liaisons
  • Prepare standard response procedures for authority requests

Provide to Auditors/Regulators:

  • Quality Manual
  • QMS procedures
  • Design control records
  • Test records
  • CAPA records
  • Management review records
  • Internal audit reports
  • Evidence of controls execution

Authority Request Response:

  • Acknowledge request within 1 business day
  • Provide requested documentation within 5 business days
  • Coordinate through Legal and Quality Director
  • Document all interactions with authorities

ROLES AND RESPONSIBILITIES

6.1 RACI Matrix

ActivityQuality DirectorQuality ManagerDesign Quality EngineerTest Quality EngineerAI System OwnerTechnical LeadCAPA Coordinator
QMS FrameworkR/ARIIIII
Quality PolicyR/ACIIIII
Design PlanningRCRIARI
Design InputsRCRIARI
Design OutputsRCRIARI
Design ReviewRCRCARI
Design VerificationRCRCARI
Design ValidationRRCRACI
TestingRCIRACI
CAPARCIIAIR/A
Management ReviewR/ARIICII

RACI Legend:

  • R = Responsible (does the work)
  • A = Accountable (ultimately answerable)
  • C = Consulted (provides input)
  • I = Informed (kept up-to-date)

6.2 Role Descriptions

Quality Director

  • Primary Responsibility: Owns QMS framework, ensures compliance
  • Key Activities:
    • Establishes QMS framework
    • Approves quality policy
    • Oversees QMS implementation
    • Reports to management
  • Required Competencies: EU AI Act Article 17, ISO 9001, quality management

Quality Manager

  • Primary Responsibility: Day-to-day QMS management
  • Key Activities:
    • Manages QMS operations
    • Coordinates quality activities
    • Monitors quality metrics
    • Reports quality status
  • Required Competencies: Quality management, process improvement

Design Quality Engineer

  • Primary Responsibility: Design control and verification
  • Key Activities:
    • Reviews design processes
    • Verifies design outputs
    • Conducts design reviews
  • Required Competencies: Design control, verification

Test Quality Engineer

  • Primary Responsibility: Testing and validation
  • Key Activities:
    • Plans testing
    • Executes tests
    • Reports test results
  • Required Competencies: Testing methodologies, validation

AI System Owner

  • Primary Responsibility: Accountable for quality of their AI system
  • Key Activities:
    • Ensures QMS followed
    • Approves design decisions
    • Participates in reviews
  • Required Competencies: AI system knowledge, quality awareness

CAPA Coordinator

  • Primary Responsibility: CAPA system management
  • Key Activities:
    • Manages CAPA workflow
    • Coordinates root cause analysis
    • Tracks CAPA to closure
  • Required Competencies: Root cause analysis, CAPA

EXCEPTIONS

7.1 Exception Philosophy

Quality management is a critical regulatory compliance activity for high-risk AI systems. Exceptions are granted restrictively and only where compensating controls adequately mitigate risks.


7.2 Allowed Exceptions

The following exceptions may be granted with proper justification and approval:

Exception TypeJustification RequiredMaximum DurationApproval AuthorityCompensating Controls
Simplified QMS (Minimal-Risk AI)AI system clearly minimal-risk; simplified QMS sufficientPermanentQuality DirectorDocument rationale; Annual re-confirmation
Extended Design Review TimelineResource constraints prevent timely review30 daysQuality Director + AI System OwnerInterim review; Accelerated plan

7.3 Prohibited Exceptions

The following exceptions cannot be granted under any circumstances:

Skipping QMS for high-risk AI - Mandatory per Article 17, no exceptions
Skipping design reviews - Required for design quality
Skipping design verification - Required for design correctness
Skipping design validation - Required before deployment
Skipping testing - Required per Article 17(1)(d)
Skipping CAPA - Required per Article 17(1)(a)


7.4 Exception Request Process

Step 1: Submit Exception Request

  • Complete Exception Request Form (FORM-AI-EXCEPTION-001)
  • Include business justification
  • Propose compensating controls
  • Specify duration requested
  • Attach risk assessment

Step 2: Risk Assessment

  • Quality Director assesses risk of granting exception
  • Evaluates adequacy of compensating controls
  • Documents residual risk

Step 3: Approval

  • Route to appropriate approval authority based on exception type
  • Quality Director approval: Minor exceptions
  • Quality Director + AI Governance Committee: Significant exceptions
  • AI Governance Committee: Critical exceptions

Step 4: Documentation and Monitoring

  • Document exception in Exception Register
  • Assign exception owner
  • Set review date
  • Monitor compensating controls
  • Report exceptions quarterly to AI Governance Committee

Step 5: Exception Review and Closure

  • Review exception at specified review date
  • Assess if exception still needed
  • Close exception when normal QMS completed
  • Document lessons learned

ENFORCEMENT

8.1 Non-Compliance Consequences

ViolationSeverityConsequenceRemediation Required
High-risk AI without QMSCriticalImmediate suspension until QMS establishedEstablish QMS within 30 business days; Root cause analysis
Skipping design reviewsHighEscalation to AI Governance CommitteeComplete reviews within 10 business days
Skipping design validationCriticalBlock deploymentComplete validation before deployment
Skipping testingCriticalBlock deploymentComplete testing before deployment
CAPA not implementedHighEscalation to managementImplement CAPA within 10 business days
Management review not conductedMediumWritten warningComplete review within 30 business days

8.2 Escalation Procedures

Level 1: Quality Director

  • Minor procedural violations
  • Documentation deficiencies
  • Timeline delays < 5 days
  • Action: Written warning, corrective action required

Level 2: Quality Director + AI Governance Committee

  • Repeated violations
  • Missing design reviews
  • Missing testing
  • Action: Formal review, corrective action plan, management notification

Level 3: AI Governance Committee

  • High-risk AI without QMS
  • Missing design validation
  • Critical compliance failures
  • Action: Immediate AI system suspension, investigation, disciplinary action

Level 4: Executive Management + Legal

  • Potential regulatory enforcement action
  • Significant legal liability
  • Reputational risk
  • Action: Executive crisis management, legal strategy, regulatory engagement

8.3 Immediate Escalation Triggers

Escalate immediately to AI Governance Committee + Legal if:

  • ⚠️ High-risk AI system operating without QMS
  • ⚠️ Design validation not completed before deployment
  • ⚠️ Testing not completed before deployment
  • ⚠️ Regulatory inquiry or inspection related to QMS
  • ⚠️ Critical quality issues affecting safety

8.4 Disciplinary Actions

Individuals responsible for QMS violations may be subject to:

  • Verbal or written warning
  • Mandatory retraining
  • Performance improvement plan
  • Reassignment of responsibilities
  • Suspension (with pay during investigation)
  • Termination (for egregious violations, e.g., knowingly deploying without validation)

Factors Considered:

  • Intent (knowing violation vs. honest mistake)
  • Severity of violation
  • Impact (actual or potential)
  • Cooperation with remediation
  • Prior violation history

KEY PERFORMANCE INDICATORS (KPIs)

9.1 Quality Management KPIs

KPI IDKPI NameDefinitionTargetMeasurement MethodFrequencyOwnerReporting To
KPI-QMS-001QMS Documentation Completeness% of Article 17 elements documented100%(# elements documented / # total elements) × 100MonthlyQuality DirectorAI Governance Committee
KPI-QMS-002Design Review Completion% of required reviews completed100%(# reviews completed / # required reviews) × 100MonthlyQuality DirectorManagement
KPI-QMS-003Test Pass Rate% of tests passed first time≥95%(# tests passed / # total tests) × 100Per releaseQuality DirectorManagement
KPI-QMS-004CAPA Closure Rate% of CAPAs closed on time≥90%(# CAPAs closed on time / # total CAPAs) × 100MonthlyQuality DirectorAI Governance Committee
KPI-QMS-005Defect Escape Rate% of defects found post-release<5%(# defects post-release / # total defects) × 100Per releaseQuality DirectorManagement
KPI-QMS-006Audit Findings Closure% of audit findings closed on time100%(# findings closed on time / # total findings) × 100QuarterlyQuality DirectorAI Governance Committee
KPI-QMS-007Management Review Completion% of scheduled reviews completed100%(# reviews completed / # scheduled reviews) × 100AnnuallyQuality DirectorExecutive Management
KPI-QMS-008Design Verification Completion% of designs verified100%(# designs verified / # total designs) × 100MonthlyQuality DirectorManagement
KPI-QMS-009Design Validation Completion% of designs validated before deployment100%(# designs validated / # designs deployed) × 100MonthlyQuality DirectorAI Governance Committee
KPI-QMS-010QMS Effectiveness ScoreComposite QMS effectiveness score≥90%Weighted average of effectiveness metricsQuarterlyQuality DirectorAI Governance Committee

9.2 KPI Dashboards and Reporting

Real-Time Dashboard (Quality Director access)

  • Current QMS status
  • Design review status
  • Testing status
  • CAPA status
  • Quality metrics

Monthly Management Report

  • KPI-QMS-001, 002, 003, 004, 005, 008, 009
  • Trend analysis (vs. previous month)
  • Issues and risks
  • Planned actions

Quarterly AI Governance Committee Report

  • All KPIs
  • QMS effectiveness assessment
  • Audit findings status
  • Internal audit findings (if conducted)
  • Exception register review

Annual Executive Report

  • Full-year KPI performance
  • QMS maturity assessment
  • Strategic recommendations
  • Regulatory outlook

9.3 KPI Thresholds and Alerts

KPIGreen (Good)Yellow (Warning)Red (Critical)Alert Action
QMS Documentation Completeness100%95-99%< 95%Red: Immediate escalation to AI Governance Committee Chair
Design Review Completion100%90-99%< 90%Red: Escalate to AI Governance Committee
Test Pass Rate≥95%90-94%< 90%Red: Block deployments until improved
CAPA Closure Rate≥90%80-89%< 80%Yellow: Accelerate CAPA; Red: Escalate to AI Governance Committee
Defect Escape Rate<5%5-10%> 10%Red: Escalate to AI Governance Committee

TRAINING REQUIREMENTS

10.1 Training Program Overview

All personnel involved in quality management must complete role-specific training to ensure competency in EU AI Act Article 17 requirements, QMS processes, and quality procedures.


10.2 Role-Based Training Requirements

RoleTraining CourseDurationContentFrequencyAssessment Required
Quality DirectorQMS Management Expert Training20 hoursEU AI Act Article 17; ISO 9001; QMS framework; Management reviewInitial + annuallyYes - Written exam (≥90%)
Quality ManagerQMS Operations Training16 hoursQMS processes; Quality assurance; Testing; CAPAInitial + annuallyYes - Written exam (≥90%)
Design Quality EngineerDesign Control Training12 hoursDesign control; Design review; Verification; ValidationInitial + annuallyYes - Practical exercise
Test Quality EngineerTesting and Validation Training12 hoursTesting methodologies; Test planning; Test executionInitial + annuallyYes - Practical exercise
CAPA CoordinatorCAPA Training8 hoursCAPA process; Root cause analysis; Action planningInitial + annuallyYes - Practical exercise
AI System OwnersQMS Overview4 hoursQMS requirements; Responsibilities; Design controlAt onboarding + annuallyYes - Knowledge check (≥80%)
All AI Development StaffQuality Awareness2 hoursQuality basics; QMS awareness; Quality requirementsAt onboarding + annuallyYes - Knowledge check (≥80%)

10.3 Training Content by Topic

EU AI Act Article 17 Requirements

  • QMS establishment (Article 17(1))
  • QMS elements (Article 17(1)(a-m))
  • Compliance obligations

Design Control

  • Design planning
  • Design inputs/outputs
  • Design review
  • Design verification
  • Design validation
  • Design change control

Quality Assurance and Testing

  • QA program
  • Testing strategy
  • Test planning
  • Test execution
  • Defect management

CAPA

  • CAPA process
  • Root cause analysis
  • Corrective actions
  • Preventive actions

Management Review

  • Review process
  • Review inputs/outputs
  • Continuous improvement

10.4 Training Delivery Methods

Initial Training:

  • Instructor-led classroom or virtual training
  • Includes interactive exercises and case studies
  • Hands-on practice with QMS tools
  • Group discussions of complex scenarios

Annual Refresher:

  • E-learning modules for core content review
  • Live update sessions for regulatory changes
  • Case study reviews of recent QMS activities
  • Knowledge assessment

On-the-Job Training:

  • Mentoring for new quality staff
  • Job shadowing during QMS activities
  • Supervised QMS work for first 3 AI systems

Just-in-Time Training:

  • Quick reference guides and job aids
  • Video tutorials on specific topics
  • Help desk support from experienced quality staff

10.5 Training Effectiveness Measurement

Assessment Methods:

  • Written exams for knowledge retention
  • Practical exercises for skill application
  • On-the-job observations for competency validation
  • Feedback surveys for training quality

Competency Validation:

  • Quality Directors: Must demonstrate ability to establish QMS for 1 sample AI system with 100% compliance before independent work
  • All staff: Must pass knowledge assessments with minimum required scores

Training Metrics:

MetricTargetFrequency
Training completion rate100%Quarterly
Assessment pass rate (first attempt)≥ 90%Per training
Training effectiveness score (survey)≥ 4.0/5.0Per training
Time to competency (Quality Directors)< 45 daysPer person

10.6 Training Records

Records Maintained:

  • Training attendance records
  • Assessment scores
  • Competency validations
  • Refresher training completion
  • Individual training transcripts

Retention: 10 years (to align with EU AI Act documentation retention)

Access: HR, Quality Director, Internal Audit, Competent Authorities (upon request)


DEFINITIONS

TermDefinitionSource
Quality Management System (QMS)Documented system for ensuring AI systems meet quality requirementsEU AI Act Article 17
Quality PolicyOrganization's intentions and direction related to qualityISO 9001:2015
Quality ObjectiveMeasurable quality targetISO 9001:2015
NonconformityFailure to meet a requirementISO 9001:2015
Corrective ActionAction to eliminate cause of nonconformityISO 9001:2015
Preventive ActionAction to eliminate cause of potential nonconformityISO 9001:2015
CAPACorrective and Preventive Action systemThis Standard
Design InputRequirements that form the basis for designISO 9001:2015
Design OutputResults of design processISO 9001:2015
Design VerificationConfirmation that design outputs meet design inputsISO 9001:2015
Design ValidationConfirmation that design meets user needs and intended useISO 9001:2015
Design ReviewSystematic review of design at appropriate stagesISO 9001:2015

LINK WITH AI ACT AND ISO42001

12.1 EU AI Act Regulatory Mapping

This standard implements the following EU AI Act requirements:

EU AI Act ProvisionArticleRequirement SummaryImplemented By (Controls)
Quality Management SystemArticle 17QMS for high-risk AIAll controls (QMS-001 through QMS-015)
QMS DocumentationArticle 17(1)Documented QMSQMS-001
Regulatory Compliance StrategyArticle 17(1)(a)Compliance proceduresQMS-001, QMS-014
Design ControlArticle 17(1)(b)Design proceduresQMS-004, QMS-005, QMS-006, QMS-007, QMS-008, QMS-009, QMS-010, QMS-011
Development and QAArticle 17(1)(c)Development and QA proceduresQMS-012, QMS-013
Testing and ValidationArticle 17(1)(d)Testing proceduresQMS-013
Technical SpecificationsArticle 17(1)(e)Technical specificationsQMS-005
Data ManagementArticle 17(1)(f)Data management proceduresQMS-001 (integrated with STD-AI-003)
Risk ManagementArticle 17(1)(g)Risk management integrationQMS-001 (integrated with STD-AI-002)
Post-Market MonitoringArticle 17(1)(h)Post-market monitoring integrationQMS-001 (integrated with STD-AI-012)

12.2 ISO/IEC 42001:2023 Alignment

This standard aligns with ISO/IEC 42001:2023 as follows:

ISO 42001 ClauseRequirementImplementation in This Standard
Clause 4.4: AI management systemEstablish AI management systemQMS-001
Clause 5.2: AI policyEstablish AI policyQMS-002
Clause 6.2: AI objectivesEstablish AI objectivesQMS-002
Clause 7.5: Documented informationMaintain documented informationQMS-001
Clause 8.1: Operational planning and controlPlan and control operationsQMS-004, QMS-005, QMS-006
Clause 8.2: AI system risk assessmentRisk assessmentQMS-001 (integrated)
Clause 9.2: Internal auditConduct internal auditsCompliance Section 5.2
Clause 9.3: Management reviewConduct management reviewsQMS-015
Clause 10.1: Nonconformity and corrective actionAddress nonconformitiesQMS-014
Clause 10.2: Continual improvementContinually improveQMS-015

12.3 ISO 9001:2015 Alignment

This standard aligns with ISO 9001:2015 as follows:

ISO 9001 ClauseRequirementImplementation in This Standard
Clause 4.4: Quality management systemEstablish QMSQMS-001
Clause 5.2: Quality policyEstablish quality policyQMS-002
Clause 6.2: Quality objectivesEstablish quality objectivesQMS-002
Clause 7.3: Design and developmentDesign and development controlQMS-004 through QMS-011
Clause 8.5: Production and service provisionControl productionQMS-012, QMS-013
Clause 8.7: Control of nonconforming outputsControl nonconformitiesQMS-014
Clause 9.2: Internal auditConduct internal auditsCompliance Section 5.2
Clause 9.3: Management reviewConduct management reviewsQMS-015
Clause 10.2: Nonconformity and corrective actionAddress nonconformitiesQMS-014

12.4 Relationship to Other Standards

This quality management standard integrates with other AI Act standards:

Related StandardIntegration PointRationale
STD-AI-001: ClassificationClassification determines if QMS requiredHigh-risk AI requires Article 17 QMS
STD-AI-002: Risk ManagementRisk management integrated in QMS (Article 17(1)(g))QMS includes risk management
STD-AI-003: Data GovernanceData management integrated in QMS (Article 17(1)(f))QMS includes data management
STD-AI-004: Technical DocumentationTechnical specifications in QMS (Article 17(1)(e))QMS includes technical specifications
STD-AI-012: Post-Market MonitoringPost-market monitoring integrated in QMS (Article 17(1)(h))QMS includes post-market monitoring

12.5 References and Related Documents

EU AI Act (Regulation (EU) 2024/1689):

  • Article 17: Quality Management System
  • Article 17(1): QMS Requirements
  • Article 17(1)(a-m): QMS Elements (13 mandatory elements)

ISO/IEC Standards:

  • ISO/IEC 42001:2023: Information technology — Artificial intelligence — Management system
  • ISO 9001:2015: Quality management systems — Requirements
  • ISO 13485:2016: Medical devices — Quality management systems (if applicable)

Internal Documents:

  • POL-AI-001: Artificial Intelligence Policy (parent policy)
  • STD-AI-001: AI System Classification Standard
  • STD-AI-002: AI Risk Management Standard
  • STD-AI-003: AI Data Governance Standard
  • STD-AI-004: AI Technical Documentation Standard
  • STD-AI-012: AI Post-Market Monitoring Standard
  • PROC-AI-QMS-001, -002, -003, -004: QMS procedures
  • PROC-AI-CAPA-001: CAPA procedure

APPROVAL AND AUTHORIZATION

RoleNameTitleSignatureDate
Prepared ByRobert MartinezQuality Director_________________________
Reviewed ByDavid LeeChief Technology Officer_________________________
Reviewed BySarah JohnsonAI Act Program Manager_________________________
Reviewed ByJane DoeChief Strategy & Risk Officer_________________________
Approved ByJane DoeAI Governance Committee Chair_________________________

Effective Date: 2025-08-01
Next Review Date: 2026-08-01
Review Frequency: Annually or upon regulatory change


END OF STANDARD STD-AI-009


This standard is a living document. Feedback and improvement suggestions should be directed to the Quality Director.

Standard Details

Standard ID

STD-AI-009

Version

1.0

Status

draft

Owner

Quality Director

Effective Date

2025-08-01

Applicability

High-risk AI systems

EU AI Act References
Article 17
ISO 42001 Mapping
Clause 4-10