aicomply.
Lesson15 minChapter 7 of 8

Case Study: Sandbox Success

Real-world example of successful regulatory sandbox participation.

Learning Objectives

By the end of this chapter, you will be able to:

  • Apply sandbox participation strategies through a detailed realistic scenario
  • Identify key success factors and common pitfalls in sandbox engagement
  • Understand the practical timeline and resource requirements
  • Recognise how sandbox outcomes translate to market readiness
  • Extract lessons applicable to your own sandbox planning

Introduction: Learning from Experience

This comprehensive case study follows a fictional start-up through the entire sandbox process—from initial application through successful market entry. While fictional, the scenario draws on common patterns and challenges observed in real regulatory sandbox participation.

Note: This case study is illustrative. Actual sandbox processes vary by Member State and will evolve as the AI Act is implemented.


Company Profile: RareFind Diagnostics

Company Overview

AttributeDetails
Company nameRareFind Diagnostics GmbH
LocationMunich, Germany
Founded2021
Employees32 (at sandbox application)
FundingSeries A, €8M raised
FocusAI-assisted diagnosis of rare diseases

The AI System: RareFind Engine

System AttributeDescription
FunctionAnalyses patient symptoms, medical history, and test results to suggest possible rare disease diagnoses
TechnologyMulti-modal machine learning combining NLP, structured data analysis, and image interpretation
UsersSpecialist physicians in rare disease centres
OutputRanked list of possible diagnoses with confidence scores and supporting evidence
RoleDecision support—physician makes final diagnosis

Why Sandbox?

RareFind faced significant uncertainty:

ChallengeConcern
ClassificationClearly high-risk (Annex III, 5(a): medical devices), but novel approach
Data governanceTraining on rare disease data raises unique challenges
Conformity pathMedical device integration path unclear under AI Act
Novel technologyNo precedent for similar systems
Resource constraintsStart-up resources, couldn't afford compliance missteps

Phase 1: Preparation (Months 1-2)

Decision to Apply

RareFind's leadership evaluated options:

OptionProsConsDecision
Direct market entryFastest if successfulHigh risk of non-compliance, expensive mistakes❌ Rejected
Sandbox participationRegulatory guidance, reduced uncertaintyTakes time, resource commitment✅ Selected
Wait and seeLearn from others' experienceLose first-mover advantage, delayed market entry❌ Rejected

Sandbox Selection

ConsiderationAnalysis
National sandbox (Germany)BfArM (Federal Institute for Drugs and Medical Devices) likely authority; familiar jurisdiction, German language
Joint sandboxNo suitable multi-state sandbox for medical AI at time of application
AI Office sandboxNot applicable—system is not GPAI

Decision: Apply to German national sandbox once established.

Internal Preparation

ActivityOwnerDeliverables
System documentationCTOTechnical documentation draft, architecture diagrams
Risk assessmentHead of ProductPreliminary FMEA, risk classification rationale
Data governanceData LeadTraining data documentation, GDPR compliance evidence
Testing planHead of ClinicalProposed hospital partner, testing methodology
Safeguards designCTO + LegalHuman oversight design, informed consent process

Phase 2: Application (Month 3)

Application Package

RareFind submitted:

ComponentContentPage Count
Executive summaryCompany overview, system purpose, sandbox objectives3
System descriptionTechnical architecture, algorithm description, intended use15
Risk classificationAnalysis against Annex III, rationale for high-risk5
Compliance questionsSpecific guidance sought (10 questions)3
Testing proposalMethodology, hospital partner, safeguards12
Safeguards planInformed consent, data protection, human oversight8
Resource commitmentTeam allocation, timeline, budget4
SME evidenceFinancial statements, employee count2

Key Compliance Questions Submitted

  1. Does our human-in-the-loop design satisfy Article 14 human oversight requirements?
  2. What data governance documentation is required for rare disease training data?
  3. How should we approach conformity assessment given MDR/AI Act intersection?
  4. Are our transparency mechanisms sufficient for Article 13?
  5. What monitoring requirements apply post-deployment?
  6. How should we document model uncertainty for rare conditions?
  7. What validation approaches are appropriate given rare disease data limitations?
  8. How does our logging approach align with Article 12 requirements?
  9. What is the appropriate scope for bias testing with rare disease populations?
  10. How should we structure instructions for use for specialist physicians?

Application Outcome

TimelineEvent
Week 1Application submitted
Week 3Acknowledgment received, assigned case officer
Week 5Clarification questions from authority
Week 6RareFind responds with additional detail
Week 8Invitation to application interview
Week 10Interview conducted
Week 12Acceptance notification with conditions

Phase 3: Plan Negotiation (Month 4)

Sandbox Plan Development

Plan ElementRareFind ProposalAuthority FeedbackFinal Agreement
Duration9 monthsRecommended 12 months12 months
Testing scope500 patientsIncrease for statistical validity1,000 patients minimum
Hospital partners1 rare disease centreRecommend multi-site3 centres across Germany
SupervisionMonthly reportsAdd quarterly meetingsMonthly reports + quarterly meetings
SafeguardsStandard informed consentEnhanced consent for AIEnhanced AI-specific consent
Exit criteriaPerformance validationAdd documentation assessmentPerformance + documentation + gap assessment

Agreed Milestones

MilestoneMonthDeliverables
M1: Onboarding complete1Sandbox plan signed, team assigned
M2: Testing infrastructure2Hospital partnerships active, consent process operational
M3: Initial testing3-5300 patient cases processed
M4: Interim review6Performance analysis, gap identification
M5: Extended testing7-91,000 cases complete, bias analysis
M6: Documentation finalisation10All documentation updated
M7: Exit assessment11-12Final evaluation, exit report

Phase 4: Active Participation (Months 5-16)

Month 5-6: Infrastructure and Testing Setup

ActivityOutcome
Hospital onboarding3 centres signed data sharing agreements
Consent processAI-specific consent form developed and approved
Technical integrationRareFind Engine integrated with hospital systems
Monitoring setupReal-time performance dashboard operational
TrainingPhysician users trained on system use

Month 7-9: Initial Testing Phase

MetricTargetActualStatus
Cases processed300347✅ On track
System availability99%99.2%✅ Met
Physician satisfaction>80%87%✅ Exceeded
Diagnostic accuracyBaseline TBD73% match with final diagnosis⚠️ For review

Month 10: Interim Review (Key Meeting)

Agenda:

  1. Performance analysis
  2. Compliance gap assessment
  3. Guidance on outstanding questions
  4. Plan adjustments

Key Findings:

AreaFindingAuthority Guidance
Human oversightDesign deemed appropriateMinor documentation enhancement needed
Accuracy73% match rate acceptable for rare diseasesDocument expected performance range
Data governanceDocumentation gaps identifiedSpecific additions required
TransparencyExplanation quality variableImprove consistency of explanations
Bias testingLimited by rare disease demographicsAccept methodology limitations with documentation

Plan Adjustment: Extended testing phase by 1 month to address documentation gaps.

Month 11-14: Extended Testing and Refinement

ActivityDeliverables
Continued testing1,047 total cases processed
Documentation updates23 documentation gaps addressed
System improvementsExplanation consistency improved
Bias analysisComprehensive analysis across available demographics
Physician feedback integration12 usability improvements implemented

Monthly Reporting Example

RAREFIND DIAGNOSTICS - SANDBOX MONTHLY REPORT
Month: 11 (March 2026)

TESTING PROGRESS
Cases processed this month: 142
Cumulative cases: 789
Target: 1,000 (79% complete)

PERFORMANCE METRICS
Accuracy (vs. final diagnosis): 74%
System availability: 99.4%
Physician satisfaction: 89%

INCIDENTS
None this period

COMPLIANCE ACTIVITIES
- Data governance documentation v2.1 completed
- Explanation templates refined
- Article 13 transparency checklist addressed

NEXT MONTH FOCUS
- Complete 1,000 case target
- Finalise bias analysis
- Begin conformity assessment preparation

Month 15-16: Exit Assessment

Assessment AreaEvidence ReviewedOutcome
Article 9 Risk ManagementRMS documentation, risk register✅ Satisfactory
Article 10 Data GovernanceData documentation, bias testing✅ Satisfactory (with recommendations)
Article 11 DocumentationTechnical documentation package✅ Complete, minor updates noted
Article 12 LoggingLogging design, sample logs✅ Satisfactory
Article 13 TransparencyInstructions for use, explanations✅ Satisfactory
Article 14 Human OversightOversight design, physician workflow✅ Satisfactory
Article 15 Accuracy/RobustnessValidation results, security review✅ Satisfactory

Phase 5: Exit and Market Entry (Month 17-20)

Exit Report Summary

SectionKey Content
Compliance assessmentSystem meets AI Act high-risk requirements
Recommendations5 minor recommendations for enhancement
ConditionsNone—unconditional pathway to market
Outstanding issuesOngoing monitoring approach to be implemented
EU-wide validityExit report valid throughout the Union

Post-Sandbox Activities

ActivityTimelineOwner
Implement recommendationsMonth 17CTO
Conformity assessment (internal)Month 17-18Quality Lead
CE marking preparationMonth 18Regulatory Affairs
DoC draftingMonth 18Legal
Market entryMonth 20Commercial

Resource Investment Summary

ResourceInvestment
Internal team time~2,400 person-hours over 16 months
External legal support€45,000
Hospital partnerships€60,000 (data access fees)
Technology infrastructure€25,000
Documentation effort~800 person-hours
Total direct cost~€130,000 + internal time

Success Factors Analysis

What Worked Well

FactorImpact
Early, thorough preparationStrong application, smooth onboarding
Specific compliance questionsTargeted, actionable guidance
Multi-site testingRobust validation, statistical validity
Regular communicationEarly identification of issues
Documentation-first approachEfficient exit assessment
Cross-functional teamIntegrated compliance and development

Challenges Overcome

ChallengeHow Addressed
Rare disease data limitationsTransparent documentation of limitations, accepted by authority
MDR/AI Act intersectionClarified through sandbox guidance
Explanation consistencySystem improvements during testing phase
Timeline extensionPlan amendment negotiated professionally

What Could Have Gone Better

IssueLesson
Initial documentation gapsMore thorough self-assessment before application
Underestimated testing scaleResearch authority expectations earlier
Late data governance updatesAddress known gaps before sandbox entry

Outcomes and Impact

For RareFind Diagnostics

OutcomeValue
Market entry achievedRevenue generation begins
Compliance certaintyConfidence in regulatory position
Documentation assetsReusable for future products
Regulatory relationshipsOngoing constructive engagement
Competitive advantageFirst compliant rare disease AI to market
Investor confidenceSeries B raised partly due to compliance achievement

For the Ecosystem

OutcomeBenefit
Guidance developedAuthority published guidance on healthcare AI based on sandbox learnings
Best practices sharedRareFind presented at industry conference
Template developmentDocumentation templates made available to sector
Precedent establishedPathway clarified for similar systems

Lessons for Your Sandbox Participation

Before Applying

LessonAction
Preparation quality determines outcomeInvest in thorough pre-application preparation
Know your questionsDevelop specific, actionable compliance questions
Assess readiness honestlyDon't apply before system is ready for meaningful testing
Understand resource requirementsBudget 12-18 months, significant team time

During Participation

LessonAction
Communication is keyProactive, regular engagement with authority
Document continuouslyDon't leave documentation to the end
Embrace feedbackView authority guidance as valuable input
Build relationshipsRegulators become allies, not adversaries

After Exit

LessonAction
Follow through on recommendationsImplement all exit report recommendations
Maintain complianceSandbox exit is beginning, not end
Share learningsContribute to ecosystem development
Leverage assetsReuse documentation and processes

Case Study Application Checklist

Preparation Phase

  • Conducted thorough internal readiness assessment
  • Developed specific compliance questions
  • Prepared all application components
  • Identified testing partners and resources
  • Allocated dedicated team capacity

Application Phase

  • Submitted complete application package
  • Responded promptly to clarification requests
  • Prepared thoroughly for application interview
  • Negotiated realistic sandbox plan

Participation Phase

  • Established testing infrastructure per plan
  • Maintained regular reporting schedule
  • Documented all activities comprehensively
  • Addressed guidance and feedback promptly
  • Managed plan amendments professionally

Exit Phase

  • Prepared thoroughly for exit assessment
  • Reviewed and accepted exit report
  • Implemented all recommendations
  • Completed conformity assessment
  • Achieved market entry

What You Learned

Key concepts from this chapter

**Thorough preparation** led to smooth sandbox participation—invest upfront

**Specific compliance questions** generated actionable guidance—don't be vague

**Regular communication** built trust and identified issues early—over-communicate

**Documentation-first approach** made exit assessment efficient—don't leave it to the end

**Testing scale matters**—authorities expect robust validation, plan accordingly

Chapter Complete

Innovation Pathways

7/8

chapters