Case Study: Sandbox Success
Real-world example of successful regulatory sandbox participation.
Learning Objectives
By the end of this chapter, you will be able to:
- Apply sandbox participation strategies through a detailed realistic scenario
- Identify key success factors and common pitfalls in sandbox engagement
- Understand the practical timeline and resource requirements
- Recognise how sandbox outcomes translate to market readiness
- Extract lessons applicable to your own sandbox planning
Introduction: Learning from Experience
This comprehensive case study follows a fictional start-up through the entire sandbox process—from initial application through successful market entry. While fictional, the scenario draws on common patterns and challenges observed in real regulatory sandbox participation.
Note: This case study is illustrative. Actual sandbox processes vary by Member State and will evolve as the AI Act is implemented.
Company Profile: RareFind Diagnostics
Company Overview
| Attribute | Details |
|---|---|
| Company name | RareFind Diagnostics GmbH |
| Location | Munich, Germany |
| Founded | 2021 |
| Employees | 32 (at sandbox application) |
| Funding | Series A, €8M raised |
| Focus | AI-assisted diagnosis of rare diseases |
The AI System: RareFind Engine
| System Attribute | Description |
|---|---|
| Function | Analyses patient symptoms, medical history, and test results to suggest possible rare disease diagnoses |
| Technology | Multi-modal machine learning combining NLP, structured data analysis, and image interpretation |
| Users | Specialist physicians in rare disease centres |
| Output | Ranked list of possible diagnoses with confidence scores and supporting evidence |
| Role | Decision support—physician makes final diagnosis |
Why Sandbox?
RareFind faced significant uncertainty:
| Challenge | Concern |
|---|---|
| Classification | Clearly high-risk (Annex III, 5(a): medical devices), but novel approach |
| Data governance | Training on rare disease data raises unique challenges |
| Conformity path | Medical device integration path unclear under AI Act |
| Novel technology | No precedent for similar systems |
| Resource constraints | Start-up resources, couldn't afford compliance missteps |
Phase 1: Preparation (Months 1-2)
Decision to Apply
RareFind's leadership evaluated options:
| Option | Pros | Cons | Decision |
|---|---|---|---|
| Direct market entry | Fastest if successful | High risk of non-compliance, expensive mistakes | ❌ Rejected |
| Sandbox participation | Regulatory guidance, reduced uncertainty | Takes time, resource commitment | ✅ Selected |
| Wait and see | Learn from others' experience | Lose first-mover advantage, delayed market entry | ❌ Rejected |
Sandbox Selection
| Consideration | Analysis |
|---|---|
| National sandbox (Germany) | BfArM (Federal Institute for Drugs and Medical Devices) likely authority; familiar jurisdiction, German language |
| Joint sandbox | No suitable multi-state sandbox for medical AI at time of application |
| AI Office sandbox | Not applicable—system is not GPAI |
Decision: Apply to German national sandbox once established.
Internal Preparation
| Activity | Owner | Deliverables |
|---|---|---|
| System documentation | CTO | Technical documentation draft, architecture diagrams |
| Risk assessment | Head of Product | Preliminary FMEA, risk classification rationale |
| Data governance | Data Lead | Training data documentation, GDPR compliance evidence |
| Testing plan | Head of Clinical | Proposed hospital partner, testing methodology |
| Safeguards design | CTO + Legal | Human oversight design, informed consent process |
Phase 2: Application (Month 3)
Application Package
RareFind submitted:
| Component | Content | Page Count |
|---|---|---|
| Executive summary | Company overview, system purpose, sandbox objectives | 3 |
| System description | Technical architecture, algorithm description, intended use | 15 |
| Risk classification | Analysis against Annex III, rationale for high-risk | 5 |
| Compliance questions | Specific guidance sought (10 questions) | 3 |
| Testing proposal | Methodology, hospital partner, safeguards | 12 |
| Safeguards plan | Informed consent, data protection, human oversight | 8 |
| Resource commitment | Team allocation, timeline, budget | 4 |
| SME evidence | Financial statements, employee count | 2 |
Key Compliance Questions Submitted
- Does our human-in-the-loop design satisfy Article 14 human oversight requirements?
- What data governance documentation is required for rare disease training data?
- How should we approach conformity assessment given MDR/AI Act intersection?
- Are our transparency mechanisms sufficient for Article 13?
- What monitoring requirements apply post-deployment?
- How should we document model uncertainty for rare conditions?
- What validation approaches are appropriate given rare disease data limitations?
- How does our logging approach align with Article 12 requirements?
- What is the appropriate scope for bias testing with rare disease populations?
- How should we structure instructions for use for specialist physicians?
Application Outcome
| Timeline | Event |
|---|---|
| Week 1 | Application submitted |
| Week 3 | Acknowledgment received, assigned case officer |
| Week 5 | Clarification questions from authority |
| Week 6 | RareFind responds with additional detail |
| Week 8 | Invitation to application interview |
| Week 10 | Interview conducted |
| Week 12 | Acceptance notification with conditions |
Phase 3: Plan Negotiation (Month 4)
Sandbox Plan Development
| Plan Element | RareFind Proposal | Authority Feedback | Final Agreement |
|---|---|---|---|
| Duration | 9 months | Recommended 12 months | 12 months |
| Testing scope | 500 patients | Increase for statistical validity | 1,000 patients minimum |
| Hospital partners | 1 rare disease centre | Recommend multi-site | 3 centres across Germany |
| Supervision | Monthly reports | Add quarterly meetings | Monthly reports + quarterly meetings |
| Safeguards | Standard informed consent | Enhanced consent for AI | Enhanced AI-specific consent |
| Exit criteria | Performance validation | Add documentation assessment | Performance + documentation + gap assessment |
Agreed Milestones
| Milestone | Month | Deliverables |
|---|---|---|
| M1: Onboarding complete | 1 | Sandbox plan signed, team assigned |
| M2: Testing infrastructure | 2 | Hospital partnerships active, consent process operational |
| M3: Initial testing | 3-5 | 300 patient cases processed |
| M4: Interim review | 6 | Performance analysis, gap identification |
| M5: Extended testing | 7-9 | 1,000 cases complete, bias analysis |
| M6: Documentation finalisation | 10 | All documentation updated |
| M7: Exit assessment | 11-12 | Final evaluation, exit report |
Phase 4: Active Participation (Months 5-16)
Month 5-6: Infrastructure and Testing Setup
| Activity | Outcome |
|---|---|
| Hospital onboarding | 3 centres signed data sharing agreements |
| Consent process | AI-specific consent form developed and approved |
| Technical integration | RareFind Engine integrated with hospital systems |
| Monitoring setup | Real-time performance dashboard operational |
| Training | Physician users trained on system use |
Month 7-9: Initial Testing Phase
| Metric | Target | Actual | Status |
|---|---|---|---|
| Cases processed | 300 | 347 | ✅ On track |
| System availability | 99% | 99.2% | ✅ Met |
| Physician satisfaction | >80% | 87% | ✅ Exceeded |
| Diagnostic accuracy | Baseline TBD | 73% match with final diagnosis | ⚠️ For review |
Month 10: Interim Review (Key Meeting)
Agenda:
- Performance analysis
- Compliance gap assessment
- Guidance on outstanding questions
- Plan adjustments
Key Findings:
| Area | Finding | Authority Guidance |
|---|---|---|
| Human oversight | Design deemed appropriate | Minor documentation enhancement needed |
| Accuracy | 73% match rate acceptable for rare diseases | Document expected performance range |
| Data governance | Documentation gaps identified | Specific additions required |
| Transparency | Explanation quality variable | Improve consistency of explanations |
| Bias testing | Limited by rare disease demographics | Accept methodology limitations with documentation |
Plan Adjustment: Extended testing phase by 1 month to address documentation gaps.
Month 11-14: Extended Testing and Refinement
| Activity | Deliverables |
|---|---|
| Continued testing | 1,047 total cases processed |
| Documentation updates | 23 documentation gaps addressed |
| System improvements | Explanation consistency improved |
| Bias analysis | Comprehensive analysis across available demographics |
| Physician feedback integration | 12 usability improvements implemented |
Monthly Reporting Example
RAREFIND DIAGNOSTICS - SANDBOX MONTHLY REPORT
Month: 11 (March 2026)
TESTING PROGRESS
Cases processed this month: 142
Cumulative cases: 789
Target: 1,000 (79% complete)
PERFORMANCE METRICS
Accuracy (vs. final diagnosis): 74%
System availability: 99.4%
Physician satisfaction: 89%
INCIDENTS
None this period
COMPLIANCE ACTIVITIES
- Data governance documentation v2.1 completed
- Explanation templates refined
- Article 13 transparency checklist addressed
NEXT MONTH FOCUS
- Complete 1,000 case target
- Finalise bias analysis
- Begin conformity assessment preparation
Month 15-16: Exit Assessment
| Assessment Area | Evidence Reviewed | Outcome |
|---|---|---|
| Article 9 Risk Management | RMS documentation, risk register | ✅ Satisfactory |
| Article 10 Data Governance | Data documentation, bias testing | ✅ Satisfactory (with recommendations) |
| Article 11 Documentation | Technical documentation package | ✅ Complete, minor updates noted |
| Article 12 Logging | Logging design, sample logs | ✅ Satisfactory |
| Article 13 Transparency | Instructions for use, explanations | ✅ Satisfactory |
| Article 14 Human Oversight | Oversight design, physician workflow | ✅ Satisfactory |
| Article 15 Accuracy/Robustness | Validation results, security review | ✅ Satisfactory |
Phase 5: Exit and Market Entry (Month 17-20)
Exit Report Summary
| Section | Key Content |
|---|---|
| Compliance assessment | System meets AI Act high-risk requirements |
| Recommendations | 5 minor recommendations for enhancement |
| Conditions | None—unconditional pathway to market |
| Outstanding issues | Ongoing monitoring approach to be implemented |
| EU-wide validity | Exit report valid throughout the Union |
Post-Sandbox Activities
| Activity | Timeline | Owner |
|---|---|---|
| Implement recommendations | Month 17 | CTO |
| Conformity assessment (internal) | Month 17-18 | Quality Lead |
| CE marking preparation | Month 18 | Regulatory Affairs |
| DoC drafting | Month 18 | Legal |
| Market entry | Month 20 | Commercial |
Resource Investment Summary
| Resource | Investment |
|---|---|
| Internal team time | ~2,400 person-hours over 16 months |
| External legal support | €45,000 |
| Hospital partnerships | €60,000 (data access fees) |
| Technology infrastructure | €25,000 |
| Documentation effort | ~800 person-hours |
| Total direct cost | ~€130,000 + internal time |
Success Factors Analysis
What Worked Well
| Factor | Impact |
|---|---|
| Early, thorough preparation | Strong application, smooth onboarding |
| Specific compliance questions | Targeted, actionable guidance |
| Multi-site testing | Robust validation, statistical validity |
| Regular communication | Early identification of issues |
| Documentation-first approach | Efficient exit assessment |
| Cross-functional team | Integrated compliance and development |
Challenges Overcome
| Challenge | How Addressed |
|---|---|
| Rare disease data limitations | Transparent documentation of limitations, accepted by authority |
| MDR/AI Act intersection | Clarified through sandbox guidance |
| Explanation consistency | System improvements during testing phase |
| Timeline extension | Plan amendment negotiated professionally |
What Could Have Gone Better
| Issue | Lesson |
|---|---|
| Initial documentation gaps | More thorough self-assessment before application |
| Underestimated testing scale | Research authority expectations earlier |
| Late data governance updates | Address known gaps before sandbox entry |
Outcomes and Impact
For RareFind Diagnostics
| Outcome | Value |
|---|---|
| Market entry achieved | Revenue generation begins |
| Compliance certainty | Confidence in regulatory position |
| Documentation assets | Reusable for future products |
| Regulatory relationships | Ongoing constructive engagement |
| Competitive advantage | First compliant rare disease AI to market |
| Investor confidence | Series B raised partly due to compliance achievement |
For the Ecosystem
| Outcome | Benefit |
|---|---|
| Guidance developed | Authority published guidance on healthcare AI based on sandbox learnings |
| Best practices shared | RareFind presented at industry conference |
| Template development | Documentation templates made available to sector |
| Precedent established | Pathway clarified for similar systems |
Lessons for Your Sandbox Participation
Before Applying
| Lesson | Action |
|---|---|
| Preparation quality determines outcome | Invest in thorough pre-application preparation |
| Know your questions | Develop specific, actionable compliance questions |
| Assess readiness honestly | Don't apply before system is ready for meaningful testing |
| Understand resource requirements | Budget 12-18 months, significant team time |
During Participation
| Lesson | Action |
|---|---|
| Communication is key | Proactive, regular engagement with authority |
| Document continuously | Don't leave documentation to the end |
| Embrace feedback | View authority guidance as valuable input |
| Build relationships | Regulators become allies, not adversaries |
After Exit
| Lesson | Action |
|---|---|
| Follow through on recommendations | Implement all exit report recommendations |
| Maintain compliance | Sandbox exit is beginning, not end |
| Share learnings | Contribute to ecosystem development |
| Leverage assets | Reuse documentation and processes |
Case Study Application Checklist
Preparation Phase
- Conducted thorough internal readiness assessment
- Developed specific compliance questions
- Prepared all application components
- Identified testing partners and resources
- Allocated dedicated team capacity
Application Phase
- Submitted complete application package
- Responded promptly to clarification requests
- Prepared thoroughly for application interview
- Negotiated realistic sandbox plan
Participation Phase
- Established testing infrastructure per plan
- Maintained regular reporting schedule
- Documented all activities comprehensively
- Addressed guidance and feedback promptly
- Managed plan amendments professionally
Exit Phase
- Prepared thoroughly for exit assessment
- Reviewed and accepted exit report
- Implemented all recommendations
- Completed conformity assessment
- Achieved market entry
What You Learned
Key concepts from this chapter
**Thorough preparation** led to smooth sandbox participation—invest upfront
**Specific compliance questions** generated actionable guidance—don't be vague
**Regular communication** built trust and identified issues early—over-communicate
**Documentation-first approach** made exit assessment efficient—don't leave it to the end
**Testing scale matters**—authorities expect robust validation, plan accordingly
Chapter Complete
Innovation Pathways
7/8
chapters