Medical AI Regulation
Cheat Sheet for Developers
Everything you need to know about getting your AI/ML medical device through FDA, EU MDR, MHRA, and TGA. Risk classification, submission pathways, clinical validation requirements, timelines, costs, and the mistakes that sink startups.
Disclaimer: This guide is educational content for developers exploring the medical AI regulatory landscape. It is not legal or regulatory advice. Regulations change frequently and vary by jurisdiction. Always consult qualified regulatory counsel and/or a Regulatory Affairs professional before making submission decisions. Do not rely on this page as a substitute for professional guidance.
Is Your AI a Medical Device?
The first question every developer needs to answer. The answer determines whether you need regulatory clearance at all. Work through this decision tree:
Does your software diagnose, treat, prevent, or monitor a disease or medical condition?
Likely a medical device (SaMD)
Probably exempt
Does it provide patient-specific recommendations that drive clinical decisions?
Almost certainly regulated
May be a Clinical Decision Support (CDS) exemption
Can a clinician independently review the basis for the recommendation?
May qualify for CDS exemption (21st Century Cures Act)
Regulated as SaMD
Is it purely administrative, billing, or general wellness?
Exempt from device regulation
Continue assessment with regulatory counsel
Key concept -- Software as a Medical Device (SaMD): Defined by the International Medical Device Regulators Forum (IMDRF) as software intended to be used for medical purposes without being part of a hardware medical device. This includes AI/ML algorithms that analyze medical images, predict clinical outcomes, or recommend treatments.
FDA Pathways (United States)
As of early 2026, the FDA has authorized over 950 AI/ML-enabled medical devices. The three main pathways and when each applies:
510(k) Premarket Notification
Most common pathway for AI/ML devices (~70% of clearances)
When to use
- - A substantially equivalent (predicate) device already exists
- - Your device has the same intended use as the predicate
- - Same or similar technological characteristics
- - Different tech but no new safety/effectiveness questions
What you need
- - Predicate device identification and comparison
- - Performance testing (sensitivity, specificity, AUC)
- - Software documentation (IEC 62304 compliance)
- - Cybersecurity documentation + SBOM
- - Biocompatibility (if applicable)
Cost estimate: $200K-$1M+ (regulatory consulting, testing, FDA user fee ~$22K for small businesses, ~$450K standard)
De Novo Classification
For novel, low-to-moderate risk devices with no predicate
When to use
- - No substantially equivalent predicate device exists
- - Device is novel but low-to-moderate risk
- - You want to create a new regulatory classification
- - Your cleared device becomes a predicate for future 510(k)s
What you need
- - Risk-benefit analysis
- - Clinical performance data (often prospective study)
- - Proposed special controls for the new classification
- - Complete software lifecycle documentation
- - Cybersecurity + SBOM
Cost estimate: $500K-$2M+ (more extensive clinical data, longer FDA review, higher consulting fees)
PMA (Premarket Approval)
Highest burden -- for Class III, high-risk devices
When to use
- - High-risk device (Class III classification)
- - Life-sustaining or life-supporting
- - Substantial importance in preventing impairment
- - Presents potential unreasonable risk
What you need
- - Prospective clinical trials (often multi-site)
- - Manufacturing facility inspection
- - Complete QMS (21 CFR Part 820)
- - Annual reports + post-market studies
- - PMA supplements for any changes
Cost estimate: $5M-$50M+ (multi-site clinical trials, FDA user fee ~$450K, ongoing annual reporting costs). Rarely used for AI SaMD.
Predetermined Change Control Plan (PCCP)
Critical for ML developers: the FDA's PCCP framework (finalized 2023) allows you to define anticipated modifications to your AI/ML model in your original submission. If future updates fall within your pre-specified plan -- including retraining on new data, performance improvements, or expanded input specifications -- you can implement them without a new regulatory submission. This is essential for any continuously learning or regularly updated model. Include your PCCP in your 510(k) or De Novo submission.
EU MDR Requirements (European Union)
The EU Medical Device Regulation (2017/745) replaced the old Medical Device Directive in May 2021. It is significantly more demanding, especially for software.
Key Requirements
- 1.CE Marking -- mandatory before placing device on EU market. Requires conformity assessment by a Notified Body (for Class IIa and above).
- 2.Technical Documentation -- Annex II/III. Device description, design and manufacturing info, risk management (ISO 14971), clinical evaluation.
- 3.Clinical Evaluation -- systematic review of clinical data. For AI, this includes algorithm validation studies and real-world performance data.
- 4.Post-Market Clinical Follow-up (PMCF) -- mandatory ongoing clinical data collection after market launch.
- 5.QMS (ISO 13485) -- required quality management system. Must cover the full software development lifecycle.
- 6.UDI System -- Unique Device Identification for traceability in EUDAMED database.
EU AI Act Intersection
The EU AI Act (effective August 2024, with phased enforcement through 2027) adds another layer. Medical AI devices classified as "high-risk" under the AI Act must also comply with:
- - Risk management system specific to AI
- - Data governance and training data quality requirements
- - Transparency obligations (users must know they interact with AI)
- - Human oversight mechanisms
- - Accuracy, robustness, and cybersecurity requirements
- - Registration in the EU AI database
Key date: High-risk AI systems (including medical devices) must comply by August 2026. The clock is ticking.
MHRA (United Kingdom)
Post-Brexit regulatory framework
- - UKCA marking replaces CE marking for the UK market
- - Currently recognizes CE marks during transition (extended to 2030)
- - Software and AI-specific guidance published 2023
- - Approved Bodies (UK equivalent of Notified Bodies) still limited in capacity
- - MHRA exploring adaptive regulation framework for AI/ML
- - Regulatory sandbox (MHRA AI Airlock) available for innovative devices
TGA (Australia)
Therapeutic Goods Administration
- - Inclusion in the Australian Register of Therapeutic Goods (ARTG) required
- - Classification aligned with EU MDR (Rule 11 for software)
- - Conformity assessment through TGA or recognized bodies
- - Accepts some international evidence (EU, FDA) to streamline review
- - Post-market monitoring requirements similar to EU
- - Updated SaMD guidance issued 2024 covering AI/ML specifics
Risk Classification Comparison
How each regulator classifies medical devices by risk level. Most AI/ML SaMD falls into Class II (FDA) or Class IIa/IIb (EU MDR).
FDA (US)
| Class | Risk Level | Examples | Review Type |
|---|---|---|---|
| Class I | Low | Elastic bandages, tongue depressors | Generally exempt |
| Class II | Moderate | Most AI/ML SaMD, powered wheelchairs | 510(k) or De Novo |
| Class III | High | Life-sustaining devices, novel high-risk AI | PMA (Premarket Approval) |
EU MDR
| Class | Risk Level | Examples | Review Type |
|---|---|---|---|
| Class I | Low | Non-measuring, non-sterile software | Self-certification |
| Class IIa | Low-Medium | Diagnostic imaging AI, monitoring | Notified Body audit |
| Class IIb | Medium-High | Treatment-driving AI, implant-related SW | Notified Body + clinical evaluation |
| Class III | High | Life-critical or novel technology | Full Notified Body + clinical investigation |
MHRA (UK)
| Class | Risk Level | Examples | Review Type |
|---|---|---|---|
| Class I | Low | Administrative health software | Self-declaration (UKCA) |
| Class IIa | Medium | Diagnostic support AI | Approved Body review |
| Class IIb/III | High | Autonomous diagnostic AI | Full Approved Body review |
TGA (Australia)
| Class | Risk Level | Examples | Review Type |
|---|---|---|---|
| Class I | Low | General wellness software | Manufacturer self-assessment |
| Class IIa/IIb | Medium | Most clinical AI/ML | Conformity assessment + ARTG inclusion |
| Class III | High | High-risk autonomous systems | Full TGA review + ARTG |
Clinical Validation Requirements
What regulators expect you to prove about your AI/ML model. The depth of evidence scales with risk classification.
Analytical Validation (Does the algorithm work technically?)
Performance Metrics
- - Sensitivity / Specificity
- - AUC-ROC and AUC-PR
- - Positive/Negative Predictive Value
- - Calibration curves
- - Confidence intervals
Robustness Testing
- - Multi-site validation
- - Cross-scanner/device testing
- - Edge case analysis
- - Adversarial input testing
- - Performance under degraded conditions
Bias & Fairness
- - Performance across demographics
- - Age, sex, ethnicity subgroup analysis
- - Dataset representativeness report
- - Mitigation strategy documentation
- - Ongoing monitoring plan
Clinical Validation (Does it improve patient outcomes?)
Study Design Requirements
- - Prospective or retrospective clinical study
- - Comparison to clinical standard of care
- - Pre-specified primary endpoints
- - Adequate sample size with power analysis
- - IRB/Ethics committee approval
- - Independent test set (not used in training)
Documentation Standards
- - IEC 62304 (software lifecycle)
- - ISO 14971 (risk management)
- - ISO 13485 (quality management)
- - IEC 82304-1 (health software)
- - AAMI TIR45 (ML in medical devices)
- - Good Machine Learning Practice (GMLP)
Common Pitfalls
Mistakes that delay submissions, drain budgets, and kill medical AI startups. Learn from others' expensive lessons.
Treating regulation as an afterthought
Regulatory strategy should start at product concept, not after you build. Retrofitting a QMS onto an existing codebase is 3-5x more expensive than designing for compliance from day one.
Insufficient training data documentation
You need to document your dataset composition, demographics, labeling methodology, and inter-annotator agreement. FDA increasingly scrutinizes dataset bias and representativeness.
Ignoring post-market surveillance
Clearance is not the finish line. You must have a plan for monitoring real-world performance, collecting complaints, and reporting adverse events. For AI, this includes model drift detection.
Predicate device mismatch (510k)
Choosing the wrong predicate is one of the top reasons 510(k) submissions get rejected. Your AI must have substantially equivalent intended use AND technology to the predicate.
No Predetermined Change Control Plan
If your model learns or updates, you need a PCCP. Without one, every model update requires a new regulatory submission. The PCCP framework was formalized in the 2023 FDA guidance.
Underestimating EU MDR complexity
EU MDR (effective 2021) is significantly more demanding than the old MDD. Clinical evaluation requirements are stricter, post-market clinical follow-up is mandatory, and Notified Body capacity is limited.
Skipping cybersecurity requirements
Both FDA and EU MDR now require cybersecurity documentation. SBOM (Software Bill of Materials) submission became mandatory for FDA in 2023. Threat modeling is expected.
Assuming "research use only" shields you
If your software is used clinically in practice, labeling it "research use only" does not exempt you from regulation. Regulators look at actual intended use, not just labels.
Timeline & Cost Estimates
Realistic ranges based on industry data. Your actual costs depend on device complexity, clinical data needs, and regulatory strategy.
| Pathway | Timeline | Cost Range | Clinical Data |
|---|---|---|---|
| FDA 510(k) | 6-12 months | $200K - $1M+ | Retrospective usually sufficient |
| FDA De Novo | 12-18 months | $500K - $2M+ | Often prospective study needed |
| FDA PMA | 2-5 years | $5M - $50M+ | Multi-site prospective trials |
| EU MDR (Class IIa) | 12-18 months | $300K - $1.5M | Clinical evaluation report |
| EU MDR (Class IIb/III) | 18-30 months | $500K - $3M+ | Clinical investigation may be required |
| MHRA (UKCA) | 6-18 months | $150K - $1M | Similar to EU MDR requirements |
| TGA (Australia) | 6-12 months | $100K - $800K | Accepts international evidence |
Budget tip: Many startups pursue FDA 510(k) first, then use that clearance to streamline EU MDR and TGA submissions. The US market is typically largest and the 510(k) provides a credibility foundation for other regulators. Pre-submission meetings with the FDA (Q-Sub) are free and highly recommended -- they can save months of back-and-forth.
FDA-Authorized AI/ML Devices: Notable Examples
Real devices that made it through. Study their regulatory strategies.
IDx-DR (Digital Diagnostics)
Autonomous detection of diabetic retinopathy. First FDA-authorized autonomous AI diagnostic device. No physician needed for interpretation.
Caption Health (Caption AI)
AI-guided ultrasound. Assists users with limited training to capture diagnostic-quality cardiac images.
Viz.ai ContaCT
Automated large vessel occlusion detection in CT scans. Alerts stroke teams for faster intervention.
Paige Prostate
First AI-based pathology product. Identifies areas of interest on prostate biopsies for pathologist review.
Aidoc (multiple products)
Triage and notification for pulmonary embolism, cervical spine fractures, intracranial hemorrhage, and more.
GE HealthCare Critical Care Suite
Real-time detection of pneumothorax and chest tube positioning on X-rays at point of care.
The FDA maintains a public database of all authorized AI/ML-enabled devices, updated quarterly. As of early 2026, radiology accounts for ~75% of all authorized AI devices, followed by cardiology (~10%) and pathology (~4%).
Quick Reference: Your First 90 Days
If you are starting a medical AI project today, here is the regulatory checklist for the first three months.
Strategy & Classification
- - Define intended use precisely
- - Determine if your software is a SaMD
- - Identify target markets (US, EU, UK, AU)
- - Hire or contract a Regulatory Affairs specialist
- - Map risk classification per market
- - Identify predicate devices (for 510(k))
- - Schedule FDA Pre-Submission (Q-Sub) meeting
Foundation Building
- - Establish QMS (ISO 13485 compliant)
- - Set up design controls from day one
- - Document training data provenance
- - Begin risk management file (ISO 14971)
- - Start software lifecycle docs (IEC 62304)
- - Plan clinical validation study
- - Implement cybersecurity framework
Execution & Evidence
- - Lock training/validation/test split
- - Run analytical validation studies
- - Compile SBOM (Software Bill of Materials)
- - Draft PCCP for model updates
- - Begin clinical study (if required)
- - Prepare regulatory submission draft
- - Engage with Notified Body (EU) if needed
Continue Exploring
Reminder: This page is for educational purposes only and does not constitute legal, regulatory, or medical advice. Regulatory requirements vary by jurisdiction and change over time. Before making any regulatory decisions, consult with qualified regulatory affairs professionals and legal counsel experienced in medical device law.