HEALTHCARE IT

European digital health investment hit $3.5 billion in 2024. The EU AI Act has classified most medical AI as high-risk. MDR still has a backlog. The regulatory complexity is not slowing down.

Healthcare IT is where technical due diligence and regulatory due diligence are inseparable. A well-built clinical AI platform with no viable CE mark pathway is not a commercial product. A workflow tool that inadvertently qualifies as a medical device has a different cost structure than the pitch assumes. We find these gaps before they appear in the investment.

Why it’s different

Healthcare software has a regulatory layer that makes or breaks the commercial model.

Healthcare IT sits at the intersection of clinical workflow, patient data, and a regulatory framework that treats software as a medical device when it influences clinical decisions. The European regulatory environment — MDR/IVDR, the EU AI Act, the European Health Data Space — is simultaneously expanding in scope and creating new compliance obligations for companies that previously operated under simpler frameworks. What was a digital health workflow tool in 2022 may be a regulated medical device in 2026. Most founders have not done this classification analysis formally.

01

Software as a Medical Device classification is not optional — and it changes the cost structure completely

Under EU MDR 2017/745, software that influences clinical decision-making — diagnosis support, treatment recommendations, clinical risk scoring — qualifies as a medical device and requires CE marking through a Notified Body. This process takes 18–36 months and costs €300,000–€1 million or more depending on the risk class. A company whose financial model shows revenue in 18 months from a product that has not started the CE mark pathway is either not going to hit that timeline, or doesn't realise it needs to. We assess SaMD classification as a standard first step.

02

Clinical AI that isn't validated on the right population is a patient safety risk and a liability

The EU AI Act classifies AI used for medical diagnosis, prognosis, or treatment support as high-risk. The requirements — clinical validation, bias testing across demographic groups, human oversight mechanisms, ongoing monitoring — are layered on top of MDR/IVDR requirements. An AI system validated on a predominantly Northern European patient population may underperform on patient cohorts with different genetic backgrounds, disease presentations, or comorbidity patterns. This is not only a scientific problem — it is a liability problem.

03

Hospital procurement cycles are long, complex, and dependent on EHR integration

Clinical software sales into hospital systems take 12–36 months and require integration with the incumbent EHR — Epic, Cerner, Dedalus, CompuGroup Medical — through certified interfaces. A product that requires a hospital IT team six months to integrate is not a product that will achieve the growth trajectory in the financial model. We assess integration architecture depth specifically against the EHR systems most common in the target customer markets.

Assessment Areas

Where we focus in Healthcare IT engagements.

SaMD classification & CE mark status

Intended use analysis, MDR/IVDR risk class, Notified Body relationship, CE mark timeline

Whether the product requires regulatory approval the company hasn't factored into its commercial timeline and cost base

EU AI Act compliance posture

High-risk classification for medical AI, clinical validation documentation, human oversight implementation

Whether the AI layer meets the 2027 enforcement requirements — and what the compliance gap costs

EHR integration depth

HL7 FHIR, SMART on FHIR, Direct API integrations with Epic, Cerner, and European EHR systems

Whether the product can actually be deployed in a hospital's technical environment — or whether each deployment is a custom integration project

Clinical data quality & GDPR

Patient data processing agreements, EHDS compliance roadmap, cross-border data transfer

Whether the training data and operational data processing is compliant with the GDPR's strict clinical data provisions

Clinical validation methodology

Study design, population diversity, sample size, outcome metric selection

Whether the performance claims are supported by evidence that would withstand regulatory and clinical scrutiny

Security & data protection

NHS DSP Toolkit or equivalent, penetration testing, access controls for patient data

Whether the security posture is appropriate for clinical data — including the specifics required by health system procurement

AI in Healthcare IT

Medical AI is one of the most heavily regulated categories under the EU AI Act. The opportunity is real. So is the compliance cost.

AI is creating genuine breakthroughs in clinical decision support, diagnostic imaging, and pathway optimisation. The EU AI Act classifies medical AI as high-risk, with full enforcement from August 2027 for products already under MDR/IVDR regulation. The companies that will win in medical AI are those that have built compliance into their architecture from the start, not those that are retrofitting it.

Opportunities we verify

Clinical decision support that demonstrably improves outcomes. The most defensible medical AI systems are those where the clinical evidence is strong, the patient safety case is clear, and the validation methodology is robust enough to withstand regulatory review. In diagnostic imaging, triage support, and sepsis detection, AI is showing consistent and measurable outcome improvements. We assess whether the clinical evidence is genuine — peer-reviewed, multi-site, prospective.

EHDS as a data access opportunity for well-positioned platforms. The European Health Data Space creates a structured framework for secondary use of health data for research, analytics, and AI training. Companies whose data architecture is aligned with EHDS data governance requirements are positioned to access health data pools that competitors building outside this framework cannot.

Workflow AI with clear ROI that sidesteps the SaMD definition. Clinical documentation automation, administrative triage, coding assistance, and staff scheduling tools that support clinical efficiency without directly influencing clinical decisions can deliver measurable hospital ROI without triggering MDR/IVDR classification. We verify SaMD classification carefully — the boundary between administrative AI and clinical AI is regularly misunderstood.

Risks we surface

The SaMD classification trap. A product described as a 'workflow efficiency tool' that surfaces patient risk scores for clinical review almost certainly qualifies as a medical device under EU MDR. Founders regularly misclassify products to avoid the CE mark cost and timeline. We conduct a formal intended use analysis as part of standard DD — because a misclassified product is either non-compliant, or requires 18–36 months of regulatory work before commercial deployment.

AI validated on the wrong patient population. Clinical AI trained on data from specific geographies, hospital types, or patient demographics may underperform in the deployment context a European enterprise customer represents. A radiology AI trained predominantly on US imaging data may not generalise to European equipment configurations and patient populations.

EHDS compliance as a moving target. The EHDS is in ongoing implementation, with specific secondary use provisions still being operationalised across member states. Companies building healthcare data strategies around EHDS access need an architecture that can adapt to evolving national implementation rules — not just the current regulatory text.

Know what you’re backing before you commit.

X-Ray delivers a full product and tech verdict on any healthcare IT target in one business day — assessing SaMD classification, EU AI Act compliance posture, EHR integration depth, and clinical validation methodology.

250+ European engagements · 100% partner repeat rate