← Blog
AI & RegulationApril 6, 2026·6 minutes

The EU AI Act Creates a New Diligence Obligation. Most European VCs Aren't Ready.

36–39% of European deal flow in 2025 involves AI-first companies. The EU AI Act's high-risk provisions take full effect in August 2026. Here's what every European investor needs to assess — and what happens if they don't.

Wouter Neyndorff

Wouter Neyndorff

CEO

The EU AI Act Creates a New Diligence Obligation. Most European VCs Aren't Ready.

Here is a number that should concern every European investor: fewer than 15% of VC firms have a formal AI data practices assessment framework. At the same time, 36–39% of European deal flow in 2025 involves AI-first companies.

That gap — between what's in your portfolio and what you're equipped to assess — is about to become materially more expensive.

What the EU AI Act actually requires

The EU AI Act's full obligations land in stages, with high-risk provisions fully applicable from August 2026. For investors, the immediate implication is this: any company you back that operates in a high-risk AI category carries a compliance cost that must be factored into the investment case.

High-risk categories include HR and recruitment systems, credit scoring, education assessment tools, medical devices, and any AI used in critical infrastructure. For each, the compliance cost runs €50,000–100,000 for initial implementation, plus €20,000–50,000 annually ongoing.

Unacceptable-risk applications — social scoring, real-time biometric surveillance in public spaces — are simply banned. If a company's product is in this category, that's a dealbreaker that needs surfacing before the term sheet, not after.

Why this is harder than GDPR was

GDPR's compliance burden was primarily about data handling — process and documentation changes that most companies could address with legal support. The AI Act's requirements go to the product itself: how models are trained, what data was used, how outputs are governed, how humans remain in the loop.

You can't audit AI Act compliance by reading a privacy policy. You need to look at the actual system — the model architecture, the training data provenance, the inference pipeline, the governance layer. That requires technical access and operator-level judgment, not just legal review.

European businesses are already shifting their procurement accordingly: 72% now prioritise data sovereignty when choosing tech vendors, up from 58% in 2022. SaaS providers with EU-sovereign deployment command 15–30% higher contract values. The companies that get this right early have a genuine pricing advantage.

What you need to assess before backing an AI company

Every AI-first company in your deal flow now warrants at minimum four checks before the term sheet:

  • Risk classification: which category does the product fall into under the Act? The classification determines the compliance obligation — and the cost.
  • AI governance: is there documentation of how models are trained, updated, and monitored? Absence of governance is both a compliance risk and an operational signal.
  • Training data provenance: was the training data legally obtained? Copyright exposure from unlicensed training data is a live issue in European courts.
  • Human oversight mechanisms: does the system have adequate human-in-the-loop controls for its risk category? Automated decision-making without oversight is the most common source of high-risk classification failures.

None of these questions can be answered by asking the CTO. They require technical access — read-only at minimum — and someone who understands both the regulatory framework and the engineering reality.

The regulatory layer is now a permanent feature of European deal risk

GDPR reshaped data handling. DORA — fully applicable from January 2025 — covers 22,000+ EU financial entities with operational resilience requirements that extend to their tech suppliers. NIS2 covers 18 critical sectors. The EU AI Act adds a fourth regulatory dimension that cuts across all sectors.

The funds that build regulatory fluency into their technical assessment process will make better-informed decisions than those treating it as legal overhead. A portfolio company that crosses a regulatory line post-investment is an expensive problem. One that's never backed because the compliance cost wasn't assessed pre-LOI is a missed return.

What we assess

Every X-Ray includes an AI Readiness pillar: classification under the EU AI Act, governance practices, data provenance assessment, and a benchmark against other AI-first companies we've assessed across Europe. If a company is building on AI and you're considering backing them, this is not optional diligence — it's the diligence.

August 2026 is closer than it looks. The time to build this into your deal process is now, not when the first portfolio company triggers a compliance review.

Sources

Start with an X-Ray.

1 business day. The complete picture. 250+ assessments delivered.