All Insights
AI Strategy8 min readFebruary 2026

EU AI Act Compliance: A Practical Checklist for Dutch Organisations

The EU AI Act is in force. Dutch organisations in high-risk sectors have mandatory obligations. This checklist covers the key requirements and where most organisations are currently falling short.

II
IITS Team
International IT Solutions B.V., The Hague

The EU AI Act is in force. For Dutch organisations operating AI systems in regulated sectors, mandatory obligations are no longer theoretical. This is a practical checklist based on our work helping clients prepare — and an honest assessment of where most organisations currently stand.

What the Act Actually Requires

The EU AI Act creates a risk-tiered framework. Most AI systems fall into the 'limited risk' or 'minimal risk' categories, requiring only transparency measures (disclosing that users are interacting with AI). The compliance burden falls significantly on 'high-risk' systems — specific categories defined in Annex III of the Act.

High-risk systems face requirements across six domains: risk management systems, data governance, technical documentation, record-keeping and logging, transparency and information to users, human oversight, and accuracy and robustness.

Is Your System High-Risk?

Annex III lists eight categories. Dutch organisations are most frequently affected in these areas:

  • Credit scoring and creditworthiness assessment (banking and lending)
  • Employment and worker management — CV screening, performance monitoring, task allocation
  • Access to benefits, essential services, and emergency services (public sector)
  • Biometric identification and categorisation
  • AI used in education and vocational training outcomes
  • Systems making decisions in the administration of justice

The Compliance Checklist

For each high-risk system, you need to be able to demonstrate:

  1. 1.A documented risk management system covering the full lifecycle of the AI system
  2. 2.Data governance procedures covering training, validation, and testing datasets — including data quality criteria
  3. 3.Technical documentation sufficient for a conformity assessment body to evaluate compliance
  4. 4.Automatic logging of events during operation (inputs, outputs, human oversight actions)
  5. 5.Clear user-facing information about the AI system's capabilities and limitations
  6. 6.Defined human oversight measures — specific roles, intervention capabilities, and authority to override
  7. 7.Documented accuracy, robustness, and cybersecurity measures with performance benchmarks
  8. 8.A conformity assessment completed before market deployment (for the highest-risk categories)

Where Most Dutch Organisations Are Falling Short

Based on our assessments, the most common gaps are not in the technical systems themselves but in documentation and process:

  • No AI system inventory exists. Organisations cannot identify all deployed AI systems, let alone classify them by risk level.
  • Existing documentation was written for IT governance, not AI Act compliance. It typically lacks risk management and data governance specifics.
  • Human oversight is nominal. Systems have a "human in the loop" checkbox but no defined procedure for exercising that oversight or authority to override.
  • Logging is insufficient. Many systems log outputs but not inputs, model versions, or confidence levels — all required for meaningful audit trails.
  • Third-party systems are not assessed. The Act applies to deployers, not just developers. Buying a high-risk AI system from a vendor does not transfer compliance responsibility.

Immediate Actions to Take

  1. 1.Conduct an AI system inventory across all departments. Include both internally-built and vendor-supplied systems.
  2. 2.Classify each system using the Annex III criteria. When in doubt, treat it as high-risk.
  3. 3.For high-risk systems: appoint an owner, initiate a risk management process, and start documentation.
  4. 4.Review all AI vendor contracts. Ensure your agreements include provisions for the documentation and audit access you need to comply.
  5. 5.Appoint or designate an AI Act compliance lead — internal or external — before regulators begin active enforcement.

The window for comfortable compliance preparation is narrowing. The Dutch Authority for Digital Infrastructure is actively building enforcement capacity. Document what you have before an assessment is requested.

Ready to apply this in your organisation?
Book a free strategy session with our team in The Hague.
Book Strategy Session