← All posts
Compliance · Mar 28, 2026 · 6 min read

The EU AI Act Is Now Enforced. What Does It Mean for Your Business?

Full enforcement has begun. Most small businesses are not at risk — but there are three things you should check before your next AI tool purchase.

EU AI Act compliance

The EU AI Act — the world's first comprehensive legal framework for artificial intelligence — is now in full enforcement. After years of debate, drafts, and delays, obligations are real and penalties apply. The question for business owners across Europe is simple: does this affect me, and if so, how?

The short answer for most small businesses: you are probably fine, but you need to do a quick check.

How the Act categorises risk

The EU AI Act uses a tiered risk model. Not all AI is treated the same. The tier your AI tools fall into determines your obligations.

Unacceptable risk — banned outright

Social scoring by governments, real-time biometric surveillance in public spaces, subliminal manipulation. None of this applies to standard business software. You can ignore this category.

High risk — strict obligations

AI used in hiring and HR decisions, credit scoring, educational assessment, critical infrastructure, and law enforcement falls here. If you use AI to automatically screen CVs and make hiring decisions, or if your AI system influences credit terms for customers, you are in this tier. You need conformity assessments, human oversight mechanisms, and detailed documentation.

Limited risk — transparency requirements

Chatbots and AI-generated content tools fall here. The main obligation: users must know they are interacting with AI. If your website has a chat widget powered by AI, it needs a disclosure. This is minimal effort.

Minimal risk — no obligations

Spam filters, AI writing assistants, recommendation engines, analytics tools — the vast majority of AI software used by small businesses. No new obligations apply here.

Three things to check before your next AI tool purchase

What about general-purpose AI tools?

Tools like ChatGPT, Claude, and Gemini fall under the GPAI (General Purpose AI) provisions. The obligations here rest primarily on the model providers, not on businesses using those models via APIs or standard interfaces. You benefit from compliance they are already required to maintain.

Where your responsibility kicks in: if you build a product on top of these APIs that serves customers in a high-risk context, the downstream obligations are yours.

The practical takeaway

For most small and medium businesses, the EU AI Act requires an hour of review, not a compliance overhaul. Audit what AI you use, identify if any of it touches consequential decisions about people, and make sure customer-facing AI is disclosed. That covers the vast majority of cases.

If you operate in a sector where AI informs credit, hiring, or access to services, get proper legal advice — the high-risk tier has real teeth.

If you want a straightforward review of your current AI stack against the Act's requirements, that is something we can walk through together in a single session.

Not sure where your AI tools sit in the risk framework?

We can walk through it with you.

Start the conversation