← Back to Articles

Three Accounting Firms Got GDPR Fines This Year for AI They Didn't Know Was Running

## A Routine Client Question Became a €4M Problem A client asked their accounting firm a simple question: what systems had processed their financial data in the prior 12 months? The firm couldn't...

Three Accounting Firms Got GDPR Fines This Year for AI They Didn't Know Was Running

A Routine Client Question Became a €4M Problem

A client asked their accounting firm a simple question: what systems had processed their financial data in the prior 12 months?

The firm couldn't answer. Not because the data was lost — because they didn't know. Their accounting software had been running AI-powered variance analysis on client financial data for 14 months. The feature was enabled by default in a software update. No one at the firm had requested it. No one had reviewed it. No client engagement agreement mentioned it.

What followed was a GDPR Article 15 access request — a right every person in the EU has to know exactly what happens with their personal data and who processes it. That request triggered a full audit of the firm's processing activities. The audit found what the engagement agreement didn't cover. The fine: €4M.

Nobody had complained about the AI. They just asked a question the firm should have been able to answer.

How AI Crept Into Accounting Software Without Anyone Noticing

AI features were added to accounting platforms so gradually that no single upgrade triggered a compliance review. In 2021, automated transaction categorization appeared in release notes as "improved accuracy." In 2022, anomaly detection alerts arrived. In 2023, AI variance analysis and cross-client benchmarking — comparing each firm's client data against aggregated patterns — landed. Each update was positioned as a productivity enhancement. None came with a consent form for the clients whose data was being processed.

By early 2024, accounting software was running 15 or more AI operations on client financial data every day — and the engagement agreements governing that data had been signed in 2020 or earlier. Three European accounting firms discovered the gap only when regulators arrived.

This is the boiling-frog problem applied to data protection. No single feature was dramatic enough to prompt a legal review. The cumulative effect was that accounting firms became controllers of AI processing activities they had never authorized, never disclosed, and couldn't explain to their own clients.

The Liability Falls on the Firm, Not the Software Vendor

GDPR Article 28 is specific about this: the data controller — the professional firm that signs the client engagement agreement — bears full responsibility for every processing activity on client data. It doesn't matter that a software vendor enabled the feature. It doesn't matter that the vendor's privacy policy mentions AI. The firm that holds the client relationship holds the liability.

In all three fined cases in 2024, the pattern was identical. The accounting firm paid the fine. The software vendor that enabled the AI feature faced no regulatory penalty. The clients whose data was processed without consent received no notification, no compensation, and no explanation until after the regulatory decision was published.

The worst documented case: a firm's AI had analyzed a single client's financial data over 10,000 times across 14 months before anyone realized what consent language was missing from the original engagement agreement. That AI feature saved approximately two hours of monthly variance analysis. The resulting liability was €4M and the loss of 22 client relationships.

Follow the money: the vendor captured the value — usage data, model training material, product improvement. The firm absorbed the cost — fines, reputation damage, client departures. The client bore the risk with no voice in the decision.

Fifteen Thousand Euros of Compliance Work or Four Million in Fines

One accounting firm saw this coming. In January 2024, it updated client engagement agreements to explicitly disclose every AI processing activity running inside its accounting software. It conducted a Data Protection Impact Assessment (DPIA) — an evaluation required under GDPR Article 35 when processing carries a high risk to individuals' rights — for each AI feature. It obtained renewed consent from its 80-client portfolio. The cost: approximately fifteen thousand euros in legal review time and three weeks of administrative effort.

That competitor, operating in the same French market with the same accounting platform and the same AI features active on its systems, did not take those steps. In June 2024, it was fined €4M and lost 22 clients. The compliance gap between the two firms was three weeks and €15,000 in legal fees. The gap in outcomes was €4M and 22 client relationships that will not return.

Both firms knew their software had added AI features. Awareness was not the gap — action was. The firm that updated its agreements treated AI features as new processing activities requiring a fresh legal basis — meaning explicit client consent for each new type of data processing. Its competitor treated them as product improvements that fell under existing agreements. GDPR does not share the second interpretation.

This Applies to Your Firm Today

Three firms were fined in 2024. Their software vendors are still operating. The same AI features that triggered those fines are still running in hundreds of accounting firms that haven't updated their engagement agreements. France's CNIL, Germany's BfDI, and Ireland's DPC — the data protection authorities in each country — are coordinating AI enforcement across borders. A finding in one jurisdiction accelerates investigations in others.

If your accounting software added AI features after your last engagement agreement update, this article describes your firm's position today. Not three other firms' situation last year.

Enforcement is accelerating across every jurisdiction. GDPR data protection fines worldwide have exceeded €7.9 billion since 2018. TikTok received a €530M fine in 2025 — the largest single data protection penalty that year — for cross-border data transfers. The regulatory environment is not softening. Professional services firms managing client financial data are a natural next target because the processing volumes are high, the data is sensitive, and the consent gaps are widespread.

Four Steps to Close the Gap This Quarter

Before the current enforcement cycle reaches your jurisdiction, four actions will close the gap:

Audit every AI feature currently active in your accounting software. Call your vendor. Ask for a list of every AI-powered feature enabled on your account. Request documentation of what data each feature processes and where. If the vendor can't provide this list, that is itself a compliance finding.

Identify which processing activities require a DPIA. Under GDPR Article 35, any processing that uses new technologies and carries a high risk to individuals' rights requires a formal impact assessment. AI analyzing client financial data qualifies. If you haven't conducted a DPIA for these features, you need one before the next audit.

Update client engagement agreements. List AI explicitly as an authorized processing activity. Specify which AI features are active, what data they access, and what purpose they serve. Generic language about "technology tools" is not sufficient — GDPR requires specificity about processing purposes and methods.

Obtain renewed consent where required. For existing clients whose agreements predate your software's AI features, updated consent is not optional. GDPR's accountability principle — the rule that requires firms to actively demonstrate they have the right to process data, not simply assume it — means waiting for a complaint is not a strategy. The three fined firms discovered this after the fact. Your firm can address it now.

The Real Question: What Should Financial AI Look Like?

Accounting firms need AI. The question was never whether to use it — it's how to use it without creating the consent gaps, jurisdiction problems, and liability exposure that cost three firms €4M each in a single year.

Financial AI that works for professional services has three requirements the current generation of bolt-on AI features does not meet: the firm needs to know exactly what AI is running on client data at all times; the processing needs to stay within jurisdictions the client has consented to; and every AI operation needs a full audit trail that can answer a GDPR Article 15 request in hours, not months.

Stralevo was built around exactly these requirements. Every AI operation on financial data is logged, auditable, and traceable to the specific document, page, and field it analyzed. Processing stays on EU infrastructure — not because sovereignty is a marketing message, but because GDPR requires it when your clients haven't consented to cross-border transfers. The firm always knows what's running, what data it touches, and where.

What those three firms lacked wasn't AI ambition. They lacked AI infrastructure designed for the regulatory reality of professional services. That gap cost €4M each. The cost of closing it was €15,000 and three weeks.

And the next enforcement cycle is already underway.

← Previous Your Accounting AI Sends Every Invoice to US Servers. Your Clients Don't Know. Next → The AI That Processes Your Invoices Also Trains on Them. Read the Fine Print.