Roughly 80% of AI projects in finance fail to deliver their expected value. That statistic should make every CFO pause. Not because the technology doesn't work. It does. The large language models are impressive. The automation platforms are capable. The dashboards are beautiful.
They fail because the finance function underneath them isn't ready. And most people are looking at the wrong reasons.
The assumption vs the reality
When an AI project stalls in finance, the default narrative is that the technology wasn't right, or the vendor oversold, or the use case was too ambitious. Sometimes those things are true. But in my experience, after 25 years of building and fixing finance functions, the actual reasons are far more mundane.
Data quality. The AI needs clean, consistent, well-structured data. Most finance functions don't have that. They have data scattered across multiple systems, inconsistent naming conventions, manual workarounds that introduce errors, and reconciliations that exist precisely because the data can't be trusted.
Process immaturity. AI automates processes. If your processes are inconsistent, poorly documented, or dependent on individual knowledge, there's nothing stable to automate. You're asking AI to learn a process that changes depending on who's doing it and what day of the month it is.
Organisational unreadiness. Even when the data is clean and the processes are solid, the organisation needs to be ready to work differently. That means trust in the output, willingness to change workflows, and leadership that understands what AI can and cannot do.
These three factors account for the vast majority of AI failures in finance. And none of them are technology problems.
The perception gap
There's a fascinating disconnect in how organisations see their own AI readiness. Research consistently shows that around 51% of CFOs claim their organisations have achieved full AI adoption. But when you ask the controllers and finance managers — the people actually doing the work — only about 19% agree.
That's not a small gap. That's a chasm. And it tells you something important: the people at the top think AI is working. The people in the middle know it isn't.
The CFO sees the dashboard. The controller sees the manual workaround that feeds it.
I've seen this play out repeatedly. A PE-backed business invests in an AI-powered forecasting tool. The CFO presents the output at board meetings. What nobody mentions is that the management accountant spends two days each month manually adjusting the inputs because the underlying data isn't structured in a way the tool can consume. The AI is technically running. It's just not doing what anyone thinks it's doing.
What "AI-ready" actually means
When I assess a finance function's AI readiness, I look at six dimensions. This isn't a theoretical framework. It's what I've found actually determines whether AI will work in practice.
1. Data foundation. Is your data clean, consistent, and accessible? Do you have a single chart of accounts across all entities? Are your naming conventions standardised? Can you extract data from your systems without manual intervention? Most finance functions score poorly here, especially those that have grown through acquisition.
2. Process maturity. Are your core finance processes documented, standardised, and repeatable? Could someone new follow them without guidance from the person who usually does the work? If your month-end close depends on institutional knowledge rather than documented procedures, you're not ready.
3. Controls environment. Do you have proper controls around data input, processing, and output? AI doesn't just need good data — it needs data you can prove is good. That means audit trails, approval workflows, and exception handling that's systematic rather than ad hoc.
4. Systems architecture. Are your systems integrated, or are you running parallel systems with manual bridges between them? AI works best when it can access data through clean APIs and automated feeds, not when it depends on someone exporting a CSV and reformatting it.
5. People and skills. Does your team understand what AI does and doesn't do? Are they willing to work alongside automated processes? Do you have anyone who can evaluate AI output critically, or will people blindly trust whatever the system produces?
6. Governance framework. How will you manage AI-related risks? Who's responsible for the accuracy of AI-generated output? How do you handle the EU AI Act requirements that are already affecting financial services? Most finance functions haven't even started thinking about this.
The Marie Myers principle
Marie Myers, the former CFO of HP, articulated something I think about constantly: "The first move wasn't to switch on AI, but to redesign the work."
That's the insight most organisations miss. They think AI adoption is about choosing the right tool and implementing it. It's actually about redesigning how work gets done so that AI can be effective. That means standardising processes before you automate them. Cleaning data before you feed it to algorithms. Building controls before you hand decisions to machines.
The organisations that succeed with AI in finance almost always follow this sequence: fix the foundation, then add the intelligence. The ones that fail do it the other way round.
Why this matters for PE and VC specifically
If you're in a PE- or VC-backed business, AI readiness isn't just an efficiency question. It's a value question.
Compliance risk gets priced into exits. If your finance function can't demonstrate proper controls around AI-generated output, that's a risk factor in due diligence. If you're using AI for forecasting or reporting and you can't explain how the numbers are produced, buyers will discount your projections.
Operating partners are already asking about AI readiness. I'm seeing it in every portfolio review I'm involved with. Not because they want to see flashy AI demos, but because AI readiness is a proxy for finance function maturity. A finance function that's AI-ready is, by definition, one with clean data, standardised processes, proper controls, and capable people. That's what creates value.
The flip side is also true.
A finance function that's deployed AI on top of messy foundations is a liability. It looks modern, but it's fragile. And sophisticated buyers can see through it.
The foundation is the work
Most of what it takes to be AI-ready is straightforward finance operations. Not technology. Not data science. The unglamorous work of process standardisation, data governance, controls design, and documentation.
Standardise your chart of accounts across all entities. Document your close process so anyone can follow it. Clean up your master data: customer names, cost centres, GL codes. Build proper reconciliation processes with audit trails. Create controls around data input that prevent errors rather than just catching them.
This work isn't exciting. It doesn't make for good conference presentations. But it's what makes AI actually work.
The technology layer, selecting tools, integrating systems, training models, is easier than it's ever been. The tools are mature, the integration options are extensive, and the vendors have largely figured out how to deploy in mid-market finance functions. The technology isn't the hard part any more. The foundation is.
Three questions before you buy any AI tool
Before you sign that contract or approve that budget, ask yourself:
1. Can I describe the process this will automate in a standard operating procedure? If you can't write down exactly how the work gets done today — step by step, with no "and then Sarah does her thing" gaps — then you don't have a process mature enough to automate. AI can't standardise a process for you. It can only execute one that already exists.
2. If I gave the AI's input data to a new hire, could they produce the right output manually? This tests both data quality and process clarity. If a competent accountant couldn't produce the right answer from the data the AI will receive, the AI won't either. Machines don't have the institutional knowledge your team uses to compensate for bad data.
3. Who will be accountable for the AI's output, and how will they verify it? If the answer is "nobody" or "we'll trust the system," you're not ready. Every AI output in finance needs a human owner who understands what they're looking at and can spot when something's wrong. That requires both skills and governance structures.
If you can't answer all three clearly, you're not ready to buy. You're ready to fix your foundation.
Realistic timeline expectations
The vendors will tell you 7-12 months to full deployment and ROI. In my experience, for a PE- or VC-backed mid-market business, the realistic timeline is more like 2-4 years for meaningful return.
That breaks down roughly as:
- Months 1-6: Foundation work. Data cleanup, process standardisation, controls design, team training. This is where the real value starts. Your finance function gets better before AI even enters the picture.
- Months 6-12: Pilot phase. One or two carefully chosen use cases, usually in areas where the data is cleanest and the process most standardised. Accounts payable automation or basic reporting are common starting points.
- Months 12-24: Expansion. Apply what you learned in the pilot to additional use cases. Adjust processes that didn't work as expected. Build internal capability to manage AI tools.
- Months 24-36: Maturity. AI is embedded in core finance processes. The team trusts and understands the output. You can demonstrate to buyers or auditors exactly how AI-generated numbers are produced.
That timeline might sound long. But compare it to the alternative: buying a tool tomorrow, spending 12 months trying to make it work with messy data and immature processes, then writing it off as a failed project and starting over. I've seen that cycle repeat three times in a single organisation.
Fix the foundation first
The technology works. I want to be clear about that. AI in finance is not hype. It's capable of transforming how finance functions operate. Automated reconciliations, predictive analytics, natural language reporting, anomaly detection — these things work when the foundation is right.
But the foundation has to be right first. Clean data. Standard processes. Proper controls. Capable people. Clear governance.
That's the bulk of the work. And it's work that makes your finance function better regardless of whether you ever deploy AI.
A finance function with clean data, standardised processes, and strong controls creates value at exit whether it uses AI or not.
Fix the foundation first. Then the AI works.
If you're trying to figure out whether your finance function is actually ready for AI — or what needs to happen first — the AI Readiness Assessment is a focused 2–3 week engagement designed specifically for this. No vendor bias, no technology agenda. Just a clear view of where you stand and a practical roadmap to get AI-ready.
