Following the session on “AI and SMPs: What’s Changing, What Matters, What’s Next”, the message is clear: AI is no longer optional, but it must be adopted with discipline, safeguards, and professional oversight.
The accountancy sector is ahead of many industries in digital adoption. However, most firms still use AI mainly for support tasks such as drafting emails, summarising documents, or research. The real productivity gains will come when AI is embedded into core audit and accounting workflows.
Practical areas ready for AI adoption include risk flagging, reconciliation, revenue recognition review, financial statement drafting, and analysis of past PMP/PLP findings. For example, AI can analyse full transaction populations instead of samples, identify anomalies, automate bank-to-ledger matching, and generate first drafts of financial statements. These tools can reduce manual formatting and review time, but they do not replace professional judgement, partner sign-off, or complex audit assessments such as going concern and valuation.
Confidentiality remains a critical risk. Firms should use paid or enterprise AI platforms, disable data training where possible, approve only reputable providers, and avoid uploading identifiable client information unless necessary. Sensitive data should be anonymised or desensitised before use. For highly confidential engagements, firms should consider private or on-premise AI solutions. The guiding principle is simple: once data is uploaded to a cloud AI model, it should be treated as no longer fully under the firm’s control.
AI adoption should be problem-led, not tool-led. A genuine use case starts with a manual, repeatable, time-consuming task and delivers clear savings or quality improvements. Hype begins with a tool searching for a problem. Firms should assess each AI use case based on cost, consistency, integration into existing workflows, and reliability of outputs.
Because AI can hallucinate or produce inconsistent results, validation is mandatory. Calculations must be checked against source data, regulatory references verified against original standards, risk flags reviewed manually, and financial statement wording checked for accuracy and compliance. For regulatory inspection, firms should document the prompt used, the AI output, reviewer challenges, amendments made, and confirmation that professional judgement determined the final decision.
Audit staff will need new competencies, including prompt engineering, AI governance, output validation, process redesign, and ethical judgement. Staff should be trained to know how to use but also when not to use AI.
In the next 30 days, the firm should adopt an AI usage policy, pilot a paid AI tool with selected audit seniors and managers, choose one high-impact process such as financial statement drafting or reconciliation, train staff, and document all AI-assisted work in audit files.
AI will not replace the accountant’s role in trust, judgement, and compliance. But firms that start small, validate rigorously, and scale only proven use cases will gain a significant advantage.