Consumer Duty and AI: what the FCA expects
The Consumer Duty, which came into full effect in July 2024, represents the most significant shift in FCA regulatory expectations in over a decade.[1] At its core, the Duty requires firms to deliver good outcomes for retail customers — and to be able to demonstrate that they are doing so.[2]
For firms adopting AI to assist with financial advice, the Consumer Duty creates both an opportunity and a challenge. The opportunity is clear: AI can help firms deliver more consistent, better-researched advice at scale. The challenge is proving to the regulator that this AI-assisted advice meets the same standard of care as traditional human advice.
The four outcomes and AI advice
The Consumer Duty is built around four outcomes: products and services, price and value, consumer understanding, and consumer support.[2] Each has direct implications for firms using AI.
Products and services: AI tools that generate investment recommendations or suitability reports must be designed and monitored to ensure they produce appropriate advice for the client's circumstances. The firm cannot delegate this responsibility to the AI provider.[4]
Price and value: If AI enables firms to produce advice more efficiently, the FCA will expect those efficiencies to be reflected in the value delivered to clients. Firms cannot charge the same fee for AI-generated advice while significantly reducing their costs, without demonstrating proportionate value.[3]
Consumer understanding: Clients must be able to understand the advice they receive, regardless of how it was produced. AI-generated reports that are technically accurate but incomprehensible to the client do not meet this standard.[2]
Consumer support: Firms must ensure that clients can get help when they need it. An AI-first advice process still requires human oversight and accessible support channels.
The accountability gap
The FCA has not banned AI-assisted advice. What it has done is make clear that the firm — not the AI, not the technology provider — is accountable for every piece of advice it delivers.[4] This creates what we call the accountability gap: the distance between what the AI produces and what the firm can prove about its oversight process.
Consider a scenario where a client complains that the advice they received was unsuitable. The firm needs to demonstrate that the advice was properly reviewed before it reached the client. With traditional processes, this might involve pulling up an email chain or a note in the CRM. With AI-generated advice at scale, firms need something more systematic.
What the FCA will look for
Based on published guidance and recent supervisory activity,[3] the FCA is likely to focus on several areas when examining AI-assisted advice:
Governance and oversight: Does the firm have clear processes for reviewing AI-generated advice before it reaches clients? Are these processes documented and consistently followed?[5]
Record-keeping: Can the firm produce a complete audit trail showing what advice was generated, who reviewed it, what changes were made, and when it was approved? Are these records reliable and tamper-proof?
Monitoring and testing: Is the firm actively monitoring the quality of AI-generated advice? Are there processes for identifying and correcting errors?
Accountability: Can the firm clearly identify who is responsible for each piece of advice? Is there a qualified, FCA-authorised individual who has reviewed and approved the advice?[5]
Building the evidence base
The firms that will navigate this landscape most successfully are those building their evidence base now, before the regulator comes asking. This means investing in infrastructure that creates automatic, verifiable records of every step in the advice process.
Bedrock was built specifically for this purpose. Every piece of AI-generated advice is recorded in an immutable ledger the moment it enters the system. Every review action is tracked with timestamps, reviewer identity, and full audit trail. Tamper-proof certificates provide independent verification that the process was followed correctly.
The Consumer Duty does not require firms to avoid AI. It requires them to use it responsibly, with proper oversight, and with the ability to prove that oversight to anyone who asks.[6] The firms that get this right will be the ones that thrive as AI transforms financial services.
References
- FCA, "Consumer Duty", FCA Handbook, PRIN 2A
- FCA, "PS22/9: A new Consumer Duty", Policy Statement, July 2022
- FCA, "FG22/5: Final non-Handbook Guidance for the Consumer Duty", 2022
- FCA, "AI Update", 2024
- FCA, "Senior Managers and Certification Regime"
- Bank of England & FCA, "Artificial intelligence in UK financial services", Third joint survey, 2024
Ready to build your compliance infrastructure?