MPs warn UK regulators are lagging on AI risks in finance

Published on

Britain’s financial regulators are failing to keep pace with the risks posed by artificial intelligence leaving consumers and the financial system exposed, according to a hard-hitting report from the Treasury Select Committee.

In its fifteenth report of the current Parliament, MPs say AI adoption in UK financial services is accelerating faster than in almost any other sector, with around 75% of firms already using the technology, particularly insurers and large international banks.

While AI promises faster services, improved fraud detection and efficiency gains, the committee concludes that regulators have taken an overly cautious “wait-and-see” approach.

The report finds that the Financial Conduct Authority and the Bank of England rely on existing rules rather than AI-specific regulation, arguing this creates uncertainty for firms and weakens consumer protection.

RISK OF FRAUD

MPs highlight risks including opaque AI-driven credit and insurance decisions, the potential for financial exclusion, unregulated AI-generated financial advice and an increased risk of fraud.

Concerns were also raised about financial stability. MPs warn that heavy reliance on a small number of US technology firms for AI and cloud services creates concentration risks, while AI-driven trading could amplify market shocks. Despite this, regulators do not currently run AI-specific stress tests.

The committee is particularly critical of delays in implementing the Critical Third Parties regime, which would give regulators oversight of major cloud and AI providers.

More than a year after the regime was established, no firms have yet been designated, despite high-profile outages at major cloud providers affecting UK banks.

PRACTICAL GUIDANCE

The report calls on the FCA to publish clear, practical guidance by the end of 2026 on how existing consumer protection rules apply to AI and how accountability under the Senior Managers and Certification Regime should work.

It also urges the Bank of England and FCA to introduce AI-specific stress testing, and presses HM Treasury to designate major AI and cloud providers as critical third parties within the same timeframe.

Without faster action, MPs warn, the UK risks undermining confidence in one of its most important industries at a time when ministers are counting on AI to drive economic growth.

SAFETY MECHANISMS NEEDED

Dame Meg Hillier (main picture, inset), Chair of the Treasury Select Committee, said: “Firms are understandably eager to try and gain an edge by embracing new technology, and that’s particularly true in our financial services sector which must compete on the global stage.

“Safety mechanisms within the system must keep pace.”

“The use of AI in the City has quickly become widespread and it is the responsibility of the Bank of England, the FCA and the Government to ensure the safety mechanisms within the system keeps pace.

“Based on the evidence I’ve seen, I do not feel confident that our financial system is prepared if there was a major AI-related incident and that is worrying. I want to see our public financial institutions take a more proactive approach to protecting us against that risk.”

FRAUD RISK
Phil Cotter, CEO of SmartSearch
Phil Cotter, SmartSearch

Phil Cotter, CEO of SmartSearch, said: “The Treasury Select Committee has highlighted a growing risk that poorly governed AI could accelerate fraud, increase consumer harm and weaken trust in the financial system.

“In areas such as AML and financial crime prevention, AI must never operate as a black box.

“While AI can dramatically improve the detection of suspicious activity and reduce false positives, these benefits disappear if firms cannot explain decisions, evidence compliance to regulators or intervene when systems go wrong.”

LACK OF CLARITY

He added: “A lack of clarity around accountability and oversight creates real risks for vulnerable consumers and opens the door to greater financial crime.

“Regulatory clarity, explainable AI and human oversight are essential if AI is to strengthen – rather than undermine – the UK’s defences against fraud and financial crime.”

COMMENT ON MORTGAGE SOUP

We want to hear from you!
Leave a comment and get the conversation started.
You need to register to post, so please login or sign up below.

Latest articles

Kensington reduces fixed rates across buy-to-let proposition

Kensington Mortgages has reduced rates across its buy-to-let mortgage range, cutting pricing by up...

Handelsbanken joins Mortgage Advice Bureau lender panel

Handelsbanken has joined the lender panel of Mortgage Advice Bureau, widening the range of...

Royal London highlights role of income protection

Income protection and wider workplace support are increasingly critical as workers grapple with health...

Mortgage lenders turn to AI to tackle delays

UK mortgage lenders are increasingly prioritising technology and process reform as economic pressures continue...

Scottish Widows joins iPipeline underwriting platform

Scottish Widows is set to add its life, critical illness and income protection products...

Latest publication

Other news

Kensington reduces fixed rates across buy-to-let proposition

Kensington Mortgages has reduced rates across its buy-to-let mortgage range, cutting pricing by up...

Handelsbanken joins Mortgage Advice Bureau lender panel

Handelsbanken has joined the lender panel of Mortgage Advice Bureau, widening the range of...

Royal London highlights role of income protection

Income protection and wider workplace support are increasingly critical as workers grapple with health...