FCA lays out approach to AI regulation

UK financial watchdog the Financial Conduct Authority (FCA) has called for the creation of a strong regulatory framework around the use of artificial intelligence in financial services.

In response to the results of the UK government’s consultation on a pro-innovation approach to AI regulation, the FCA has published its own update in which it states its focus on how firms can safely and responsibly adopt the technology as well as understanding what impact AI innovations are having on consumers and markets.

The announcement comes ahead of a government deadline set for 30th April which calls on UK regulators to outline their strategic approach to AI.

In the update’s foreword, Jessica Rusu, chief data, information and intelligence officer, outlines that in the two years since the FCA published its Discussion Paper on AI, the technology has been “propelled to the forefront of agendas across the economy with unprecedented speed” and that “AI could significantly transform the way [financial institutions] serve their customers and clients.”

The report goes on to outline the regulator’s approach to AI, along with responding to five key principals that had been laid out by the government: safety, security, robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress.

Looking ahead, the FCA said that it plans to continue to further its understanding of AI deployment in UK financial markets in an effort to ensure that “any potential future regulatory adaptations are proportionate to the risks, whilst creating a framework for beneficial innovation”.

The FCA added that it is currently involved in diagnostic work on the deployment of AI across UK financial markets, and that it is re-running a third edition of its machine learning survey, jointly with the Bank of England. The FCA said it is also collaborating with the Payment Services Regulator (PSR) to consider AI across systems areas.

To ensure that AI is used in a way that is safe and responsible, the FCA said it is assessing opportunities to “pilot new types of regulatory engagement as well as environments in which the design and impact of AI on consumers and markets can be tested and assessed without harm materialising”.

As for its own use of AI, the regulator said that the tech can help identify fraud and bad actors, noting that it uses web scraping and social media tools that are able to detect, review and triage potential scam websites. The regulator said that it plans to invest more into these technologies, and that it is currently exploring potential further use cases involving Natural Language Processing to aid triage decisions, assessing AI to generate synthetic data or using LLMs to analyse and summarise text.

The report concludes that while AI can “make a significant contribution to economic growth,” this will require “a strong regulatory framework that adapts, evolves and responds to the new challenges and risks that technology brings”.

Reacting to the FCA’s response, Karim Haji, global and UK head of financial services at KPMG, said that regulation of AI will continue to be a major concern for the financial services sector this year.

“The regulation of AI will continue to be a big issue for the financial services sector this year,” Haji commented. “A recent poll we conducted found 81 per cent of sector leaders ranked policies aimed at balancing the opportunities with the risks of AI as important when it comes to government policy ahead of a general election.”



Share Story:

Recent Stories


Safeguarding economies: DNFBPs' role in AML and CTF compliance explained
Join FStech editor Jonathan Easton, NICE Actimize's Adam McLaughlin and Graham Mackenzie of the Law Society of Scotland as they look at the role Designated Non-Financial Businesses and Professions (DNFBPs) play in the financial sector, and the challenges they face in complying with anti-money laundering and counter-terrorist financing regulations.

Ransomware and beyond: Enhancing cyber threat awareness in the financial sector
Join FStech editor Jonathan Easton and Proofpoint cybersecurity strategist Matt Cooke as they discuss the findings of the State of the Phish 2023 report, diving into key topics such as awareness of cyber threats, the sophisticated techniques being used by criminals to target the financial sector, and how financial institutions can take a proactive approach to educating both their employees and their customers.

Click here to read the 2023 State of the Phish report from Proofpoint.

Cracking down on fraud
In this webinar a panel of expert speakers explored the ways in which high-volume PSPs and FinTechs are preventing fraud while providing a seamless customer experience.

Future of Planning, Budgeting, Forecasting, and Reporting
Sage Intacct is excited to present FSN The Modern Finance Forum’s “Future of Planning, Budgeting, Forecasting, and Reporting Global Survey 2022” results. With participation from 450 companies around the globe, the survey results highlight how organisations are developing their core financial processes by 2030.