Smarsh survey reveals AI trust issues in UK finance

Smarsh survey reveals AI trust issues in UK finance

A recent survey by Smarsh has uncovered a sharp rise in the use of AI tools within the UK financial services sector, but also highlights a lack of clarity and control around how these tools are governed.

The report, based on insights from 2,000 employees across financial services and insurance, shows that over a third are using AI frequently, yet many remain uneasy about transparency, regulation, and data privacy.

The findings reveal a striking disconnect between usage and oversight. While AI tools are becoming a regular fixture in day-to-day operations, more than half of employees say they have received no formal training on how to use them responsibly. This is compounded by the fact that 38% of respondents are unsure whether their company has any systems in place to monitor or capture AI-generated outputs. A further 21% confirmed that their organisations do not have such controls at all.

Despite this, there is a strong appetite among employees for more robust governance. Nearly 70% of those surveyed said they would feel more confident using AI if all outputs were transparently monitored.

Smarsh president of enterprise business Tom Padgett said, “AI adoption in financial services has accelerated rapidly, with employees embracing these tools to boost productivity.

“But with innovation comes responsibility. Firms must establish the right guardrails to prevent data leaks and misconduct. The good news is that employees are on board—welcoming a safe, compliant AI environment that builds trust and unlocks long-term growth.”

Smarsh’s research also draws attention to the growing use of AI Agents—automated systems that handle tasks such as customer service and even investment decisions. The survey found that 43% of organisations are deploying such tools in customer communications, while 22% are applying them in investment-related functions. However, these advances are not without concern. A third of employees questioned their firm’s ability to ensure these agents comply with regulatory requirements, while 29% expressed unease about the handling of potentially sensitive data.

Smarsh vice president of product Paul Taylor warned, “Using public Al tools without controls is digital negligence. You’re effectively feeding your crown jewels into a black box you don’t own, where the data can’t be deleted, and the logic can’t be explained. It’s reckless. Private tools like Microsoft 365 Copilot and ChatGPT Enterprise are a step in the right direction. Still, if companies aren’t actively capturing and auditing usage, they’re not securing innovation- they’re sleepwalking into a compliance nightmare.”

Smarsh offers several AI-powered compliance tools to help regulated firms bridge this gap, including solutions tailored for Microsoft 365 Copilot and ChatGPT Enterprise. These tools enable companies to embrace AI without sacrificing oversight, future-proofing their operations while reducing regulatory risk.

Keep up with all the latest FinTech news here
Copyright © 2025 FinTech Global

Enjoyed the story? 

Subscribe to our weekly RegTech newsletter and get the latest industry news & research

Copyright © 2018 RegTech Analyst

Investors

The following investor(s) were tagged in this article.