Technology & Innovation

Britain is using AI, but wants a human boss

Brits are rapidly folding AI into everyday life, but remain wary of letting it make decisions on its own, according to new EY research that points to a trust problem at the heart of the UK’s AI push. EY’s AI recent sentiment index found 74 per cent of UK consumers

  • Saskia Koopman
  • May 6, 2026
  • 0 Comments

Wednesday 06 May 2026 7:04 pm

Brits are rapidly folding AI into everyday life, but remain wary of letting it make decisions on its own, according to new EY research that points to a trust problem at the heart of the UK’s AI push.

EY’s AI recent sentiment index found 74 per cent of UK consumers have used AI in the past six months, from customer support and route planning to health queries and financial services.

But just 14 per cent said they would be comfortable relying on fully autonomous AI systems, highlighting a clear divide between using AI as a tool and trusting it to act independently.

The findings land as ministers and businesses pour money into AI adoption, with UK firms racing to prove the technology can deliver productivity gains rather than just shiny pilots.

Matthew Ringelheim, EY UK and Ireland AI leader, said: “AI adoption in the UK is rapidly advancing, but trust is not keeping pace with technological capability.”

#mc_embed_signup { background: #fff; clear: left; font: 14px Helvetica, Arial,sans-serif; width: 100%; max-width: 600px; margin: 20px 0; } #mc-embedded-subscribe-form { margin: 20px 0 !important; } .newsletter-form-flex { display: flex; gap: 0; align-items: center; margin-top: -10px; } .newsletter-form-flex input[type=”email”] { flex: 1; padding: 2px 10px; border: 1px solid rgb(18, 22, 23) !important; border-radius: 12px 0 0 12px !important; } .newsletter-form-flex input[type=”submit”] { padding: 4px 10px !important; margin: 0 !important; background-color: rgb(18, 22, 23) !important; color: rgb(255, 255, 255) !important; border: 1px solid rgb(18, 22, 23) !important; border-radius: 0 12px 12px 0 !important; } .newsletter-banner-content { margin-bottom: 15px; } .newsletter-banner-content h2 { margin: 0 0 10px 0; font-size: 18px; font-weight: 600; } .newsletter-banner-content p { margin: 0 0 10px 0; line-height: 1.5; } .newsletter-banner-content ul, .newsletter-banner-content ol { margin: 0 0 10px 20px; } .newsletter-banner-content a { color: #0073aa; text-decoration: none; } .newsletter-banner-content a:hover { text-decoration: underline; } .newsletter-banner-content img { max-width: 100%; height: auto; margin: 10px 0; } #mc_embed_signup #mce-success-response { color: #0356a5; display: none; margin: 0 0 10px; width: 100%; } #mc_embed_signup div#mce-responses { float: left; top: -1.4em; padding: 0; overflow: hidden; width: 100%; margin: 0; clear: both; }

“Whilst consumers are engaging with AI every day, many still want greater clarity about who is accountable when decisions are made on their behalf.”

Trust still lags adoption

EY’s survey found AI is already part of routine life for millions, with 35 per cent using it for customer support, 31 per cent for travel routes and 26 per cent to help identify possible medical symptoms.

Half of UK respondents said they had used AI in health or wellness experiences in the last six months, while 35 per cent had used it in financial activities.

Read more Big Four giant EY set to bolster AI recruitment 

Yet only 43 per cent trust companies to manage AI-related data effectively, while 41 per cent trust governments. Almost three quarters are worried AI systems could be hacked or breached.

Britain’s AI sector has strong momentum, with startups valued at more than £45bn and billions flowing into the market, but adoption inside large firms is increasingly being slowed by governance, data quality and accountability concerns.

OneStream research recently found nearly half of senior executives had made a material business decision using inaccurate or outdated data in the past year.

Ringelheim said: “Trust must be embedded through strong data foundations, clear accountability and visible human oversight”.

“Organisations that can clearly demonstrate how autonomy is governed, and how people retain meaningful control, will be best positioned to scale AI responsibly.”

The EY report also points to a skills gap, with just 23 per cent of UK consumers saying they had received significant AI training or education.

Ringelheim added: “Training also better equips users to better spot errors, challenge outputs and make more informed decisions on when to rely on AI and when to escalate human judgement.”

Read more IBM: AI adoption stalls as UK businesses lack clear strategy

Similarly tagged content: Sections Categories People & Organisations

This post was originally published on this site.