Consumers don't trust AI

Issue 4 2023 News & Events

Despite the hype, many consumers admit they don't trust artificial intelligence, with a significant proportion also expressing cynicism about the benefits it brings.

Research conducted by Herbert Smith Freehills reveals that just 5% of UK consumers are unconcerned about the growing presence of AI in everyday life. Only 20% say they have a high level of faith that AI systems are trustworthy.

Undertaken to mark the launch of the firm's Emerging Tech Academy, the research explored views among 1000 consumers between the ages of 18 and 80. Respondents were asked about the type of AI systems they use today, expectations about future usage, and comfort levels with the way machines gather data and operate. Key findings include:

Manipulative machines: just over half (56%) do not accept that AI can be impartial. Additionally, more than one third of respondents (37%) fear the outputs of AI systems could be biased against specific groups and over half (53%) also fear AI will make decisions that directly impact them, using information that is wrong.

Responsive, but not responsible: while 60% accept that AI will make the world run more efficiently by offering solutions quickly, just over half (53%) say they are concerned about a lack of accountability in AI systems. One third (31%) also suggest that AI tools failing to meet ethical expectations is a problem.

Modern, yet outdated: although a significant proportion accept that AI can help reduce human errors (44%), just 16% believe AI tools give accurate information. More than one-third (38%) also fear that AI systems use out-of-date information.

"Artificial intelligence can undoubtedly benefit consumers, but there is clearly still work to do to win their trust and overcome cynicism. The AI market risks being seen as the 'wild west' so, as policymakers define their strategies to address the risks of AI, they must ensure they are creating a system that delivers certainty and confidence now, while being flexible enough to promote and account for future innovations," says Alexander Amato-Cravero, Regional Head of Herbert Smith Freehills' Emerging Technology Group.

Based on the findings and ahead of the UK hosting the first major global summit on AI safety, Herbert Smith Freehills' Emerging Tech Academy has identified three steps which, taken together, can foster an environment in which consumer and business confidence in AI will improve. These are:

Accelerating the development and implementation of legally binding AI rules: the sooner policymakers can plug the gaps in the current patchwork of rules that apply to AI with laws, regulations, guidance, and principles that are fit for purpose and have the force of law, the sooner consumers and businesses will be comfortable engaging with AI systems.

Increasing alignment among domestic and global policymakers on AI: the risks associated with AI are overseen by multiple regulators and authorities. A harmonised approach is needed to address gaps in the existing collection of laws and regulations. With consumers engaging with businesses around the world, this discourse must go beyond domestic policy and address global alignment and interoperability as well.

Improving dialogue and better educating consumers and markets on AI risks: despite excitement about the possibilities of AI systems now and in the future, consumers' fear and distrust will be minimised through balanced dialogue about the benefits and risks.

Amato-Cravero concludes, "The key to long-term success is dialogue rather than fanfare. It's easy to get caught up in the hype, but building confidence in AI requires cutting through the noise with sharp focus on the opportunities and risks. At the same time, policymakers must deliver certainty to consumers and businesses by clarifying the patchwork of existing laws and regulations."

The research was conducted during May and June 2023 and is based on 1000 respondents. Respondent profiles include individuals including those in full time employment and education and across 11 UK regions.


Additional statistics

• Only 34% of respondents think AI is reliable.

• Those aged 55+ are less likely (21%) than those 35-years or less (34%) to say AI helps them make better decisions.

• Fewer women (48%) than men (55%) are comfortable with the idea of companies using AI to diagnose health problems.

• More people (53%) are uncomfortable with the idea of AI being used to settle legal disputes than those who think it is a good idea (29%).





Share this article:
Share via emailShare via LinkedInPrint this page



Further reading:

Unlock the future of security operations in Bloemfontein
DeepAlert News & Events Surveillance
Security professionals and business leaders are invited to revolutionise their offsite monitoring operations at the DeepAlert Product Road Show, taking place on 16 – 17 September 2025, at the Schoemanspark Golf Club, Bloemfontein.

Read more...
Hytera supports communication upgrade for Joburg
News & Events Infrastructure Government and Parastatal (Industry)
By equipping Johannesburg’s metro police and emergency services with multimode radios which integrate TETRA and LTE networks, Hytera is bridging coverage gaps and improving response times across the city.

Read more...
Your Wi-Fi router is about to start watching you
News & Events Surveillance Security Services & Risk Management
Advanced algorithms are able to analyse your Wi-Fi signals and create a representation of your movements, turning your home's Wi-Fi into a motion detection and personal identification system.

Read more...
ProtecLink 2025: Ithegi Electronics supports a safer, smarter security ecosystem
News & Events
If you are a security buyer, operations lead, or technology partner, do not miss ProtecLink 2025, to be held in Polokwane on 16 September 2025, at the Polokwane Royal Hotel.

Read more...
Secutel maintains ISO certifications
News & Events Fire & Safety
Secutel Technologies has successfully recertified all four of its ISO standards, a reflection of its continued commitment to excellence, client trust, and operational integrity.

Read more...
SABRIC appoints Andre Wentzel as interim CEO
News & Events Financial (Industry) Associations
The South African Banking Risk Information Centre (SABRIC) has announced the appointment of Andre Wentzel as interim chief executive officer, effective immediately.

Read more...
Choicejacking bypasses smartphone charging security
News & Events Information Security
Choicejacking is a new cyberthreat that bypasses smartphone charging security defences to confirm, without the victim’s input or consent, that the victim wishes to connect in data-transfer mode.

Read more...
Paxton cuts emissions by over a third
Paxton News & Events
Paxton has announced a significant reduction in its carbon footprint, cutting emissions by 961 tonnes of CO2e in its 2023 second reporting year.

Read more...
Most wanted malware
News & Events Information Security
Check Point Software Technologies unveiled its Global Threat Index for June 2025, highlighting a surge in new and evolving threats. Eight African countries are among the most targeted as malware leaders AsyncRAT and FakeUpdates expand.

Read more...
Securex gears up for Cape Town
News & Events
Four industry expos debut in Cape Town from 21–23 October, providing access to Africa’s tech hub and a rapidly expanding local market, through a platform covering security, OSH, facilities management, and fire safety solutions in one venue.

Read more...










While every effort has been made to ensure the accuracy of the information contained herein, the publisher and its agents cannot be held responsible for any errors contained, or any loss incurred as a result. Articles published do not necessarily reflect the views of the publishers. The editor reserves the right to alter or cut copy. Articles submitted are deemed to have been cleared for publication. Advertisements and company contact details are published as provided by the advertiser. Technews Publishing (Pty) Ltd cannot be held responsible for the accuracy or veracity of supplied material.




© Technews Publishing (Pty) Ltd. | All Rights Reserved.