Tech experts express concern over deepfakes and misinformation around General Election
Tech experts are concerned about the impact of misinformation and deepfakes heading into the General Election, according to the results of a new poll conducted by Swindon-headquartered BCS, The Chartered Institute for IT.
62 per cent of IT professionals polled believe that the current ban on media coverage of the general election on polling day should include an exception to allow mainstream media to refute fraudulent misinformation.
The current regulations, set by Ofcom, state that discussion and analysis of election issues must finish when the polling stations open, and not resume until they close.
The regulations also stipulate that whilst people are voting, broadcasters must not publish the results of any opinion polls.
Deepfakes ‘pose a risk to elections’
In the same poll of 1,200 tech experts, 65 per cent of respondents expressed concern that deepfakes will have an influence on the result of the upcoming UK General Election – and are calling for technical and policy solutions.
And 92 per cent of tech experts said political parties should agree to publicise when and how they are using AI in their campaigns.
The two most effective measures for limiting the democratic decay of deepfakes will be public education and technical solutions – such as watermarking and labelling – the poll found.
A group of major tech companies signed a pact in February to adopt ‘reasonable precautions’ to prevent AI from being used to disrupt democratic elections around the world.
But only eight per cent of computing professionals in the BCS poll think this agreement will be effective.
Adam Leon Smith, BCS fellow and international AI standards expert said: “As we approach the General Election, it is essential that broadcasters are more active in the fight against misinformation and disinformation, especially when it comes to those misleading the electorate.”
“By enabling reputable media outlets to fact-check and correct misleading content in real time, they can provide the public with accurate information, thereby fostering a more informed electorate and upholding democratic values.”
Rashik Parmar MBE, chief executive of BCS, The Chartered Institute for IT added: “Technologists are seriously worried about the impact of deepfakes on the integrity of the election – but there are things politicians can do to help the public and themselves.
“Parties should agree between them to clearly state when and how they are using AI in their campaigns.
“Official sources are just one part of the problem. Bad actors outside the UK and independent activists inside can do even more to destabilise things.
“We need to increase public awareness of how to spot deepfakes, double-check sources and think critically about what we’re seeing.
“We can support that with technical solutions, and the most popular in the poll was a clear labelling consensus where possible – and it would be ideal if this could be done globally with the US election coming too.”
Pictured: Prime minister Rishi Sunak and Labour leader Keir Starmer in front of a burning Houses of Parliament. This AI-generated image is not designed to mislead – but more convincing examples could sway voters
Women in Leadership champion appointed CEO of Swindon-based tech institute
Read more22.05.2025
Computing’s professional body appoints new COO
Read more23.04.2025
BCS appoints HMRC’s chief digital officer as president
Read more14.03.2025
BCS chief executive Rashik Parmar announces departure
Read more06.01.2025
Not enough girls are leaving school with computing qualifications, BCS warns
Read more23.08.2024
Number of women taking computer science degrees continues to grow
Read more16.08.2024
SMEs struggling to hire the AI and digital apprentices they need – BCS
Read more25.06.2024
Strike off unethical tech bosses, professional body urges next government
Read more05.06.2024