The warnings have grown louder and more urgent as 2024 approaches: The rapid advance of artificial intelligence tools threatens to amplify misinformation in next year's presidential election at a scale never seen before.
Most adults in the U.S. feel the same way, according to a new poll from The Associated Press-NORC Center for Public Affairs Research and the University of Chicago Harris School of Public Policy.
The poll found that nearly 6 in 10 adults (58%) think AI tools — which can micro-target political audiences, mass produce persuasive messages, and generate realistic fake images and videos in seconds — will increase the spread of false and misleading information during next year's elections.
By comparison, 6% think AI will decrease the spread of misinformation while one-third say it won't make much of a difference.
“Look what happened in 2020 — and that was just social media,” said 66-year-old Rosa Rangel of Fort Worth, Texas.
Rangel, a Democrat who said she had seen a lot of “lies” on social media in 2020, said she thinks AI will make things even worse in 2024 — like a pot “brewing over.”
Just 30% of American adults have used AI chatbots or image generators and fewer than half (46%) have heard or read at least some about AI tools. Still, there's a broad consensus that candidates shouldn't be using AI.
When asked whether it would be a good or bad thing for 2024 presidential candidates to use AI in certain ways, clear majorities said it would be bad for them to create false or misleading media for political ads (83%), to edit or touch-up photos or videos for political ads (66%), to tailor political ads to individual voters (62%) and to answer voters' questions via chatbot (56%).
The sentiments are supported by majorities
Read more on tech.hindustantimes.com