In news sure to crush the souls of writers everywhere, it seems that ChatGPT and other artificial intelligence tools write just as well as humans do—at least, to a level that the average American adult finds indistinguishable.
The results come from a survey conducted by ToolTester(Opens in a new window), a site for finding the best web builders, hosts, and e-commerce tools. It performed two surveys—the first in late February 2023 of 1,920 American adults that compared 75 pieces of text either generated by AI, by humans, or by AIs and edited by humans. The AI used was ChatGPT powered by GPT-3.5, but after the launch of GPT-4, ToolTester surveyed another 1,394 people in late March with the same queries and topics, but with new AI-generated copy using the same prompts.
Over half the respondents thought ChatGPT-3.5's copy was written by a human. That number rose to 63.5% using GPT-4. The results show that GPT-4 (used in the pay version of ChatGPT) is at least 16.5% more convincing than copy created with the older GPT-3.5.
The AI copy can be harder to detect depending on the type of writing. For example, health-related articles generated by AI are the best at duping people into thinking humans wrote them. This table shows percentages of people who believe ChatGPT-generated text is from an AI, a human, or an AI edited by a human.
The more familiar people are with using AI tools, the less likely they are to be fooled. Even so, they're misled 48% of the time. For those who say they have never used a generative AI tool, the average ability to identify AI writing drops to 40.8%.
The wisdom that comes with age also helps. Those aged 65 and older are the most likely to correctly ID AI-generated content, whereas 18- to
Read more on pcmag.com