Researchers have discovered the limits to ChatGPT's ability to provide location-specific information about environmental justice issues, which they said possibly hint at the chatbot's geographic biases. The researchers at Virginia Tech University in the US asked ChatGPT to answer a prompt about the environmental justice issues in each of the 3,108 counties across the states of the country. ChatGPT is a generative artificial intelligence (AI) tool, developed by OpenAI, that is trained extensively on massive amounts of data in natural language. Such AI tools, also called large language models, can process, manipulate and generate textual responses based on users' requests, or "prompts".
The researchers found that while ChatGPT displayed an ability to identify location-specific environmental justice challenges in large, high-density population areas, the tool had limitations when it came to local environmental justice issues.
They said that the AI model could provide location-specific information for only about 17 per cent, or 515, of the total 3,018 counties it was asked about. Their findings are published in the journal Telematics and Informatics.
"We need to investigate the limitations of the technology to ensure that future developers recognise the possibilities of biases. That was the driving motivation of this research," said Junghwan Kim, assistant professor at the Virginia Tech University and the study's corresponding author.
The researchers said they chose environmental justice as the subject for the investigation to expand the range of questions typically used to test the performance of generative AI tools.
The US Department of Energy describes environmental justice as the "fair treatment and meaningful involvement of
Read more on tech.hindustantimes.com