Driverless cars reportedly can't detect children and darker-skinned pedestrians as well as they can lighter-skinned adults.
Researchers at King’s College in London tested eight AI-powered pedestrian-detection systems against 8,000-plus images, and found that the software’s ability to detect pedestrians was 20% higher for adults than it was for children. The study also found that the software was 7.5% more accurate for light-skinned pedestrians than it was for those with darker skin.
The researchers note that a lot of the issue is because the AI models are trained on photos that feature more people with light skin than dark skin. As a result, the programs are more easily able to detect light-skinned pedestrians simply because that’s what they know.
“While the impact of unfair AI systems has already been well documented, from AI recruitment software favoring male applicants to facial-recognition software being less accurate for black women than white men, the danger that self-driving cars can pose is acute,” Dr. Jie Zhang, one of the scientists who performed the study, said in a statement. “Before, minority individuals may have been denied vital services, now they might face severe injury.”
In the system's detection of darker-skinned individuals, meanwhile, it also found that biases get significantly worse in low-contrast and low-brightness situations, potentially leading to even more dangerous conditions when AI-powered pedestrian detection systems are used at night.
Car manufacturers don’t release the details of the software they use for pedestrian detection. But the scientists claim that those programs are typically built on the same open-source systems they used in their research.
Researchers suggest that the
Read more on pcmag.com