A coalition of attorneys general are calling on Congress to study how AI is used to exploit children and come up with ways to crack down on the practice.
In short, AI is being used to create child sexual abuse material (CSAM). Attorneys general from all 50 US states along with Washington, D.C. and three other US territories want Congress to create an expert commission to study how AI abuse affects children.
The attorneys general commended Congress on its existing work to regulate AI misuse in other sectors, but said lawmakers also need to focus more on child exploitation. Current laws prevent child exploitation online, but the attorneys general say restrictions should be expanded to include AI-generated content specifically to make prosecution of such crimes easier.
As the AGs note, "AI has the potential to be used to identify someone’s location, mimic their voice, and generate deepfakes." The letter cites an instance where scammers faked a kidnapping by using an AI-generated version of the child's voice and another using AI-generated, sexually explicit deepfakes to extort people.
They acknowledge that these crimes are possible with apps like Photoshop. However, AI makes the creation of such content more widely available to less proficient users. The result is quicker access and a higher level of abuse.
“We are engaged in a race against time to protect the children of our country from the dangers of AI. Indeed, the proverbial walls of the city have already been breached. Now is the time to act," the attorneys general wrote in the letter. "This will ensure prosecutors have the tools they need to protect our children."
Sign up for What's New Now to get our top stories delivered to your inbox every morning.
This
Read more on pcmag.com