The ESRB recently filed a request with the FTC seeking approval for a "verifiable parental consent mechanism" called Privacy-Protective Facial Age Estimation, which will enable people to use selfies to prove that they are actually adults who can legally provide parental consent to their children. It struck me and many others as not a great idea—literally inviting Big Brother into your home and all that—but in a statement sent to PC Gamer, the ESRB said the system is not actually facial recognition at all, and is «highly privacy protective.»
The filing, made jointly by the ESRB, digital identity company Yoti, and «youth digital media» company Superawesome, was made on June 2 but only came to light recently thanks to the FTC's request for public comment. It describes a system in which parents can opt to submit a photo of themselves through an «auto face capture module,» which would then be analyzed to determine the age of the person in question. Assuming an adult is detected, they could then grant whatever permissions they feel are appropriate for their children.
It's basically a photo-verified age gate, then, not terribly different from showing your driver's license to the guy behind the counter before you buy booze—except that the guy behind the counter is a faceless machine, and you're not flashing government-issued ID, you're handing over a live image snapped within the confines of your own home. At a time when corporate interests around the world are racing to develop increasingly complex AI systems, while experts are warning us about the dangers inherent in that race, the idea of willingly submitting one's face for machine analysis understandably raised some hackles.
There were some misunderstandings, however, which
Read more on pcgamer.com