The European Commission has released a proposal that is supposed to protect children from sexual abuse, but which critics say could undermine the privacy of everyone who uses popular services, even if companies have implemented end-to-end encryption for their platforms.
The commission says it wants to establish new rules that would require companies to scan for child sexual abuse material (CSAM) as well as "grooming," which it defines as "the solicitation of children," on their platforms so they can help prevent children from being exploited.
"With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive," the commission says. "The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64% increase in reports of confirmed child sexual abuse in 2021 compared to the previous year."
If that sounds familiar, well, it's probably because Apple planned to roll out automated CSAM detections in 2021. That plan was controversial because it meant the company would be trading privacy for a program that—despite its promises—wouldn't necessarily be limited to CSAM.
The European Commission's proposal could be even more invasive because it would require companies to scan the contents of text-based messages in addition to checking for known CSAM. That requirement has the potential to make end-to-end encryption all but meaningless.
"This document is the most terrifying thing I’ve ever seen," Johns Hopkins University professor Matthew Green tweeted. "It is proposing a new mass surveillance system that will read private text messages, not to detect CSAM, but to detect 'grooming'."
The commission seems to be
Read more on pcmag.com