YouTube's algorithms have a mind of their own, according to new research from the Mozilla Foundation. The nonprofit this week reported(Opens in a new window) that YouTube "Dislike" and "Not interested" buttons do little to curb unwanted recommendations, misinformation, or violent content. In fact, they often frustrate and confuse users.
Two years ago, amid a global pandemic and ahead of the 2020 US election, Mozilla launched(Opens in a new window) the RegretsReporter browser extension to crowdsource research into what it called YouTube's "recommendation problem."
More than 20,000 participants and 500 million videos later, the Foundation today published(Opens in a new window) its findings about YouTube's "ineffective user controls." Spoiler alert: They don't work.
A majority of people (62%) said the buttons—Dislike, Don't recommend channel, Not interested, Remove from watch history—"did nothing" or had "mixed results" when it came to training the algorithm's recommendations. One user who asked YouTube to stop suggesting firearm videos was later recommended more gun content, while another asked for an end to cryptocurrency get-rich-quick clips, only to be offered more crypto videos.
"Our study found that YouTube's user controls have a negligible impact on preventing unwanted recommendations, leaving people at the mercy of YouTube's recommender system," Mozilla data scientist Jesse McCrosky said in a statement. "As a result, YouTube continues to recommend videos that people have clearly signaled they do not want to see, including war footage and gruesome horror clips."
Of the four controls, the most effective, according to the study, is "Don't recommend channel," which, compared to not employing the buttons at all, prevents
Read more on pcmag.com