Over the past few years, the Meta platform was bashed by many concerned parents about child safety. Now, to take action and protect its teenage user base, Meta is testing a nudity protection feature for Instagram's messages which detects and blurs the nude images which are being shared by other users. This feature will keep users safe from potential scams and harmful content which appears on Instagram. This move came after serious pressure from the US and European governments. Know more about how the nudity protection feature on Instagram will work.
Also read: How to download Instagram stories with audio: Step-by-step guide
Meta is backing teens from sextortion and intimate image abuse by introducing new features to Instagram. Sextortion is a new emerging scam which involves sexual blackmail, threats, sharing adult images and others. According to the Meta blog, the upcoming feature will make it challenging for scammers and criminals to interact with and manipulate teen users. Additionally, the company is also working on new measures to support youngsters in recognising scammers and how they can protect themselves. Currently, Instagram is testing nudity protection in DMs which will strictly blur the image if detected containing nude content. This feature uses on-device machine learning to detect and analyse images sent in a DM on Instagram.
Also read: How to use Vanish mode on Instagram: A guide to disappearing chats
Meta highlighted that once this feature is fully ready, it will be turned on by default for users under the age of 18. For adults, a notification will pop up which will encourage users to use the feature for their safety and protection. Meta said, “Because the images are analysed on the device itself, nudity protection will also work in end-to-end encrypted chats, where Meta won't have access to these images – unless someone chooses to report them to us.” This tool is expected to roll out in the coming week, making Meta platforms a safer space for
Read more on tech.hindustantimes.com