Instagram has rolled out new features aimed at increasing parental control and enhancing safety for teenagers on the platform. These updates, announced on Tuesday, include several changes designed to restrict how teens use the app and what they encounter online. Despite these efforts, critics argue that the updates may not fully address concerns regarding adolescent safety and well-being.
Meta, the company behind Instagram, has introduced a “teen accounts” program. This initiative includes new measures to limit screen time, control the content minors can view, and manage interactions with strangers. The program also expands parental monitoring options, allowing parents to oversee their children's activity on the platform more closely.
Also read: Apple users complain about ‘bricking' after installing latest OS, company pulls back update - All details
Antigone Davis, Meta's global head of safety, explained the rationale behind the changes. “We are altering the experience for millions of teenagers using our app,” Davis said. “We are rethinking the online parent-child dynamic in response to feedback from parents about their preferences and needs,” The Washington Post reported.
Under these new guidelines, Instagram will make all accounts of users under 18 private by default. This means that new and existing accounts for teens will require approval from the account holder before new followers can view, like, or comment on their posts. This move aims to enhance privacy for younger users.
Also read: Jio Fiber, AirFiber free for a year with ₹2222 plan, new users just need to…
The update also includes a feature that prevents teenagers from receiving notifications between 10 p.m. and 7 a.m. Additionally, the platform will restrict sensitive content, such as nudity and discussions of self-harm, and block direct messages from users who are not followed by the account holder.
Another significant change is the introduction of content themes. Teens can now select areas of interest,
Read more on tech.hindustantimes.com