SAN FRANCISCO, Sept 17 — Meta on Tuesday announced the creation of “Teen Accounts,” designed to better protect underage users from the dangers associated with Instagram.
Many experts and authorities accuse the hugely popular photo-sharing app of damaging the mental health of its youngest users through addiction to the app, bullying, and body image and self-esteem issues.
“'Teen Accounts' is a significant update, designed to give parents peace of mind,” Antigone Davis, Meta vice-president in charge of safety issues, told AFP.
Under the new policy, users aged 13 to 17 will have private accounts by default, with tighter safeguards on who can contact them and what content they can see.
Thirteen to 15-year-olds who want a more public profile and fewer restrictions — because they want to become influencers, for example — will need to obtain permission from their parents. The new rules apply for both existing and new users to the platform.
“This is a big change. It means making sure that we do this really well,” Davis said.
Three billion IDs
For the past year, pressure has been building across the globe against the social media giant founded by Mark Zuckerberg and its rivals.
Last October, some forty US states filed a complaint against Meta's platforms, accusing them of harming the “mental and physical health of young people,” due to the risks of addiction, cyber-bullying or eating disorders.
Australia, meanwhile, will soon set the minimum age for its social networks at between 14 and 16.
For the time being, Meta refuses to check the age of all of its users, in the name of confidentiality.
“When we have a strong signal that someone's age is wrong, we're going to ask them to verify their age, but we don't want to make three billion people have to provide IDs,” Davis said.
In her opinion, it would be simpler and more effective if age checks were carried out at the level of the smartphone's mobile operating system, i.e. Google's Android or Apple's iOS.
“They actually have significant information about the age of users. And if they were to share that broadly across all the apps that teens use, that would provide peace of mind for parents.”
It's not clear whether the new protections will be enough to reassure governments and online safety advocates, such as Matthew Bergman, founder of the Social Media Victims Law Center.
“Instagram is addictive. Instagram leads kids down dangerous rabbit holes, where they are shown not what they want to see, but what they can't look away from,” he said.
His group represents 200 parents whose children committed suicide “after being encouraged to do so by videos recommended by Instagram or TikTok.”
Bergman points to the many cases where young girls have developed serious eating disorders.
Meta now prevents the promotion of extreme diets on its platforms, among other measures taken in recent years.
These measures are “baby steps, but nevertheless, steps in the right direction,” he told AFP.
In his view, all that's needed is for groups to make their platforms less addictive — “and therefore a little less profitable.”
This can be done without the platforms losing their quality for users, he said. — AFP