Attorneys general from 44 states and territories in the US have pressed Facebook to abandon its intentions to release a version of Instagram designed for under 13-year olds, warning that young children are “not equipped to navigate the challenges of having a social media account.”
The National Association of Attorneys General has written a letter to Mark Zuckerberg, Facebook’s chief executive, claiming that social media use can have detrimental effects on children’s health and wellbeing.
“Further, Facebook has historically failed to protect the welfare of children on its platforms,” the attorneys wrote. “The attorneys general have an interest in protecting our youngest citizens, and Facebook’s plans to create a platform where kids under the age of 13 are encouraged to share content online is contrary to that interest.”
Instagram confirmed it was working on a version of the app for children under 13 in March, with chief executive Adam Mosseri announcing the company did not have “a detailed plan yet” for how the app would work. Currently, children under the age of 13 are not allowed to sign up to the platform.
The attorneys cited research published in the CMAJ (Canadian Medical Association Journal) highlighting a link between young people’s use of social media and an increase in mental distress and self-harming and suicide-related behaviour, adding that Instagram was regularly flagged as hosting content linked to suicidal ideas, depression and body image concerns by an online monitoring firm.
Privacy and children’s safety campaigners have voiced concerns the new app may not be adequately equipped to help children navigate the web safely while protecting their privacy.
Ian Russell, whose 14-year old daughter Molly took her own life in November 2017 after viewing suicide and self-harm content on Instagram, said it was “very hard” to see how its safety could be guaranteed.
Mr Russell said in order to accept the fact that Instagram and its parent company Facebook could make an app safe for children, “you have to ask the question, why haven’t they done more on the current platform?”
“If someone under the age of 13 finds their way onto the adult version of the platform, as they do now all too frequently, they then still wouldn’t be protected in the same way that Facebook claims they will be protecting under-13-year-olds on the new platform,” he told i in March.
“It’s fairly widely accepted that social media is designed to be addictive. I don’t see how you can square that with a young usership under the age of 13.”
The attorneys pointed out how Facebook Messenger Kids, the version of Facebook’s messaging app created specifically for children, contained a security flaw that allowed its users to join group chats with people outside of the list of contacts pre-approved by their parents or carers.
“It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account,” they said.
“In short, an Instagram platform for young children is harmful for myriad reasons. The attorneys general urge Facebook to abandon its plans to launch this new platform.”
A spokesperson for Facebook said the company had just started exploring how to create a version of Instagram for children, in which it had committed not to show any adverts.
“We agree that any experience we develop must prioritise their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it. We also look forward to working with legislators and regulators, including the nation’s attorneys general,” the spokesperson told CNBC.