Attorneys general for 44 states and territories have come out in opposition to Facebook’s plans for a version of Instagram for children under 13.
Facebook has been planning to roll out a version of the popular social media platform for children under the age of 13, a group that enjoys special protection under the law. Facebook is believed to be in the early stages of planning, with no concrete timelines having been announced.
Nonetheless, AGs for Massachusetts, Nebraska, Vermont, Tennessee, Alaska, California, Connecticut, Delaware, District of Columbia, Guam, Hawaii, Idaho, Illinois, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Michigan, Minnesota, Mississippi, Missouri, Montana, Nevada, New Hampshire, New Jersey, New Mexico, New York, North Carolina, Northern Mariana Islands, Ohio, Oklahoma, Oregon, Puerto Rico, Rhode Island, South Carolina, South Dakota, Texas, Utah, Virginia, Washington, Wisconsin and Wyoming are voicing their opposition.
In a letter to Facebook CEO Mark Zuckerberg, the AGs outlined their concerns, not the least of which was the impact early exposure to social media has on young minds.
First, research increasingly demonstrates that social media can be harmful to the physical, emotional, and mental well-being of children. “In the last decade, increasing mental distress and treatment for mental health conditions among youth in North America has paralleled a steep rise in the use of smartphones and social media by children and adolescents.” Research shows a link between young people’s use of social media and the “increase in mental distress, self-injurious behavior and suicidality among youth.” In fact, an online-monitoring company tracking the activity of 5.4 million children found that “Instagram was frequently flagged for suicidal ideation, depression and body image concerns.”
Another major concern was the risk of cyberbullying, with the letter highlighting that 42% of young Instagram users had experienced cyberbullying, the highest rate of any social media platform.
The AGs also took Facebook to task for its track record protecting young users and their privacy.
Third, Facebook has a record of failing to protect the safety and privacy of children on its platform, despite claims that its products have strict privacy controls. Reports from 2019 showed that Facebook’s Messenger Kids app, intended for kids between the ages of six and 12, contained a significant design flaw that allowed children to circumvent restrictions on online interactions and join group chats with strangers that were not previously approved by the children’s parents. Just recently, a “mistake” with Instagram’s algorithm promoted diet content to users with eating disorders, where the app’s search function recommended terms including “appetite suppressants” and “fasting” to vulnerable people who were at risk of relapsing. These alarming failures cast doubt on Facebook’s ability to protect children on their proposed Instagram platform and comply with relevant privacy laws such as the Children’s Online Privacy Protection Act.
It remains to be seen if Facebook will change course or continue with its plans. If it does continue, it may face significant legal challenges given the opposition it is already experiencing.