The US Senate, examining the company’s lapses in child protection, urges legislation mandating parental consent for app downloads.
On Wednesday, Meta urged US lawmakers to regulate Google and Apple’s app stores for enhanced child protection. On the same day, the Senate initiated an investigation into Meta’s shortcomings in shielding children on its platforms.
Antigone Davis, Meta’s global head of safety, in a blog post titled “Parenting in a Digital World Is Hard. Congress Can Make It Easier,” advocates for federal legislation. The proposed law would compel app stores to notify parents when a child aged 13 to 16 downloads an app, seeking parental approval. The blog post doesn’t explicitly name Google or Apple, but these companies operate the world’s largest smartphone app stores, the Play Store for Android and the App Store for iPhone’s iOS. Any legislation aimed at regulating children’s app downloads would impact both platforms.
Davis contended that there’s a more effective approach to overseeing smartphone and internet use than laws mandating parental approval for a child to create a social media account. For instance, Utah implemented a requirement in March for parents of individuals under 18 to consent to the use of TikTok, Instagram, Facebook, and other apps, aiming to protect the mental health of youth, as stated by the state governor, Spencer Cox.
Davis’s plea coincided with the Senate judiciary committee sending a letter to Mark Zuckerberg, Meta’s CEO, urging the submission of documents related to senior executives’ awareness of the mental and physical health risks associated with its platforms, including Facebook and Instagram. The letter stipulates a deadline of November 30 for the documents. By the time of the press, neither Google nor Apple had issued statements.
Following the publication of this story, Meta released a statement expressing support for internet regulation, especially concerning young people. However, the company expressed concern about the emergence of disparate laws across various U.S. states, leading to inconsistent online experiences for teens. Meta maintains that it has endorsed legislation focusing on “clear industry standards” for parental supervision and age verification since 2021.
The Senate committee’s initial inquiry follows a testimony from a former senior Meta employee who spoke before its members a week ago. He detailed the potential harm Instagram could inflict on children, including his own daughter. According to him, Meta’s leadership dismissed his concerns when he raised them internally.