EU Commission Launches Formal Investigation Into Meta Over Child Safety on Facebook, Instagram

EU officials are also concerned with the age verification methods put in place by Meta.
EU Commission Launches Formal Investigation Into Meta Over Child Safety on Facebook, Instagram
A smartphone and a computer screen displaying the logos of the social network Facebook and its parent company Meta in Toulouse, southwestern France, on Jan. 12, 2023. (Lionel Bonaventure/AFP via Getty Images)
Katabella Roberts
5/17/2024
Updated:
5/17/2024
0:00

The European Union has launched an investigation into Meta and its platforms Facebook and Instagram amid concerns the tech giant is failing to protect children online and possibly breaching EU laws.

The European Commission, the EU’s executive branch, said in a press release on Thursday that it had opened formal proceedings against Meta to assess whether the Mark Zuckerberg-founded company may have breached the EU’s Digital Services Act (DSA) as it pertains to protecting minors.

That act requires very large online platforms and search engines to put in place various measures to protect children online, including by preventing them from accessing inappropriate content and ensuring a high level of privacy and safety.

Failure to comply with the rules could result in such companies being fined up to 6 percent of their global revenue.

However, the commission said it is concerned that the systems behind both the Facebook and Instagram platforms, including the algorithms that recommend videos and posts, could lead to addiction among children and create so-called “rabbit-hole effects.”

Additionally, EU officials are concerned with the age verification methods put in place by Meta, which include requiring users to either upload an image of their government-issued ID, record a video selfie, or ask mutual friends to verify their age.

The investigation will focus on multiple areas, including Meta’s compliance with DSA obligations regarding the assessment and mitigation of risks caused by the “design of Facebook’s and Instagram’s online interfaces,” which the commission said “may exploit the weaknesses and inexperience of minors and cause addictive behavior,” among other things.

“Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights,” the commission noted.

‘Another Step to Ensure Safety’ for Young Online Users

The probe will also focus on Meta’s compliance with DSA requirements relating to mitigation measures to prevent young users from accessing inappropriate content, notably its age-verification tools, which the commission said “may not be reasonable, proportionate and effective.”

Additionally, the investigation will look into whether or not Meta complied with DSA obligations to ensure a high level of privacy, safety, and security for minors, particularly with regard to default privacy settings for minors.

The probe into Meta is based on a preliminary analysis of a risk assessment report sent by Meta in September 2023, the commission said.

In a statement on Thursday, Margrethe Vestager, executive vice president for a Europe Fit for the Digital Age said the EU Commission had concerns that Facebook and Instagram “may stimulate behavioral addiction and that the methods of age verification that Meta has put in place on their services is not adequate.”

“Today we are taking another step to ensure safety for young online users,” Ms. Vestager said. “We want to protect young people’s mental and physical health.”

Elsewhere, Thierry Breton, European commissioner for Internal Market, said officials are “not convinced” that Meta has done enough to comply with its obligations under the DSA to mitigate the risks of negative physical and mental health effects of its platforms on young Europeans.

Mr. Breton promised an “in-depth” investigation into Meta and its potentially addictive nature as well as its age verification tools and the level of privacy afforded to minors.

“We are sparing no effort to protect our children,” he said.

Mark Zuckerberg, CEO of Meta, speaks to victims and their family members as he testifies during the US Senate Judiciary Committee hearing "Big Tech and the Online Child Sexual Exploitation Crisis" in Washington on Jan. 31, 2024. (Brendan Smialowski/AFP via Getty Images)
Mark Zuckerberg, CEO of Meta, speaks to victims and their family members as he testifies during the US Senate Judiciary Committee hearing "Big Tech and the Online Child Sexual Exploitation Crisis" in Washington on Jan. 31, 2024. (Brendan Smialowski/AFP via Getty Images)

Meta’s Legal Challenges Mount

In a statement to The Epoch Times, a Meta spokesperson said: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools, features and resources designed to protect them.

“This is a challenge the whole industry is facing, which is why we’re continuing to advance industry-wide solutions to age-assurance that are applied to all apps teens access. We look forward to sharing details of our work with the European Commission.”

The latest probe into Meta comes shortly after the European Union launched an investigation into whether the platform was failing to counter disinformation ahead of EU elections in June.

In February, the commission also began probing video-sharing platform TikTok over potential non-compliance with the Digital Services Act and possible failures to address the negative effects of its app on young people.

Similar to Meta, that probe will focus on whether or not the platform—which is owned by Chinese parent company ByteDance—ensures safety and privacy for minors, whether its algorithms may stimulate addictions, and whether TikTok complies with various DSA requirements including an obligation to conduct and submit a risk assessment report prior to deploying certain functionalities.

Prior to the latest probe, Meta came under scrutiny in the United States where it has been the subject of multiple lawsuits, including one filed by the attorneys general of 33 states claiming the social media giant routinely ignored reports of underage users on its platforms and “coveted and pursued” the underage Instagram user demographic for years.

Meta said at the time that the complaint misrepresented its work over the past decade to bolster the safety of teenagers online, and touted its various support tools in place for parents and young users.

Other lawsuits filed against Meta argue the company’s apps promote body dysmorphia and expose underage users to potentially harmful content.

In February, Mr. Zuckerberg apologized to the families and parents of children who have been harmed by his social media platforms during a hearing on Capitol Hill, adding that Meta would continue investing in industry-wide efforts to bolster child protections online.