The European Commission has opened an investigation into the owner of Facebook and Instagram over concerns that the platforms are creating addictive behaviour among children and damaging mental health.
The EU executive said Meta may have breached the Digital Services Act (DSA), a landmark law passed by the bloc last summer that makes digital companies large and small liable for disinformation, shopping scams, child abuse and other online harms.
“Today we open formal proceedings against Meta,” the EU commissioner for the internal market, Thierry Breton, said in a statement. “We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram.”
The investigation will explore potential addictive impacts of the platforms, known as “rabbit hole” effects, where an algorithm feeds young people negative content, such as on unrealistic body image. It will also look at the effectiveness of Meta’s age verification tools and privacy for minors. “We are sparing no effort to protect our children,” Breton said.
A Meta spokesperson said: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
Last month the commission opened an inquiry into Meta under the DSA over its handling of political content amid concerns that it was not doing enough to counter Russian disinformation before the EU elections in June.
Under the DSA, platforms are obliged to protect the privacy and safety of children. Following a preliminary investigation, EU officials are concerned that Facebook and Instagram “may exploit the weaknesses and inexperience of minors and cause addictive behaviour”.
They are also sceptical about the platform’s age-verification tools. Users are meant to be at least 13 years old to open an account on Facebook or Instagram.
One official said that it was “so obviously easy to circumvent some controls” that the commission wanted to know how it was ever assessed by Meta that these measures could be effective and appropriate.
An EU official said on Thursday that the commission wanted to use the bloc’s European digital identity wallet for age verification. The wallet, which is still at the testing stage, is intended to make it easier for people across the 27-country union to prove who they are, whether they are opening a bank account, applying to university or applying for a job.
The commission has also begun two investigations into TikTok, which led the Chinese-owned video-sharing platform to voluntarily withdraw its TikTok Lite reward-to-watch service in France and Spain last month.
This followed the launch of DSA proceedings into X for alleged hate speech and into the online commerce site AliExpress over its advertising transparency and complaint handling.
The latest Meta investigation is similar to the TikTok case as both are examining the potentially addictive nature of online platforms. Breton has previously said the TikTok Lite service could be “as toxic and addictive as cigarettes”.
The DSA, which entered into force in February for platforms operating in Europe, was intended to force powerful online platforms that were “too big to care” to take responsibility for online safety.
If the commission is not satisfied with Meta’s response it can impose a fine equating to 6% of its global turnover. More immediately, it can carry out on-site investigations and interview company executives, with no deadline publicly fixed to complete the investigation.
Leave a Reply