Meta faces EU investigation over child safety risks
Meta Platforms' (META.O), opens new tab social media sites Facebook and Instagram will be investigated for potential breaches of EU online content rules relating to child safety, EU regulators said on Thursday, a move that could lead to hefty fines.
Tech companies are required to do more to tackle illegal and harmful content on their platforms under the European Union's landmark Digital Services Act (DSA), which kicked in last year.\
The European Commission said it had decided to open an in-depth investigation into Facebook and Instagram due to concerns they had not adequately addressed risks to children. Meta submitted a risk assessment report in September.
"The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called 'rabbit-hole effects'," the EU executive said in a statement.
"In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta." The regulator's concerns relate to children accessing inappropriate content.
Meta is already in the EU's crosshairs over election disinformation, a key concern ahead of crucial European Parliament elections next month. DSA violations can lead to fines of as much as 6% of a company's annual global turnover.