The European Commission found that Meta breached EU law by failing to prevent minors under the age of 13 from accessing its platform. The company’s handling of child safety has come under increasing scrutiny.
The commission announced Wednesday that a preliminary investigation concluded that Meta violates EU digital services law because Instagram and Facebook’s minimum age requirement of 13 years is not adequately enforced.
According to the commission, minors can enter false dates of birth when creating accounts, and there are no controls in place to verify this.
Additionally, the commission said the tool for reporting minor accounts is “difficult to use” and requires up to seven clicks to access the form. Even when accounts of minors are reported, the Commission found, there is often no appropriate follow-up or action taken to remove minors from the platform.
“We believe that Instagram and Facebook need to change their risk assessment methodologies to assess what risks they pose and how they manifest in the European Union,” the commission said in a statement.
“We disagree with these preliminary findings,” a Meta spokesperson told CNBC. “It’s clear that Instagram and Facebook are targeted at users 13 and older, and we take steps to detect and remove accounts under that age.”
“We continue to invest in technology to find and remove underage users, and will provide further information next week about additional measures that will soon be rolled out. Age tracking is an industry-wide challenge and requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this important issue.”
Mehta can now review the commission’s preliminary findings and respond in writing.
If the commission’s preliminary findings are confirmed in the final investigation, Meta could be fined up to 6% of its annual worldwide turnover.
The commission’s preliminary findings follow two high-profile U.S. court rulings in March, one finding that aspects of the company’s platform design contributed to addiction and mental health harm among teenagers, and the other concluding that the company misled users about the safety of children on its platform.
