Brussels: The European Union on Friday said Meta and TikTok had breached their transparency obligations after an investigation that could result in billions of dollars in fines.
The inquiry found both companies had violated the Digital Services Act, the EU’s trailblazing digital rule book that imposes a set of strict requirements designed to keep internet users safe online, including making it easier to report counterfeit or unsafe goods or flag harmful or illegal content like hate speech, as well as a ban on ads targeted at children. “We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society,” said Henna Virkunnen, the EU’s executive vice president for tech sovereignty, security and democracy, in a post on X. “Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice.”
The 27-nation bloc launched investigations in 2024 into both Meta and TikTok. They found that the companies did not allow easy access to data for researchers. They also found that Meta’s Instagram and Facebook did not make it easy for users to flag illegal content and effectively challenge moderation decisions.
“Allowing researchers access to platforms’ data is an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health,” according to a statement by the European Commission. The inquiry found that both Facebook and Instagram deployed “dark patterns” or deceptive interface designs for their protocols for flagging malicious content like child sex abuse or
terrorist content.