Ireland Probes TikTok And LinkedIn Over Suspected Breaches Of EU Digital Services Rules
Ireland’s media regulator, Coimisiún na Meán, has turned its attention to TikTok and LinkedIn as concerns grow over how major platforms allow users to report illegal content.
The two investigations, announced this week, follow the regulator’s recent scrutiny of Elon Musk’s X—marking a decisive phase in Ireland’s enforcement of the European Union’s Digital Services Act (DSA).
Are TikTok And LinkedIn Making It Too Hard To Flag Illegal Content?
At the centre of the probes are allegations that both platforms may be failing to meet DSA requirements for accessible, anonymous and user-friendly reporting systems.
Officials believe the tools provided by TikTok and LinkedIn might involve “deceptive interface designs,” or dark patterns, that could mislead users during the reporting process.
According to the regulator, these designs may cause people to mistakenly submit a report under platform rules rather than flagging suspected illegal material—an important distinction in the DSA framework.
Digital Services Commissioner John Evans said:
“Providers are also obliged to not design, organise or operate their interfaces in a way which could deceive or manipulate people, or which materially distorts or impairs the ability of people to make informed decisions.”
The investigation will examine whether both platforms allow anonymous reporting of child sexual abuse material (CSAM), which the DSA requires, and whether their reporting tools are genuinely easy to access.
What Powers Does Ireland Have Under The DSA?
The DSA divides enforcement responsibility between the European Commission and the regulator of the country where a platform is headquartered.
With TikTok’s parent company ByteDance and LinkedIn’s owner Microsoft both based in Ireland for EU operations, Coimisiún na Meán holds authority over several compliance areas.
Platforms found in breach of the law can face penalties of up to 6% of their global annual revenue.
The watchdog has already pressured other companies into revamping their illegal content reporting systems, signalling a stricter regulatory environment.
This is also not the first time TikTok and LinkedIn have faced action from Irish authorities.
TikTok was fined €530 million in May 2025 for violating GDPR, while LinkedIn previously paid around €310 million in separate regulatory cases.
A TikTok spokesperson said the company “will review it in full and engage with (the regulator) as required.”
X Already Under Irish Scrutiny For Moderation Failures
The investigations into TikTok and LinkedIn follow Coimisiún na Meán’s November 2025 decision to open a DSA probe into X over claims it is failing to remove user-reported illegal content.
The regulator is examining whether X’s internal complaint-handling system meets DSA standards, including the right to appeal moderation decisions.
Virkkunen, the European Commission’s Executive Vice-President for Technological Sovereignty, Security, and Democracy, noted that while automated moderation is allowed, “online platforms must be transparent about its use and accuracy.”
The inquiry is supported by nonprofit group HateAid, which previously acted on behalf of a researcher who had been repeatedly banned from X.
Separately, Ireland’s Data Protection Commission is investigating X over concerns it trained its Grok AI model on user posts—an alleged GDPR breach that could carry fines of up to 4% of global revenue.
Are More Enforcement Cases Coming For Big Tech?
With the DSA now firmly in force, regulators across Europe appear increasingly prepared to challenge major online platforms.
Ireland, home to many tech giants’ EU headquarters, is becoming the focal point for enforcement.
While the European Commission oversees most action against very large platforms, national regulators like Coimisiún na Meán hold growing influence over specific obligations such as reporting systems.
A Digital Future Built On Accountability Or Avoidance
Coinlive observes that these investigations raise deeper questions about how platforms shape user behaviour—intentionally or not.
If reporting illegal content becomes confusing, hidden or discouraging, society faces a platform ecosystem that appears safe on the surface yet quietly weakens public protections.
The DSA aims to restore balance, but its effectiveness will depend on regulators staying firm and platforms embracing responsibility rather than relying on clever interface design.
This clash between regulatory expectations and business-driven UX choices may well define the next phase of Europe’s online governance.