Ireland was the first member state to trigger an alert under the EU's powerful new legislation on online hate speech in the immediate aftermath of last week’s riots in Dublin, RTÉ News has learned.

As a result of the move, the European Commission became directly involved in contacting social media giants such as Meta, YouTube, X (formerly Twitter), Tik-Tok, Instagram and others within hours of the riots, to warn them of their obligations under the new legislation.

EU sources have also said that the shortage of Irish-speaking moderators among some of the biggest social media firms may have resulted in far-right activists deliberately using Irish in their posts in order to avoid them being removed.

X, YouTube and Tiktok have all reported employing no, or very few, Irish-language content moderators.

"In the case of the riots in Dublin we saw that those spreading hatred, illegal and harmful content 'exploited’ the lack of Gaelic speaking moderators," said a Commission official.

"This is an issue on X and Google. Meta instead was much better prepared."

Under the new Digital Services Act (DSA), large social media platforms in particular must be much more vigilant in combating hate speech and harmful content online.

By triggering an alert under the DSA, the Irish authorities ensured that the European Commission became directly involved in warning the digital platforms of their obligations

The DSA requires each member state to set up a national regulator - in Ireland’s case Comisiún na Meán - to more robustly police large online platforms and marketplaces on harmful content.

National regulators can order social media companies to take down illegal content, while the Commission can get involved if companies are flagrantly evading their responsibilities.

In a statement the day after the riots, Comisiún na Meán reminded tech companies that under the DSA they were "obliged to assess and mitigate a series of risks from the use of their services, including negative effects for public security.

"They are also obliged to notify law enforcement authorities if they become aware of information about an actual or potential criminal offence involving public safety."

By triggering an alert under the DSA, the Irish authorities ensured that the European Commission became directly involved in warning the digital platforms of their obligations under the new Europe-wide legislation.

"The incident protocol under the DSA was launched for the first time - by the Irish authorities - and it worked," said a Commission official.

"This enabled three-way discussions between the Irish authorities, the Commission and the platforms.

"We see that platforms are more responsive when the Commission is part of the discussion. We also saw this in the context of exchanges with [social media] platforms prior to elections recently (ie, Slovakia and the Netherlands)."

Under the DSA online platforms which have over 45 million monthly users are obliged to publish transparency reports on how they are complying with the DSA.

Meta Platforms Ireland Ltd published its first report on October 27.

It showed that the company was subject to 12 so-called "authority orders" to provide information on content that was potentially illegal.

Meta’s report also showed that it employed 42 Irish language content moderators.

By contrast, while X’s transparency report did not spell out how many Irish-speaking content moderators it employed, it admitted that the company collected "very little data" on Irish language tweets.

In Google-YouTube’s transparency report, also published on October 27, the company revealed there were no Irish language moderators employed to monitor content on YouTube.

Tiktok’s transparency report also showed that the company employed zero Irish-speaking content moderators

It’s understood Irish officials will hold another meeting with the European Commission tomorrow.