[ad_1]
The European Commission gives TikTok and Meta a week to provide explanations and details on measures taken to contain, counter and eliminate the spread of terrorist, violent content and hate speech on their platforms (via Reuters).
The Commission has sent the formal request to the two companies as researchers point to the proliferation of disinformation following Hamas’ attack against Israel.
If the companies do not provide satisfactory explanations and can’t make a compelling argument about how this is not to happen in the future, the EU’s executive body can open investigations that can quickly result in fines.
Major online platforms such as TikTok and Meta’s Facebook and Instagram are required to do more to take down illegal and harmful content or risk fines as much as 6% of their global turnover, according to new online content rules known as the Digital Services Act (DSA) that came into force recently.
Meta and TikTok “must provide the requested information to the Commission by 25 October 2023 for questions related to the crisis response and by 8 November 2023 on the protection of the integrity of elections”, the Commission said.
It’s nice that the EU doesn’t want to see “terrorist, violent content” on TikTok and Facebook. But what about seeing pro-terrorist, violent acts and protests in real life – on the streets of major European cities? Who do you impose a 6% fine on?
[ad_2]
Source link