Meta is taking steps to stop misinformation during the Israel-Hamas conflict

0
53

[ad_1]

Following Hamas’ terrorist attacks on Israel, it’s no secret that there has been a flood of misinformation on social media apps like Instagram and Facebook, with people supporting one side or the other. However, this Israel-Hamas conflict has made moderation even more difficult for Meta, which claims it has seen a substantial surge in misinformation and subsequent removal of content that violated its Dangerous Organizations and Individuals policy in Hebrew and Arabic.

While the company did not explicitly reference the European Union (EU) or its Digital Services Act, this report aligns with an open letter from European Commissioner Thierry Breton, which highlighted the need to combat disinformation and illegal content on social media apps. Furthermore, the commissioner has also sent letters to other social media companies, including X, highlighting the urgent need to address this situation.

Meta’s measures

In response to the spread of misinformation, Meta has implemented various measures, including the creation of a dedicated Special Operations Center staffed with experts proficient in Hebrew and Arabic and the Israel-Hamas conflict. This specialized team aims to bolster the company’s capability to promptly identify and address content that violates its policies, including violent and graphic content, hate speech, harassment, and activities promoting harm.

Additionally, Meta emphasized that it has already taken down over 795,000 pieces of content within the three days following October 7th for policy violations. Moreover, the company has also banned Hamas, the group behind the attacks, from all of Meta’s platforms.

Furthermore, Meta is actively blocking hashtags and giving priority to reports related to the crisis on Facebook and Instagram Live. Lastly, the company is “temporarily expanding” its violence and incitement policy, leading to the removal of posts that identify hostages, even if the intent is to raise awareness.

However, despite these efforts, concerns have been raised by employees of Meta’s Trusted Partner Program, responsible for content removal, regarding the company’s slow response to reports, sometimes taking months.

[ad_2]

Source link