Musk given 24 hours to address graphic images of Hamas attacks

MicroWave@lemmy.world to World News@lemmy.world – 581 points –
Musk given 24 hours to address graphic images of Hamas attacks
politico.eu

Elon Musk has until the end of Wednesday to respond to demands from Brussels to remove graphic images and disinformation linked to the violence in Israel from his social network X — or face the full force of Europe's new social media rules.

Thierry Breton, the European Union commissioner who oversees the bloc's Digital Services Act (DSA) rules, wrote to the owner of X, formerly Twitter, to warn Musk of his obligations under the bloc's content rules.

If Musk fails to comply, the EU's rules state X could face fines of up to 6 percent of its revenue for potential wrongdoing. Under the regulations, social media companies are obliged to remove all forms of hate speech, incitement to violence and other gruesome images or propaganda that promote terrorist organizations.

Since Hamas launched its violent attacks on Israel on October 7, X has been flooded with images, videos and hashtags depicting — in graphic detail — how hundreds of Israelis have been murdered or kidnapped. Under X's own policies, such material should also be removed immediately.

181

You are viewing a single comment

Traumatizing ≠ making people take war seriously.

Believe it or not, journalism and educating people is much more than uploading graphically disturbing images to some website and leave it as is.

Traumatizing ≠ making people take war seriously.

That's actually not true, and most people who watch these videos aren't 'traumatized', so it's not really an argument.

Believe it or not, journalism and educating people is much more than uploading graphically disturbing images to some website and leave it as is.

Who said it isn't? They should include the footage with their articles. This way people can see instead of just being told.

If they don't want to look, then there should be explicit content warnings.

There have been plenty of studies about gore and death content that suggest they cause trauma similar to PTSD. Some people are affected more than others. On X you're pretty likely to be presented with some extremely violent images right now if you go looking for information about what is happening, so you can't really avoid it other than to avoid X entirely. Plenty of these images and videos aren't even related to this conflict, and are just misinformation / ragebait.

Can you show me what studies you're talking about?

I have a feeling you're referring specifically to studies that focus on people who are paid to moderate this content. If you share what studies you're talking about we can know for sure.

You really don't need to look further than the clinical data on PTSD. A sufficient amount of any form of trauma can cause mental health issues including but not limited to PTSD. Watching an execution video has a large potential to cause a severe trauma response, especially if the victims are people you know or love, or are members of your community.

Plenty of real world examples of content moderation teams at social media companies suffering from their exposure to extreme content.

Traumatizing people is one of the core goals of terrorism, because it does damage.

Thanks for not linking to a single study like I asked.

Sorry, but I won't take you seriously until you do. You mentioned 'studies.' Show us them.

I've seen studies that disprove your studies.