EU Investigating Musk’s X Platform Over Handling of Israel–Hamas War Content

Ryan Morgan
By Ryan Morgan
December 18, 2023Israel–Hamas War
share
EU Investigating Musk’s X Platform Over Handling of Israel–Hamas War Content
The logo of social media platform X, formerly Twitter, on July 24, 2023. (Dado Ruvic/Illustration/Reuters)

The European Union is investigating the X social media app, formerly known as Twitter, over its content moderation practices amid the ongoing Israel–Hamas conflict.

The European Commission announced on Monday that it’s specifically investigating whether X’s content moderation practices violate the EU’s Digital Services Act (DSA). In particular, the investigation is set to focus on whether X allowed content that is illegal under the DSA to be shared “in the context of Hamas’ terrorist attacks against Israel” when Hamas terrorists breached the Israel–Gaza barrier on Oct. 7 and proceeded to kill hundreds of people.

Under existing EU laws, illegal online content can include a number of categories, such as content that incites or otherwise contributes to terrorism. Illegal content under EU laws can also include hate speech and incitement to violence. The European Commission did not specify exactly what types of illegal content appeared on X following the Hamas attacks that might constitute illegal online content, but it is known that images of the attacks were captured and shared on the platform.

The European Commission announced that its investigation will also delve into whether X’s Community Notes feature has been an effective tool in combatting information manipulation on the platform.

Margrethe Vestager, the executive vice president of the European Commission for A Europe Fit for the Digital Age, said the commission has enough evidence “to formally open a proceeding against X.”

In April, the European Commission named the X social media platform as one of 19 “very large online platforms” (VLOPs) that would need to comply with the new EU law, and compliance requirements went into effect in August.

“The higher the risk large platforms pose to our society, the more specific the requirements of the Digital Services Act are. We take any breach of our rules very seriously,” Ms. Vestager said.

The investigation marks the first time the European Commission has initiated investigative proceedings under the DSA, which was enacted in October of 2022.

“Today’s opening of formal proceedings against X makes it clear that, with the DSA, the time of big online platforms behaving like they are ‘too big to care’ has come to an end,” Commissioner Thierry Breton said. “We now have clear rules, ex ante obligations, strong oversight, speedy enforcement, and deterrent sanctions and we will make full use of our toolbox to protect our citizens and democracies.”

Musk Has Faced Past Warnings

The European Commission’s decision to launch the investigation into X comes as the platform’s owner, Elon Musk, has pushed back on the EU’s content moderation requests.

In May, Mr. Musk withdrew X from the EU’s Code of Practice on Disinformation, which is a set of voluntary EU-prescribed content moderation practices. The Code of Practice entails 44 commitments and 128 specific measures, including calls for platforms to share data with disinformation researchers, and to demonetize site users accused of spreading disinformation.

Following Mr. Musk’s decision to pull X from the EU disinformation agreement, Mr. Breton wrote a warning post on X that the platform still must follow EU content moderation policies.

“You can run but you can’t hide,” Mr. Breton’s May 26 warning states. “Beyond voluntary commitments, fighting disinformation will be legal obligation under #DSA as of August 25. Our teams will be ready for enforcement.”

On Oct. 10, Mr. Breton once again called on Mr. Musk to account for his platform’s content moderation policies.

“You need to be very transparent and clear on what content is permitted under your terms and consistently and diligently enforce your own policies,” Mr. Breton’s Oct. 10 letter to Mr. Musk reads. “This is particularly relevant when it comes to violent and terrorist content that appears to circulate on your platform. Your latest changes in public interest policies that occurred over night left many European users uncertain.”

X CEO Linda Yaccarino had responded to Mr. Breton in an Oct. 11 letter, insisting X has taken actions to stop illegal content from spreading on the platform. As of Oct. 14, Ms. Yaccarino said X had identified and suspended hundreds of Hamas-affiliated accounts, handled more than 80 law enforcement requests to remove content, and had applied its Community Notes fact-checking feature to more than 700 unique posts and thousands of reposts related to the Oct. 7 attacks.

“X is committed to serving the public conversation, especially in critical moments like this and understands the importance of addressing any illegal content that may be disseminated through the platform,” Ms. Yaccarino wrote in October. “There is no place on X for terrorist organizations or violent extremist groups and we continue to remove such accounts in real time, including proactive efforts.”