abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

このページは 日本語 では利用できません。English で表示されています

記事

2023年10月18日

著者:
Sam Biddle, The Intercept

Meta allegedly censors Gaza Hospital bombing image over nudity concerns

"Instagram Censored image of Gaza hospital bombing, claims it’s too sexual", 18 October 2023

INSTAGRAM AND FACEBOOK users attempting to share scenes of devastation from a crowded hospital in Gaza City claim their posts are being suppressed, despite previous company policies protecting the publication of violent, newsworthy scenes of civilian death.

Since Hamas’s surprise attack against Israel on October 7 and amid the resulting Israeli bombardment of Gaza, groups monitoring regional social media activity say censorship of Palestinian users is at a level not seen since May 2021...

Two years ago, Meta blamed the abrupt deletion of Instagram posts about Israeli military violence on a technical glitch. On October 15, Meta spokesperson Andy Stone again attributed claims of wartime censorship on a “bug” affecting Instagram. (Meta could not be immediately reached for comment.)

“It’s censorship mayhem like 2021,” Marwa Fatafta, a policy analyst with the digital rights group Access Now, told The Intercept. “But it’s more sinister given the internet shutdown in Gaza.”

In other cases, users have successfully uploaded graphic imagery from al-Ahli to Instagram, suggesting that takedowns are not due to any formal policy on Meta’s end, but a product of the company’s at times erratic combination of outsourced human moderation and automated image-flagging software.

Alleged Photo of Gaza Hospital Bombing

According to screenshots shared with The Intercept by Fatafta, Meta platform users who shared this image had their posts removed or were prompted to remove them themselves because the picture violated policies forbidding “nudity or sexual activity.” Mona Shtaya, nonresident fellow at the Tahrir Institute for Middle East Policy, confirmed she had also gotten reports of two instances of this same image deleted.

タイムライン