Meta fires engineer for fixing bug blocking pro-Palestinian posts; files lawsuit against company

Tech

Published: 2024-06-05 14:02

Last Updated: 2024-07-04 19:36


Mark Zuckerberg, the CEO of Meta (Photo: Getty Images)
Mark Zuckerberg, the CEO of Meta (Photo: Getty Images)

Former Meta engineer, Palestinian-American Ferras Hamad, filed a lawsuit against the company on Tuesday for discrimination and wrongful termination, claiming that Meta fired him for trying to help fix bugs causing the suppression of Palestinian content.

Hamad's lawsuit says that Meta exhibited bias against Palestinian content, even going so far as to delete internal employee communications about relatives who died in Gaza. It adds that the company conducted investigations into employees' use of the Palestinian flag emoji, a scrutiny not applied to those using the “Israeli” or Ukrainian flag emojis.


Also Read: Users report controversial shadow banning of pro-Palestine content


The incident that led to Hamad’s termination occurred in December, involving an emergency procedure to address severe platform issues, known within Meta as a SEV or "site event." Hamad discovered irregularities in SEV policies affecting Palestinian Instagram accounts, with posts being restricted from searches and feeds. One notable instance cited in the lawsuit involves a video by Palestinian photojournalist Motaz Azaiza, which was mislabeled as pornographic content despite showing a destroyed building in Gaza.

Hamad explained that he received conflicting guidance from colleagues regarding his authority to address the SEV. Despite having worked on similar SEVs related to “Israel”, Gaza, and Ukraine, Meta told him he was fired for violating a policy that bars employees from working on accounts of people they know personally. Hamad asserts he had no personal connection to Azaiza.

- Oversight and controversy -

This incident follows criticism from Meta's independent oversight board, which in December rebuked the company for removing posts depicting human suffering in Gaza. The board overturned decisions, including the removal of an Instagram video showing the aftermath of a strike near Al-Shifa Hospital in Gaza, which depicted injured or killed Palestinians, including children.

Oversight board co-chair Michael McConnell stated that while the board focuses on protecting freedom of expression, it also aims to prevent content that incites violence or hatred. The board urged Meta to preserve removed posts that might serve as evidence of human rights violations.

- Policy on “zionist” discussions -

In February, sources revealed that Meta was considering stricter rules for discussions involving the term "Zionist" on its platforms. An email from Meta policy personnel indicated the company was revisiting its hate speech policy concerning the term. Civil society sources expressed concern, with Dani Noble from Jewish Voice for Peace stating that conflating the political ideology of Zionism with Jewish ethno-religious identity was troubling.

Meta’s internal guidelines, initially published by The Intercept in 2021, instructed moderators to remove posts if the term "Zionist" was used as a proxy for "Jewish" or "Israeli," both protected under corporate speech policies. The proposed policy change could lead to more aggressive moderation and removal of posts critical of “Israeli nationalism”.

Hamad’s lawsuit adds to the growing scrutiny of Meta’s content moderation practices, raising questions about bias and the company’s commitment to impartiality in handling politically sensitive topics.

- Instagram’s shadowban history -

Instagram users have long voiced their frustrations about being shadowbanned for posting content related to Palestine, dating back to years before October 7.

Many have reported that their posts, stories, and hashtags about Palestinian issues often do not appear in their followers' feeds, are excluded from search results, and fail to receive typical engagement levels. This perceived suppression has sparked accusations of bias and censorship, particularly during times of heightened aggression on Gaza.

Users argue that this treatment stifles important conversations and awareness about Palestinian experiences and human rights violations, undermining the platform's role as a space for free expression and social activism.

Across the platform, users have circulated different ways on how to outsmart the shadowban. The list suggests strategies such as encouraging followers to engage with your content, using different spellings, and periodically mixing in unrelated posts to avoid the shadowban (for example, some inventive users have alternated between screenshots of casualties in Gaza and cat pictures).

On Dec. 22, 2023, the Human Rights Watch (HRW) released a 51-page analysis titled "Meta’s Broken Promises: Systemic Censorship of Palestine Content on Instagram and Facebook," highlighting how Meta's content moderation policies have increasingly silenced voices supporting Palestine in the wake of the war on Gaza.

The report documents over 1,050 instances of post suspensions across Instagram and Facebook within the past two months, alongside other suppressive actions against content advocating for Palestinian human rights and condemning ongoing violations. This widespread censorship spans over 60 countries, underscoring the global reach of Meta’s flawed policies.

HRW attributes this systematic censorship to several factors: inconsistent policy implementation, overreliance on automated moderation tools, and undue influence from governments over content removals.

The organization identifies six key patterns in Meta’s censorship tactics, including the removal of posts, stories, and comments, account suspensions, restrictions on user engagement, limitations on the use of live features and monetization, and shadow-banning, which significantly decreases the visibility of users’ content without notification.

Deborah Brown, a senior digital rights researcher at HRW, condemned Meta’s actions, stating, “Meta’s censorship of content in support of Palestine adds insult to injury at a time of unspeakable atrocities and repression already stifling Palestinians’ expression. Social media is an essential platform for people to bear witness and speak out against abuses, while Meta’s censorship is furthering the erasure of Palestinians’ suffering.”

The HRW report details numerous instances where Meta’s algorithm misapplied its Community Guidelines, falsely categorizing pro-Palestinian content as violations related to violent and graphic content, incitement, hate speech, and nudity. The company has also inconsistently enforced its “newsworthy allowance” policy, removing content documenting Palestinian injuries and deaths that hold significant news value.

Despite acknowledging its flawed policy enforcement, Meta has a history of neglecting its human rights responsibilities. The HRW points to past incidents, such as the 2021 censorship during the Sheikh Jarrah neighborhood conflict in occupied East Jerusalem, where Meta faced similar accusations of arbitrarily silencing pro-Palestinian content.


Also Read: Human Rights Watch accuses Meta of systematic censorship of Palestine


HRW has called on Meta to permit protected expression on its platforms, including the documentation of human rights abuses and political movements.

“Instead of tired apologies and empty promises, Meta should demonstrate that it is serious about addressing Palestine-related censorship once and for all by taking concrete steps toward transparency and remediation,” Brown emphasized.