However, no willful bias was found at Meta, either by the company as a whole or by individual employees. The report’s authors said they found “no evidence of racial, ethnic, national, or religious animus in government teams,” noting that Meta “has staffers who represent diverse viewpoints, nationalities, races, ethnicities, and religions that support this.” conflict are relevant”.
Rather, numerous instances of unintentional bias violating the rights of Palestinian and Arabic-speaking users have been identified.
In response, Meta said it plans to implement some of the report’s recommendations, including improving its Hebrew “classifiers,” which help automatically remove offending posts using artificial intelligence.
“Many of these recommendations don’t have quick fixes overnight, as BSR makes clear,” the Menlo Park, Calif.-based company said in a blog post Thursday. “Although we have already made significant changes as a result of this exercise, this process will take time – including time to understand how best to address some of these recommendations and whether they are technically feasible.”
Meta, the report confirmed, also made serious enforcement errors. For example, as the Gaza war raged last May, Instagram briefly banned the hashtag #AlAqsa, a reference to the Al-Aqsa Mosque in Jerusalem’s Old City, a focal point of the conflict.
Meta, which owns Instagram, later apologized and said its algorithms had confused Islam’s third holiest site with the militant group Al-Aqsa Martyrs Brigade, an armed affiliate of the secular Fatah party.
The report repeated issues raised in internal documents by Facebook whistleblower Frances Haugen last fall, showing that the company’s problems are systemic and long-known within Meta.
A major shortcoming is the lack of moderators in languages other than English, including Arabic — one of the most common languages on Meta’s platforms.
For users in Gaza, Syria and other conflict-torn regions of the Middle East, the issues addressed in the report are nothing new.
Israeli security agencies and watchdogs, for example, have been monitoring Facebook and bombarding it with thousands of orders to delete Palestinian accounts and posts while trying to crack down on incitement.
“They are flooding our system and completely overwhelming it,” Ashraf Zeitoon, Facebook’s former head of policy for the Middle East and North Africa region, who left in 2017, told The Associated Press last year. “It forces the system to make mistakes in favor of Israel.”
Israel experienced a violent outburst of violence in May 2021 – with weeks of tension in East Jerusalem escalating into an 11-day war with Hamas militants in the Gaza Strip. The violence spread to Israel itself, with the country witnessing the worst communal violence between Jewish and Arab citizens in years.
In an interview, Kobi Shabtai, Israel’s national police chief, told Yediot Ahronot daily that he believes social media has fueled communal fighting. He called for social media to be shut down if similar acts of violence broke out again, and said he suggested last year that social media be blocked to bring down the flames.
“I’m talking about shutting down the grids completely, calming the situation on the ground and turning them back on when it’s calm,” he was quoted as saying. “We are a democratic country, but there is a limit.”
The remarks caused an uproar and police issued a clarification that his proposal was for extreme cases only. Omer Barlev, the cabinet minister who oversees the police, also said Shabtai had no authority to impose such a ban.
Associated Press reporter Josef Federman contributed from Jerusalem.