Meta, the parent company of Instagram, has issued an apology after a concerning incident where the word “terrorist” was inaccurately added to the biographies of Instagram users identifying as Palestinian. This issue arose from a translation error affecting Arabic translations within some of Meta’s products. The company acknowledged the mistake, stating, “We sincerely apologize that this happened.” The Instagram translation error has further intensified concerns regarding its handling of content related to the Israeli-Palestinian conflict, including allegations of suppressing pro-Palestinian voices on the platform.
Accusations of Content Suppression
In addition to the translation error, Instagram has faced allegations of “shadow banning” users who expressed support for Palestinians during the Israel-Gaza conflict. “Shadow banning” refers to the practice of limiting the visibility of posts to ensure they do not appear in others’ feeds. Some users reported that their Stories referencing the conflict received fewer views and that their accounts became harder to find in searches. While Meta acknowledged that a bug affected Stories, it emphasized that this issue was unrelated to the content’s subject matter.
Instagram Translation Error Brings Back Past Controversies
This incident follows the exposure of the translation error by an Instagram user and further raises concerns about Instagram’s content moderation policies. The platform has previously faced accusations of removing content related to the Israeli-Palestinian conflict. In response to these accusations, Instagram cited the removal of content containing “hate speech or symbols” and made adjustments to its algorithm. These concerns prompted Meta to commission an independent review of its content moderation practices during the 2021 Israel-Palestinian conflict.
The review, conducted by consultancy firm Business for Social Responsibility (BSR), concluded that Meta’s actions had an adverse impact on the rights of Palestinian users, including freedom of expression and political participation. Although BSR did not identify intentional bias, it recommended that Meta provide clearer explanations for users whose content is removed and enhance the language skills of its staff in Hebrew and Arabic dialects.