Meta, the owner of Facebook, issued its first annual human rights report on Thursday, following years of accusations that it neglected online crimes that led to real-world violence in regions such as India and Myanmar.
The analysis, which covers due diligence performed in 2020 and 2021, includes a summary of a contested human rights impact assessment of India that Meta commissioned Foley Hoag to execute.
Human rights organizations such as Amnesty International and Human Rights Watch demanded the full publishing of the India review in a January letter and accused Meta of procrastination.
Meta said in its summary that the law firm had highlighted “serious human rights issues” about Meta’s platforms, such as “hate advocacy that incites hostility, discrimination, or violence.”
In addition, the study did not examine “prejudice allegations in content moderation.”
Ratik Asokan, a representative of India Civil Watch International who participated in the study and drafted the joint letter, told Reuters that the summary looked to be an attempt by Meta to “whitewash” the firm’s results.
“This is as clear evidence as you can get that they are truly uncomfortable with the information in that report,” he continued. Share at least the executive summary so we can see what the unbiased law firm has to say.
Deborah Brown, a researcher with Human Rights Watch, characterized the description as “selective” and argued that it “brings us no closer” to comprehending the company’s role in the dissemination of hate speech in India or their vows to address the issue.
Human rights organizations have raised concerns about anti-Muslim hate speech fueling tensions in India, Meta’s largest overseas user market, for years.
In 2020, Meta’s top public policy officer in India resigned as a result of a Wall Street Journal piece claiming she delayed enforcing the company’s principles to Hindu nationalist individuals who were cautioned internally for instigating violence.
Meta noted in its report that it was reviewing India’s recommendations, but did not commit to implementing them as it had in past human rights assessments.
In response to a question regarding the distinction, Miranda Sissons, the director of Meta Human Rights, highlighted United Nations regulations that warn against threats to “affected stakeholders, workers, or real business secrecy requirements.”
Sissons told Reuters that a variety of factors, including security concerns, can influence the method of reporting.
Sissons, who joined Meta in 2019, claimed that her team now comprises of eight members, while around one hundred more persons work on human rights within related teams.
In addition to country-level assessments, the study revealed her team’s work on Meta’s COVID-19 response and Ray-Ban Stories smart glasses, including the identification of possible privacy issues and negative consequences on vulnerable groups.
Sissons noted that the evaluation of augmented and virtual reality technologies, which Meta has emphasized with its bet on the “metaverse,” is occurring primarily this year and would be addressed in next studies.