Facebook, WhatsApp Parent Meta in India Found Exposed to Human Rights Risks Due to Third Party Action

Facebook parent Meta's HRIA involved interviews with 40 civil society stakeholders, academics, and journalists.

Facebook, WhatsApp Parent Meta in India Found Exposed to Human Rights Risks Due to Third Party Action

Meta faced criticism and potential reputational risks, according to Meta

Highlights
  • The project was undertaken by law firm Foley Hoag
  • The report is based on an independent human rights impact assessment
  • Project was launched in March 2020, experienced limitations due to COVID
Advertisement

Meta Platforms, which include Facebook and WhatsApp, were found exposed to human right risks such as "restrictions of freedom of expression and information" and "hatred that incites hostility" due to action of third parties, the first human rights report of the social media giant has said.

The report is based on an independent human rights impact assessment (HRIA) commissioned in 2019 by Meta on potential human rights risks in India and other countries related to its platforms.

The project was undertaken by law firm Foley Hoag.

"The HRIA noted the potential for Meta's platforms to be connected to salient human rights risks caused by third parties, including: restrictions of freedom of expression and information; third party advocacy of hatred that incites hostility, discrimination, or violence; rights to non-discrimination; as well as violations of rights to privacy and security of person," the report said.

The HRIA involved interviews with 40 civil society stakeholders, academics, and journalists.

The report found that Meta faced criticism and potential reputational risks related to risks of hateful or discriminatory speech by end users.

The assessment also noted a difference between company and external stakeholder understandings of content policies.

"It noted persistent challenges relating to user education; difficulties of reporting and reviewing content; and challenges of enforcing content policies across different languages. In addition, the assessors noted that civil society stakeholders raised several allegations of bias in content moderation. The assessors did not assess or reach conclusions about whether such bias existed," the report said.

According to the report, the project was launched in March 2020 and it experienced limitations caused by COVID-19, with a research and content end date of June 30, 2021.

The assessment was conducted independently of Meta, the report said.

The HRIA developed recommendations for Meta around implementation and oversight, content moderation, product interventions which Meta is studying and will consider them as a baseline to identify and guide related actions, the report said.

 


Are affordable smartwatches worth it? We discuss this on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated - see our ethics statement for details.
Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: facebook, meta, whatsapp, human rights
TikTok's Global Security Chief Roland Cloutier to Step Down, Reveals Internal Memo
NASA, Russian Space Agency to Share Integrated Crew Flights to ISS From September
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »