Tuesday, October 7, 2025
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
No Result
View All Result

in DRM Exclusive, Top story

Facebook Parent Meta Releases First Human Rights Report

DRMby DRM
July 15, 2022
Facebook Parent Meta Releases First Human Rights Report

Photo: AP

July 15, 2022 – Meta Platforms, Inc., the parent organisation of leading social media platforms such as Facebook, Instagram and WhatsApp, has released its first annual report detailing the impacts of its products and policies on human rights globally and the company’s efforts to tackle growing online challenges, including misinformation and hate speech, across its platforms.

The report, released on Thursday, covers 2020 and 2021 and provides details regarding Meta’s overall approach to manage human rights risk, according to an official statement. 

“The report includes insights and actions from our human rights due diligence on products, countries and responses to emerging crises,” states Meta. “It also discusses our work protecting the privacy of journalists and human rights defenders, increasing youth safety on Instagram, fighting exploitation across our apps and protecting free and fair elections around the world.”

Foley Hoag, the law firm commissioned to conduct human rights impact assessment of India, notes in its summary the potential for “salient human rights risks” including “advocacy of hatred that incites hostility, discrimination, or violence” involving Meta’s platforms. The assessment, however, did not investigate the “accusations of bias in content moderation”.

Meta’s controversial and discriminatory approaches to content moderation have repeatedly raised concerns and questions as to how the tech corporation handles hate speech and misinformation across its social media platforms. The 2021 Facebook Papers revealed that Facebook struggles with non-English languages, which leaves the platform vulnerable to abuse and hate speech, especially in the Global South. About 87 percent of the company’s global budget on classifying misinformation was allocated for the US, while the rest of the world received the remaining 13 percent of it. 

Meta is frequently called out for its failure to contain the spread of sensitive and inflammatory content, particularly hate speech and misinformation, which has manifested serious damages in countries such as India (Meta’s largest market by number of users), Myanmar, Ethiopia, among others. Recently, a test conducted by UK-based nonprofit groups Global Witness and Foxglove to ascertain how well Facebook could detect hate speech in advertisements that were submitted to the platform showed that the company had failed again to detect hate speech in advertisements inciting violence. Earlier this year, Facebook failed a similar test run by Global Witness with hate speech against the Rohingya people in Myanmar. Facebook was unsuccessful in detecting hateful and inflammatory content in the advertisements which were subsequently approved for publication by its systems.  

Meta’s summary of its assessment of India, however, has been termed an attempt to “whitewash” the firm’s findings by Ratik Asokan, a representative from India Civil Watch International who participated in the assessment.

“It’s as clear evidence as you can get that they’re very uncomfortable with the information that’s in that report,” said Asokan. “At least show the courage to release the executive summary so we can see what the independent law firm has said.”

Similarly, Human Rights Watch researcher, Deborah Brown, called the summary “selective” and remarked that it “brings us no closer” to understanding Meta’s role in spreading hate speech in India or the commitments that the company will make to address the problem.

In addition to being grilled for placing profits over public good, Meta has faced accusations of narrowing the scope of its assessment of human rights impacts as well as delaying its completion. The corporation has also been called out for not taking action against violence fuelled by ethnic, political and religious intolerance across its platforms, especially in India and Myanmar. 

Tags: FacebookHuman Rights ReportMeta
Previous Post

Uber Taken To Court By 550 Women Over Sexual Assault Claims

Next Post

Elon Musk Asks Court To Reject Twitter’s Fast-Track Trial Request

Share on FacebookShare on Twitter
NCCIA charges three YouTubers for promoting illegal gambling apps

PTA blocks 139 websites and accounts selling citizens’ personal data

October 5, 2025
PTCL gets regulatory green light to acquire Telenor Pakistan

PTCL gets regulatory green light to acquire Telenor Pakistan

October 1, 2025
Senate panel told FBR drafting tax plan for TikTok content creators

Senate panel told FBR drafting tax plan for TikTok content creators

September 28, 2025
No Content Available

Next Post
Elon Musk's Twitter deal on hold

Elon Musk Asks Court To Reject Twitter’s Fast-Track Trial Request

About Digital Rights Monitor

This website reports on digital rights and internet governance issues in Pakistan and collates related resources and publications. The site is a part of Media Matters for Democracy’s Report Digital Rights initiative that aims to improve reporting on digital rights issues through engagement with media outlets and journalists.

About Media Matters for Democracy

Media Matters for Democracy is a Pakistan based not-for-profit geared towards independent journalism and media and digital rights advocacy. Founded by a group of journalists, MMfD works for innovation in media and journalism through the use of technology, research, and advocacy on media and internet related issues. MMfD works to ensure that expression and information rights and freedoms are protected in Pakistan.

Follow Us on Twitter

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements