Tuesday, October 21, 2025
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
No Result
View All Result

in DRM Exclusive, Exclusives, Features, Gender & Tech, News

Online Harassment Leaves Lasting Scars on Women’s Lives

Muhammad Hussainby Muhammad Hussain
October 17, 2025
Online Harassment Leaves Lasting Scars on Women’s Lives

Imagine waking up to find a video of yourself spreading online — a video showing you doing things you never did, a video you know isn’t yours. The fear crawls under your skin, the panic rises with every share. This is just an ounce of the terrifying impact deepfake videos have on the lives of countless women.

Deepfake videos manipulate images or videos generated through artificial intelligence to make them appear real. This in turn has become a tool for not only defamation but also intimidation. For women in particular, these fabrications are designed to discredit their voices and ultimately drive them out of online spaces, silencing them from digital participation altogether. They are used to humiliate, blackmail, silence dissent, and exact revenge.

While the law enforcement agencies struggle to define, trace, and prosecute cases, a woman’s respect, mental health, safety and even career progression opportunities are compromised even before any legal remedy is pursued.

In regions such as Balochistan, every decision a woman makes—her clothing, her movements, her relationships—is tightly tethered to her family’s name and so-called ‘honour’, a patriarchal construct that many are now actively challenging. In such a scenario, such online harassment, fake videos and manipulated pictures can have devastating consequences. Yet, the online harassment never seems to stop.

A student from the University of Balochistan, choosing to remain anonymous, shared an experience from 2022 which has continued to shape her life. She described a period of intense fear and rejection after being harassed online. She even lost faith in safety: “I thought I had lost my life. I could not tell anyone. Even when I shared it with my own sister, she insulted me.”

The mental health fallout was significant months of anxiety, panic, withdrawal from online spaces. She changed her social media accounts and stopped using apps like WhatsApp and Facebook. Years later, she still lives with uncertainty online. Her story reflects what many victims endure privately: fear of disbelieving, fear of silence, fear of further harm.

Human rights activist Amida Noor described how the advancement of technology has outpaced the growth of legal protections and societal awareness. Young women, she noted, are especially vulnerable. Where there are no rules, no clear laws, and no strong support networks, harassment becomes normalised.

She added that the rise of deepfake videos is becoming a major issue. As digital technology advances, the problems for women, especially young girls who use the internet, are increasing significantly. This is because this space largely lacks any rules or laws. Consequently, women and girls frequently become targets of online harassment.

As our dependence on artificial intelligence (AI) and its innovations continue to grow, the creation of deepfake pictures and videos has led to great concern and numerous problems for women. Our society is extremely conservative, narrow-minded, and patriarchal. In such an environment, even if the content is fake, its mere appearance causes immense damage to women. Here, not only women in politics, but any woman who holds a differing opinion is targeted, Noor said.

If you disagree with someone—whether politically, economically, or socially—using these videos to harass and defame them has become very common and is dangerously on the rise. This trend is deeply concerning for us as activists, she remarked. Women have worked extremely hard to create a space for themselves in society and in the country, striving to work collectively despite many restrictions. Now, even in these spaces, if someone disagrees with them, incidents of using videos for online harassment have increased dramatically. If this continues, it will create serious problems for women, as they still do not have the same space and respect afforded to women in developed countries. Women already carry the heavy burdens of society; when such videos emerge, it creates additional obstacles for those trying to move forward, she added.

Senior advocate Habib Tahir, who has represented cybercrime victims, said that as deep fake videos are becoming increasingly common day by day, the issue of fake videos or online harassment, which previously targeted prominent figures, is now being used regularly to harass and blackmail ordinary people, especially girls. Their morphed pictures are posted, or they are blackmailed privately. This is becoming very common in our country, which is a very sad state of affairs, he remarked. .

Meena Majid’s deepfake video

On September 29, 2024 a video started circulating online showcasing a female politician hugging a provincial male chief minister with the caption: “Shamelessness has no limits. This is an insult to Baloch culture.”

The video showed a woman in a green shalwar kameez, identified as Balochistan MPA Meena Majeed, hugging Sarfraz Bugt, Balochistan CM. Both members of the Pakistan Peoples Party. AFP even conducted a fact-check of the video.

In a press conference, Majeed clarified that the video which is being attributed to her is actually of another girl who has since passed away. The original video, however, continues to circulate on the internet.

She even expressed profound concern for the families affected, asking what the deceased girl’s family must be enduring, as well as her own family, her father, brother, daughter, and husband due to the malicious republication of the video.

Majeed identified that the video was first shared by a group admin in a group named G Mehring and is now being propagated by Baloch Yakjehti Committee (BYC) activists and supporters. She characterised this as a failed attempt to defame her, motivated by the fact that her political stance frightens them. She stated she is being targeted because she is a pro-Pakistan Baloch who believes in Pakistan’s Constitution, advocates for peace, speaks about the development of Balochistan, and holds political disagreements with the BYC’s ideology.

She directly challenged the BYC, questioning why a committee named for “Baloch Solidarity” does not show solidarity with her, a Baloch woman, by condemning this vile act and restraining its workers. She argued that if they disagree with her views, they should engage in reasoned debate instead of resorting to character assassination, which proves they lack a moral or logical counter-argument.

Majeed then posed a critical question to the Baloch nation: “Is the honour and respect of every Baloch woman sacred, or only that of specific leaders like Maharang Baloch and Sammi Deen Baloch?” She even urged the Baloch society to reflect on this, warning that if she can be targeted today, any other Baloch daughter could be targeted tomorrow. She declared that she is standing up not only for herself but for all women of Balochistan, confident that those behind these attacks will soon be held accountable by the law.

She firmly stated that she will not be silenced and will continue to express her views. Emphasising that this is a broader issue, she condemned the very existence of such private videos of Baloch women online and vowed to fight for the dignity of all Baloch women.

She concluded by confirming that a case has been registered with the Cyber Crime unit, and legal action will be pursued to ensure no one involved is spared. She reiterated that this is a question of the honour of all Baloch daughters and women, and it is a battle she will continue to wage on their behalf.

Cyber crimes against women in Pakistan

Understanding the scope of digital harassment and deepfake abuse in Pakistan is challenging due to underreporting, social stigma, and the technical difficulty of tracing content. However, available data paints a worrying picture.

Between 2020 and 2024, the Federal Investigation Agency (FIA)’s Cyber Crime Wing (which previously used to investigate these cases) registered over 639,000 complaints across Pakistan. Of these, around 414,000 were officially verified, leading to 73,825 formal enquiries. Yet only 5,713 cases reached the court. These numbers are more than mere statistics; they represent human stories of violation, fear, and often, no resolution.

In the year 2023 alone, the FIA recorded approximately 46,000 enquiries, and in 2024 the number of cases increased to over 10,000 registered case filings. Yet conviction rates remain far lower than incidents reported. That gap between reporting, enquiry, registration, prosecution, and conviction   is where much of the injustice lies.

Civil society also bears witness. The Digital Rights Foundation (DRF) recorded 2,224 cases of cyber harassment in 2023 through its harassment helpline. Many of these referred to the FIA for action. Yet many complainants remain anonymous, do not proceed with FIRs, or withdraw complaints before hearings due to fear of social or family repercussions.

According to the Digital Rights Foundation (DRF), over 63% of the complaints received by its Cyber Harassment Helpline in 2023 were from women, with manipulated images and fake accounts forming a rising category of abuse.

The National Cyber Crime Investigation Agency (NCCIA), which replaced the FIA’s Cyber Crime Wing, has also confirmed an increase in cases of politically motivated online harassment.

In Balochistan, the official numbers are much murkier. There is no public, reliable, province-wise breakdown for deepfake instances. Helpline data suggests that most complaints come only from urban centers where reporting infrastructure exists. Remote districts rarely show up in national statistics, though activists believe harassment is widespread there too. Underreporting and lack of digital access or awareness contribute significantly to this “data desert.”

Legal landscape in Pakistan

Pakistan’s principal law dealing with cybercrimes is the Prevention of Electronic Crimes Act (PECA) 2016 . The law criminalises a range of online harms, including unauthorised interception or access to data, cyberstalking, harassment, and dissemination of sexually explicit content without consent. Under PECA:

  • Section 21 addresses offences against modesty and good moral condition of a person;
  • Section 24 defines cyberstalking;
  • Section 24A deals with dissemination of sexually explicit images or videos.

These sections offer legal tools for victims. However, PECA was drafted before the widespread availability of AI tools that enable seamless creation of deepfakes. It does not explicitly use the term “deepfake,” nor does it clearly define AI-manipulated imagery as a separate category. Advocates argue this ambiguity hampers legal clarity and slows prosecution of deepfake-based abuse.

The National Cyber Crime Investigation Agency (NCCIA) is the lead law-enforcement body under PECA. It is tasked with receiving complaints, gathering evidence, and cooperating with platform owners (both local and global) to take down harmful content.

Yet, despite rising complaints, many victims do not see justice. Delays due to insufficient technical capacity, procedural bottlenecks, reluctance of platforms to cooperate or preserve data, territorial jurisdiction issues when perpetrators are overseas, and social stigma often result in either case withdrawals, out-of-court settlements, or cases that drag on for years without meaningful resolution.

Beyond PECA, Pakistan has other legal instruments that are relevant to digital harassment, although they too face limitations.

The Criminal Law (Amendment) Act, various sections of the Pakistan Penal Code (PPC), and provincial-specific protections around gender-based violence (for example, provincial laws protecting women and children from abuse) can sometimes be invoked, especially in cases of defamation, stalking, or illegally distributing intimate images.

On the international side, Pakistan is party to treaties like the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW), which obliges it to protect women from discrimination and violence. While these treaties do not always directly address deepfakes, their frameworks give civil society and legal actors a normative basis for insisting that digital violence against women be treated with seriousness.

The UN Human Rights Council has even passed resolutions recognising online gender-based abuse as a human rights violation, while the Budapest Convention on Cybercrime remains an international benchmark for combating cyber offences, though Pakistan is not a signatory to it.

Legal challenges in Pakistan

Legal frameworks are only as good as their implementation. In Pakistan, several overlapping challenges prevent victims from accessing justice or protection.

Advocate Tahir remarked that FIA’s Cyber Crime Wing did not have the capacity to handle all complaints of cyber harassment. They used to receive hundreds of complaints daily via email and other means, but they weren’t able to respond to them all. Their equipment and resources was very limited and insufficient.

He also said that when any content is posted on platforms like Facebook or other tools, the FIA needs to have access to those platforms to trace it. The process of sending letters and obtaining data takes months and weeks to determine where the video originated and where action should be taken. Furthermore, many individuals operating from outside the country are also involved, and taking legal action against them is a slow and cumbersome process, Advocate Tahir added.

In addition, there are laws against defamation in cybercrime, and our country has specific defamation laws. However, the problem lies in the implementation of these laws, he pointed out. The legal procedures and mechanisms are so slow that it takes years for people to get justice. There is a need to work on streamlining these procedures and expediting cases. For instance, when a complaint is filed, arrests should be made within a few days, and trials should begin promptly, leading to convictions. So far, we have not heard of many such punishments; if any, they are very few. Currently, hundreds of such videos are circulating, he added.

He further said until legislation is improved and capacity is enhanced, these issues will not be controlled. Exemplary punishments, fines, and media exposure of the culprits are necessary to create a deterrent effect. This will help develop a public perception that this is a crime and should not be committed, the lawyer remarked.

Noor also said that cyber cases remain under trial in the country for a long period of time. The common citizen is suffering, and the implementation of laws is a major issue. Access to the few institutions that exist is not available to everyone, and the cases drag on for very long periods, causing the victims and their families immense hardship, she said.

Anyone linked to social media profiles may have awareness, but the nature of that awareness and how it is applied in real life is a big challenge for us. There is a need for every social media user to know their rights and their limits, as well as the boundaries of how others can approach or speak to them, she remarked.

To improve this, workshops for the youth, sessions on gender-based violence and gender sensitivity, training programs, and media campaigns are essential, she recommended.

When working on any platform, gender sensitivity must be kept in mind. Online harassment and everything that comes with it should not be taken lightly. One must be aware of their own limits and should also respect the boundaries of others, so we can build a better and healthier society, the activist said.

Global perspectives on cyber crimes

Countries worldwide are scrambling to regulate the rapid escalation of online, AI-fueled harassment. While there is no universal standard yet, several legislative and regulatory efforts stand out.

In the European Union, the Digital Services Act (DSA) requires digital platforms to remove illegal or harmful content rapidly and impose penalties on platforms that fail to comply. Similarly, Germany, France, and Spain have introduced laws or regulations specifically addressing non-consensual distribution of intimate images and deepfake content. These laws typically mandate faster takedowns, transparency in moderation, and stronger user protections.

The United Kingdom’s Online Safety Act imposes duties on tech companies to limit the spread of harmful content and protect users, especially minors, against non-consensual intimate imagery. The Act also creates legal responsibilities for platforms to remove or restrict deepfake or manipulated content when it causes serious harm.

In the United States, several states have enacted specific laws against revenge porn and non-consensual deepfake pornography. Virginia, for example, has made it illegal to distribute deepfake sexual content without consent . California and Texas have passed statutes penalizing deepfakes used for political disinformation or personal defamation. At the federal level, discussions are ongoing, but broad regulation of AI-generated imagery is still fragmented.

In South Korea, in the wake of scandals like the Nth Room– an online sex abuse ring where women and underage girls were blackmailed into sharing sexual content that was distributed to paying members– the government passed laws criminalising the creation and distribution of sexually explicit content, including manipulated or deepfake imagery. Authorities have also set up specialised investigative units and victim hotlines, along with measures for rapid content removal in cooperation with platform providers and investigators.

On the intergovernmental level, the United Nations Human Rights Council has repeatedly called for recognition of digital gender-based violence. Under advocacy from global human rights organisations, there is increasing pressure on governments to ensure that offline rights (privacy, dignity, bodily autonomy) are protected online. Some international treaties and agreements have started to treat non-consensual intimate content, especially manipulated content as harmful and in need of regulation.

 

 

 

The story has been edited by Yasal Munim who works as Senior Manager Programs at Media Matters for Democracy. 

Tags: cyberbullyingDeepfakesGendergender and techonline gender-based violenceonline harassment
Previous Post

PTCL says internet connectivity restored nationwide after undersea cable repair

Next Post

Senior cybercrime officer abducted in Islamabad, six NCCIA officials reported ‘missing’ so far

Share on FacebookShare on Twitter
NCCIA charges three YouTubers for promoting illegal gambling apps

Senior cybercrime officer abducted in Islamabad, six NCCIA officials reported ‘missing’ so far

October 21, 2025
Online Harassment Leaves Lasting Scars on Women’s Lives

Online Harassment Leaves Lasting Scars on Women’s Lives

October 17, 2025
PTCL says internet connectivity restored nationwide after undersea cable repair

PTCL says internet connectivity restored nationwide after undersea cable repair

October 17, 2025
No Content Available

Next Post
NCCIA charges three YouTubers for promoting illegal gambling apps

Senior cybercrime officer abducted in Islamabad, six NCCIA officials reported 'missing' so far

About Digital Rights Monitor

This website reports on digital rights and internet governance issues in Pakistan and collates related resources and publications. The site is a part of Media Matters for Democracy’s Report Digital Rights initiative that aims to improve reporting on digital rights issues through engagement with media outlets and journalists.

About Media Matters for Democracy

Media Matters for Democracy is a Pakistan based not-for-profit geared towards independent journalism and media and digital rights advocacy. Founded by a group of journalists, MMfD works for innovation in media and journalism through the use of technology, research, and advocacy on media and internet related issues. MMfD works to ensure that expression and information rights and freedoms are protected in Pakistan.

Follow Us on Twitter

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements