How a Yazidi teen inspired a community-led crackdown on digital violence
It began quietly enough, with a single TikTok post exposing the vulnerability of a minority Yazidi girl from the northwestern Iraqi town of Shingal. But what followed was far from quiet.
We first shared Dina’s story in January: Someone had posted a photo on TikTok falsely portraying the 16-year-old as soliciting sex online. That post carried devastating weight in Dina’s conservative community, and she soon felt the walls closing in. Fortunately, Dina found relief and justice—thanks to the swift intervention of Baghdad Women Association (BWA) and INSM Network for Digital Rights. Working together, they managed to close the attacker’s account and offer Dina the psychosocial support she needed.
That outcome was early validation for the ecosystems response to digital violence against women that the SecDev Foundation is piloting in Iraq and Jordan, with partners like BWA and INSM, and with funding from the International Development Research Center (IDRC).
As it turns out, Dina’s story did not end there.
After taking this case public, BWA and INSM say they were overwhelmed by the scale of the response from across the Yazidi community and beyond. Very quickly, families, peers, and survivors began reaching out—many with eerily similar stories of digital exploitation.
Hundreds of Dinas
As a longstanding TikTok “trusted partner,” INSM has helped shape the platform’s regional approach to online safety. Their latest collaboration, Safe Together, promotes digital awareness among Iraqi students, parents and educators. This includes co-developing culturally relevant digital-safety resources and shaping content-moderation mechanisms. Early outreach sessions in schools and universities have already surfaced promising community-led efforts.
That’s how INSM connected with Maher Murad Shammu, a second-year network engineering student, and the only person from the Yazidi community studying cyber security. In February, he created Tiqaniyat Shams, a Kurmancî-language Instagram page promoting digital safety, teaming up with Sabah Atte, an influencer with nearly 160,000 followers. And when they learned how INSM’s help desk had helped Dina, they began forwarding digital-abuse reports that they were receiving.
We were astonished by TikTok’s rapid response to the cases we reported via INSM,” Maher says. “One girl was saved from the brink of suicide after a harassing account was shut down within 24 hours.”
Struck by the sheer volume of similar-sounding reports, INSM launched its own investigation. What they found was alarming: a web of coordinated digital attacks targeting Yazidi girls using stolen images, fake accounts, suggestive captions, and manipulated audio. In some cases, attackers were pocketing hundreds of dollars monthly through blackmail.
This wasn’t random,” explains Haidar Hamzoz, INSM’s executive director. “We saw patterns—language misuse, repeated tactics—that pointed to a network systematically targeting a vulnerable minority.”
Armed with analysis, INSM got to work. Over a span of two months, they helped shut down more than 175 malicious accounts, primarily on TikTok, but also on Instagram and Facebook. And they were able to connect some victims to support in the community.
One account, run by a blackmailer with 65,000 followers, was shut down after Maher obtained a screenshot of him bragging about extorting girls for $1,200 a month. Another had threatened to murder Yazidi girls who might defy social norms—resurfacing the 2007 Qahtaniyah bombings that killed hundreds of Yazidis, stoking communal trauma and risking renewed violence.
According to Hamzoz:
It was like opening Pandora’s box: each lead uncovered another, revealing just how vast and coordinated the network of offenders was. They were preying on vulnerable local girls, exploiting platform blind spots. In one case, we discovered a perpetrator using a suggestive audio clip paired with a girl’s photo. We wondered: what if we isolated the audio and checked where else it had been used? We uncovered thousands of similar posts that weren’t even on our radar. And we reported the file to TikTok, and they muted every account using it. With a single action, we may have protected thousands of girls.”
Growing the community response
With its trusted-partner status, INSM’s report-response times at TikTok average 48 hours—much faster than public reporting mechanisms. TikTok itself acknowledged its reviewers’ limitations in handling cases involving unfamiliar languages or subtle cues. That’s why local partners are so valuable: they can identify and explain culturally specific threats.
We play complementary roles,” says Asia Anwar, INSM Project Coordinator. “Local actors and initiatives like Tiqaniyat Shams have access and trust. At INSM, we verify, escalate, and resolve. Within days, girls were sending screenshots of fake accounts using their names or photos. Many were too scared or ashamed to speak up initially—until they saw what happened with Dina.”
Because keeping up with offenders is exhausting work, INSM is negotiating with TikTok to block accounts based on IP addresses so it’s harder for offenders to simply open new accounts. Most importantly, they are working at building local initiatives’ capacities to prevent and respond to digital violence against girls like Dina.
For instance, INSM is training Tiqaniyat Shams volunteers in approaches for gathering evidence and documenting cases. That’s helping this local, volunteer-run initiative independently handle new incoming reports of abuse. And word is getting out: their Instagram following grew by 450 percent this year. They’ve even received referrals from Iraq’s National Security Service.
This has been a steep learning curve,” says Tiqaniyat Shams’ Sabah. “We’re proud of the impact. Our aim is to eradicate digital violence against women in our community, and we’re getting closer.”
INSM has also published an Arabic-language community guide to responding to digital violence, built for families as well as teachers and community leaders. At the same time, they’ve been busy raising awareness through their social media channels, website and partner network. And their content is drawing attention not only from the Yazidi community but also from broader digital rights networks. Since January 2025, INSM’s website traffic has increased by 400% and their Facebook page alone reached over 5 million users.
Backlash and resilience
The Dina-inspired pushback campaign was public by design. At one point, Tiqaniyat Shams celebrated a fresh batch of account closures on their Instagram page, publicly acknowledging their partnership with INSM. That’s when the threats started, against everyone involved. Many were personally intimidating. One message simply vowed to “open a thousand accounts for every one INSM shuts down.”
INSM also faced a coordinated attack on its website. More than 1,800 toxic comments flooded the platform in just 30 minutes, overwhelming the server. The team pulled the site offline for one week as a precaution. “We expected resistance,” says Hamzoz, “but the scale and coordination of the attack shocked us. This wasn’t just harassment; it was an organized attempt to silence our work.”
As the adage goes, you can often measure the effectiveness of a campaign by the backlash it sparks. So rather than back down, INSM used the downtime to strengthen the digital resilience of its systems and team. The website is now back online, along with the public awareness campaign, with an expanded website now in the works.
Dina’s story was a seed that’s grown into a local movement.
This one pivotal case motivated many more Yazidi women and girls to come forward. In doing so, this case exposed both the harm and the hope embedded in Iraq’s digital landscape. Above all, it showed what’s possible when community stakeholders come together as deliberate ecosystems of support for women. That’s the idea at the heart of our IDRC-supported pilot project in Iraq and Jordan.
Dina’s case marked the start of a local shift in how digital harm is addressed—collaboratively and quickly. And its ripples are being felt not just in policy and platform accountability, but in the confidence of a community that now knows: they are not alone.
