I work on the Revenge Porn Helpline (a UK based service operated by SWGfL that can support any adult affected by intimate image abuse). In recent years, our work has been brought into the limelight through prolific cases such as the amazing advocacy work of Georgia Harrison following the trial of her perpetrator. Whether it’s Taylor Swift being targeted with deepfake content or a close friend who’s being blackmailed by an ex-partner, intimate image abuse is happening all around us, and unfortunately, it’s happening in many different forms.
Our helpline may be called the ‘Revenge Porn Helpline’ but the more accurate term to use is intimate image abuse. We must never shy away from what this behaviour is – it’s abuse. The helpline alone received nearly 19,000 reports last year of adults being targeted by perpetrators, totaling a 106% increase from the previous period. If we look at it across the board, since 2015 (when our helpline initially began) reports have increased more than ten-fold.
While UK legislation has greatly improved in recent years to offer stronger protections for victims including jail sentences for perpetrators, we still face challenges. This is a global issue, and it can’t be tackled effectively by countries individually. Adults from across the globe are experiencing this devastating form of abuse, and in many cases, whether due to cultural backgrounds or through minimal or non-existent legal protections, some people may not receive the same level of support as others.
We knew that a helpline structure could never combat this issue effectively worldwide – having practitioners pick up the phone to support adults from around the world was just not practical. But our deep understanding of the complexities surrounding this global issue showed us that the solution lay with technology. Although devices are the primary vehicle that enable perpetrators to engage in intimate image abuse, we found we could use technology to flip the narrative and take their control away. In other words, we could fight fire with fire.
StopNCII.org
StopNCII.org was built in partnership between the Revenge Porn Helpline team at SWGfL and Meta. It allows any adult in the world to protect their images from being shared online through the innovative use of hashing technology. One of the more challenging aspects of this work was to ensure that people’s privacy remained intact. Asking people to send their intimate photos to an external platform for escalation and review raised ethical concerns: how could someone feel their content was truly safe if photos were still being shared, whether it was through a reputable support network or not?
The hashing technology we use provided the solution we needed.
It ensures that photos will never be shared with the StopNCII.org tool; instead, it creates a ‘digital fingerprint’ (or a hash) of the image in question which is then used across some of the most widely used platforms across the world to prevent and block images from being shared. The hash is simply a piece of code, represented as a string of letters and numbers that can be matched against any corresponding photo that is trying to be shared.
To date, we have onboarded 12 industry partners including Facebook, Instagram, TikTok, OnlyFans, Reddit, Pornhub, Snap, Threads, Niantic, RedGifs, Playhouse and Microsoft Bing, who are preventing NCII across their platforms through their implementation of StopNCII.org.
Since 2021, StopNCII.org has supported hundreds of thousands of people on a global scale. It is a free initiative that has an open invitation for any legally operated organisation that permits the sharing of image based content to join the ongoing fight against intimate image abuse.
Making NCII Illegal
In our work to prevent intimate image abuse, there is always more that can be done. We have highlighted that the Revenge Porn Helpline have 30,000 URLs containing non-consensually shared content that we are unable to remove. This is due to the current legal framework in the UK that allows NCII content to remain online due to websites not being legally required to take it down.
This issue could be resolved if Internet Service Providers (ISPs) were able to block the content that we can’t remove. The new UN Cybercrime Convention requires countries to have laws preventing NCII, and we recommend amendments to the law that reflect the need for the adoption of hashing technology. It’s a necessary, essential move that we are working hard to encourage Parliament to act on to strengthen protections for victims.
While it may seem like there is a long road ahead, the significant progress that has been made in quite a short space of time provides hope and optimism that we can one day eradicate intimate image abuse for good. One thing is for sure though, without the innovative use of technology, we would not be where we are today. It is at the heart of combating this abuse and we will continue to work with it as the digital landscape evolves.