Though the scale of the problem is potentially enormous, affecting around 32% of women of all ages, and 62% of women aged 18-34, it is also largely underestimated. Women largely don’t report when they experience online violence, either to the platform on which it takes place, or to external law enforcement services.
This is influenced by many factors: most immediately a lack of universal education on online safety and the rights that people have to be safe online; closely following, that when women do report their experiences, they are largely underserved by the services best positioned to provide remediation. Amongst a breadth of similar studies, a 2017 study of 800 police officers in the UK found that 94.7% had not received any formal training on how to conduct investigations into Intimate-Image Based Abuse - this clarifies recently highlighted phenomenona such as extraordinarily low conviction rates on crimes such as sexual assault and harassment. Structural issues such as a lack of training on Online Harms, as well as on unique intersectional experiences of these harms, contribute to findings such as the WWWF’s in 2016, that in 74% of 86 Web Index Countries surveyed, law enforcement agencies and courts are failing to take appropriate actions for Online VAWG. Within this context, there is an ecosystem of support providers that have developed distinct services for victim-survivors of VAWG. However, a lack of research in the area, and the breadth and complexity of the issue combine to mean that these services struggle to innovate and counter harm before it happens.
PUBLIC has been researching the problem of Online VAWG and the main technical approaches to tackling this issue area as part of the Online Safety Data Initiative. We’ve focused on understanding the dynamics and challenges of the problem, and from this, how the Safety Tech Industry, and how large social media platforms, can act to counter it. Beneath, we’ve spotlighted the main Safety By Design approaches and the Safety Tech solutions to countering this problem.
Technical Solutions for Countering Online VAWG
There are a series of emerging technical solutions for countering the problem of Online VAWG, ranging from startups to Safety By Design principles embedded in larger platforms.
Safety By Design approaches seek to combat potential harm on platforms through altering aspects of how the platform operates. There are a huge diversity of solutions in this space, but three key access points consist of:
OVAWG is a nascent area for Safety Tech approaches. However, certain fields, such as Intimate Image Based abuse or Hateful Speech, have a range of innovative approaches emerging:
1. Hash matching
a. Machine learning identifies images that include nudity, and match this across a database of provided non-shareable images
b. For example, StopNCII is a platform, operated by the Revenge Porn Helpline, working to digitally hash private sexual content, to prevent this being shared across the platforms of participating industry partners
2. NLP
a. AI solutions to determine if content is harassment, abuse, or a threat to share intimate images
b. For example, Moonshot CVE has developed an open-source methodology, called The Redirect Method, that uses targeted advertising to connect people searching online for harmful content with constructive alternative messages
3. Nudity Blocking
a. Determine if an image includes nudity, using artificial intelligence
b. For example, ImageAnalyzer provides artificial intelligence-based content moderation technology for image, video and streaming media, including live-streamed footage uploaded by users
4. Metadata analysis/Threat Intelligence
a. Analyzes metadata from communication (e.g. message size, sender, frequency) to determine intent
b. For example, Insikt develops AI Solutions for combating online terror, hate and disinformation, such as the Spotlight OSINT software which helps law enforcement agencies to gather insights about actors in the online community, and how they influence the general level of threat in the domain of social media
As the scale and pressing nature of this issue become more evident, and more actors begin to prioritise counteirng Online violence Against Women and Girls, it is important for the ecosystem to convene around this issue, in order to align on findings and best practice solutions.
If you would like to learn more about the research please reach out to us at jessica@public.io