The majority of online content-sharing service companies do not publish transparency metrics on child sexual abuse and exploitation.
Online Content Sharing Services Transparency Metrics & Reporting on CSEA
A critical step in improving the response to and monitoring of online child sexual exploitation and abuse is to standardise the data that are collected, stored, and analysed, and to create uniform metrics that effectively measure change in online CSEA over time.
This review was undertaken with the specific objective of bridging the existing evidence gap on metrics. To accomplish this we conducted a thorough examination of the metrics currently employed by online content sharing services in relation to CSEA. Content-sharing services can be categorised in three areas following the approach used by OECD and TVEC benchmarking:
- Social media, video sharing services and online communications services
- Cloud-based file sharing services
- An “Other” category that includes popular digital services for content management and online reference
Transparency refers to the openness and accountability of the data, algorithms and decision-making processes used by these services when addressing online CSEA found in their services. Transparency is also important for trust.
Our responses to prevention and safeguarding for online CSEA are only as good as our evidence-base. Ensuring greater transparency across the metrics produced by these services will assist in improving data collection and improve our safeguarding of children at scale.
Most tech companies do not publish transparency metrics on child sexual abuse and exploitation data. Of those that do, differing metrics mean that data is often not directly comparable across all companies. A total of 20 technology companies provide transparency reporting on online CSEA. This study reviewed the metrics of 19 transparency reports. From those, only three provide any time-related metrics including:
Removal before any views
Removal within 24 hours
Reach of content deactivated for child sexual exploitation
Time-related metrics are incredibly important for understanding the potential reach and views prior to takedown and how quickly companies are taking down that content.
Breakdown of frequency of metrics reported by companies on CSEA
Furthermore, this study found that very little metadata was given by tech companies. Only one company provided detailed technical notes on the numerators and denominators of their metrics and the challenges they experienced with measurement. The authors recommend that all companies should include technical notes that detail the data points that go into the metric, how they are identified and any limitations reported.
Tech companies employ a number of different methods to detect and combat the sharing of abusive content. These methods include hash matching, AI algorithms, user reports, image analysis, collaboration with authorities, keyword filtering, and text analysis. The most recent E-Safety Commission report in Australia investigated the responses and actions to CSEA by five major tech companies, found variations on the approaches to detection and reporting, and emphasized the importance of human moderators as well as technological detection methods. The authors will further examine and assess the reporting and existing reporting gaps of various detection methods in transparency reports for identifying images related to child exploitation and abuse.
Overall, this study found there is an urgent need for harmonised metrics reporting that takes account of differing service functions, but allows for comparability across the industry. Without this, we do not have objective measures of progress and effectiveness for tackling online child sexual exploitation and abuse.
More information
Lu, M., Lamond, M., and Fry, D. A Scoping Review of Technology Companies Transparency Metrics & Reporting. In Searchlight 2023 - Childlight's Annual Flagship Report. Childlight – Global Child Safety Institute: Edinburgh, 2023.
- Researchers: Dr Mengyao Lu, Ms Maria Lamond, and Prof Deborah Fry
- Registered Study Protocol: https://osf.io/fsp5n
- Dataset https://osf.io/j2ges/files/osfstorage/654e5f32253a74090da0e8fd