Safety by Design for Generative AI: Preventing Child Sexual Abuse - Future of Trust & Safety Insights

All Tech Is Human was privileged to host The Future of Trust & Safety, a landmark gathering of emerging and established Trust & Safety professionals on May 14, 2024, in New York City. The Future of Trust & Safety featured two panels focused on empowering the field.

The first panel, Safety By Design for Generative AI, focused on current efforts to reduce risk and exploitation of minors in online spaces. It featured Dr. Rebecca Portnoff (Head of Data Science, Thorn), Afrooz Kaviani Johnson (Child Protection Specialist, UNICEF), Sean Litton (President and Chief Executive Officer, Tech Coalition), Juliet Shen (Community Advisory Board, Integrity Institute; Research Associate for Columbia University's Trust and Safety Tools Consortium), and was moderated by Matthew Soeth (Head of Trust & Safety and Global Affairs, All Tech Is Human). After the panel, Soeth took time to write key insights and highlights.

By Matthew Soeth

Collaboration around Child Sexcual Abuse Material (CSAM) is an ongoing effort for platforms. Some of the earliest tech efforts include Photo DNA by Microsoft, along with Hany Farid, to develop a system to label images so that platforms could use them to scan and remove existing CSAM material. This tool was made available to all platforms that host images to implement as part of their moderation tooling. While this effort was launched over a decade ago, tech and tooling have evolved along with our ability to design online environments. As we enter the age of Generative AI (gAI) we can utilize our knowledge of Safety by Design as well as build on existing best practices when it comes to preventing and moderating CSAM content.

One of the biggest shifts in the past few years is to actively solicit feedback from young people about their experiences online. One of the ideas that pushed for this shift was the United Nations Rights of the Child. These rights call for access to tech as a human right, as well as reducing access to harmful content like CSAM and grooming. “Number one, it really shifts the paradigm so it's not just that charity lens or looking at children as objects of protection, but really as individuals with rights. Secondly, it is the most ratified human rights treaty in the world. So it really provides a nearly universally accepted framework,” shared Afrooz Kaviani Johnson, a Child Protection Specialist at Unicef.

Children are more than a group to be protected, they are a group that should be empowered through their access to technology; and that access should be free of harmful content. This is reflected through the start of youth advisory boards, such as Unicef or TikTok, to solicit feedback and gain a better understanding of how young people are using the tech in their day to day lives. 

Another area driving this need for better online experiences is the Safety By Design framework. This framework provided a series of principles for platforms who are designing these experiences: 

  1. Service provider responsibility

  2. User empowerment and autonomy

  3. Transparency and accountability

This framework, published by the e-Safety Commissioner's Office in Australia, provided a human-centric approach to online safety. While new at the time, this framework built upon years of knowledge, experience, and current research to provide guidance to platforms on how to implement tooling and safety mechanisms. This same thinking was applied to developing safe guards around gAI.

In June of 2023, All Tech is Human and Thorn began collaborating on Safety By Design Principles for Generative AI, as Dr. Rebecca Porntoff of Thorn shared, with “…many of the leading generative AI companies to collaboratively define, align on, and then commit to a set of safety by design principles and associated mitigations, to enact those principles.” Considering that gAI was still pretty nascent to the public, this was a major goal, “…to not miss out on this window of opportunity, to focus on those organizations that we believe have both the opportunity and the responsibility to be acting into this moment.” 

One common thread throughout child safety is collaboration between platforms and external partners such as Thorn, All Tech is Human, Unicef, and the Technology Coalition; there is a lot of work to be done. In speaking at the recent All Tech is Safety by Design Summit in New York, trust and safety product expert Juliet Shen shared, “All four of these organizations have definitely influenced product design, user experience, system architecture, data retention and how we operate and design workflows.”

This collaboration makes platforms, and the people working to solve these hard problems, more efficient and better at reducing online harms. A great example of operationalizing this collaborative effort is Technology Coalition’s Lantern Project, the first child safety cross-platform signal sharing program. Bad actors often engage across platforms, making their behavior difficult to detect. However, with Lantern, it makes it easy for platforms to share signals and tactics of bad actors to help disrupt the coordinated efforts to harm children; and It’s free to participate.

“The collaboration has to drive real outcomes, otherwise people won't continue with it,” shared Sean Litton, CEO of the Technology Coalition.  As we evaluate and implement more ways to reduce risk when it comes to child safety, it takes more than just tech. This is the benefit of Safety by Design as we not only look at ways to respond to harmful incidents, we also design platforms and systems to avoid and mitigate those anticipated risks. This can only be done through open communication, collaboration, and transparency between industry and experts as we share best practices when it comes to online safety. 

About All Tech Is Human

All Tech Is Human is a non-profit committed to building the world’s largest multistakeholder, multidisciplinary network in Responsible Tech. This allows us to tackle wicked tech & society issues while moving at the speed of tech, leverage the collective intelligence of the community, and diversify the traditional tech pipeline. Together, we work to solve tech & society’s thorniest issues.

Previous
Previous

Cultivating the Next Generation of Trust & Safety Leaders - Future of Trust & Safety Insights

Next
Next

Building a Career in Responsible Tech: Sarah Welsh, PhD, All Tech Is Human Program Director