In an interview with TechCrunch, Dr. Rebecca Portnoff of Thorn shares her journey and innovative approaches in harnessing AI for the protection of children against sexual abuse.
Dr. Rebecca Portnoff, the vice president of data science at Thorn, a non-profit organisation that leverages technology to combat child sexual abuse, recently shared insights into her work in an interview with TechCrunch. As part of TechCrunch’s Women in AI series, which aims to highlight the contributions of women in the artificial intelligence field, Dr. Portnoff discussed both her career journey and her current endeavours in using AI for social good.
Dr. Portnoff’s academic background is built on a solid foundation having obtained her undergraduate degree from Princeton University, followed by a PhD in computer science from the University of California, Berkeley. She joined Thorn in 2016 as a volunteer research scientist and has since advanced to lead a unique and highly skilled team dedicated to the development of machine learning and AI technologies specifically designed to safeguard children from sexual abuse.
Her interest in the field was sparked during her time at Princeton when a book recommendation from her sister, “Half the Sky” by Nicholas Kristof and Sheryl WuDunn, led her to explore the issue of child sexual abuse and inspired her PhD dissertation, which focused on applying AI and machine learning to combat such abuses.
Dr. Portnoff’s team at Thorn is at the forefront of identifying victims and preventing the spread of child sexual abuse material online. One notable project under her leadership was the Safety by Design initiative, launched in collaboration with All Tech Is Human last year. This project focuses on preventing the misuse of generative AI technologies to harm children and aims to align industry leaders with certain standards to curb the creation and distribution of abusive material.
The growing sophistication of AI in generating nonconsensual sexual imagery has become a significant issue without comprehensive federal regulation in the United States. However, some states, like Florida, Louisiana, and New Mexico, have enacted laws addressing AI-generated child abuse specifically. “One in 10 minors report they knew of cases where their peers had generated nude imagery of other kids,” Dr. Portnoff shared, highlighting the urgency of addressing this challenge in AI’s evolution.
Thorn is actively advocating for technology companies to adopt and commit to safety-by-design principles and work transparently to show how they are preventing AI technology misuse in child exploitation. This involves collaboration with expert organisations such as the Institute of Electrical and Electronics Engineers (IEEE) and the National Institute of Standards and Technology (NIST) to establish standards for auditing the industry’s progress. Dr. Portnoff emphasises the importance of legislation that compels all companies and stakeholders to cooperate on this critical issue.
Reflecting on her experiences as a woman in a predominantly male field, Dr. Portnoff recounted instances where her expertise was questioned. She attributes her success to preparation, confidence, and a positive outlook, advising women interested in AI to recognise their worth and contribution to the field. “As ML/AI becomes more integrated into our human systems, all of us need to work together to ensure it’s done in a way that builds up our collective flourishing and prioritizes the most vulnerable among us,” she remarked.
In discussing ethical AI development, she underscored the importance of engaging with a broad range of stakeholders outside the immediate technical community to ensure transparency, fairness, reliability, and safety in AI systems. Dr. Portnoff sees a vital role for investors in the responsible development of AI by evaluating a company’s ethical commitments during the due diligence stage and enforcing standards that prevent harm. She believes that while there is much work to be done, positive change is achievable with concerted effort and collaboration.
Source: Noah Wire Services











