At just 14, nude images Leah Juliett sent to a boy on Facebook were shared online, at first with other children at school and then on anonymous internet messaging boards.
Juliett, who identifies as non-binary and uses they/them pronouns, told DailyMail.com: ‘It started on an iPhone. It was then circulated on Facebook.
‘Then my abuse images essentially found their permanent home on an anonymous image board called Anon IB. That website still exists today.’
Now 27, the advocate says the experience was devastating – and inspired their journey to become an activist fighting against what they see as Big Tech’s inaction to prevent image-based child sex abuse.
Leah Juliett, now 27, has become an advocate against abuse enabled by technology
They continued: ‘When this happened to me, when I was 14 years old, I wanted to die. I tried to die. I can’t state that enough. I’ve made it my mission to fight for big tech accountability and survivor justice.’
In 2017, they launched the March Against Revenge Porn across the Brooklyn Bridge, beginning a journey as an advocate against abuse enabled by technology that eventually took Juliett to the White House.
Juliett now campaigns for the Heat Initiative, which aims to make Apple accountable for the spread of abuse images on the company’s iCloud.
They said: ‘I really used my shame as a force for social good. But I’m only 27 years old. I didn’t want or expect this to be my life. When I was little, I wanted to be a singer.
‘But because this is my life, and because it sadly still continues to be for so many vulnerable teenagers and children across our country and around the world, I still very much carry my trauma with me.
‘It’s very much a deeply rooted part of who I am and an integral reason for why I do the work that I do. But I’m stronger now. I have built a toolbox – a toolbox to reclaim the shame that I experienced and use it for good.’
Juliett told this website that since 2017, the language around the subject has changed enormously.
Juliett said: ‘The whole landscape of the [revenge porn] issue has changed from… when I first marched across the Brooklyn Bridge.
‘We don’t use that term anymore. Because there’s nothing that I did to warrant revenge against my body. And non consensual nudity [is] not pornography.
‘We say image based sexual abuse and child sexual abuse material. Those are more accurate terms to describe the real crimes that happen to kids every day around the country.’
They added that ‘millions’ of internet users worldwide fall victim to similar abuse, and ‘the phone is the delivery mechanism.’
Key to defeating image-based abuse, they told DailyMail.com is bipartisan legislation and education.
But Big Tech is also part of the problem.
Juliett said: ‘It is an important moment for us to look upstream, and recognize that we can’t fix the problem at the watering hole. We have to fix it up at the source. And in my work, and in my experience over the last decade as a survivor and an expert in this field, I’ve recognized that source as the iPhone.
‘What people don’t realize is that these tech companies, Apple included, especially Apple, are not just labs of innovation, as Apple often likes to refer to itself as, they are companies offering a product and services.’
Unlike food stores, for example, which aren’t allowed to sell products that poison people, there is little legislation around Big Tech, Juliett believes.
They added: ‘They are companies that offer services to people and people are experiencing severe harm at the hands of their products.
‘I personally think that there’s a lot of things that they can be doing to prevent these sorts of harms. And there’s a very clear reason why they don’t and that’s because they continuously choose profit over people.’
Data from the National Center for Missing and Exploited Children (NCMEC) suggested Apple had documented 267 cases of child sexual abuse material (CSAM) worldwide between April 2022 and March 2023.
The number of iPhone users worldwide is estimated to be more than a billion.
When Juliett was 14, nude images they sent to a boy were shared online
Juliett told this website: ‘They could offer a more robust reporting mechanism on their platforms. For instance, we know that Meta has a robust reporting record to the National Center for Missing and Exploited Children.
‘In contrast, Apple does not, in any stretch of the imagination, have nearly as significant of a reporting record. But we know that the abuse is happening in iCloud.’
Apple in 2021 stated it would implement ‘NeuralHash,’ an algorithm designed to detect and remove CSAM in the iCloud.
But several months later, the program was paused over privacy concerns.
Juliett said: ‘The most basic thing that they could do today is they can initiate basic hash, hash matching detection and iCloud, which basically converts a piece of known CSAM into a unique string of numbers through an algorithm. It makes the image into a digital fingerprint of sorts and then is compared against a list of other digital fingerprints.
‘They could do that. Initiate that today and save children’s lives today by detecting known Child Sexual Abuse images.’
In a company response to the Heat Initiative regarding its reversal, Apple’s director of user privacy and child safety Erik Neuenschwander said: ‘Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it.’
However, he said, after working with privacy and security experts, digital rights groups and child safety advocates, Apple determined it couldn’t proceed with its CSAM-scanning mechanism, even one specifically built to protect privacy.
Neuenschwander wrote: ‘Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit. It would also inject the potential for a slippery slope of unintended consequences.
‘Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.’
Juliett, now 27, said the experience was devastating
DailyMail.com reached out to Apple for comment and was directed to a previous statement from Apple to the Heat Initiative.
The statement said: ‘Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it.
‘We’re proud of the contributions we have made so far and intend to continue working collaboratively with child safety organizations, technologists, and governments on enduring solutions that help protect the most vulnerable members of our society.
‘With respect to helping kids stay safe, we have made meaningful contributions toward this goal by developing a number of innovative technologies.
‘As you note, we decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago, for a number of good reasons.
‘After having consulted extensively with child safety advocates, human rights organizations, privacy and security technologists, and academics, and having considered scanning technology from virtually every angle, we concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users.
‘Scanning of personal data in the cloud is regularly used by companies to monetize the information of their users. While some companies have justified those practices, we’ve chosen a very different path — one that prioritizes the security and privacy of our users. Scanning every user’s privately stored iCloud content would in our estimation pose serious unintended consequences for our users.’
The full statement can be read here.
Juliett campaigns against image-based abuse in various ways, including through poetry
But Juliett said they will continue to fight.
They told DailyMail.com: ‘I do a lot of my storytelling through poetry. And I will continue using my voice telling my story and screaming my poems… wherever the wind takes me until I see big tech reform.
‘When I started the March against revenge porn in 2016, it felt like a very lonely fight for me. But 10 years later, I realized that I didn’t have to be alone. I don’t have to be alone.
‘I am now a part of an incredible group of survivors and allies. And if I lead the same march today, I know that I would have hundreds of survivors by my side, friends. Being public about my story has been incredibly hard. But I know that this is what I was born for.’