I'm the victim of CSAM. Apple's CEO should be testifying about safety in Washington.

The leaders of five major tech companies have been called before the Senate for a hearing on child safety. Here's why Tim Cook's absence is so frustrating.

SHARE THIS —

If you Google my name, the phrase “revenge porn” is bound to pop up. Go ahead and do it, I’ll wait. There’s a good reason for this association: I led a protest march across the Brooklyn Bridge called “March Against Revenge Porn,” and just a few years ago, I spoke at a TEDx event about my journey from victim to activist.

If you Google my name, the phrase “revenge porn” is bound to pop up. Go ahead and do it, I’ll wait.

But today, I’d never use those words to describe my experience. I’m not a victim of “revenge porn” — I’m the victim of child sexual abuse material, or CSAM, and image-based sexual violence, or IBSV. 

And these distinctions matter. Using the correct terms is crucial in raising awareness of a problem that is still traumatizing thousands, and to getting lawmakers and tech companies to take action. Pornography is produced between two consenting adults and has nothing in common with CSAM, which depicts sexualization, rape, and assault of children and teenagers.

As I’ve come to understand how to accurately categorize my abuse, I’ve also learned more about the broader landscape of CSAM. It’s not an exaggeration to describe this as an epidemic of harm that is shattering childhoods worldwide. In 2022, the National Center for Missing and Exploited Children’s CyberTipline received 32 million reports of CSAM. The vast majority of those reports were submitted by Google, WhatsApp, Facebook, Instagram and Omegle. (Omegle shut down in 2023.) In 2023, NCMEC says it received a record 36 million reports.  

Missing from this list is Apple, and its nearly entirely unregulated iCloud.

In 2022, while other large tech companies reported and worked to remove millions of CSAM, Apple reported just 234 pieces of content. That’s because, unlike many of its competitors, the company refuses to voluntarily monitor for and flag when such content is uploaded. Reporting indicates that Apple’s disclosures are merely the tip of an iceberg of abuse happening under the surface.

When I was a teenager, nude photos of me were shared on the internet, many of which were shared on iPhones and, given their proliferation, likely stored on iCloud. 

To this day, many of these photos remain online. 

Like many survivors, I live with complex post-traumatic stress disorder from compounded childhood trauma — trauma that was exacerbated by the public shaming that I endured after identifying as a survivor.

In 2021, I felt hopeful when Apple announced that it was taking action to flag and remove child sexual abuse material from iCloud. Maybe this would mean my images would no longer be circulated, I remember thinking. Maybe this would be the end of the abuse that I suffered at the hands of the boys who first exploited me and the adults who found — and continue to find — sexual gratification in images of my childhood body.

But the company was criticized by privacy advocates and others. Thousands of people signed a petition against the moves. Unfortunately, rather than finding a way to address these concerns — while still monitoring for child abuse — the company simply abandoned its proposal.

In September 2023, Erik Neuenschwander, Apple’s director of user privacy and child safety, sent a letter to child safety activists attempting to explain the company’s actions, and claiming there was no practical way to search iCloud photos for CSAM without “imperiling the security and privacy of our users.” The result could be a “slippery slope,” he wrote. “Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging system,” he continued, while calling child sexual abuse material “abhorrent.”

But I know tech companies can do better.

The leaders of five major tech companies testify in front of the Senate today about child safety on their platforms. I will be in attendance.

I recently traveled to Washington, D.C., with other child sexual violence survivors and advocates to urge lawmakers to hold companies accountable for the role they play in the dissemination of CSAM. I was gratified by the positive response from legislators — and I’m hopeful that they will help us change a system that is enabling further victimization.

This is especially urgent as the leaders of five major tech companies testify in front of the Senate today about child safety on their platforms. I will be in attendance as a reminder of the human cost of negligence. Although Apple CEO Tim Cook was not asked to testify, this is a necessary step. Without action, companies risk complicity.

And Apple is one of many incredibly influential companies that have the power to prevent generations of users from experiencing the same trauma that I endured. 

When I was a teenager, technology did not exist to protect me. Today, that technology exists — and it’s time we put it to use. Not just for me, but for young people everywhere whose bodies have been stolen from them, and stored out of reach.

test MSNBC News - Breaking News and News Today | Latest News
IE 11 is not supported. For an optimal experience visit our site on another browser.
test test