Home Artificial Intelligence Dear Taylor Swift, we’re sorry about those explicit deepfakes

Dear Taylor Swift, we’re sorry about those explicit deepfakes

1
Dear Taylor Swift, we’re sorry about those explicit deepfakes

I can only imagine how you need to be feeling after sexually explicit deepfake videos of you went viral on X. Disgusted. Distressed, perhaps. Humiliated, even. 

I’m really sorry this is going on to you. No person deserves to have their image exploited like that. But for those who aren’t already, I’m asking you to be furious. 

Furious that this is going on to you and so many other women and marginalized people around the globe. Furious that our current laws are woefully inept at protecting us from violations like this. Furious that men (because let’s face it, it’s mostly men doing this) can violate us in such an intimate way and walk away unscathed and unidentified. Furious that the businesses that enable this material to be created and shared widely face no consequences either, and might profit off such a horrendous use of their technology. 

Deepfake porn has been around for years, but its latest incarnation is its worst one yet. Generative AI has made it ridiculously easy and low cost to create realistic deepfakes. And nearly all deepfakes are made for porn. Just one image plucked off social media is sufficient to generate something passable. Anyone who has ever posted or had a photograph published of them online is a sitting duck. 

First, the bad news. In the mean time, now we have no good ways to fight this. I just published a story  3 ways we will combat nonconsensual deepfake porn, which include watermarks and data-poisoning tools. But the fact is that there isn’t any neat technical fix for this problem. The fixes we do have are still experimental and haven’t been adopted widely by the tech sector, which limits their power. 

The tech sector has to date been unwilling or unmotivated to make changes that will prevent such material from being created with their tools or shared on their platforms. That’s the reason we’d like regulation. 

Individuals with power, like yourself, can fight with money and lawyers. But low-income women, women of color, women fleeing abusive partners, women journalists, and even children are all seeing their likeness stolen and pornified, with no approach to seek justice or support. Any certainly one of your fans could possibly be hurt by this development. 

The excellent news is that the undeniable fact that this happened to you means politicians within the US are listening. You could have a rare opportunity, and momentum, to push through real, actionable change. 

I do know you fight for what is true and aren’t afraid to talk up whenever you see injustice. There will likely be intense lobbying against any rules that will affect tech firms. But you have got a platform and the ability to persuade lawmakers across the board that rules to combat these forms of deepfakes are a necessity. Tech firms and politicians have to know that the times of dithering are over. The people creating these deepfakes should be held accountable. 

You once caused an actual earthquake. Winning the fight against nonconsensual deepfakes would have a good more earth-shaking impact.

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here