Alarms are blaring about artificial intelligence deepfakes that manipulate voters, like the robocall sounding like President Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.
Yet there’s actually a far bigger problem with deepfakes that we haven’t paid enough attention to: deepfake nude videos and photos that humiliate celebrities and unknown children alike.
One recent study found that 98 percent of deepfake videos online were pornographic and that 99 percent of those targeted were women or girls.
Francesca Mani, a 14-year-old high school sophomore in New Jersey, told me she was in class in October when the loudspeaker summoned her to the school office.
The boys had made naked images of a number of other sophomore girls as well.
Persons:
Biden, Taylor Swift, Donald Trump, there’s, influencers, Francesca Mani
Organizations:
Google
Locations:
New Hampshire, New Jersey