New York CNN —Meta failed to remove an explicit, AI-generated image of an Indian public figure until it was questioned by its Oversight Board, the board said Thursday in a report that calls on the tech giant to do more to address non-consensual, nude deepfakes on its platforms.
The report is the result of an investigation the Meta Oversight Board announced in April into Meta’s handling of deepfake pornography, including two specific instances where explicit images were posted of an American public figure and an Indian public figure.
But in the case of the Indian public figure, although the image was twice reported to Meta, the company did not remove the image from Instagram until the Oversight Board took up the case.
“Meta determined that its original decision to leave the content on Instagram was in error and the company removed the post for violating the Bullying and Harassment Community Standard,” the Oversight Board said in its report.
The push to fight non-consensual deepfakes is just part of Meta’s larger efforts to prevent the sexual exploitation of its users.
Persons:
New York CNN — Meta, Taylor Swift, “ Meta, Meta, —, sexualized
Organizations:
New, New York CNN, Facebook, Meta, Board
Locations:
New York, American, Indian, Instagram, Nigeria