Instagram's recommendation algorithms have been connecting and promoting accounts that facilitate and sell child sexual abuse content, according to an investigation published Wednesday.
Meta's photo-sharing service stands out from other social media platforms and "appears to have a particularly severe problem" with accounts showing self-generated child sexual abuse material, or SG-CSAM, Stanford University researchers wrote in an accompanying study.
Stamos, who is now director of the Stanford Internet Observatory, said the problem has persisted after Elon Musk acquired Twitter late last year.
"They then cut off our API access," he added, referring to the software that lets researchers access Twitter data to conduct their studies.
Earlier this year, NBC News reported multiple Twitter accounts that offer or sell CSAM have remained available for months, even after Musk pledged to address problems with child exploitation on the social messaging service.
Persons:
Instagram, Alex Stamos, Stamos, Elon Musk, CSAM, Musk
Organizations:
Stanford University, Wall Street Journal, Stanford, Policy Center, University of Massachusetts Amherst, Stanford Internet Observatory, Elon, Twitter, NBC News, YouTube