Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "NCMEC"


13 mentions found


Instagram to crack down on teen sextortion
  + stars: | 2024-04-11 | by ( Samantha Murphy Kelly | ) edition.cnn.com   time to read: +4 min
The company announced on Thursday it is testing new features to curb an alarming trend called financial sextortion, which often targets kids and teenagers. Once the nude images are sent, the scammers claim they’ll post them online, either on public websites or on newsfeeds where their friends will see, unless the victims send money or gift cards. In the upcoming weeks and among a subset of users, Instagram said it will roll out various new features, such as blurring nude images sent in direct messages and informing users when they’ve interacted with someone who engaged in financial sextortion. But the FBI recently said it has seen an increase in financial sextortion cases from strangers, often started by scammers overseas. Meta said it is also working on ways to identify accounts that may be engaging in sextortion scams by detecting and monitoring likely sextortion behavior.
Persons: Instagram, they’ve, ” Antigone Davis, “ It’s, sextortion, Meta, Meta Meta, ” Davis, , Organizations: CNN, FBI, National Center for, Meta, Tech Coalition
The NCMEC has not yet published the total number of child abuse content reports from all sources that it received in 2023, but in 2022 it received reports of about 88.3 million files. "We are receiving reports from the generative AI companies themselves, (online) platforms and members of the public. It's absolutely happening," said John Shehan, senior vice president at NCMEC, which serves as the national clearinghouse to report child abuse content to law enforcement. Content flagged as AI-generated is becoming "more and more photo realistic," making it challenging to determine if the victim is a real person, said Fallon McNulty, director of NCMEC's CyberTipline, which receives reports of online child exploitation. OpenAI, creator of the popular ChatGPT, has set up a process to send reports to NCMEC, and the organization is in conversations with other generative AI companies, McNulty said.
Persons: Sheila Dang, John Shehan, Fallon McNulty, NCMEC's, McNulty, Kylie MacLellan Organizations: U.S . National Center for, Reuters, Meta, Stanford Internet Observatory Locations: NCMEC, Austin
NCRI, a nonprofit, found cybercriminals used the social apps Instagram, Snapchat and Wizz to find and connect with their marks. And social media platforms should include a distinct category to report sextortion — as Snapchat did in early 2023. Parents and educators should "combat the belief that photos sent on Snapchat disappear, which can create a false sense of security," the NCRI study recommends. The NCRI study also strongly criticized Wizz, concluding: "Sextortion on Wizz is pervasive and dangerous. Apple's App Store and Google Play can also help, the NCRI study suggested, by carefully monitoring complaints about sextortion associated with social media apps, and enforcing their existing policies.
Persons: cybercriminals, Wizz, Paul Raffile, Alex Goldenberg, TikTok, Scribd, sextortion, Snapchat, Goldenberg, screenshotted, General Raúl Torrez, Mark Zuckerberg, Meta, — Kevin Collier, Ben Goggin Organizations: Yahoo, Network, Research, FBI, Yahoo Boys, NBC News, CNBC, NBC, Secret Service, Facebook, YouTube, Meta, gov, National Center for Locations: North America, Australia, West Africa, Michigan, Wizz, New Mexico, U.S
A former Pornhub video moderator said content that should've been removed "stayed up for months." The ex-MindGeek worker made the claims in the new Netflix documentary Money Shot: The Pornhub Story. If you're a victim of exploitation and in need of support please contact the National Center for Missing and Exploited Children online at https://www.missingkids.org/A former MindGeek moderator said many Pornhub videos that should've been taken down remained on the platform for months. "Many videos that should have been taken down stayed up for months," he said in a new Netflix documentary, Money Shot: The Pornhub Story, which is being released Wednesday. "Any insinuation that we do not have enough moderators to thoroughly review all uploaded content is categorically false.
Meta is cracking down on revenge porn targeting children under the age of 18 on its platforms. It funded a new tool removing explicit images online, released by a child protection organization. Over 20 million images of child sexual abuse material were found on Facebook and Instagram in 2020. Users can select images and videos on their devices that they don't want to be posted online or that have already been posted online. It also detected over 20 million images of child sex abuse on Instagram and Facebook in 2020.
CNN —Meta is taking steps to crack down on the spread of intimate images of teenagers on Facebook and Instagram. To create a hash of an explicit image, a teen can visit the website TakeItDown.NCMEC.org to install software onto their device. Meanwhile, President Biden demanded in his latest State of the Union address more transparency about tech companies’ algorithms and how they impact their young users’ mental health. Meta recommends teens who have multiple copies of the image or edited versions make a hash for each one. “There’s no one panacea for the issue of sextortion or the issue of the non-consensual sharing of intimate images,” Davis said.
Federal law enforcement officers are cracking down on a scheme that aims to extort sexual imagery from children and teens after a dramatic increase in incidents over the past year. Sometimes, a predator shares imagery regardless of whether a victim meets payment demands, according to federal officials. Law enforcement officials say prevention is the best weapon against sextortion. The sextortion cycle generally ends when a victim tells an adult or the offender is discovered by law enforcement. "We will continue to partner with federal, state and local law enforcement to protect children from sexual exploitation in all its despicable forms."
Meanwhile, Twitter’s resources to fight child sexual exploitation content online (and what is sometimes called child pornography or child sexual abuse materials) are thin, following layoffs, mass-firings and resignations from the company. Child sexual exploitation content has remained a problem for Twitter, though most major social media platforms continue to deal with it in some form or another. Moderation of this content usually relies on a combination of automated detection systems and specialized internal teams and external contractors to identify child abuse content and remove it. “So, I mean, that is disheartening.”It’s unclear how many Twitter employees remain to work on child safety issues. A search on LinkedIn for current Twitter employees who say they work on child safety turned up only a few accounts.
Less than a month after taking control of Twitter , Elon Musk said addressing child sexual exploitation content on the social media platform was "Priority #1." "It is a crime that they refused to take action on child exploitation for years!" Meanwhile, Twitter's resources to fight child sexual exploitation content online (and what is sometimes called child pornography or child sexual abuse materials) are thin, following layoffs, mass-firings and resignations from the company. Twitter's imperfect efforts fighting child sexual exploitation content were well documented. Stroppa said he felt Twitter's previous efforts were lacking and that it now moves quickly to find and suspend accounts that post child sexual exploitation content.
CNN —Apple is abandoning its plans to launch a controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material (CSAM) following backlash from critics who decried the feature’s potential privacy implications. Apple first announced the feature in 2021, with the goal of helping combat child exploitation and promoting safety, issues the tech community has increasingly embraced. Apple was criticized in 2021 for its plan to offer a different tool that would start checking iOS devices and iCloud photos for child abuse imagery. Many child safety and security experts praised the attempt, recognizing the ethical responsibilities and obligations a company has over the products and services it creates. But they also called the efforts “deeply concerning,” stemming largely from how part of Apple’s checking process for child abuse images is done directly on user devices.
Crypto exchanges enabled online child sex-abuse profiteer
  + stars: | 2022-11-23 | by ( ) www.reuters.com   time to read: +22 min
These sites often included links for users to pay via crypto exchanges, the IWF told Reuters, declining to name companies. “For those people looking to make money from child sexual abuse, crypto has lowered the barrier,” said Dan Sexton, the IWF’s chief technology officer. The Dark Scandals website, owned by Michael Mohammad, instructs users to send tokens to a Dark Scandals digital wallet to purchase content. While banks and payment platforms demanded more details from online merchants, many crypto exchanges for years requested little or no information from clients. The IWF received more reports last year of websites selling child abuse imagery for crypto than any year prior.
The U.S. Justice Department, in a report this September, said many crypto exchanges still "make little or no effort to comply" with know-your-customer requirements. These sites often included links for users to pay via crypto exchanges, the IWF told Reuters, declining to name companies. While banks and payment platforms demanded more details from online merchants, many crypto exchanges for years requested little or no information from clients. Asked at his trial for his opinion of crypto, Mohammad noted, "Privacy is something that a lot of users value." The IWF received more reports last year of websites selling child abuse imagery for crypto than any year prior.
Instagram is rolling out a feature to help locate missing children. Shareable AMBER Alerts will display in a user's feed when they are within the search area. An AMBER Alert will appear in a user's feed if they are within the search area of a missing child. The alerts are triggered by law enforcement and cover only the area where there is an active search for a missing child. "Photos are the most important tool in the search for missing children," said John Bischoff of the Missing Children Division at the National Center for Missing & Exploited Children.
Total: 13