The advent of non-consensual intimate imagery (NCII) as a monetized business on the Internet has shifted pornography into the realm of undressing anyone you like.
With this latest development using AI, nothing is sacred anymore. According to research firm Graphika in their report A Revealing Picture, NCII can manipulate existing photos and video footage of real individuals to make them appear nude without their consent. With 34 unique NCII providers in existence today, the interest in NCII is soaring, with 24 million unique visitors to their websites.
According to the report, the volume of referral link spam for these services has increased by 2000% since the beginning of 2023, with social media platforms serving as the primary medium.
We’ve talked about sextortion scams that have worked with little more than an old username and password from a few years back to convince a would-be victim to pay up. Think about what this kind of technology can do to these kinds of scams when individuals within organizations can potentially be blackmailed with illicit image or video content that portrays them in less than business-like situations.
Unfortunately, there’s not much that the user can do to stop someone from taking images or video posted publicly on the web and push it through an NCII service. But, it is possible for users who receive sextortion scams that may one day include NCII content to ignore them – we teach this in our new school security awareness training. After all, it’s far more likely that any kind of scam like this is done in bulk across thousands of users, with only those that respond actually getting the attention of the scammer.
KnowBe4 empowers your workforce to make smarter security decisions every day. Over 65,000 organizations worldwide trust the KnowBe4 platform to strengthen their security culture and reduce human risk.