Google to Use Hashes from StopNCII to Remove Nonconsensual Intimate Imagery from Search Results
Google has announced a new initiative to combat the spread of non-consensual intimate imagery (NCII) on its search platform, partnering with StopNCII.org to implement a proactive detection system. Over the coming months, Google will begin using cryptographic hashes provided by StopNCII to identify and remove such content from search results. Hashes—unique digital fingerprints generated from images and videos—allow Google to detect abusive material without storing or sharing the actual content. StopNCII uses the PDQ algorithm for images and MD5 for videos to create these identifiers, enabling efficient and privacy-preserving detection across the web. The move comes amid growing criticism that Google has been slower than other major tech platforms to adopt such measures. In its announcement, Google acknowledged the urgency of the issue, stating, “We have also heard from survivors and advocates that given the scale of the open web, there’s more to be done to reduce the burden on those who are affected by it.” Other companies have already taken similar steps: Facebook, Instagram, TikTok, and Bumble joined StopNCII as early as 2022, and Microsoft integrated the system into Bing in September of the previous year. Google’s latest effort marks a significant step forward in its commitment to protecting users from the harmful impacts of NCII, particularly as search engines remain a common pathway for the unauthorized distribution of such content.
