Microsoft joins coalition to scrub revenge and deepfake porn from Bing

Microsoft announced it has partnered with StopNCII to help remove non-consensual intimate images — including deepfakes — from its Bing search engine. When a victim opens a "case" with StopNCII, the database creates a digital fingerprint, also called a "hash," of an intimate image or video stored on that individual's device without their needing to upload the file. The hash is then sent to participating industry partners, who can seek out matches for the original and remove them from their platform if it breaks their content policies. The process also applies to AI-generated deepfakes of a real person. Several other tech companies have agreed to work with StopNCII to scrub intimate images shared without permission. Meta helped build the tool, and uses it on its Facebook, Instagram and Threads platforms; other services that have partnered with the effort include TikTok, Bumble, Reddit, Snap, Niantic, OnlyFans, PornHub, Playhouse and Redgifs. Absent from that list is, strangely, Google. The tech giant has its own set of tools for reporting non-consensual images, including AI-generated deepfakes. However, failing to participate in one of the few centralized places for scrubbing revenge porn and other private images arguably places an additional burden on victims to take a piecemeal approach to recovering their privacy. In addition to efforts like StopNCII, the US government has taken some steps this year to specifically address the harms done by the deepfake side of non-consensual images. The US Copyright Office called for new legislation on the subject, and a group of Senators moved to protect victims with the NO FAKES Act, introduced in July. If you believe you've been the victim of non-consensual intimate image-sharing, you can open a case with StopNCII here and Google here; if you're below the age of 18, you can file a report with NCMEC here.This article originally appeared on Engadget at https://www.engadget.com/big-tech/microsoft-joins-coalition-to-scrub-revenge-and-deepfake-porn-from-bing-195316677.html?src=rss

Sep 6, 2024 - 01:30
 0
Microsoft joins coalition to scrub revenge and deepfake porn from Bing

Microsoft announced it has partnered with StopNCII to help remove non-consensual intimate images — including deepfakes — from its Bing search engine.

When a victim opens a "case" with StopNCII, the database creates a digital fingerprint, also called a "hash," of an intimate image or video stored on that individual's device without their needing to upload the file. The hash is then sent to participating industry partners, who can seek out matches for the original and remove them from their platform if it breaks their content policies. The process also applies to AI-generated deepfakes of a real person.

Several other tech companies have agreed to work with StopNCII to scrub intimate images shared without permission. Meta helped build the tool, and uses it on its Facebook, Instagram and Threads platforms; other services that have partnered with the effort include TikTok, Bumble, Reddit, Snap, Niantic, OnlyFans, PornHub, Playhouse and Redgifs.

Absent from that list is, strangely, Google. The tech giant has its own set of tools for reporting non-consensual images, including AI-generated deepfakes. However, failing to participate in one of the few centralized places for scrubbing revenge porn and other private images arguably places an additional burden on victims to take a piecemeal approach to recovering their privacy.

In addition to efforts like StopNCII, the US government has taken some steps this year to specifically address the harms done by the deepfake side of non-consensual images. The US Copyright Office called for new legislation on the subject, and a group of Senators moved to protect victims with the NO FAKES Act, introduced in July.

If you believe you've been the victim of non-consensual intimate image-sharing, you can open a case with StopNCII here and Google here; if you're below the age of 18, you can file a report with NCMEC here.This article originally appeared on Engadget at https://www.engadget.com/big-tech/microsoft-joins-coalition-to-scrub-revenge-and-deepfake-porn-from-bing-195316677.html?src=rss

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Viral News Code whisperer by profession, narrative alchemist by passion. With 6 years of tech expertise under my belt, I bring a unique blend of logic and imagination to ViralNews360. Expect everything from tech explainers that melt your brain (but not your circuits) to heartwarming tales that tug at your heartstrings. Come on in, the virtual coffee's always brewing!