Instagram will test nudity protection in messages to fight sextortion
Instagram is far from a gleaming example of protecting young people online, having failed to prevent its algorithm from promoting child sexual abuse material. But new features bring some (at least a little?) hope that the platform could become a bit safer. Meta announced it's rolling out new tools meant to protect users against intimate image abuse and sextortion — when a person is digitally blackmailed under threat of sharing intimate media. One of the most significant updates is that nudity protection is coming to private messages. Meta first confirmed it was building this technology back in 2022, and it will automatically activate the tool for users under 18. Once switched on, a machine learning tool will detect and blur images it suspects of containing nudity for the recipient. The analysis happens on the user's device, so messages should remain end-to-end encrypted without Meta ever having access to them. Users will have the option to view the image alongside a pop-up message from Meta that they shouldn't feel pressured to respond, along with a safety tips button and an option to block the sender. Meta's new tool — which it will start testing "soon" — also detects if a person is sending a nude image and warns them to "take care when sharing sensitive photos" while outlining potential risks. Plus, it reminds users that they can delete a message before anyone sees it. Then there's the final warning: a reminder to be responsible and respectful appears when someone tries to forward a message with detected nudity (though it still lets the image be forwarded). Then there are the tools designed to detect potential scammers or sextortionists and make it more difficult for them to approach teens. Message requests from these possible bad actors should now go to hidden requests, and anyone already involved in a conversation will receive a warning with boundary reminders and steps to report users. As for young people, Meta previously barred people from messaging users 16 or under if they weren't mutually connected — even if the other account claimed to be the same age. Now, these potential scammers won't see the option to message a teen even if they follow each other.This article originally appeared on Engadget at https://www.engadget.com/instagram-will-test-nudity-protection-in-messages-to-fight-sextortion-131516318.html?src=rss
Instagram is far from a gleaming example of protecting young people online, having failed to prevent its algorithm from promoting child sexual abuse material. But new features bring some (at least a little?) hope that the platform could become a bit safer. Meta announced it's rolling out new tools meant to protect users against intimate image abuse and sextortion — when a person is digitally blackmailed under threat of sharing intimate media.
One of the most significant updates is that nudity protection is coming to private messages. Meta first confirmed it was building this technology back in 2022, and it will automatically activate the tool for users under 18. Once switched on, a machine learning tool will detect and blur images it suspects of containing nudity for the recipient. The analysis happens on the user's device, so messages should remain end-to-end encrypted without Meta ever having access to them. Users will have the option to view the image alongside a pop-up message from Meta that they shouldn't feel pressured to respond, along with a safety tips button and an option to block the sender.
Meta's new tool — which it will start testing "soon" — also detects if a person is sending a nude image and warns them to "take care when sharing sensitive photos" while outlining potential risks. Plus, it reminds users that they can delete a message before anyone sees it. Then there's the final warning: a reminder to be responsible and respectful appears when someone tries to forward a message with detected nudity (though it still lets the image be forwarded).
Then there are the tools designed to detect potential scammers or sextortionists and make it more difficult for them to approach teens. Message requests from these possible bad actors should now go to hidden requests, and anyone already involved in a conversation will receive a warning with boundary reminders and steps to report users. As for young people, Meta previously barred people from messaging users 16 or under if they weren't mutually connected — even if the other account claimed to be the same age. Now, these potential scammers won't see the option to message a teen even if they follow each other.This article originally appeared on Engadget at https://www.engadget.com/instagram-will-test-nudity-protection-in-messages-to-fight-sextortion-131516318.html?src=rss
What's Your Reaction?