Cadastre-se agora para um orçamento mais personalizado!

NOTÍCIAS QUENTES

Instagram plans to blur nudes DMed to teens, hoping to fight sextortion scams

Apr, 11, 2024 Hi-network.com
Instagram blurring a nude photo sent in a message
Meta

Meta is taking more steps to try to combat sextortion crimes that target teenagers on Instagram. The company on Thursday announced plans to test blurring nude photos in DMs sent to teens and prevent potential scammers from even contacting teenagers.

By default, the nudity protection feature would be turned on globally for messages sent to teens under 18. Instagram also plans to display a notification for adults urging them to enable the feature. Further, anyone who tries to send a nude photo will be asked to exercise caution when sharing such images and told that they can unsend the photos if they change their mind.

With sextortion and other scams affecting people on social media, Meta hopes the blurring feature will protect teens and adults from seeing unwanted nudity in their messages. The feature is also designed to specifically safeguard teenagers from scammers who send nude photos as a way to trick them into responding with nude pictures of themselves.

Also: Instagram DMs get an edit feature and Threads gets gestures

Anyone who receives a message with a blurred, nude image will be shown a warning screen, allowing them to choose whether or not to view the unblurred version. Recipients will also see a message telling them not to feel any pressure to respond and offer them a way to block and report the sender.

People who send or receive images with nudity will see safety tips about the possible risks. For example, malicious individuals may screenshot or forward such images to others, a tactic used in sextortion scams. The tips will also provide links to different resources, including Meta's Safety Center, support helplines, StopNCII.org for adults over 18, and Take It Down for teens under 18.

Detecting nudity in an image is quite the technical challenge. Instagram's nudity protection feature will use on-device machine learning to analyze the image, Meta said. Since the images are evaluated on the device and not online, the feature will even work with end-to-end encrypted chats.

Meta said that it's also working on a way to use certain signals to identify accounts that may be engaging in sextortion scams. Though the signals wouldn't necessarily be proof of such a scam, the idea is to take certain precautionary steps to prevent such accounts from interacting with teenagers. Specifically, Instagram would put any message requests from potential sextortion accounts into the recipient's hidden requests folder (where it would remain unseen), show Safety Notices to teens already exchanging messages with possible sextortion accounts, and hide the Message button on teenagers' profiles from potential sextortion accounts.

Also: 41 US states sue Meta over claims that Instagram and Facebook are harmful

Meta said that it's also testing new pop-up messages for users who may have chatted with an account since removed for sextortion. The messages would direct people to contact a friend and to various resources, including the Stop Sextortion Hub, support helplines, StopNCII.org for those over 18, and Take It Down for those under 18.

Given that AI and ML are far from perfect, one has to wonder: what signals is Meta feeding its system for detecting possible sextortion accounts? How will people falsely accused this way be able to fight back? In its post, Meta didn't explain exactly how its  identification process would work.

tag-icon Tags quentes : Negócio

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.
Our company's operations and information are independent of the manufacturers' positions, nor a part of any listed trademarks company.