Meta will soon launch a new safety feature that blurs nude images in Instagram messages to protect minors.
By default, teenage Instagram users will have this feature enabled, which is based on the birthday information they provided on their account. Additionally, adult users will receive a notification to encourage them to turn it on.
Detecting nudity photos sent in Instagram’s direct chats are detected and analyzed through on-device machine learning technology. Meta, however, reassures that it will only have access to these images if they’ve been reported.
The enabled feature also provides a reminder for young users who receive nude photographs, advising them not to feel obligated to respond. It also includes options to block and report the sender.
“This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” shared Meta.
According to the Wall Street Journal, the blurring feature will undergo testing in the upcoming weeks and be launched globally in the next few months.
Leave a comment