Instagram is working on a user safety feature that will safeguard users from receiving unwanted nude photographs in direct messages (DMs). While social media has helped people communicate more effectively, it has also made it simpler to harass others. According to reports, Cyberflashing is one such crime that affects millions of women worldwide. Instagram is to protect users from graphic content in order to filter out unsolicited and unwelcome nude photographs.
Meta acknowledged to the Verge that the feature is still in its early stages of development. The new “Nudity Protection” option is comparable to Instagram’s “Hidden Words” feature, which was introduced last year. Users can use this functionality to automatically filter direct message requests that contain offensive content.
New Instagram Safety Feature Arriving Soon: Here’s What To Expect
According to reports, Meta will employ machine learning to block naked photographs from being delivered on Instagram. “We’re collaborating with experts to guarantee that these new capabilities protect people’s privacy while providing them control over the communications they receive,” says a Meta representative.
Alessandro Pauzzi, a corporate developer, also offered a sneak preview of the new function on Twitter. He posted the screenshot on his microblogging site “Instagram is working on chat nudity protection. Your device’s technology protects photographs that may include nudity in chats. Instagram is unable to access images.”
Cases of cyber flashing and bullying on social media networks have escalated in recent years. Instagram is one of the platforms where such instances have become pretty prevalent.
According to a YouGov poll conducted in 2017, more than 40 per cent of young women received unsolicited images of a man’s private parts or unwanted graphic photos. According to a Glitch poll, 17 per cent of women received unsolicited pornography in June or July 2020.