Instagram is working on a tool to filter out nude messages sent over DMs. Earlier this week, the app researcher Alessandro Paluzzi, spotted the development of a nudity protection filter. He shared a photo in the tweet, saying; “Technology on your device covers photos that may contain nudity in chats. Instagram CAN’T access photos.”

Instagram Confirms the Development of Nudity Protection Filter

According to a spokesperson from Instagram, Liz Fernandez; “We’re developing a set of optional user controls to help people protect themselves from unwanted DMs, like photos containing nudity.” She further added; “This technology doesn’t allow Meta to see anyone’s private messages, nor are they shared with us or anyone else. We’re working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive.”

A study by the Center for Countering Digital Hate analyzed direct messages sent to five high-profile women including actor Amber Heard, Sharan Dhaliwal, Bryony Gordon, Rachel Riley, and Jamie Klingler. The report logged abuse sent by 253 accounts and reported them using the Instagram app or website. An audit of abusive accounts revealed that 227 remained active at least a month after these reports were filed.

Keyboard Based Filters

Last year, Instagram also launched a feature called ‘Hidden Words’ that allows users to filter abusive messages in DM requests based on keywords. If the requested DM contains any filter word you’ve chosen, it’s automatically placed in a hidden folder that you can choose to never open. The feature was great and highly welcomed but it still didn’t tackle the issue of nude photos which has become one of the biggest problems on social media. A study by London College reveals that 75.8 percent of the 150 young people aged 12-18 had been sent unsolicited nude images.

Also read: Instagram Adds New Features to DMs


Please enter your comment!
Please enter your name here