Instagram is developing a nudity filter for direct messages
Instagram has been working to find a way to block unsolicited nude pictures that are sent through direct messages. This is done by using a “nudity protector” filter to hide and screen out such photos.
Although the social media giant confirmed that the feature was being developed, it did not give a time frame. “We are working closely with experts in order to ensure these new features preserve individuals’ privacy while giving them control of the messages they receive,” said a spokesperson from Meta, Instagram’s parent company.
Instagram claims that the “nudity protection” feature is optional. It will be similar to Hidden Words which was an opt-in feature Instagram introduced last year. Hidden Words filters abusive DM requests based upon keywords and diverts them to a hidden folder, so that users don’t have to see them.
After Alessandro Paluzzi , an app researcher, posted a screenshot showing the coming message-protection functions earlier this week on Twitter, the company confirmed the feature. According to the Instagram notice, “Technology on your phone covers chat photos that might contain nudity.” Instagram cannot access the photos” Instagram will hide photos if it detects nudity within a DM unless the user opts to view them.
A study by non-profit group Center for Countering Digital Hate was released in April. It analyzed direct messages to five high-profile women, including Amber Heard (actor), who has been the victim of hateful attacks online during her legal battle against Johnny Depp. The study found 125 instances of image-based sexual abuse. According to the org, Instagram failed to respond to every instance of image-based sexual abuse within 48 hours.
A In a study conducted by Indiana University’s Kinsey Institute in 2020, nearly half of women surveyed had received a picture of male sexual organs.