Instagram’s Selective Blurring Of Nudity Falls Woefully Short Of Protecting Kids

Instagram’s Selective Blurring Of Nudity Falls Woefully Short Of Protecting Kids


Instagram is finally taking action against sexual exploitation on its platform, just one day after being called out in the National Center on Sexual Exploitation’s (NCOSE) Dirty Dozen List. Instagram, which is owned by Meta, will use artificial intelligence to automatically blur images of nudity in the direct messages (DMs) of users under 18 years old. 

While the new policy may seem like a welcome step in the right direction, it’s far from enough. Minors may still click “view image anyway” and easily surpass the blurring on an explicit direct message. Many children will want to click on the blurred image to see what it is. In fact, the change is little different than Instagram’s existing policy banning the posting of nude images, which is easily circumvented or overridden by

Continue reading


 

Join the conversation!

Please share your thoughts about this article below. We value your opinions, and would love to see you add to the discussion!