Instagram to Blur Nudity in Messages

AP Photo/Luca Bruno

At this point, we have seen far too many horror stories of minors being sexually exploited and blackmailed after being convinced to send nude photos to abusive people over social media. This practice, which has come to be termed "sextortion," ruins lives and in some cases has led to suicides. Of the many poisons that the advent of the internet has injected into the body of society, this is clearly among the worst. But what, if anything, can be done about it? This week, Instagram, owned by Meta, has announced what they believe may be a partial solution. They will begin blurring any nude images going to or from accounts registered to minors under the age of 18. However, while the intent may seem laudable, this policy raises a number of questions about free expression and its potential efficacy appears dubious at best. (Associated Press)

Advertisement

Instagram says it’s deploying new tools to protect young people and combat sexual extortion, including a feature that will automatically blur nudity in direct messages.

The social media platform said in a blog post Thursday that it’s testing out the features as part of its campaign to fight sexual scams and other forms of “image abuse,” and to make it tougher for criminals to contact teens.

Sexual extortion, or sextortion, involves persuading a person to send explicit photos online and then threatening to make the images public unless the victim pays money or engages in sexual favors. Recent high-profile cases include two Nigerian brothers who pleaded guilty to sexually extorting teen boys and young men in Michigan, including one who took his own life, and a Virginia sheriff’s deputy who sexually extorted and kidnapped a 15-year-old girl.

As I said, this may appear to be a step in the right direction at first glance, but a deeper look into the details raises all sorts of questions. First of all, Meta is only applying this technology to direct messages sent on Instagram involving accounts registered to minors. The policy will not apply to messages sent on Facebook or WhatsApp, so anyone wishing to send or receive such images can simply switch platforms. Even on Instagram, children have discovered any number of ways to get around age limitations when accessing online content. This probably won't thwart all that many people.

Advertisement

Even setting those considerations aside, a closer look at the policy reveals that they aren't really blocking or blurring nude images completely. When the system detects a nude image coming to a minor's account, it will be blurred. But the user will receive a warning with an option to view the unblurred image anyway. This may protect some people from an unwanted "surprise" in their direct messages, but anyone who has already consented to such activity - including minors - will be able to quickly skip past the blurring feature. Also, the feature will not be turned on for accounts registered to adults. They will simply receive an advisory to consider turning it on.

Next there is the question of the underlying technology being used to accomplish this. How is the system distinguishing between nude images and all the rest of the random selfies and travel photos that people exchange endlessly? Instagram currently has more than two billion users, the vast majority of whom use the direct messaging function. There is no way that human beings are reviewing all of these messages, so they must be doing it with Artificial Intelligence. How is this AI system being trained? Can it distinguish between a picture of a nude person and someone wearing a scanty bikini at the beach? Would it block a photo of the statue of David?

Finally, while the sexploitation and sextortion of children are undeniably evil, adults are still supposed to have control over their own bodies and communications. I have consistently maintained that it is unwise to launch naked pictures of yourself out into the wilds of the internet, but everyone has to make such decisions for themself. Are we really comfortable with a private company making decisions about what sort of images we can or can't send? This well-intentioned policy just seems to be simultaneously ineffective and censorial in a troubling fashion. The true responsibility for controlling such online situations should fall to a combination of the parents and law enforcement, not Meta.

Advertisement

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
David Strom 8:00 PM | April 29, 2024
Advertisement
Advertisement