Constantly expanding its scope, Instagram continues to offer new features – sometimes even taking a step back after users’ reactions. Recently, the social media giant has once again caused controversy with its “Sensitive Content Control” feature.
Instagram typically labels photos that doesn’t violate its Community Guidelines but might upset some people or cause traumatic effects as sensitive content. But how does Instagram decide which photos are sensitive/inappropriate?
The answer is that a set of advanced artificial intelligence checks the photos according to the instructions given to it. However, in addition to this algorithm, more than 15,000 content reviewers working within Facebook and Instagram also eliminate thousands of photos one by one every day. And the photos selected appear in pixelated form with a sensitive content tag on users’ discovery or home page. With the new feature that Instagram introduced on July 20, users over the age of 18 will now be able to view these sensitive content without warning/censorship if they wish.
Users will now have the choice between three options: Allow, Limit and Limit Even More. But the filter in question is not for everyone’s liking, especially content creators.
“Over the past 24 hours, I’ve had many conversations with artists and other creators who are incredibly upset to see their work hidden.” wrote Phillip Miner, artist and creator of queer hobby magazine Natural Pursuits, in a post. “Conversely, people are frustrated that they cannot find the content they want to see.” Miner has pointed out that grouping all sensitive content under a single phrase is the biggest problem and that the platform should allow users to customize the filter to adapt according to individual interests.
Publishing a thread on Twitter, Instagram detailed the operation of Sensitive Content Control and explained that its purpose is to personalize users’ experience on the platform. According to Instagram, the feature is not a cause for concern for content creators as the social network has already been limited by default- before the introduction of this filter.
This wasn’t Instagram’s first controversial decision on censorship. Its algorithm has already been criticized in recent months for implementing racist decisions and double standards.
On May, Instagram has come under severe criticism for the removal of pro-Palestine content during the conflict affecting the Gaza Strip and neighboring territories. Many stories highlighting the conflict have disappeared due to what Instagram described as a glitch – a failure of the automated systems.
In the midst of an escalation of violence in Gaza, pro-Palestinian posts about the eviction of Palestinians living in Sheikh Jarrah, a neighborhood in east Jerusalem, were not shown on the social network for hours. Adam Mosseri, the CEO of Instagram, apologized on Twitter for what happened, alluding to a technical error.
That same day, Instagram also deleted many stories posted by activists who demonstrated in support of the national awareness day for missing and killed Native American women. Throughout the day, families, friends, supporters and nonprofits posted illustrations and messages, as well as photos and stories of loved ones who have disappeared or been murdered.
However, once again, Instagram has deleted all the posts.
In response, Instagram has repeatedly pointed out that there is no censorship, but only a misinterpretation of the algorithm used to define preferences for the content offered to users. The social network reported that the algorithm is programmed to put more emphasis on the original content in the Stories, rather than those that are republished.
As the conflicts continued in Gaza Strip in mid-May, a group of Instagram employees formally protested that many more pro-Palestinian posts were not visible on social media. They argued that Instagram’s algorithm discriminates against posts made by Arabs and Muslims, and cite as an example the mistaken removal of several posts. Although they admitted that the censorship is not deliberate, they assured that “the moderation of scale is biased against marginalized groups.”
After these allegations, Instagram has confirmed that it is going to introduce a series of changes in its algorithm. From now on Instagram will order the Stories giving the same importance to both the publications made by a user, and the content shared in the stories that had been published by others.
What do you think about the removal of these posts and stories? Was it intentional or was it a glitch? What about Sensitive Content Control? Do you think it reduces the visibility of content creators? Let us know which side you’re on in the comments section below and don’t forget to follow us on our socials!