A Danish research group has discovered that Meta’s Instagram platform might not be providing the safeguarding needed to keep vulnerable teenagers from committing self-harm. The results of the study have found it to be “extremely inadequate”, as content posted during the study wasn’t removed.
Researchers posted for a month, ratcheting up the severity of the content. Included in the images were gore and razors, which made up part of the 85 pieces of content posted. It found that Instagram hadn’t removed anything.
Those working on the project also found that once an account had connected to another in the test group, it was quite easy to expand the circle. This way, the chance of self-harm encouragement or groups could easily proliferate in quite a short amount of time.
Instagram – and Meta at large – has been struggling to keep on top of its removal of such content. In March, Lotte Rubæk left the company and said that it was “turning a blind eye” to self-harm content on its apps. While the use of Facebook has begun to dwindle amongst younger users, Instagram remains incredibly popular for teens.
In January, it introduced a new teen-focused level of protection, including direct messages.
Ex-Meta staff shocked at Instagram’s lack of moderationSpeaking with The Guardian, Rubæk was shocked that Meta didn’t manage to flag any of the content posted. “They have repeatedly said in the media that all the time they are improving their technology… This proves, even though on a small scale, that this is not true.”
Meta has switched to using artificial intelligence to help moderate its content. While it still uses human reviewers, AI has mostly taken over. The company claims that 99% of this style of content is removed before it even sees human eyes, but the Danish researchers, Digitalt Ansvar, see it in a much different way now.
Last year, a report from The Washington Post found that the company’s TikTok competitor was delivering violent and gore-filled videos regularly.
Featured image: Instagram, Pexels
The post Instagram is putting teens at risk with self-harm content appeared first on ReadWrite.