Aiming to reduce exposure of minors to self-harming and disturbing content on its platform, Instagram has rolled out the "sensitivity screen" feature that blurs questionable pictures and put in video-thumbnails on the app until the user opts for.
The feature has already been available to select Indian users blocking images of cutting and self-harm that could surface in search, recommendations or hashtags and influence minors into physical danger.
The move comes soon after the British Health Secretary Matt Hancock issued a warning to Instagram to improve protection for young people on its platform or face legal action.
Instagram’s Head Adam Mosseri announced the introduction of "sensitivity screens" in an op-ed he wrote for The Telegraph, where he also expressed grief on the suicide of British teenager Molly Russell.
Russell parents has blamed Instagram for exposing their daughter to self-harm and suicidal content.
‘We are not yet where we need to be on issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe. We already offer help and resources to people who search for such hashtags, but we are working on more ways to help,’ Mosseri wrote in the op-ed.
The photo-sharing app has begun removing inauthentic likes, follows and comments from accounts that use third-party apps to boost themselves. The company uses machine learning tools to discover accounts that use these services.
While Instagram is not removing ‘Likes’ or followers have already been gained by user accounts, instead the company wants to prevent this fake behaviour from happening in the future.