Instagram lets users filter out abusive messages

1 min read

Instagram has announced the launch of a tool to enable users to automatically filter out abusive messages from those they do not follow on the platform.

It follows a number of footballers speaking out about experiencing racist, sexist and other abuse on Instagram.

Direct messages (DMs) containing words or emojis deemed offensive will be removed from view.

The tool will be available in the UK, France, Ireland, Germany, Australia, New Zealand and Canada within weeks.

More countries will then receive the relevant update in the coming months.

“Because DMs are private conversations, we don’t proactively look for hate speech or bullying the same way we do elsewhere,” Instagram blogged.

The tool focused on message requests from people users did not already follow “because this is where people usually receive abusive messages”, it added.

Monkey emojis

Instagram consulted with anti-discrimination and anti-bullying groups to curate a list of terms, phrases and emojis deemed offensive.

For example, Liverpool Football Club criticised the platform after some of its players were sent racist monkey emojis.

But users can also add their own definitions to this list, through the Hidden Words section of the app’s privacy settings.

This feature already exists to filter out abuse in comments on Instagram posts.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome & exclusive content in your inbox, every week.

We don’t spam! Read our privacy policy for more info.

insiderblm

InsiderBLM is a fast-growing business site with deep financial, media, tech, and other industry verticals.

Leave a Reply

Previous Story

Ad consequuntur ex reprehenderit consectetur quo reprehenderit

Next Story

All the electric vehicles that stood out at the Shanghai Auto Show