Taking a step forward in combating bullying on the platform, Instagram has launched two new features that uses AI to warn users about an offensive posts and to restrict problematic followers.
Social media platform Instagram announced two new tools yesterday. The first tool is intended to use AI in order to warn users if a comment they are about to post might be offensive, giving a chance to rethink and maybe undo their comments.
During tests, Instagram discovered that the feature encouraged people to reflect on their comments and undo them, and instead write something less hurtful that won’t be considered much offensive.
The second feature called ‘Restrict’ will use AI in order to let users restrict trouble-creating followers. When a user will restrict the follower, the comments made by them won’t appear publicly unless approved by the user. Also, the followers on the restrict list will not be allowed to see when the user is active or when they have seen their direct message or not.
Both the tools, as per Engadget, are a part of the social media firm’s ongoing efforts to fight bullying. Head of Instagram, Adam Mosseri wrote in the blog post, “We are committed to leading the industry in the fight against online bullying, and we are rethinking the whole experience of Instagram to meet that commitment.”