Publishedduration2 days agoimage copyrightGetty Imagesimage captionThere are fears over the impact of self-harm and suicide content on young and vulnerable people
Instagram has launched new technology to recognise self-harm and suicide content on its app in the UK and Europe.
The new tools can identify both images and words that break its rules on harmful posts.
It will make posts less visible in the app and, in extreme cases, remove it automatically.
Facebook, which owns Instagram, said it was an "important step" but the company wanted to do "a lot more".
Head of Instagram Adam Mosseri detailed the new system, which uses artificial intelligence, in a blog post on its website.