By Helen-Ann Smith, news reporter

A young woman who says graphic images of self-harm on social media encouraged her to make multiple suicide attempts has criticised Instagram's reporting process.

She has reported hundreds of images to the picture sharing site which she felt were harmful or triggering and estimates only 30% are ever taken down.

At least one of these images was reported after Instagram's pledge last month to remove all graphic self-harm content from its platform.

Following the pledge, Sky News reported a number of very graphic images to the site.

A month later and the content hadn't been taken down.

Image: Content reported by Sky News still remains on the platform, 2 months after it was reported

It raises questions about the scale of the challenge facing Instagram, if reported content isn't coming down swiftly, how difficult will it be for Instagram to find and remove content which is less easy to find?

Anna Hodge is now 18 years old. She was first diagnosed with OCD at the age of 11 and became increasingly depressed as she entered her teenage years. As well as struggling with anorexia she began self-harming and was eventually diagnosed with borderline personality disorder and depression.

More from Instagram

"You feel such a build-up you need to do something and self-harm was that something for me," says Anna.

"It was a way of punishing myself and it felt like a good relief at the time, but looking back it didn't help at all.

At first she found the mental health community online a helpful source of support. But there was a darker side to some of the accounts she was following

"The more people you follow the more chances there are of seeing things that aren't appropriate and Instagram suggests accounts to follow and you go down a bit of a rabbit hole. Once you've talked to these people and you feel like you've bonded you don't want to unfollow when they post harmful things because you don't want to upset them

"Once you're reading about all these people thinking such negative things and telling you what they are , if you've had some of those thoughts too it reinforces that pattern of negative thinking in your head, it kind of encourages it."

After a number of years in hospital Anna is now back at home and studying for a degree.

She also runs a recovery account on Instagram encouraging body positivity and good mental health.

As part of her recovery she regularly reports content to the site and is frustrated by how often things aren't removed.

Image: Instagram is a photo sharing platform, with upwards of 375m users

"They just say 'we've reviewed your content and it's not against our guidelines'," says Anna.

"I think they need to listen to people who have been in that vulnerable space and know how easy it is to access that content. They need to listen to people with these issues, listen to what they think is harmful and not what the big Instagram bosses think is harmful."

The social media companies have faced criticism over the secrecy that surrounds their reporting and monitoring process.

The images Sky News reported to Instagram showed graphic self-harm as well as quotes encouraging suicide.

The site's policy has always been that it does not allow content that encourages or promotes self-harm or suicide – despite this the content had not been removed a month after it was reported.

For charities like Samaritans who are helping advise Instagram, their goals will be hard to achieve without being more open.

"We definitely need it to be much more transparent," says Harriet Edwards, Samaritans policy manager.

"Critically they need to ensure they're bringing people with them, making sure the public know these improvements are being made and making sure they know why these changes are being made."

View this post on Instagram

♡ I know that when you're in a dark place you probably won't believe this – you can't see the point of anything, including living, when we're just going to die one dRead More – Source

[contf] [contfnew]

Sky News

[contfnewc] [contfnewc]