YouTube star Logan Paul is currently weathering a storm of unseen proportions after he posted a video to his channel seemingly depicting the body of someone who had died by suicide in the Aokigahara forest in Japan.
The vlog, titled We found a dead body in the Japanese Suicide Forest, was uploaded on 31 December and shared to his 15 million subscribers and while the man’s face was blurred out, Logan repeatedly zoomed in on him, his hands and his pockets as they commented what they were seeing.
The video was eventually taken down – seemingly by Logan – 24 hours later, but by then it had already garnered, according to Vulture, 6.3 million views.
Many were calling for YouTube to remove the footage not long after it was uploaded and now people are calling for Logan’s channel to be permanently removed due to the 15-minute long video.
Aaron Paul and Game Of Thrones actress Sophie Turner have since criticised the YouTuber for posting the video, with the former labelling Logan ‘pure trash’.
But how did it stay online for so long in the first place?
When it comes to ‘violent or graphic content’, YouTube’s policy states:
‘It’s not okay to post violent or gory content that’s primarily intended to be shocking, sensational or gratuitous. If a video is particularly graphic or disturbing, it should be balanced with additional context and information.
‘If posting graphic content in a news, documentary, scientific or artistic context, please be mindful to provide enough information to help people understand what’s going on. In some cases, content may be so violent or shocking that no amount of context will allow that content to remain on our platforms. Lastly, don’t encourage others to commit specific acts of violence.’
This content is shocking and it’s definitely gratuitous.
While it’s not a violent video, per se, suicide reporting must be treated with respect and caution, as to not encourage copy-cats.
In that potential area, YouTube says: ‘Videos that incite others to commit acts of violence are strictly prohibited on YouTube. If your video asks others to commit an act of violence or threatens people with serious acts of violence, it will be removed from the site.’
It’s simple to flag such insensitive material, with YouTube claiming ‘staff review flagged videos 24 hours a day, seven days a week. A video can be flagged at any time once uploaded to YouTube and then it is reviewed by YouTube staff.’
However, the policy continues: ‘If no violations are found by our review team, no amount of flagging will change that and the video will remain on our site. Flagging videos is anonymous, so other users can’t tell who flagged a video.’
YouTube is yet to publicly release a statement or comment in regards to the video, however YouTube personality Philip DeFranco obtained an official statement from a contact at the website that says: ‘Our hearts go out to the family of the person featured in the video.
‘YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner.
‘If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated.’
I just received an official statement from a contact at @Youtube regarding the outrage and controversy around Logan Paul’s (now self-removed) “We found a dead body” top trending Youtube video.
I’ll save my personal comment for later. Just wanted to pass this along. pic.twitter.com/JNTQDMVvT4
— Philip DeFranco (@PhillyD) January 2, 2018
In the past, YouTube created automated software to cleverly identify extremist content and says it’s now aiming to do the same for videos unsuitable for children – IE this one.
Metro.co.uk contacted YouTube for comment and clarification on their uploading policies.