PDA

View Full Version : YouTube and Blocking Videos Process



8BitNinja
2018-06-10, 02:34 PM
So today I was listening to music on YouTube (why I didn't just turn on Spotify is still beyond me), and decided to listen to the song No Remorse by Metallica. When I clicked on the video, this message popped up.

The following content has been identified by the YouTube community as inappropriate or offensive to some audiences.

Since I just wanted to listen to the song, I clicked "I wish to proceed" and listened to it anyways. Afterwards, since I had never seen this message before, clicked on the "Learn More" link at the bottom, as I was confused as to why this happened. The reason was explained in this paragraph

Some borderline videos, such as those containing inflammatory religious or supremacist content without a direct call to violence or a primary purpose of inciting hatred, may not cross these lines for removal. Following user reports, if our review teams determine that a video is borderline under our policies, it may have some features disabled.

The politics and personal beliefs involving YouTube aside, I would get why they would want to protect themselves from being associated with certain groups and ideologies they do not believe in. However, I would never expect a song by Metallica to fall under this. Is there a process as to how this is done? If so, what is it?

DISCLAIMER: Due to forum rules I am not looking for personal opinions on the context. All I am asking is if there is a definite process to this filtering and if so, what is it. Also, if possible, I would like explanation on how the specific video was censored.

Here is the link to the video (https://www.youtube.com/watch?v=whUfwGSW1hE)

Brother Oni
2018-06-10, 03:26 PM
I get a 'This content is not available on this country domain' message for that particular video, but I can watch the song just fine on the official Metallica channel.

Whatever the filtering mechanism is, there doesn't seem to be much real thought put behind the consistency of its implementation, suggesting an automated procedure of some sort.

factotum
2018-06-10, 03:28 PM
When it says "Youtube community" it suggests it's actually something the audience are voting for in some way, but I have no idea why they'd have taken agin' that song.

georgie_leech
2018-06-10, 06:33 PM
When it says "Youtube community" it suggests it's actually something the audience are voting for in some way, but I have no idea why they'd have taken agin' that song.

Less "voting for" and more "enough reports for automatic action to be taken happened" is my understanding. I mean, I get why most of the processes are automatic; YouTube would have a he'll of a time trying to actively monitor every video, or even every report. Trolls are gonna troll, after all. But let's just say that having an automatic flagging/blocking system can be pretty easily exploited.

Blackhawk748
2018-06-10, 07:47 PM
Less "voting for" and more "enough reports for automatic action to be taken happened" is my understanding. I mean, I get why most of the processes are automatic; YouTube would have a he'll of a time trying to actively monitor every video, or even every report. Trolls are gonna troll, after all. But let's just say that having an automatic flagging/blocking system can be pretty easily exploited.

And believe me it is. Their AI (cuz it is a rudimentary AI) is fundamentally broken simply because it gets no (or very little) "positive" input, meaning that it very rarely gets an override on the flags, so it just gets ban happy. You can see this if you decide to look up patriotic songs of all types from the 30s-50s, it happens to have one word or phrase thats "wrong" and the bot nails it, and noone corrects it.

Its rather annoying.

Gnoman
2018-06-10, 09:38 PM
From some research, it appears that enough reports from the comment section can generate this flag.