Youtube's recommendation algorithm pushes people to watch ever more extreme content

Youtube's recommendation algorithm pushes people to watch ever more extreme content

“Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

What we are witnessing is the computational exploitation of a natural human desire…. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”
 


Regardless of topic, Youtube’s recommendation algorithm  recommends videos containing more extreme versions of whatever it is you just watched.
This makes sense from a selling eyeballs to advertisers perspective. The more emotionally riled up you become, the more likely you are to be receptive to advertiser messages (a.k.a. ad propaganda).
Google’s core business is advertising (a.k.a. propaganda) and they know exactly what they are doing, which is optimizing every possible way to make money. A side effect is the increasing polarization of society, and the dumbing down of society fed a diet of misinformation through the use of computational advertising.
Read the whole article, above.
Recommendation algorithms are a fantastically powerful tool for propagandists.
 

Comments are closed.