Youtube’s recommendation algorithm pushes people to watch ever more extreme content

“Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

What we are witnessing is the computational exploitation of a natural human desire…. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”


Regardless of topic, Youtube’s recommendation algorithm  recommends videos containing more extreme versions of whatever it is you just watched.

This makes sense from a selling eyeballs to advertisers perspective. The more emotionally riled up you become, the more likely you are to be receptive to advertiser messages (a.k.a. ad propaganda).

Google’s core business is advertising (a.k.a. propaganda) and they know exactly what they are doing, which is optimizing every possible way to make money. A side effect is the increasing polarization of society, and the dumbing down of society fed a diet of misinformation through the use of computational advertising.

Read the whole article, above.

Recommendation algorithms are a fantastically powerful tool for propagandists.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s