Simple Facebook experiment demonstrates how FB amplifies left and right extremist perspectives

Simple Facebook experiment demonstrates how FB amplifies left and right extremist perspectives

Spread the love

Facebook’s algorithms got to work, suggesting what they’d be interested in.

Accepting recommendations for sites supportive of Trump led Carol to suggestions for a site called “Donald Trump is Jesus,” and another for QAnon, a wide-ranging extremist ideology that alleges celebrities and top Democrats are engaged in a pedophile ring. Karen was presented with anti-Trump pages, including one that posted an image showing an anus instead of Trump’s mouth.

Source: The story of Carol and Karen: Two experimental Facebook accounts show how the company helped divide America

Similarly, as I have noted previously, FB does not display job opportunity ads for me because I am too old, in their view. I have degrees in computer science, software engineering and business administration, have worked in Silicon Valley, for Microsoft, and others – but I was over 50. Thus, FB was used to promote job discrimination (studies have shown this to be true). They never display job ads to me.

FB amplifies extreme viewpoints and promotes discrimination across a variety of fronts. As I have written in the past, FB is a friction less platform for the spread of propaganda. Anyone can be a propagandist, not just “experts” at it.

FB controls your news feed to optimize time spent on FB. A different approach would allow you, the user, to configure the information you see. But that would not enable FB to maximize viewing minutes. FB knows that “outrage” engages people for longer than cat photos. Thus, FB is designed to encourage a culture of perpetual outrage. The incentives for FB are aligned to make the system worse for each of us, but better for FB.

A genuine solution would be to give each user real control over what gets inserted into their news feed. Merely enabling a time line view of friends’ posts would be helpful! Instead, FB selects which posts we see from our friends and groups we belong to.

I have a software programming group page with thousands of group members. For any given post, less than 1 in 12 will ever be presented with the posts I put there, rendering group usage near useless to me. FB chooses what you see – you do not choose what you see.

Comments are closed.