The following analysis points to the high emotional content of posts on Facebook:
These two factors – the way that anger can spread over Facebook’s social networks, and how those networks can make individuals’ political identity more central to who they are – likely explain Facebook users’ inaccurate beliefs more effectively than the so-called filter bubble.
If this is true, then we have a serious challenge ahead of us. Facebook will likely be convinced to change its filtering algorithm to prioritize more accurate information. Google has already undertaken a similar endeavor. And recent reports suggest that Facebook may be taking the problem more seriously than Zuckerberg’s comments suggest.
But this does nothing to address the underlying forces that propagate and reinforce false information: emotions and the people in your social networks
As noted by our blog, the quick acting, emotionally driven, System 1 “thinking style” takes over. This quick acting response to emotional content compels us to instantly Like and Share, spreading the message on to others.
The author of the above (a professor of communications at Ohio State) sees how these emotionally laden messages propagate rapidly and hook their “targets”.
He notes that improved filtering algorithms will not address the fundamental issue of emotional messaging and how people respond to those emotional messages. Emotional message sharing is fundamental to how Facebook operates.