News and social media "filters" reinforce established beliefs

News and social media "filters" reinforce established beliefs

Facebook, Google News and other online services automatically try to filter the information you see, to deliver to you what their algorithms think you want to see. Usually, this means delivering items to you similar to those you’ve already looked at before. The effect seems to strengthen bias, rather than challenge them. A simplified study was done to test this idea in practice and it (so far) confirms that our online world may be leading to less diversity in ideas, rather than more:

filtering of either sort led people to click and spend more time on “pro-attitudinal” articles — that is, articles most likely to reflect their own opinions right back at them. In a way, the bottom-right graph is the most interesting. It shows that people in the control group spent more than half their time on the site reading articles that challenged their beliefs. That number plummeted precipitously in the other conditions.

Original source
 

Comments are closed.