I have a post appearing in my news feed right now, asking: “You have 30 minutes to spend $10,000 in one store. Where are you going?” Or, “What was your first car?” On the surface, these are boring questions, yet FB indicates it has had 1.8 million comments added. Why are these posts occurring and who is behind them? Someone appears to be pouring enormous energy and effort into a data collection project aimed at building personality profiles from social…
The ACLU rewrote the words of the co-founder of their very own [Person’s] Rights Project and U.S. Supreme Court Justice Ginsburg in an unusual way that erases women from historical quotes.
Academic research confirms the media’s obsession with “missing white women” (specifically, young, blonde, blue eyed women). When someone goes missing, coverage is extreme for missing young white women, drops significantly for missing women of color, and is almost non-existent for men, with coverage of missing white men receiving the least amount of press.
Bill Gates blames changing political parties in control of governments for lack of climate change solutions. His words are bizarre as he appears to be opposing democracy itself.
Should a newspaper’s reporting be funded by a “community-based” “non-profit” agenda focused and driven organization that advocates for specific outcomes?
Facebook uses “AI” software to interpret posts and grade the posts for its automatic content moderation. But FB’s systems do not support many languages, enabling much content to glide through moderation.
Scientific American reports on how government agencies use “media embargoes” to control reporting.
An earlier claim from NOAA that July 2021 was the hottest month ever appears to no longer be true after more temperature data was collected. The issue here is: the initial claims made headlines all over the media landscape. The revision, and effectively a correction, has received no publicity. This is unfortunately how media works – and why I consider most news stories to be false until they withstand the test of time.
The algorithms lead to an increase in angry people and divisiveness, leading to a world of angry people, according to the Wall Street Journal.
The researchers find we evaluate unknown sourced information based on whether it agrees with our existing thoughts. We judge information we agree with (even if factually wrong) as more credible merely because it aligns with our pre-existing thoughts.