A Facebook-funded study published recently in Science magazine analyzed the algorithm Facebook uses to deliver information via its News Feed, and whether the social network isolates its users from different political viewpoints.
Researchers wanted to find out whether Facebook could and/or does use its filtering to keep users from exposure to news stories and opinions they disagree with, since the social media website generally delivers users ads and info that are similar to what they like, and to the pages they frequent.
If indeed Facebook’s News Feed is automatically giving information you might find challenging or disagreeable a lower exposure, placing it somewhere on a web page where it’s less likely to be seen and/or clicked on, you as a user are perhaps not getting what you signed up for.
The study found that an average of 29% of News Feed stories will present a user opposing viewpoints.
“The effects that I wrote about exist…but they’re smaller than I would have guessed,” said Eytan Bakshy, the data scientist who led the study.
Facebook claims that users are more responsible for the filtration than anything the company itself programs into its software.
However, there also exists a major flaw in the study – it analyzed only 10.1 million of the website’s 200 million American users. Furthermore, of that 20%, only 9% of users listed their political affiliation. This essentially means that Facebook opted to only study 4% of American Facebook users – hardly representative of Facebook’s population at large.