Whether we use Facebook, Google, or any other platform, our choices are in fact guided in secrecy. In an age where computer programs tell us what to think, a somewhat outdated piece of news has emerged: Some select individuals who are not accountable to anyone decide what is news.
On Facebook, a small group working there determines which topics users can and cannot see. Some claimed that Facebook did not promote conservative content posts. Facebook denied this claim. Technology company Grizmodo made this claim last March, stating that Facebook might see it as a source of embarrassment for two reasons, regardless of political affiliation.
First, the fact that people choose the news to be featured dispels the illusion of an impartial process operating in the news ranking. The second is the perception that these people who choose the news are doing this job only by considering numerical targets and without any editorial responsibility.
In fact, the problem is not the human element or ideological bias. The real problem is that the world’s most powerful information sharing platform decides unsupervised what people see on their Facebook account.
Platforms like Facebook offer us news in an attractive way in attractive categories such as “most shared” or “relevance”. We do not know how this news is filtered. However, this is very important because small changes in the information presented to us change our behavior.
Views: 394