Facebook will show you related articles and verification tests before you open a link

Facebook is still in full war against the fake news. If a month ago we anticipated that the social network would alert users that they were trying to publish content of dubious reliability, today thanks to TechCrunch we know that the social giant will show the user related articles and proof of fact verification before he opens a link.

From Facebook it is said that all this is done to “offer easier access to additional perspectives and information” about a news, so that the user can check it. In short, it is about trying to avoid the indoctrination of the public with invented news or with partial information.

How does it work? For example, suppose that in a little-known blog there is a news item that may be of general interest and that someone posts it on Facebook. Under it, and before you click on the link, you will see a series of related articles in trusted sources : newspapers of national circulation, specialized publications of recognized prestige and etcetera.

It is worth noting that this feature of related articles already appeared in 2013 to show links that might interest the user as a result of the articles he reads. However, after the criticism received in 2016 during the last presidential elections, Facebook began to work with verifiers of external events.

For Mark Zuckerberg, the only way to combat the problem is to broaden the public’s perspective …

A more effective approach is to show a greater range of perspectives, let the public see where their points of view are on a spectrum and reach a conclusion they think is right. Over time, our community will identify which sources offer a full range of perspectives so that dubious content will be easier for users to identify.

For now Facebook is testing this feature, so it will not display it for all its users until it proves it is useful.

In the end, it seems that Facebook is going to become a kind of “truth ministry”, something that may be essential to fight against false news. In any case, and as pointed out by María González, should Facebook be responsible for “filtering” what is false and what is not?

The question leads us to the debate on whether Facebook is a means of communication or not Zuck seems not to have it very clear, and yet this way of trying to verify and contrast news does not stop being an editorial decision. We will see how this feature evolves and if it finally ends up reaching all users.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.