Facebook owes little to the media and much to its users, but has chosen to turn its back on all

Shortly before financial markets shut on Friday in the US, Mark Zuckerberg, CEO of Facebook, wrote in his profile that the feed of news social network will prioritize the news coming from “reliable, informative and local” sources for fight sensationalism and misinformation.

But who ensures what are the means that meet such conditions? Facebook users themselves through surveys.

There is too much sensationalism, misinformation and polarization in the current world. Social networks allow people to disseminate information faster than ever and, if we do not specifically address these problems, we will end up amplifying them. That is why it is important that News Feed promotes high quality news that helps create common sense.

The difficult question we have struggled with is how to decide which news sources are widely trusted in a world with so much division. We could try to make that decision ourselves, but we do not feel comfortable with that. We considered the possibility of consulting external experts, who would make the decision out of our hands but that probably would not solve the problem of objectivity. Or we could ask you -the community- and have your feedback determine the classification.

The hot potato that Facebook does not want: to decide openly about the media

The decision of the social network, which caused the rise of the actions of major media such as The New York Times, is important for several reasons, although it is particularly striking for two details in particular.

First of all, because it is very revealing that Facebook keeps insisting on not wanting to decide firsthand which means are reliable and which means are not, not even leaving that judgment on the table of experts or moderators like the 3,000 who wanted to put review inappropriate content As if they themselves were not able to distinguish between a medium of recognized prestige and a clear news website dedicated to manipulation, intoxication or directly to lies, when in the past they have implemented mechanisms of different kinds against false news . For example, setting filters with fact-checking or displaying warning messages if you intend to publish information marked as doubtful.

What if Facebook users attribute reliability to a medium based on their affinity with it and not on its objectivity and credibility?

And secondly, because it leaves it up to users to decide which media are good and which media are bad . Obviating, at least it seems, that users can vote according to interests and affinities and not regarding true credibility, recognizing the media that say what they want to hear, according to their ideology, and punishing those who report against their ideas, even if the news is totally true and objective.

The executive explains that this change is added to the one announced a little over a week ago, focused on prioritizing the content of friends and family compared to that of companies, pages and media. He hopes that in this way the news will be reduced by 20% on Facebook, reaching 4% of the content, when now they represent 5% of the total content of the social network.

Something that clashes head-on with the strategy of attracting media that for years have been carried out, although the free fall in the visibility of the pages in the service and in the numbers of users arriving at the sites from Facebook foreshadowed what arrived, It has arrived and is probably coming. “It’s a hard blow, but some will affect more than others,” said Spanish media responsible for this foreseeable nightmare.

More media news reported as reliable to the detriment of the rest

However, Zuckerberg points out, this way of scoring the reliability of the news “will not change the amount of news you see on Facebook.” The only thing that will change will be, he assures, the balance between media news in which “the community is determined to trust” and those that do not. “My hope is that this update on trusted news and last week’s update on meaningful interactions will help make time on Facebook well spent, ” he adds, reiterating the quality of time spent.

These surveys begin today in the United States, while their implementation in the rest of the world is already planned for future times.

The problems of a supposedly meritocratic system in the hands of users

The system announced by Facebook to ask your community about the media leaves too many doubts in the air . The explanation that the CEO makes is the following:

This is how this will work. As part of our ongoing quality surveys, we will now ask people if they are familiar with a news source and, if so, if they trust that source . The idea is that some news organizations are only reliable for their readers or observers, and others are widely trusted throughout society, even for those who do not follow them directly.

A statement posted in the press room, the head of the News Feed, Adam Mosseri, does not offer further explanations about the polls. By whom are they made? Do you check that the user’s familiarity really exists with the medium he or she is going to rate? Will all assessments be taken into account even if they manifestly contradict any objective and really reliable assessment? The answers can offer a dangerous scenario for users and for the social network itself, which “is not good for democracy” or for its community according to the testimony of several figures who helped create it.

A supposedly meritocratic system like this in the hands of users can end up in ratings according to the interests and affinities of each one, as we said above, without taking into account the real credibility. This is what usually happens in news aggregators like Menéame, as Antonio Ortiz said in a tweet : “we know that people vote for membership (it says what I think) and not quality / credibility”.

If users vote for membership, and not for credibility, there is a risk of further increasing the usual confirmation bias

The trap is not new: it is the confirmation bias , the tendency that seeks to confirm what one thinks. Something that leads to turn the chronologies of social networks into echo cameras of opinions that we approve and, ultimately, polarize even more debates with extreme views in itself. “The wide availability of content provided by users in online social networks facilitates the accumulation of people around interests, worldviews and common narratives,” said a study on this matter published in the scientific journal PNAS .

The magic in Facebook’s initiative could arise if by putting together what users vote on one side of the political spectrum and what they vote on the other, we get a broad set of reliable media that best represent the breadth of the debates , the different points of view on specific topics and the diversity of opinions. However, it is difficult to believe in this type of magic even if it is believed that these systems governed by the users themselves will work properly and for the purpose for which they were designed.

Facebook should not abandon its responsibilities: it does not owe it to the media, but to its users and their right to information

This is without taking into account that the measures against false news have shown that many users are still more attracted by the warnings about their credibility . “Academic research on correcting erroneous information has shown that putting a strong image, such as a red flag, next to an article can actually strengthen deep-seated beliefs,” said the platform’s managers in December.

Facebook, no doubt, has a power of unprecedented impact on the world of information at the global level. The social network has become an important source of traffic for many communication groups around the world and one of the main channels of information for many citizens, many of those 2 billion monthly users that it has. Both for those with more possibilities to be informed in other areas, as for those who have the platform as their only means. That’s why the social network led by Mark Zuckerberg has a responsibility – there continues to be the issue of Russian interference – even if he wants to put himself in front of her or leave decision-making in the hands of users, to be able to wash his hands.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.