Sunday, May 10, 2015

Users make their wall in an ideological bubble – Informador.com.mx

>

Facebook generated controversy in 2014 by studies that emotionally manipulated users. AFP /


  • By Javier Salas – El País
  • What is shown on the platforms you choose a formula that selects the best to satisfy the interests of user

GUADALAJARA, JALISCO (10 / JUL / 2015) .- Although many people still do not know, Facebook select what users see on their wall. A filtering algorithm which is shown to, in principle, give the user only what you most want to see and not saturate with information that does not care much. The question is whether that algorithm so well known that we are baiting us only what we like, creating a bubble around us that does not fit in anything that challenges our thinking. To dispel doubts, Facebook social scientists published in the journal “Science” the first study to analyze the influence of that formula that manipulates the walls: the ideological bubble exists, but it is more the fault of users that programming Mark Zuckerberg.

After studying more than 10 million users and their interaction with links to political news, scientists have discovered that Facebook social network is a sounding board for our ideas with few windows to the outside. So all links are the people who consider themselves progressive, only 22% challenge their thinking. Conservatives see in their walls 33% of news that does not correspond to their ideology.

If the algorithm had not intervened, progressives have seen 24% of unwelcome news and conservative 35%. This means that computers designed in Facebook helps reduce the ideological diversity of the wall of the users, but the formula is not the main culprit. Users are responsible enclosed in their own ideas, according to the study, if not chose their friends as they do, but at random, would be between 45% (progressive) and 40% (conservative) content contrary to their ideas.

Of course, the “offline” people, people with the environment that is physically related, nor is it random. But it is much more difficult to measure the ideological bubble in the street on social networks. The vast amounts of information that a company like Facebook can collect about your users (and those who are not) allow you to measure its inclination to withdraw into groups of thought more or less isolated.

One of the weaknesses the study is that only users of EU studies that defined his ideological position on a Facebook tab-more easy to show two opposites poles, which generates a significant bias and leaves doubts about the behavior of users who have ideology, but they not scored in their profile. Paul Barber suggests, studying the polarization of networks at the University of New York, studied users probably have a network of contacts in Facebook more homogeneous: “If the study had included all users, probably observe further levels Exposure to high diversity of opinions, and a greater effect algorithms “.


was algorithms

” It is a Study on the defensive, “says Esteban Moro, an expert on social networks Carlos III University. “Facebook has an image problem because of the algorithms that filter the information we see and wanted to show that algorithmic filter does not have as much influence as the social filter”, says the researcher.

We live in the era algorithms. What we are shown in the results of Google, on the wall of Facebook or other platforms is decided by an increasingly complex formula that selects the best to satisfy the interests of the user and the company. However, there are still many who think they see what is and not what the algorithm thinks they should see. But it is not so: in terms of their interaction with friends and their activity, Facebook define their interests and shows which will cause more interaction, to remain longer in the network and thus generate more revenue to the company.

This feedback loop aroused the interest of activist Eli Pariser, who published in 2012 a book called “Filter Bubble” (something like Bubble filters), referring to the fact that the algorithm had in our lives :. to Egypt in Google search, users receive information about the riots and other travel information for only the pyramids, all based on their previous behavior

The summer of 2014, Facebook released another of these which regularly publishes studies on behavior in his red- generated an unusual controversy because emotionally manipulated to its users, showing more negative or positive messages from their contacts to check that a certain contagion occurred in the way they express themselves. To a large extent, the controversy arose because the public discovered that Facebook handles the walls and therefore the behavior of people.

Scientists from the company that runs Mark Zuckerberg show that social contagion or ideological bubble that forms on your social network is similar or milder than “offline” occurs. In fact, already in 2012 had published a study which denied that the bubble was that bad, but this time it was important to download the guilt of the algorithm.
Previous research of Barber and this study indicate that social networks could be a mechanism to receive information differently than usual. “For example, a right-wing voter who only sees Antena 3 and read the reason could be exposed for the first time contained a leftist bias shared by your contacts on Facebook,” Barbera said.

Yet , this is one of the weaknesses of this latest study team Facebook, as Esteban Moro laments: “The problem is that it does not compare with anything. We can not know if what happens is worse or better off Facebook “.

More

Data from the study

study was conducted among users who had ideologically defined in EU.
studied 10.1 million of the nearly billion daily active users currently to be.

Only 13% of news studied directions corresponded to “hard” news, political news valid for the study.

Of the 903 million users who viewed news studied, only clicaron 59 million times.

Only 20% News that progressives clicaron were contrary to their ideas, compared to 29% of conservatives.

The researchers explain these differences between liberals and conservatives because people left are more likely to share links to news its ideology.

LikeTweet

No comments:

Post a Comment