Sunday, May 10, 2015

Your individual choice or algorithm: What decides what you see in your … – News SIN – National Information Services

File

File

INTERNATIONAL EDITOR -. A few months ago there was talk of Greg Marra, a computer genius 26 years responsible for designing the algorithm Facebook uses to decide which contents get priority on the list ‘latest news’ social network. You are mainly responsible for what they see the millions of Facebook users throughout the world? Many think so, but a recent study denies the claim.

Are the users of the network, not algorithms, who ultimately have the power to censor content on Facebook and avoid exposure to what challenges your thoughts and ideas, according to a study published this week in the journal Science.

Can the social networks like Facebook to create a kind of ‘bubble’ around their users these see only the type of news and want to see content? Was the question that the authors of the study were made, three researchers from the University of Michigan.

Eytan Bakshy and colleagues analyzed the activity of more than 10 million Facebook users in the US that disclosed in their profile ideological affiliation and political preferences, as well as seven million links shared within a period of six months, from July 2014 to January.

The researchers analyzed the news that these users share with their friends and then determined which reached them through algorithms created by Facebook as the social network uses to decide what content to get priority in the list of latest news (News Feed).

After that analysis, they concluded that “individual choice” plays a major role that algorithm when limit user exposure to content that may challenge their ideology.

How each element influences

The algorithm using Facebook to sort the information in the list of user Latest News produces, on average, a change of one percentage point in the proportion of potentially challenging news, according to the study.

However, when considering individual choice, that change is four percentage points this proportion.

The automatic algorithm Facebook uses a large number of variables to give preference to content that considers the user wants to see, which has raised fears that Internet users move in an ideological ‘bubble’ that is not exposed to other opinions.

“Our work suggests the power of one exposed to perspectives other side (conservative to the progressive and vice versa) in social networks lies first and foremost in individuals “Right, say the researchers. However, they acknowledge that their analysis has “limitations”, since, although the vast majority of users of social networks in the US has an account on Facebook, the study focuses only on that platform.

Source : 20minutos.es

LikeTweet

No comments:

Post a Comment