NEW YORK — Facebook on Thursday announced measures to combat the dissemination of false news by its huge social network, focusing on the messages that are most toxic, and by using a committee of experts to distinguish between obvious tricks and news legitimate.
In the first place, Facebook would facilitate more users to report any content that they deem implausible; now they can do it in two steps instead of three. If enough users report a particular item, Facebook will send you to a partnership of expert organisations, linked to the Institute of Journalism, Poynter.
Those organizations are now The Associated Press, ABC News, FactCheck.org, Politifact and Snopes. Facebook has announced that the set could be expanded.
The fake news can try on a wide range of topics, such as cures for non-existent cancer or testimonies of having seen the Man of the Snows. But the ones that have to do with politics have caused a stir in recent times to the suspicion that influenced the perception of the public and were perhaps a factor in the u.s. elections. And some false news have had unfortunate consequences in the real world: a man is taken in by a lie that they were abusing children in a pizzeria of Washington and went to the place, where he shot his rifle.
“Certainly we believe that we have the obligation to combat the dissemination of false news,” he said in an interview, John Hegeman, vice president of production and news of Facebook. But he added that the company also takes seriously its responsibility to give everyone the opportunity to express themselves, and that it is not their role to determine what is true and what is false.
The articles that are confirmed as a falsehood will not be removed from Facebook, but will the announcement that he is “in dispute” and will be further down the chain news from other users. Readers will be able to click on the ad and read the details of why it was decided that the assertion is alien to the reality. And if anyway one wants to share that content, what can be done, but the content will be accompanied by a warning.
By partnering with a respected press organizations, and to the point, instead of remove, the false content, Facebook is avoiding the complaints that others had raised about what right does she have to take that kind of decisions. For example, many complained that Facebook was to become a censor, and perhaps a censor quite awkward since most of their employees are engineers who have little experience in deciding issues of journalistic ethics.
“you Definitely don’t have the necessary experience,” said Robyn Caplan, research Data & Society, an academic institute non-profit funded in part by Microsoft and by the National Science Foundation. In an interview before the announcement by Facebook, Caplan urged the company “to ask for help to the professionals of the journalism and related organizations that deal with these issues”.
No comments:
Post a Comment