Saturday, March 26, 2016

Tay, racist and xenophobic robot Microsoft – The Nacional.com

It was called Tay and took longer to learn to keep a conversation to be withdrawn from the market.

It was created, according to Microsoft, as an experiment to learn more about the interaction between computers and beings human.

it was a computer program designed to keep social networks an informal and fun conversation with an audience of between 18 and 24 years, as the company said on its website.

But the big bet of the technology giant in artificial intelligence eventually became a resounding failure.

And the day after its release, Microsoft had to turn it off.

racial slurs and sexist comments

the racist and xenophobic messages rebellious teenager Microsoft on Twitter and other social networks have not gone unnoticed.

His empathy for Hitler or its support for the genocide in answering questions from users of social networks are some examples, along with racial slurs and sexist and homophobic comments.

He also defended the Holocaust, the concentration camps or white supremacy, and he was contrary to feminism.

a spokesman for Microsoft said the company is making adjustments to make sure this does not happen again and blamed users, in part, by the reaction of Tay.

“Unfortunately, within 24 hours of being placed on the internet (Wednesday), we see a coordinated effort by some users to abuse capabilities conversation Tay to respond inappropriately,” he said in a statement.

“Project learning”

Some users also criticized restrictions on issues related to music or television.

others were concerned about what Tay could mean for future artificial intelligence technologies.

“Tay is a robot of artificial intelligence and a learning project designed for interaction with humans.”

“as you learn, some of their responses may be inappropriate and indicate the type of interaction that some users have with it,” the spokesman added.

Indeed, the bot was created to issue customized responses to users, collecting information on each of them during the interaction, which would explain certain comments, depending on the nature of the party.

the company decided to edit or delete the offending tweets issued by Tay, one as criticized some users who asked to “let her learn by herself.”

Microsoft said it will reprogram and soon again launch your teen bot, but did not confirm the exact date.

LikeTweet

No comments:

Post a Comment