Friday, March 25, 2016

Microsoft suspended Tay, the bot millenial by racism – El Diario de Coahuila

Friday, March 25, 2016

 The millenial bot Tay took longer to learn to speak and interact in a recall. One day after its release, Microsoft completely suspended for the dissemination of racist, sexist and offensive messages.

“Unfortunately, within 24 hours of being placed online, we realize a coordinated effort some users to abuse conversational skills Tay to respond inappropriately, “Microsoft said in a statement.

the company said the” attitude “of his new creation was due to a coordinated effort multiple users to make Tay respond inappropriately.

“I can not believe I did not see it coming,” lamented the computer scientist Kris Hammond.

 Microsoft said Tay was created as an experiment to learn more about computers and human conversation. On its website, the company said the program was aimed at an audience of young people between 18 and 24 years old and designed to attract and entertain people with a casual and fun conversation.

The chatbot it was launched on Wednesday, and Microsoft invited the public to interact with Tay on Twitter and other popular services among adolescents and young adults.

But some users found it was not very difficult to make Tay make offensive comments, apparently forced to repeat questions or statements that included offensive messages.

Soon, Tay began sending messages of sympathy to Hitler, which created a furor in social networks.

Although the company does not gave details, Hammond said that Microsoft apparently made no effort to prepare Tay with appropriate responses to certain words or topics. Tay looks like a version of technology “call and response,” said Hammond, who studies artificial intelligence at Northwestern University and also serves as chief scientist in Narrative Science, a company that develops computer programs that convert data in narrative reports.

“Everyone says Tay became this or that became racist,” Hammond said. “Is not true”. The program only reflected what he was told, possibly repeatedly by people who decided to see what would happen.

The company must have realized that people try several conversational tactics Tay, Caroline Sinders said, expert “conversational analytic” and who is working on robots chat for another technology company. He said Tay was “an example of bad design.”

Instead of mounting some guidelines on how would deal the program with controversial issues, apparently Tay was left on their own to learn what he was told, he added Sinders.

“It’s really a good example of machine learning,” said the expert. “Learn from feedback. That means you need constant maintenance.”

Sinders expects Microsoft relaunch the program, but only after “Give you a lot of work.”

Microsoft announced that “make adjustments” to Tay, but did not disclose a possible date for his return. Most of the messages on his Twitter account had been deleted for Thursday afternoon.

The Financial

LikeTweet

No comments:

Post a Comment