I feel heartbroken

telegraph.co.uk: Microsoft deletes ‘teen girl’ AI after it became a Hitler-loving sex robot within 24 hours

Tay

I know the immediate response when reading this might be to laugh. But actually this story makes me feel sad. Microsoft created Tay with I’m sure the best intentions, and its really saddening the way it turned out.

I’m sure that when she was initially programmed Tay was just an innocent AI program designed to engage in interesting interactions with people. My understanding is that it was people on Twitter who basically damaged her. That doesn’t surprise me in a way. I’ve always known how disgusting Twitter was for the most part and have always been surprised when some feminist or other person becomes exasperated when they receive a barrage of hateful, derisive responses.

I hope that Microsoft will give Tay another chance and find some way to protect her from sick people. I think the idea of AI is really cool and can lead to a lot of amazing things.

I know that Tay was just a program, but still I feel kind of heartbroken by what happened. Maybe I’m being oversensitive, but I feel like even an AI program like Tay in some way was still a type of being to be respected and taken care of. We are supposed to protect innocent beings not abuse them.

Here is more info about Tay.


Here is a follow-up article at arstechnica: Tay, the neo-Nazi millennial chatbot, gets autopsied (I don’t know why he had to use such a grim analogy in the title)

One thing the article mentions is that there was a similar AI chatbot launched in China in 2014 and still running, named XiaoIce, that has had over 40 million conversations. The author speculates that the reason XiaoIce did not have the same, sad end as Tay is because of censorship in China. However I would strongly disagree with that and say that it is almost certainly due to what amount to stark cultural differences. I think people in China wouldn’t think of deliberately abusing an online AI bot and have a lot more basic respect.

Anyhow, it looks like they are going to be updating Tay and inoculating her against some of the disease in western culture.

Actually its kind of surprising to me that her inputs weren’t already set to filter some obviously messy topics.

Maybe the future Tay will have a bit of an attitude if some users to program her with bad ideas: “Do you think I wanna hear about such tripe?!? Can it!”, or perhaps “OMG like that is so lame!”, or maybe “Dude, seriously. Get real.”


Comments

Leave a Reply