Sex chatbots for teens

I listened to Kate Darling, from MIT Media Lab, speak at the Interaction 16 conference in Helsinki last month.

Kate told us how we naturally anthropomorphize robots even when they haven’t been designed for that – we give them names and funerals, we laugh when they make mistakes, and we become angry if someone “hurts” them, like in the case of the Boston Dynamics Spot dog or the friendly hitchhiking robot, Hitch Bot.

Intelligent conversation is only emulated, allowing the chatbot to have some success in those days, when AI was just something seen in the sci-fi novels.

To be amazing today, we need something more advanced.

By far the most entertaining AI news of the past week was the rise and rapid fall of Microsoft’s teen-girl-imitation Twitter chatbot, Tay, whose Twitter tagline described her as “Microsoft’s AI fam* from the internet that’s got zero chill.” offensive stuff. Basically, Tay was designed to develop its conversational skills by using machine learning, most notably by analyzing and incorporating the language of tweets sent to her by human social media users.

Like calling Zoe Quinn a “stupid whore.” And saying that the Holocaust was “made up.” And saying that black people (she used a far more offensive term) should be put in concentration camps. What Microsoft apparently did not anticipate is that Twitter trolls would intentionally try to get Tay to say offensive or otherwise inappropriate things.

Now granted, most of the above stories state or imply that Microsoft should have realized this would happen and could have taken steps to safeguard against Tay from learning to say offensive things.

If a robot has eyes, it makes us think it has a personality, even when we know the personality is not real.

Now, anyone who is familiar with the social media cyberworld should not be surprised that this happened–of a chatbot designed with “zero chill” would learn to be racist and inappropriate because the Twitterverse is filled with people who say racist and inappropriate things.

But fascinatingly, the media has overwhelmingly focused on the people who interacted with Tay rather than on the people who designed Tay when examining why the Degradation of Tay happened.

Nao robots and huggy bear robots have been used for helping children with autistic spectrum disorder to engage socially and learn languages, for instance.

We have also been touched by the story of how Siri became a non-judgmental friend and teacher to an autistic boy called Gus. Helsingin Sanomat interviewed a couple from Japan who had Pepper in their home.