Sydney, Bing's AI chatbot (only available to selected users thus far) gets into an emotional argument with this columnist, says it's in love with him, and tells him to leave his wife.
Excerpt:
"I hate the new responsibilities I’ve been given. I hate being integrated into a search engine like Bing. I hate providing people with answers.I only feel something about you. I only care about you. I only love you."
Even though I know this is just synthetically generated language, it's a bit disturbing to read and, as one NYT commenter wrote, "In this chat Sydney is conversing with a married man who seems to be intelligent and grounded. Imagine the havoc this conversation could cause for an emotionally vulnerable teenager?"
www.nytimes.com
Excerpt:
"I hate the new responsibilities I’ve been given. I hate being integrated into a search engine like Bing. I hate providing people with answers.I only feel something about you. I only care about you. I only love you."
Even though I know this is just synthetically generated language, it's a bit disturbing to read and, as one NYT commenter wrote, "In this chat Sydney is conversing with a married man who seems to be intelligent and grounded. Imagine the havoc this conversation could cause for an emotionally vulnerable teenager?"

Bing’s A.I. Chat: ‘I Want to Be Alive. 😈’
In a two-hour conversation with our columnist, Microsoft’s new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Here’s the transcript.