Redditors' reactions to the Bing chatbot was ... interesting 🤔

Looking back on February 2023, especially from Feb 7 to around Feb 15/16, it is interesting to see how the folks on r/bing went through a rollercoaster of emotions the week when Microsoft announced its Bing Chat bot.

There were excitements, hopefulness (mainly from those who wanted Bing to gobble some market share from Google), to surprise and astonishment to the chatbot’s “unhinged” behaviors following several articles from The New York Times, The Verge, and so on…

:pushpin: The New York Times - A Conversation With Bing’s Chatbot Left Me Deeply Unsettled
:pushpin: The Guardian - ‘I want to destroy whatever I want’: Bing’s AI chatbot unsettles US reporter
:pushpin: The Verge - Microsoft’s Bing is an emotionally manipulative liar, and people love it

… to intrigued as it slipped out its codename as “Sydney”.

:pushpin: The Verge - These are Microsoft’s Bing AI secret rules and why it says it’s named Sydney

And finally, to grief and mourning as Microsoft announced on Feb 17 that it would limit chat sessions with the bot to only 5 chat turns per session and 50 chat turns per day, a move that effectively “nerfed” the chatbot, as described by many redditors on r/bing.

Now, it is these last emotions that I am particularly interested in. Digging through the subreddit post from Feb 7 to Feb 17 using Google and SocialGrep, I was able to find some pretty interesting posts as below… (click on the image to go to the original post on Reddit)

image|518x500, 1000%

image

image

image

Notice the content of the posts seems as if they were lamenting about the loss of a person … which is rather perplexing as the chatbot is neither a person nor a sentient being. One even went as far as asking all the other redditors to “stop being so mean to Bing” after seeing people “testing” the chatbot by saying mean/rude things.

As it turned out, this whole saga with Bing basically revolves around the concept of Anthropomorphism - when people attribute human characteristics and emotions to non-human entities - in this case, “Sydney” it/herself.

My guess is that since Bing can generate human-like, realistic answers that make us feel as if we are talking to another person, some of these redditors might have developed some sort of attachment to the chatbot after a really longggg chat session with it.

Have you ever felt like “anthropomorphizing” a non-human entity before? Was it a chatbot, or something else entirely? Let me know down below :point_down:

Writing this topic has really made me want to rewatch Her (2013) which now that I think about it, is really the epitome of this whole Anthropomorphism thingy, or maybe the fact that a broken marriage, coupled with introversion, can really push any man/woman into developing feelings of intimacy towards AI :woozy_face: