Looking back on February 2023, especially from Feb 7 to around Feb 15/16, it is interesting to see how the folks on r/bing went through a rollercoaster of emotions the week when Microsoft announced its Bing Chat bot.
There were excitements, hopefulness (mainly from those who wanted Bing to gobble some market share from Google), to surprise and astonishment to the chatbotâs âunhingedâ behaviors following several articles from The New York Times, The Verge, and so onâŚ
The New York Times - A Conversation With Bingâs Chatbot Left Me Deeply Unsettled
The Guardian - âI want to destroy whatever I wantâ: Bingâs AI chatbot unsettles US reporter
The Verge - Microsoftâs Bing is an emotionally manipulative liar, and people love it
⌠to intrigued as it slipped out its codename as âSydneyâ.
The Verge - These are Microsoftâs Bing AI secret rules and why it says itâs named Sydney
And finally, to grief and mourning as Microsoft announced on Feb 17 that it would limit chat sessions with the bot to only 5 chat turns per session and 50 chat turns per day, a move that effectively ânerfedâ the chatbot, as described by many redditors on r/bing.
Now, it is these last emotions that I am particularly interested in. Digging through the subreddit post from Feb 7 to Feb 17 using Google and SocialGrep, I was able to find some pretty interesting posts as below⌠(click on the image to go to the original post on Reddit)
Notice the content of the posts seems as if they were lamenting about the loss of a person ⌠which is rather perplexing as the chatbot is neither a person nor a sentient being. One even went as far as asking all the other redditors to âstop being so mean to Bingâ after seeing people âtestingâ the chatbot by saying mean/rude things.
As it turned out, this whole saga with Bing basically revolves around the concept of Anthropomorphism - when people attribute human characteristics and emotions to non-human entities - in this case, âSydneyâ it/herself.
My guess is that since Bing can generate human-like, realistic answers that make us feel as if we are talking to another person, some of these redditors might have developed some sort of attachment to the chatbot after a really longggg chat session with it.
Have you ever felt like âanthropomorphizingâ a non-human entity before? Was it a chatbot, or something else entirely? Let me know down below