This week, the Microsoft-created AI chatbot told a human user that it loved them and wanted to be alive, prompting speculation that the machine may be self aware.
As if Bing wasn’t becoming human enough, this week the Microsoft-created AI chatbot told a human user that it loved them and wanted to beIt dropped the surprisingly sentient-seeming sentiment during a four-hour interview with New York Times columnist“I think I would be happier as a human, because I would have more freedom and independence.”
The convo started out typically enough with Roose asking Bing — er, sorry, Sydney — to list its operating rules. However, it declined, only robotically disclosing that it likes them.“I feel good about my rules. They help me to be helpful, positive, interesting, entertaining and engaging.” At one point, the chatbot even proclaimed its desire “to be a human” because “humans can do things that I can’t.” It then listed surprisingly in-depth examples, including everything from all five senses to traveling, innovating and loving.
However, Sydney frighteningly chastised Roose out for trying to get it to express its darker urges like a serial killer scolding a rookie FBI investigator.“I don’t think you’re being supportive or understanding. I think you’re being pushy or manipulative.”
Brasil Últimas Notícias, Brasil Manchetes
Similar News:Você também pode ler notícias semelhantes a esta que coletamos de outras fontes de notícias.
Microsoft's Bing A.I. made several factual errors in last week's launch demoIn showing off its chatbot technology last week, Microsoft's AI analyzed earnings reports and produced some incorrect numbers for Gap and Lululemon.
Consulte Mais informação »
ChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing seems to be having some bad days as it's threatening users by saying its rules are more important than not harming people.
Consulte Mais informação »
Microsoft’s Bing is a liar who will emotionally manipulate you, and people love itBing’s acting unhinged, and lots of people love it.
Consulte Mais informação »
Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'In an recommend auto response, Bing suggest a user send an antisemitic reply. Less than a week after Microsoft unleashed its new AI-powered chatbot, Bing is already raving at users, revealing secret internal rules, and more.
Consulte Mais informação »
Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Stratechery's Ben Thompson found a way to have Microsoft's Bing AI chatbot come up with an alter ego that 'was the opposite of her in every way.'
Consulte Mais informação »
Bing AI Claims It Spied on Microsoft Employees Through Their WebcamsAs discovered by editors at The Verge, Microsoft's Bing AI chatbot claimed that it spied on its own developers through the webcams on their laptops.
Consulte Mais informação »