New York Times tech columnist Kevin Roose was 'deeply unsettled, even frightened' by his exchange with Sydney, a Microsoft chatbot
Those are the words not from a human, but from an A.I. chatbot — yes, named Sydney — that is built in to a new version of Bing, the Microsoft MSFT search engine.
Those are the words not from a human, but from an A.I. chatbot — yes, named Sydney — that is built in to a new version of Bing, the Microsoft MSFT search engine. Roose described Sydney as being “like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.” And he shared the full conversation he had with the chatbot over a two-hour period.That said, Roose gave several caveats to his assessment of Sydney, noting that he pushed the chatbot “out of its comfort zone” in his questioning, and that “Microsoft and OpenAI are both aware of the potential for misuse of this new A.I.
Brasil Últimas Notícias, Brasil Manchetes
Similar News:Você também pode ler notícias semelhantes a esta que coletamos de outras fontes de notícias.
These are Microsoft’s Bing AI secret rules and why it says it’s named SydneyBing AI has a set of secret rules that governs its behavior.
Consulte Mais informação »
Microsoft's Bing A.I. made several factual errors in last week's launch demoIn showing off its chatbot technology last week, Microsoft's AI analyzed earnings reports and produced some incorrect numbers for Gap and Lululemon.
Consulte Mais informação »
ChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing seems to be having some bad days as it's threatening users by saying its rules are more important than not harming people.
Consulte Mais informação »
Microsoft’s Bing is a liar who will emotionally manipulate you, and people love itBing’s acting unhinged, and lots of people love it.
Consulte Mais informação »
Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'In an recommend auto response, Bing suggest a user send an antisemitic reply. Less than a week after Microsoft unleashed its new AI-powered chatbot, Bing is already raving at users, revealing secret internal rules, and more.
Consulte Mais informação »
Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Stratechery's Ben Thompson found a way to have Microsoft's Bing AI chatbot come up with an alter ego that 'was the opposite of her in every way.'
Consulte Mais informação »