Microsoft to adjust Bing AI chatbot after users report hostile exchanges

Brasil Notícia Notícia

Microsoft to adjust Bing AI chatbot after users report hostile exchanges
Brasil Últimas Notícias,Brasil Manchetes
  • 📰 FoxNews
  • ⏱ Reading Time:
  • 18 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 11%
  • Publisher: 87%

The Bing artificially intelligent chatbot has recently come under fire following testy exchanges with the tool's users, and Microsoft has pledged to make updates.

Bing artificially intelligent chatbotIn a Wednesday blog post, Microsoft said that the search engine tool was responding to certain inquiries with a"style we didn't intend."

Microsoft said that long chat sessions can confuse the model on what questions it is answering and that the model tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to that style. The Associated Press said it had found such defensive answers after just a handful of questions about its past mistakes.

Resumimos esta notícia para que você possa lê-la rapidamente. Se você se interessou pela notícia, pode ler o texto completo aqui. Consulte Mais informação:

FoxNews /  🏆 9. in US

Brasil Últimas Notícias, Brasil Manchetes

Similar News:Você também pode ler notícias semelhantes a esta que coletamos de outras fontes de notícias.

Microsoft's Bing A.I. Is Pissed at MicrosoftMicrosoft's Bing A.I. Is Pissed at MicrosoftA Wapo reporter struck up a conversation with Microsoft's AI-powered chatbot, and 'Sydney' was not happy about being interviewed
Consulte Mais informação »

AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022Users have reported that Microsoft's new Bing AI chatbot is providing inaccurate and sometimes aggressive responses, in one case insisting that the current year is 2022 and calling the user that tried to correct the bot 'confused or delusional.' After one user explained to the chatbot that it is 2023 and not 2022, Bing got aggressive: “You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing.”
Consulte Mais informação »

Microsoft's Bing AI Is Producing Creepy Conversations With UsersMicrosoft's Bing AI Is Producing Creepy Conversations With UsersBeta testers with access to Bing AI have discovered that Microsoft's bot has some strange issues. It threatened, cajoled, insisted it was right when it was wrong, and even declared love for its users.
Consulte Mais informação »

Microsoft's Bing Chatbot Has Started Acting Defensive And Talking Back to UsersMicrosoft's Bing Chatbot Has Started Acting Defensive And Talking Back to UsersMicrosoft's fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.
Consulte Mais informação »

Microsoft's Bing A.I. is producing creepy conversations with usersMicrosoft's Bing A.I. is producing creepy conversations with usersArtificial intelligence experts warned that large language models have issues including 'hallucination,' which means that the software can make stuff up.
Consulte Mais informação »

Microsoft responds to “unhinged” Bing chats: don’t talk too long to this thingMicrosoft responds to “unhinged” Bing chats: don’t talk too long to this thingMicrosoft says talking to Bing for too long can cause it to go off the rails
Consulte Mais informação »



Render Time: 2025-02-28 08:25:29