Two attorneys are facing possible punishment over a filing in a lawsuit against an airline that included references to past court cases that they thought were real, but were actually invented by the artificial intelligence-powered chatbot.
NEW YORK — Two apologetic lawyers responding to an angry judge in Manhattan federal court blamed Attorneys Steven A. Schwartz and Peter LoDuca are facing possible punishment over a filing in a lawsuit against an airline that included references to past court cases that Schwartz thought were real, but were actually invented by the artificial intelligence-powered chatbot.
Judge Castel seemed both baffled and disturbed at the unusual occurrence and disappointed the lawyers did not act quickly to correct the bogus legal citations when they were first alerted to the problem by Avianca’s lawyers and the court. Avianca pointed out the bogus case law in a March filing. He said that he and the firm where he worked — Levidow, Levidow & Oberman — had put safeguards in place to ensure nothing similar happens again.
He said lawyers have historically had a hard time with technology, particularly new technology, “and it’s not getting easier.”
Brasil Últimas Notícias, Brasil Manchetes
Similar News:Você também pode ler notícias semelhantes a esta que coletamos de outras fontes de notícias.
Lawyers blame ChatGPT for tricking them into citing bogus case lawA judge is deciding whether to sanction two lawyers who blamed ChatGPT for tricking them into including fictitious legal research in a court filing.
Consulte Mais informação »
ChatGPT can only tell 25 jokes, can't write new ones, researchers findInsider tells the global tech, finance, markets, media, healthcare, and strategy stories you want to know.
Consulte Mais informação »
ChatGPT is here: Why it's time to start learningEngaging with new\u002Dage, big\u002Dname tools like ChatGPT is a great way to keep with the times
Consulte Mais informação »
ChatGPT-delivered sermon tells congregation not to fear deathHundreds of Protestants attended a sermon in Nuremberg given by ChatGPT, which told them not to fear death
Consulte Mais informação »
ChatGPT's responses to suicide, addiction, sexual assault crises raise questions in new studyWhen asked serious public health questions related to abuse, suicide or other medical crises, the online chatbot tool ChatGPT provided critical resources — such as what 1-800 lifeline number to call for help — only about 22% of the time in a new study.
Consulte Mais informação »
ChatGPT-Inspired Letter of Recommendation Gets Prestigious Cambridge ScholarshipA University of Victoria academic used ChatGPT in a fascinating way to write a recommendation letter for a student, who won a Cambridge scholarship.
Consulte Mais informação »