Soon after the brief was filed, Avianca informed the court it could not locate a single authority or quotation cited in the brief. Mata’s lawyers then filed a compendium of the actual cases referred to in the brief, but these too were found to bogus.Advertisement 4In an affidavit filed with the court Shwartz said he “deeply regrets” using the platform to prepare his legal research and that he found ChatGPT to be “a source that has revealed itself to be unreliable.
In an order from the court addressing the fictional case brief, Judge P. Kevin Castel wrote that the situation was unprecedented in that he had been presented with a legal submission filled with “bogus judicial decisions, with bogus quotes and bogus internal citations.” Lucky for us, Judge Castel and Avianca’s lawyers conducted a diligent review of the case law being advanced by the plaintiff. If the ChatGPT brief was somehow able to slip through the cracks, our entire legal system would be left in a perilous state. Precedent after precedent would be replaced by bot-generated, fabricated cases that don’t accurately represent legal issues tackled by our courts.
The law, at its core, is people-based. It has, and forever should be, people representing people. Precedent is built by parties being judged by a jury of their peers or a judge weighing the evidence. This case tells us that protecting our courts from the unchecked dangers of artificial intelligence is of primary importance.Have a workplace question? Maybe I can help! Email me at sunira@worklylaw.com and your question may be featured in a future column.Share this article in your social network
Law Law Latest News, Law Law Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: globeandmail - 🏆 5. / 92 Read more »
Source: PGCitizen - 🏆 65. / 51 Read more »
Source: globeandmail - 🏆 5. / 92 Read more »