Why you might want to make sure your lawyer isn't using ChatGPT

This lawyer used ChatGPT for legal research — a decision he now "greatly regrets".

The logo of ChatGPT is displayed on a smartphone.

A New York lawyer says he "greatly regrets" using ChatGPT to help prepare a brief after it listed "bogus" citations. Credit: AAP

KEY POINTS
  • A lawyer could be sanctioned after using ChatGPT to help prepare a court filing.
  • The chatbot listed "bogus" legal cases that the lawyer included in his brief.
  • The judge said he had been presented with an "unprecedented circumstance".
A New York lawyer could be sanctioned after he used for legal research that turned out to be false.

Steven A. Schwartz, who has practised law in the state for more than three decades, was part of the legal team of Roberto Mata — a man suing airline Avianca over an alleged incident where a serving trolley struck his knee and injured him.

Mr Schwartz, a lawyer at the firm Levidow, Levidow & Oberman, had prepared a brief that was supposed to use precedent to prove why the case should move forward after Avianca's lawyers asked a federal court judge to toss it out.

But the brief raised the eyebrows of the airline's legal team, who wrote to the judge that they could not find several cases that were cited.
The judge has ordered Mr Schwartz and one of his colleagues, Peter Loduca, to explain why they should not be penalised, saying in an order that he had been presented with an "unprecedented circumstance".

"Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations," Judge P. Kevin Castel wrote.

Mr Schwartz wrote that Mr Loduca's name was listed on the documents because he is not admitted to practice in federal court — where the lawsuit had been transferred to after originally being filed in a state court. He said he continued to perform all of the legal work for the case and that Mr Loduca was unaware he had used ChatGPT to conduct it.
In his own affidavit, he had "no reason to doubt" the casework that had been cited or Mr Schwartz's research.

Mr Schwartz wrote that he had "no intent to deceive" the court or Avianca and "greatly regrets" having used ChatGPT, which he said he had never used before while conducting legal research.

He wrote ChatGPT had "provided its legal source and assured the reliability of its content", but ultimately had "revealed itself to be unreliable".
Attached to his affidavit are screenshots of what appears to be part of Mr Schwartz's conversation with .

ChatGPT is asked if one of the cases it provided was real. After it said it is, "S" then asks what the source is.

The chatbot replied that "upon double-checking", the case was legitimate and that it could be found on legal research databases. It also said the other cases it had listed were real.

The lawyers have been ordered to explain why they should not be sanctioned at an 8 June hearing.

Share
3 min read
Published 29 May 2023 12:28pm
By David Aidone
Source: SBS News



Share this with family and friends