[ad_1]
Legal Writing
Fake citations in legal brief were generated by Google Bard AI program, says ex-Trump lawyer Michael Cohen
Michael Cohen, who was once a lawyer for former President Donald Trump, exits the courtroom of Trump’s civil business fraud trial at the New York Supreme Court in New York in October. Photo by Yuki Iwamura/The Associated Press.
Former lawyer Michael Cohen has informed a federal judge that the nonexistent cases cited in a legal brief were generated by Google Bard, an artificial intelligence program.
Cohen, who was once a lawyer for former President Donald Trump, was under the impression that Google Bard was a “super-charged search engine,” rather than an AI program, said his lawyer E. Danya Perry, a lawyer with Perry Law, in a letter to the judge unsealed Friday.
Thinking that the cases that he found were real, Cohen provided them to the lawyer seeking an early end to his supervised release, David M. Schwartz, Cohen said in an affidavit submitted with Perry’s letter. Cohen, who is now disbarred, pleaded guilty in 2018 to campaign finance violations and bank and tax fraud.
The New York Times, Reuters and Courthouse News Service are among the publications covering the declaration.
Cohen had used Google Bard in the past to “to successfully identify accurate information in other contexts before and did not appreciate its unreliability as a tool for legal research,” Perry said in her letter.
Schwartz included the citations in his brief without checking them because he was under the mistaken impression that Perry, who was representing Cohen in another matter, had provided them. Perry did give Cohen “very cursory notes on an early draft of the motion” that did not, at that time, include the fake citations. But she did not review future drafts of the motion, as Schwartz came to believe, she said in her letter to the judge.
Perry had represented Cohen in a separate case against Trump in New York state court. Perry sought to enter the supervised release case on Cohen’s behalf because she thought that government lawyers had mischaracterized Cohen’s testimony in the Trump case. In preparing a reply motion to the government filing, Perry discovered that the citations were fake and disclosed the problem to U.S. District Judge Jesse Furman of the Southern District of New York.
The disclosure led Furman to issue an order to show cause why Schwartz should not be sanctioned. The judge has not yet ruled on the issue.
Perry said she did not intend to imply bad faith on Schwartz’s part. But, she added, “even a quick read of the nonexistent cases at issue here should have raised an eyebrow.”
One citation, for example, had a 2021 docket number, suggesting that the defendant had been indicted in 2021. But the defendant had purportedly served a 120-month sentence and been placed on supervised release, a decision said to be affirmed by a federal appeals court. That would be “a chronological impossibility,” Perry wrote.
AI has ensnared at least a dozen lawyers or litigants who used its case citations in legal filings, Eugene Volokh, a professor at the University of California at Los Angeles School of Law, told the New York Times.
Chief Justice John Roberts focused on AI in his 2023 Year-End Report on the Federal Judiciary, report the National Law Journal and SCOTUSblog.
AI could be used in the future to help increase access to justice by providing answers to basic questions and directing litigants where to find templates and court forms, Roberts said. As the technology evolves, courts will have to consider its proper use in litigation, he wrote.
“Any use of AI requires caution and humility,” Roberts added.
He referred to a previous instance in which an AI “hallucination” produced citations to nonexistent cases, which were then included in a brief.
“Always a bad idea,” he observed.
See also:
“Response to fake-citations query implicates attorney-client privilege, Michael Cohen’s lawyer says”
“Lawyers who ‘doubled down’ and defended ChatGPT’s fake cases must pay $5K, judge says”
[ad_2]
Source link