
AI tools like Claude, Chat GPT and others are being misused and creating fictitious cases and quotes in legal filings in Oregon and the rest of the U.S. Photo taken April 28, 2026.
Allison Frost / OPB
A federal judge recently fined two Oregon lawyers a total of $110,000 for filing documents full of cases and citations fabricated by Artificial Intelligence.
The fine amount was remarkable — but what led up to it was hardly a fluke.
According to Ankur Doshi, general counsel of the Oregon State Bar, there have been approximately five identified court filings in Oregon that contained AI “hallucinations.” Nationally, that number is closer to 900.
An AI hallucination is when a generative AI program, such as Claude or ChatGPT, generates inaccurate or misleading information, including inventing information that does not exist.

FILE - Undated headshot of General Counsel for the Oregon State Bar Ankur Doshi.
Courtesy of the Oregon State Bar
Doshi joined OPB’s “Think Out Loud” to dig into some of these cases, discuss why some lawyers continue to use AI despite its risks and explain how its use is shaping the legal industry.
According to Doshi, the misuse of AI in legal cases includes non-lawyers creating their own legal documents.
“Because of the fact that generative AI tools are available to members of the public,” he said, “we are seeing individuals who are representing themselves, who are going into court pro se [self-represented], using this to draft their own pleadings.”
People representing themselves could face court sanctions, such as fines, for misusing AI.
How the state bar disciplines lawyers depends on what Doshi called “aggravating” or “mitigating” factors. Those factors boil down to how transparent the lawyer is about using AI and how they resolve the problem.
If the lawyer discloses how they used AI to the court, they may only get a slap on the wrist, Doshi said. If they try to hide it, they could be fined (like the two Oregon lawyers), suspended or even disbarred.
The Oregon State Bar wrote in its formal opinion on AI tools that lawyers are required to be competent in whatever tools they use to provide legal services to their clients, such as understanding that AI can produce false information.
According to Doshi, many lawyers are still willing to use AI despite the risks because it can help them work more efficiently.
“AI does represent a substantial amount of time savings and efficiencies for attorneys who integrated [it] into their work process properly,” he said. “It does have a substantial amount of capability for being able to review our work and draft our work.”
Doshi added that lawyers still need to better understand the AI tools they’re using, and even then, AI work needs a set of human eyes.
“The key element within utilizing [AI] well — versus potentially creating an issue where you have fabrications — is that there has to be that human element who checks the work and is able to respond to it,” he said.
“One of the things that we are seeing a lot of is that attorneys don’t necessarily understand how AI functions.”
AI mistakes could cost courts more time and money, especially when opposing counsel has to take on extra work to review potentially fabricated cases. And down the road, he said, those mistakes could also erode the integrity of the legal system.
“It’s not only just fabricated cases,” Doshi said. “It’s fabricated statements of law. So you have entire arguments that have no basis in law, which strikes directly at our precedent-based system.”
Ankur Doshi, general counsel of the Oregon State Bar, spoke to “Think Out Loud” host Dave Miller. Click play to listen to the full conversation: