Call it the Cyber-ano de Bergerac Defense. The eerie new capabilities of artificial intelligence are about to show up inside a courtroom — in the form of an AI chatbot lawyer that will soon argue a case in traffic court. That’s according to Joshua Browder, the founder of a consumer-empowerment startup who conceived of the scheme. Sometime next month, Browder is planning to send a real defendant into a real court armed with a recording device and a set of earbuds. Browder’s company will feed audio of the proceedings into an AI that will in turn spit out legal arguments; the defendant, he says, has agreed to repeat verbatim the outputs of the chatbot to an unwitting judge. Browder declined to identify the defendant or the jurisdiction for next month’s court date, citing fears that the judge would catch wind of the planned stunt and block it. In recent months, the public release of increasingly advanced AI tools has raised questions about everything from high school plagiarism to the very essence of what it is to be human. Now the technology is poised to collide with legal systems and public policies that never anticipated a role for non-human actors — especially not in court. It’s all part of a tech-policy “arms race,” Browder said, in which AI is shaking up the tools and rules that determine the balance of power between individuals, on one hand, and governments and corporations, on the other. He sees AI as a way for regular citizens a chance to level the playing field in fights over fines and fees, which they often can't afford to litigate themselves. AI has already made inroads into the American legal profession, where big firms routinely use it to assist in the task of reviewing troves of documents that can number in the millions during the discovery phase of litigation. But Nicholas Saady, a litigator at Pryor Cashman who advises on the use of AI in business and legal practice, said this latest application might not fly. He pointed to a host of procedural and practical issues presented by using AIchatbots for real-time legal representation.“Is it the unauthorized practice of law?” he asked, saying the plan risked running afoul of state laws that require professional licensure for lawyers. Browder said he has considered this and identified two jurisdictions where his plan is “not outright illegal.” In the second jurisdiction, he said, he had lined up a defendant to use the AI in a traffic court Zoom hearing. For the Zoom hearing, Browder said that rather than having the defendant speak the chatbot’s output aloud themselves, he was considering taking the simulation one step further by using an AI tool that can mimic a person’s voice after recording their speech. “Although,” he conceded, “that could get us into a lot of trouble.” This is not Browder's first foray into high-tech dispute resolution . Browder’s company offers all kinds of DIY legal and consumer help, and he has been using AIs to help customers press their claims with governments and corporations since 2021, largely by generating form letters and scripts for online customer service chats. One problem he has encountered along the way is that AIs will sometimes make things up to push their case, he said. To prevent this, he said, his company has to formulate elaborate instructions that force the bots to stick to factual statements. For his courtroom foray, Browder said he is using GPT-J, an open source AI model released last year. For less sensitive applications, Browder said his company uses models from OpenAI, the company behind the highest-profile new AI tools, called large language models. Browder said he remains in close touch with OpenAI to ensure his company, DoNotPay, does not run afoul of its terms of use. In practice, that has meant adding features — like a two-second delay during which a user can reject an AI’s suggested answer — that ensure a human maintains control of the final results, he said. OpenAI did not respond to requests for comment. Browder’s courtroom gambit is just one of the curveballs that the recent slew of advanced AI tools is throwing at the legal profession. A non-peer reviewed preprint paper published a week ago by two law professors predicts that a large language model will soon be able to pass the multiple-choice section of the Multistate Bar Exam. But even if an AI could pass the Bar, and even if courts allowed AI litigators, Saady said litigants would still be better off hiring flesh-and-blood lawyers. Good litigation, he said, relies on intangibles that remain out of reach for bots: He cited the ability to read body language, and to make split-second strategic decisions in the middle of courtroom exchanges. “It doesn’t seem like AI is ready to get on its feet in court,” he said. Boston attorney Matt Henshon, on the other hand, said the idea of AI-powered legal counsel holds promise. Henshon, who chairs the American Bar Association’s Artificial Intelligence and Robotics Committee, said it could provide legal help in lower-stakes scenarios where a person would otherwise go without representation at all. In that sense, he likened AI legal counsel to class-action lawsuits, which allow large numbers of plaintiffs to obtain representation by bundling together many small claims that would not be worth litigating on their own. “There are plenty of legal wrongs that don’t get righted because it’s not worth it for a lawyer to get involved,” Henshon said. Browder — whose father Bill Browder has spearheaded the passage of the Global Magnitsky Act and other laws around the world that sanction human rights violators — said he hopes his techno-legal stunt will help make that case to state legislatures and other rule-making bodies so that they will accommodate the rise of AI lawyers. “There’s all these gatekeepers, there’s laws, there’s governments to work around,” he said. “It’s not really the technology that’s the hardest part. It’s being in compliance.”
|