You know the fears about AI automation are real when even the chief justice of the United States starts to sound nervous. John Roberts’ year-end report on the federal judiciary has caused a stir with its defense of the value of human judges in a world where AI models have started passing bar exams. While encouraging members of the stodgy legal profession to sit up and pay attention to AI’s advances, Roberts made the case that the job of judging involves an irreducible human element. “Legal determinations often involve gray areas that still require application of human judgment,” he argued. Even if human judges aren't going anywhere soon, the evidence suggests Roberts is right to be raising the alarm on AI. The technology is poised, or in some cases already starting, to collide with the practice of law in several arenas, many of which might not be obvious – but could have long-term effects. One particularly thorny issue will be the admission of evidence that is an output of an AI model, according to James Baker, a former federal appeals judge and the co-author of a 2023 judges’ guide to AI published by the Federal Judicial Center, a research agency run by and for the judiciary. The report anticipates that outputs like AI-generated analyses of medical tests or AI-screened job applicant pools will soon start posing legal dilemmas for judges. Baker told DFD that he expects the complexity of models to make controversies over AI evidence more vexing than debates over DNA evidence, which overcame initial skepticism to become a mainstay in American legal proceedings. "The challenge with AI is every AI model is different,” he said, “What’s more, AI models are constantly learning and changing.” For now, judges have discretion to steer clear of that confusion: Baker pointed to Rule 403 of the Federal Rules of Evidence, which says that a judge can exclude relevant evidence at trial if it’s likely to cause too much confusion or distraction. Of course, courts won’t be able to sidestep the complexity of AI models when they're central to the dispute being litigated. Already, generative AI has become the subject of several ongoing copyright cases, including one in which the New York Times is challenging OpenAI’s use of copyrighted material to train its models. Baker said he also expects to start seeing cases that will force judges to grapple with the role of AI in automated driving and medical malpractice. While the constitutionally mandated role of judges offers a certain level of job security, other positions in the legal profession are already starting to feel the heat. Last week, former Donald Trump lieutenant Michael Cohen, himself a disbarred lawyer, offered a memorable lesson in how not to use AI in the practice of law. On Friday court records were unsealed showing that Cohen had provided his legal team nonexistent legal precedents, given to him by Google’s Bard chatbot, which his lawyers then cited in a motion to end his supervised release early. But specialized AI legal research tools are improving rapidly, according to one litigator at a prominent mid-sized law firm who was granted anonymity to discuss what is becoming an increasingly touchy subject inside the legal profession. He said that one research tool he tried out last month accomplished in three or four minutes what a junior associate would take 10 hours to do. He predicted that smaller law firms will be able to adopt the technologies more quickly than the large firms that dominate the industry. The litigator said clients at startups and in the tech industry have already started pushing lawyers to make use of the automated tools: “There’s an expectation now that you’d use AI to reduce costs.” And the tools look poised to get better at automating human work. Last month, Harvey, a startup that bills itself as “generative AI for elite law firms,” announced it had raised $80 million from investors including Kleiner Perkins, Sequoia and OpenAI. If summer associates soon start to sweat, chief justices may not be too far behind. Matt Henshon, chair of the American Bar Association’s Artificial Intelligence and Robotics Committee, pointed DFD to a notable “dichotomy” between Roberts’ “gray area” commentary and his other memorable remarks about the role of the judiciary. At his confirmation hearing in 2005, Roberts famously described judging in more black-and-white terms. “Judges are like umpires,” he said, adding, “it’s my job to call balls and strikes.” There’s good reason for Roberts to ditch the umpire comparison in favor of a vaguer, more touchy-feely conception of judging (his latest report also emphasized judges’ ability to interpret “a quivering voice,” “a moment’s hesitation,” or “a fleeting break in eye contact”). If litigating is America’s favorite pastime, baseball might be its second favorite. And in 2019, Major League Baseball began experimenting with automated “umpires” to call balls and strikes in the minor leagues. Last year, the robo umps came to every Triple A ballpark, the last stop before getting called up to the big leagues.
|