News

Why generative AI may not replace hiring a lawyer any time soon

People are turning to AI for everything, even legal help. THE702FIRM Injury Attorneys examined the promise and peril of AI in the legal profession. (Andrey_Popov // Shutterstock/Andrey_Popov // Shutterstock)

When Christopher Brock received a citation for building a small shed on his property in Hamilton County, Ohio, he turned to an unusual legal adviser: artificial intelligence. Despite having no legal background, Brock used AI to help create legal filings and motions that ultimately led to his case being dismissed in July 2023, he told Cincinnati's Fox19.

Through the court process, Brock learned to use AI strategically by carefully feeding the tool information about his situation and using it to create multiple legal documents. He filed six or seven motions across two courts before winning his dismissal, he said. "There were so many ways that this bureaucratic system can really slow down the average person from fighting and protecting their rights, and by seeing that, I realized the value of AI," Brock told the station.

According to a 2022 Pew Research Center survey of 11,004 U.S. adults, AI has rapidly become part of daily life, with more than 1 in 4 (27%) Americans reporting they interact with AI multiple times per day. Yet many remain cautious about its expanding influence: Fewer than 1 in 5 (15%) said excitement about AI outweighs their concern about its increasing role in daily activities compared to the 38% who think otherwise.

THE702FIRM Injury Attorneys examined the promise and peril of AI in the legal profession, exploring both its potential benefits and significant risks for individuals seeking legal assistance.

As AI technology becomes more accessible through chatbots and other consumer tools, people are testing its capabilities in increasingly high-stakes situations, including legal matters. However, while Brock's success shows what's possible with careful preparation, the technology's limitations and potential for errors raise important questions about when and how consumers should use AI for legal help.

insta_photos // Shutterstock

The legal hallucination problem

Understanding AI's limitations could be the difference between winning and losing a case for people considering using it for legal help. A New York attorney learned this when he used ChatGPT for legal research in an injury case. The federal judge overseeing the lawsuit found that the AI had made up fake case citations and quotes, claiming these made-up sources could be found in major legal databases. If trained lawyers can be misled by AI's confident-sounding but false information, the risks for people handling their own legal matters could be greater.

These "hallucinations"—the term used to describe convincing but false information that AI makes up—pose real risks in legal situations. According to legal experts, people should be cautious about using AI to interpret contracts, analyze their legal rights, or prepare court documents as the AI might invent legal principles or misinterpret important details that could harm their case. This happened in the United Kingdom, where a person representing themselves inadvertently made entirely fake submissions to the court based on ChatGPT-generated advice.

The problems with AI's accuracy are concerning for people seeking legal help. Most legal documents are kept behind expensive paywalls, meaning AI systems often lack access to important legal information. For example, a 2022 study of one of the largest databases offering free access to British and Irish legal judgments found that only half of court decisions can be found on the site. This means when someone uses an AI tool to help understand their legal situation or prepare documents, they have no way of knowing if it is working with complete or accurate information.

Legal organizations have raised several specific warnings about using AI for legal help. The State Bar of California warned that AI can sound confident even when giving wrong advice, making it hard for users to spot mistakes. LexisNexis, a leading provider of legal research and analytics tools, emphasized that AI should never make final legal decisions without human review. Meanwhile, Clio, a major legal practice management software company, warned that AI systems might give biased advice if they learned from biased historical data.

Pickadook // Shutterstock

AI can help lawyers and perhaps even close the justice gap too

Despite early concerns, people are finding ways to use AI as an assistant rather than a replacement for human judgment. Brock's successful handling of his property citation shows one approach. "You can't go into an AI and say, 'Win my case,'" Brock told Fox19. Instead, he used AI strategically by uploading information about his case and using the technology to help draft initial filings and motions while maintaining control of his legal strategy.

Recent research suggests this approach can work. An April 2024 Berkeley Law study of lawyers and legal aid professionals found that 90% reported increased productivity when using AI for specific basic tasks. The study found AI was most helpful for simpler legal work, like summarizing documents, conducting preliminary research on straightforward topics, and translating complex legal language into more accessible terms. However, users emphasized the importance of carefully reviewing all AI-generated content for accuracy.

Law firms are also exploring AI's potential, according to Bloomberg Law's 2023 State of Practice survey of more than 450 legal professionals. More than half of lawyers surveyed used AI to help draft communications or conduct legal research, respectively, while more than 2 in 5 (42%) used it to summarize legal narratives. The technology remains in an assistant role, with human lawyers providing necessary oversight and judgment. Bloomberg Law's research also shows about half of firms reported having internal discussions to better understand the technology, while about 1 in 3 are developing policies for using external AI tools or have implemented restrictions on AI use.

The technology could help address a significant challenge in the legal system. David Freeman Engstrom, LSVF professor in law at Stanford Law School, found that most civil cases in America, including debt collections, evictions, and family law matters, involve individuals facing legal teams without having lawyers of their own. While AI can't replace human attorneys, it might help make basic legal information more accessible to those who can't afford traditional legal services.

Reuters reports that smaller practices may benefit from adopting AI as the technology could level the playing field by helping them increase efficiency and potentially offer more competitive fees for standard services for clients seeking representation. Meanwhile, a Stanford Law analysis shows courts are already taking action, with some jurisdictions partnering with technology providers to create accessible tools that simplify court forms and filing processes. With oversight from legal experts, initiatives like these could leverage AI to better serve people who are unable to afford traditional legal help.

Story editing by Natasja Sheriff Wells. Copy editing by Paris Close. Photo selection by Clarese Moller.

This story originally appeared on THE702FIRM Injury Attorneys and was produced and distributed in partnership with Stacker Studio.

0
Comments on this article
0

mobile apps

Everything you love about krmg.com and more! Tap on any of the buttons below to download our app.

amazon alexa

Enable our Skill today to listen live at home on your Alexa Devices!