Explore

AI Progress Update: Artificial Intelligence in Legal Practice

Artificial intelligence tools are multiplying rapidly in the legal technology industry, promising dramatic changes in how common legal tasks are accomplished


Artificial intelligence (“AI”) has become, of late, a staple of discussion in legal industry publications.  Each week seems to bring new product announcements, new panel discussions on its future, or new editorials on its potential for disruption.  Despite this ubiquitous hype, a clear definition of what we mean by AI has not emerged, areas of effective application remain narrow, and actual adoption rates are still low.  So, what’s all the fuss about, and is it warranted?

In this article, we will attempt to answer that question by providing an overview of what we mean by AI, some recent examples of its applications in legal practice, one area where it is measurably excelling, the burdens of AI training, and the actual rate of AI adoption.

What We Mean by Artificial Intelligence

Far from being a specific, technical term, artificial intelligence just refers to “intelligence exhibited by machines.”  Unfortunately, this means that our definition of AI fluctuates over time with our perceptions of intelligence.  Last year, the New York Times Magazine captured this paradox perfectly in an article on Google’s use of AI to reinvent its translation tools:

The phrase “artificial intelligence” is invoked as if its meaning were self-evident, but it has always been a source of confusion and controversy.  Imagine if you went back to the 1970s, stopped someone on the street, pulled out a smartphone and showed her Google Maps . . . Google Maps would almost certainly seem to her a persuasive example of “artificial intelligence.”  In a very real sense, it is.  It can do things any map-literate human can manage, like get you from your hotel to the airport – though it can do so much more quickly and reliably. . . . 

Practically nobody today, however, would bestow upon Google Maps the honorific “A.I.,” so sentimental and sparing are we in our use of the word “intelligence.”  Artificial intelligence, we believe, must be something that distinguishes HAL from whatever it is a loom or wheelbarrow can do. . . .  The goal posts for “artificial intelligence” are thus constantly receding.

Thus far, the AI tools being created have more in common with a loom or a wheelbarrow than with our cultural idea of a HAL-style general AI exhibiting human-like cognition.  Everything we’re making now is a weak, narrow AI, good at a performing a specific human function, but not in a human way or in a way adaptable to any other tasks.  In the legal technology space, the term is currently being applied to a new breed of research and review tools that can do more on their own with less user instruction and that can get better over time based on user feedback.

How Artificial Intelligence is Being Applied to Legal Practice

One of the first high-profile applications of AI to legal work came in 2015 when the IBM’s Jeopardy-winning Watson was leveraged to create ROSS.  That tool has since been licensed by a variety of firms and organizations to perform basic legal research functions:

When asked a question, Ross postulates hypotheses, researches and then generates a response, backing up its conclusions with references and citations.  It narrows down research results by selecting only the most highly relevant answers, and presents the answers in a more casual and understandable way. 

The AI also reportedly learns from its own experience; the more you interact with it, the more it gains speed and knowledge.

In the three years since, AI has started popping up in a wide range of legal technologies and services, primarily as a way to automate common research tasks, review tasks, and drafting tasks.  Here are some examples from just the past few months:

  • Three Startups Are Using AI in Law for Noble Purposes
    • Paladin set out to build a technology platform that helps teams to staff, manage, track and visualize their impact all in one place.
    • Road to Status is an online platform that helps people file immigration applications. It works like Turbotax by asking questions and providing explainer text to help applicants respond, with attorney review or assistance when necessary.
    • Launched in December 2017, HelpSelf Legal provides legal automation for the types of tasks that are disproportionately needed by low- and moderate-income users. The service currently only assists with domestic violence restraining orders in California, but there are plans to add eviction defense, naturalization, expungements and juvenile record-sealing in other geographies.
  • The Role of Artificial Intelligence in Legal Operations
    • Using AI-enabled technology helps you more efficiently manage incoming legal invoices and improve cost management. AI can analyze millions of invoice line items submitted by law firms to corporate legal and claims departments, comparing invoices against outside counsel billing guidelines.  The AI technology processes a huge amount of data very quickly.  Machine learning also allows for improved performance on subsequent invoices.
  • AI in Law Firms to Deliver New Models of Legal Services
    • The Chicago-based company’s Story Engine product uses AI to find relevant information and key documents in e-discovery and investigations. The product enables firms to quickly identify and understand significant players and topics in client data, such as money, dates, people, locations, and organizations, to gain a tactical advantage to prepare and strategize for court proceedings quickly and get on top of regulatory investigations with data profiles and summarized facts.
  • How Ogletree Deakins Overcame Legal AI Burnout
    • LegalMation’s tool currently drafts responses and initial discovery requests for employment and slip-and-fall suits filed in California. They said the tool can reduce the time that takes by 80 percent.  By this fall, the product should cover those cases in all 50 states.
  • Casetext Announces AI Search Integration into Revamped Research Tool
    • Up until recently, the AI-powered CARA had been mainly deployed as a tool attorneys could ‘drag and drop’ their brief into. CARA would then analyze the brief to find potential missing arguments or case law.  But now, ‘we are integrating our legal research engine with CARA,’ said Jake Heller, CEO of Casetext.

An Area Where AI Excels

One area where AI tools have particularly excelled in legal practice is contract review.  One AI contract review tool provider recently organized a formal study, “which was administered by independent attorney Christopher Ray and involved a stable of high-profile legal scholars,” pitting its tool against 20 experienced attorneys.  The study involved the review of “five nondisclosure agreements from the Enron data set” for the identification of 30 provisions.

On average, the attorneys achieved an accuracy rate of 85%, while the AI achieved an accuracy rate of 94%.  And, while the “fastest human attorney completed the task in 51 minutes,” the AI did it in 26 seconds.

Despite being both more accurate and dramatically faster at this task, however, what a tool like this can do is still narrowly limited in scope:

When you are drafting complicated contracts or putting legal arguments forward, there are good arguments to say the amount of creativity a human can bring to the table is going to significantly outperform a robot or AI.  However, the study we did was on the exact opposite side of the spectrum.  Looking through an NDA and issue-spotting requires little to no creativity.  It’s almost like the legal equivalent of a spell-check. 

The AI was not told to offer solutions.  It was not told to make recommendations when it found an issue.  It was told: Read through this contract and tell us what’s in it.  Tasks like that, yes, robots have proven consistently that they can do better than humans.  If you don’t believe me, write a 10,000-page article in Microsoft Word and give it to a human to spell-check.  We’re talking about the most basic work that lawyers today do.  [emphasis added]

Training Burdens of AI Deployment

Many AI tools also require technical configuration and a period of training to get them up to speed on a particular task or data set.  The more sophisticated or specialized the task or the data set is, the more time and work that can take.  For example:

Annotating, however, is just one step of the process.  To make sure the AI really can apply its knowledge in as broad and malleable a way as possible, developers turn to supervised learning, the primary training method for AI that works as a type of quality control process.  Give AI examples of what you want it to learn to identify, such as grammatical relationships, or specific types of contract clauses, then have it run tests on unannotated texts and correct it when it is wrong.  Run these tests over and over, and after some time, the system will start to understand lexicon on a deep, conceptual level.  [emphasis added]

In practical terms, this can mean investing a lot of time and effort up front when implementing a new AI tool for a project or a process:

In the end, it took Gowling “about two weeks to get [the AI platform] trained up to understand what to look for and how to look,” Kathuria says. Initially the firm started with a team of two staff training the AI, but then opened it up to “about between five to 10 lawyers, not working on it full time, but over a period of a week.”  [emphasis added]

AI Legal Tool Adoption Still Low

Although there is great promise in the current wave of AI legal tools, adoption rates have remained low.  In one recent survey, “only 6 percent of legal department respondents said they currently have AI tools or are piloting them.”  This is in part due to the narrow applicability and the startup burdens discussed above:

Smaller departments may not have enough data or cases to sufficiently train AI tools.  If a company doesn’t yet have a set process to standardize nondisclosure agreements and other contracts, AI won’t be helpful, she said, as it’s best used for tasks that repeat frequently.  If there’s not much repetition in contracts or elsewhere, AI isn’t a good first investment.  [emphasis added]

Another study from October 2017 also “found a strong lack of interest in AI among legal departments because many are not yet prepared for the technology.

Summary

AI tools that can perform basic legal research and review tasks have definitely arrived.  Given their narrow application and their training burdens, they are currently best suited to replacing high-volume, highly-repetitive activities like reviewing standardized documents or researching in a narrow legal area.  But, as reviewed above, a wide range of new tools is currently in development, including tools geared towards smaller-scale use by solo practitioners, individuals, and more.  Although the much-hyped revolution still appears a long way off, real benefits for some are here now, and the pool of potential beneficiaries is expanding.


About the Author

Matthew Verga, JD
Director, Education and Content Marketing

Matthew Verga is an electronic discovery expert proficient at leveraging his legal experience as an attorney, his technical knowledge as a practitioner, and his skills as a communicator to make complex eDiscovery topics accessible to diverse audiences. An eleven-year industry veteran, Matthew has worked across every phase of the EDRM and at every level from the project trenches to enterprise program design. He leverages this background to produce engaging educational content to empower practitioners at all levels with knowledge they can use to improve their projects, their careers, and their organizations.

Because you need to know

Contact Us