Skip to content
Gallery
The Sente Playbook
Share
Explore
The Sente Playbook

icon picker
44% of Investment Bankers Think They Can Make Lots of Money Off of Attorney Insecurity (AI)

This is the first in a 3 part blog post. The other 2 posts are co-authored by Toby Brown and Greg Lambert and will follow later this week on . Apologies for the length of this post, but I was channeling my inner Casey Flaherty.
The Big Idea: The data that Goldman used is insufficient to make the claims about Generative AI’s effect on legal that their report did.
Key Take-Aways
Reporting about this report is sloppy
Reporting within this report is sloppy
The underlying data doesn’t tell us much meaningful
3 Geeks attempts to find meaningful data

On March 26th, 2023 Goldman Sachs sent shockwaves through the legal industry by publishing a report claiming that 44% of “something” in the Legal Industry was going to be replaced by Generative AI. I didn’t question that stat at the time, because it sounded about right to me. I suspect that was true for most people who know the legal industry. As I’ve heard this stat repeated by multiple AI purveyors actively scaring lawyers into buying their products or services, I eventually started to question its validity.
I started by looking into the press coverage of that 44% number and was immediately confused. (All emphasis below added by me.)

Law.com - March 29, 2023
“Goldman Sachs estimated that generative AI could automate 44% of legal tasks in the U.S. “

Observer - March 30, 2023
“The investment bank's economists estimate that 46% of administrative positions, 44% of legal positions, and 37% of engineering jobs could be replaced by artificial intelligence.

NY Times - April 10, 2023
“Another research report, by economists at Goldman Sachs, estimated that 44 percent of legal work could be automated.”

Okay, so which is it? Generative AI is going to replace 44% of legal tasks, positions, or work?
Because those are 3 very different things; each of which would have extremely different impacts on the industry if they came to pass. Lest you think I cherry-picked three outlying articles, go ahead and Google “AI Replace 44% Legal Goldman Sachs” and see what you get. Those 3 articles are in my top 5 results.
My top result as of this writing is a news article from , writing last Tuesday that Goldman says, “AI could automate 46% of tasks in administrative jobs, 44% of legal jobs, and 37% of architecture and engineering professions.”
We should probably just go back to what the Goldman Sachs report actually said and then we can chalk this up to lazy tech journalism. Well, not so fast. Because while the Goldman researchers clearly say “current work tasks” (see below) even that begins to fall apart once you dig into the underlying data.

What Goldman Sachs actually said in the report

“...we estimate that one-fourth of current work tasks could be automated by AI in the US (Exhibit 5, top panel), with particularly high exposures in administrative (46%) and legal (44%) professions...” (emphasis added)

So, they are estimating that a quarter of all current work tasks in the US could be automated by AI... someday?
Let’s assume they mean sometime in the very near future, although the only reference to a timeframe is in the middle of page 5, “While much uncertainty remains around both the capability and adoption timeline of generative AI, these developments suggest that AI is well-positioned to advance rapidly and grow in scale in the coming years.” (Emphasis mine.)
Further confusing the issue, the Goldman researchers reference a chart (Exhibit 5) showing the “Share of Industry Employment Exposed to Automation by AI” in the US.

image.png
The Goldman researchers weren’t specifically targeting the legal industry with these findings. They were looking at potential for AI automation in ALL industries in the US and it just so happened that “Legal” was the second most vulnerable industry.
Except... does anyone else find it strange the the most vulnerable industry identified in this report is “Office and Administrative Support”, the fifth is “Business and Financial Operations”, and the seventh is “Management”? I’ve done my fair share of office and administrative support, business operations, and management over the years, but I’ve never considered myself a part of one of those “industries”.
Even Goldman is confusing the terms in their analysis. This chart would probably be better titled “Share of Professions Exposed to Automation by AI: US”. That would make a lot more sense, except of course the highlighted column should then be labeled “All Professions” not “All Industries”.
However, this clarification highlights another problem with this data. Exactly what jobs are included in the “legal profession”, or the legal industry for that matter? And how do we know that 44% of tasks those jobs perform are vulnerable to automation by AI?

What the underlying data tells us

Thankfully the Goldman researchers included a high-level description of their methodology so I could go back to the original data and figure out exactly where they got that 44% number and what it really means.
Unfortunately, their description is verbose while still somehow remaining cryptically vague. You can follow along at the bottom of page 5 in the report.

In particular, we use data from the O*NET database on the task content of over 900 occupations in the US ... to estimate the share of total work exposed to labor-saving automation by AI by occupation and industry.
...we classify 13 work activities (out of 39 in the O*NET database) as exposed to AI automation, and in our base case assume that AI is capable of completing tasks up to a difficulty of 4 on the 7-point O*NET “level” scale (see Appendix for more details). We then take an importance- and complexity-weighted average of essential work tasks for each occupation and estimate the share of each occupation’s total workload that AI has the potential to replace.

Having dug into the underlying data, I can now begin to translate some of that:

- The Occupational Information Network, a Department of Labor sponsored database containing voluminous occupational information. This is where all of Goldman’s data comes from.
39 Work Activities - O*NET defines 39 work activities that apply to each job title they track (I count 41, but maybe Goldman dropped a few intentionally?)
13 Work Activities - Goldman identified 13 of the 39 work activities that are potentially vulnerable to AI automation, including:
Getting Information
Monitoring Processes, Materials, or Surroundings
Identifying Objects, Actions, and Events
Estimating the Quantifiable Characteristics of Products, Events, or Information
Processing Information
Evaluating Information to Determine Compliance with Standards
Analyzing Data or Information
Updating and Using Relevant Knowledge
Scheduling Work and Activities
Organizing, Planning, and Prioritizing Work
Documenting/Recording Information
Interpreting the Meaning of Information for Others
Performing Administrative Activities
“Level” Scale - O*NET gives each work activity a Level from 1 to 7. O*NET simply describes this as a “level”, but extrapolating from the Level Examples (called Anchors by O*NET), and published in the report Appendix on pg. 17, Goldman interprets the “Level” metric to mean “Difficulty”.
Base Case - For the purposes of this study, Goldman assumes that Generative AI is Capable of performing work activities up to level 4. They suggest that findings may be significantly different, one way or another, depending on how much capability you attribute to AI.
Weighted Average - In addition to a level, each work activity is given an “Importance” score. Goldman combines the importance- and complexity- (level?) scores of essential tasks to come up with a percentage of the workload for each occupation, that AI could potentially replace. They don’t provide that formula.

What Goldman Sachs does not say in the report

O*NET classifies Work Activities by individual Job Titles, not by industry or profession, so what should we consider “legal” in this report?

With a little digging, I discovered that O*NET provides a list of occupations that it considers a part of the “
Administrative Law Judges, Adjudicators, and Hearing Officers
Arbitrators, Mediators, and Conciliators
Judges, Magistrate Judges, and Magistrates
Judicial Law Clerks
Lawyers
Legal Support Workers, All Other
Paralegals and Legal Assistants
Title Examiners, Abstractors, and Searchers
I found it odd that O*NET did not include “Legal Secretaries and Administrative Assistants” in the Legal Job Family, even though that is a title that they track. I soon discovered that they were listed in the job family. Which sounded very familiar to me.
The O*NET Job Families (presented in a familiar order):
Office and Administrative Support
Legal
Architecture and Engineering
Life, Physical, and Social Science
Business and Financial Operations
Community and Social Service
Management
Sales and Related
Computer and Mathematical
Farming, Fishing, and Forestry
Protective Service
Healthcare Practitioners and Technical
Educational Instruction and Library
Healthcare Support
Arts, Design, Entertainment, Sports, and Media
Personal Care and Service
Food Preparation and Serving Related
Transportation and Material Moving
Production
Construction and Extraction
Installation, Maintenance, and Repair
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.