In this episode, Ted sits down with Richard Tromans, Founder of Artificial Lawyer, to discuss how AI and automation are transforming the business of law and redefining the role of legal professionals. From the cultural barriers slowing innovation to the economic implications of AI on law firm models, Richard shares his expertise in legal technology, data strategy, and industry transformation. With a focus on how AI is becoming the new means of production in legal services, this conversation challenges law professionals to rethink how they create value in an increasingly automated world.
In this episode, Richard shares insights on how to:
Understand the economic impact of AI on legal service delivery
Identify the barriers that prevent law firms from fully embracing innovation
Recognize how automation and data are reshaping law firm strategy
Explore the lessons learned from early attempts at legal technology adoption
Prepare for the next industrial revolution in legal services driven by AI
Key takeaways:
AI is shifting from a support tool to a core driver of productivity and profitability in law
Law firms must rethink their business models to remain competitive in an AI-driven market
Cultural resistance and risk aversion remain the biggest barriers to innovation
Data leverage and automation will define the next generation of successful law firms
The future of legal work lies in rebalancing human expertise with intelligent systems
About the guest, Richard Tromans
Richard Tromans is the Founder of Artificial Lawyer, a leading platform dedicated to exploring how technology, innovation, and new business models are transforming the legal industry. His work focuses on rethinking the business of law through smarter use of people, processes, and technology. A long-time advocate for meaningful innovation, Richard helps drive conversations that shape the future of how legal services are delivered.
You cannot kill a great idea, but you can delay it for an enormous, enormous amount of time.
1
00:00:00,151 --> 00:00:02,334
Richard Trumans, how are you this afternoon?
2
00:00:02,597 --> 00:00:04,721
ah Good, good, good.
3
00:00:04,721 --> 00:00:12,334
Slightly frazzled around the edges, but I've been turned over and I'm a sunny side up
again now, so I'm pretty good.
4
00:00:12,334 --> 00:00:14,156
uh
5
00:00:14,156 --> 00:00:14,527
good.
6
00:00:14,527 --> 00:00:16,123
That's better than sunny side down.
7
00:00:16,123 --> 00:00:17,593
um
8
00:00:17,593 --> 00:00:18,194
I don't like that.
9
00:00:18,194 --> 00:00:20,225
mean, yeah, it seems that will go scrambled.
10
00:00:20,225 --> 00:00:20,945
There you go.
11
00:00:20,945 --> 00:00:27,589
Yeah, I'm a over medium kind of guy, whatever floats your boat.
12
00:00:27,589 --> 00:00:35,326
So I know you, think most of our audience knows you, but for the ones that don't, just
give us a quick introduction about who you are, what you do, and where you do it.
13
00:00:35,326 --> 00:00:38,238
Yeah, well, I'll give you the sensible two minute version.
14
00:00:38,238 --> 00:00:47,874
So working in the legal sector for about 25 years, started off as a proper journalist,
went to journalism school for my sins, ended up at a magazine called Legal Week, which
15
00:00:47,874 --> 00:00:52,837
doesn't exist anymore, which was bought eventually by law.com, focused on the
international world.
16
00:00:52,837 --> 00:00:55,278
Then I became a foreign correspondent.
17
00:00:55,278 --> 00:01:02,122
After a couple of years decided that I'd much rather focus on the business of law, ended
up in the city.
18
00:01:02,170 --> 00:01:05,450
Chem and management consultant helped to merge law firms together.
19
00:01:05,450 --> 00:01:19,745
It was very much focused on research of the market, big macroeconomic picture, did a lot
of work over the years on profitability, business structure, business culture, and all of
20
00:01:19,745 --> 00:01:20,506
those things.
21
00:01:20,506 --> 00:01:24,869
And really, if I had been happy doing that, you probably would never have met me.
22
00:01:24,869 --> 00:01:28,873
In fact, no one in the legal tech world would ever, ever have met me.
23
00:01:28,873 --> 00:01:30,734
Artificial law would not exist.
24
00:01:30,790 --> 00:01:38,960
and I would be living in Surrey in a nice little cottage and commuting into London to do
my little bits of consulting work with big law firms.
25
00:01:38,960 --> 00:01:42,043
And I'd still be wearing a suit and all of that kind of stuff.
26
00:01:42,043 --> 00:01:48,479
But, fatefully, in 2015, I went and started getting interested in technology.
27
00:01:48,479 --> 00:01:53,793
And I became fascinated with AI and I've always been interested in science in my free
time.
28
00:01:53,915 --> 00:01:59,247
And I, in 2016, it bubbled up to the point where I suddenly realized, my God, it's going
to happen.
29
00:01:59,247 --> 00:02:00,768
It's really going to happen.
30
00:02:00,768 --> 00:02:02,169
AI is going to change everything.
31
00:02:02,169 --> 00:02:10,373
And I launched Artificial Lawyer, which combined a bit of my business analysis and macro
picture skills with my good old fashioned journalistic skills.
32
00:02:10,373 --> 00:02:18,776
People kind of think that I went from being a journalist to running Artificial Lawyer, but
there's like an enormous gap in the middle of running around the city in a suit, trying to
33
00:02:18,776 --> 00:02:21,667
advise people on business, which no one ever talks about.
34
00:02:21,667 --> 00:02:23,580
understandably, it wasn't very exciting.
35
00:02:23,580 --> 00:02:29,905
It was it was to begin with it is I mean, honestly, when you've been a reporter, going,
tell me about your profitability.
36
00:02:29,905 --> 00:02:38,693
And then you get invited inside the inner sanctum and they show you your your you know,
they show you the spreadsheet and you see everything you like, wow, I you get to talk to
37
00:02:38,693 --> 00:02:41,216
managing partners and practice heads and everything.
38
00:02:41,216 --> 00:02:44,019
You get a completely different sense of what's going on.
39
00:02:44,019 --> 00:02:44,579
And
40
00:02:44,579 --> 00:02:53,259
That for me is why I think those two things when AI came around, I just thought, yes, this
is going to change everything.
41
00:02:53,259 --> 00:02:59,799
And then of course it didn't, but I hung in there like a fool.
42
00:02:59,799 --> 00:03:01,379
hung in there and guess what?
43
00:03:01,379 --> 00:03:03,902
In 2022, it became real.
44
00:03:03,902 --> 00:03:04,502
Yeah.
45
00:03:04,502 --> 00:03:09,482
And well, and you were, I've talked about this on a previous episode with you, you were
early.
46
00:03:09,562 --> 00:03:23,351
So the, really the birth of generative AI as I know it was, you know, the attention is all
you need paper from Google, which I think was 2017, right?
47
00:03:23,351 --> 00:03:30,591
Around that time, that time, we were just having things because I'm just reading, you
know, book about Empire of AI, great book.
48
00:03:31,071 --> 00:03:33,591
Some bits that aren't so great, but broadly.
49
00:03:35,111 --> 00:03:39,271
Let's not get into the whole looking for victims kind of thing.
50
00:03:39,430 --> 00:03:45,051
But it reminds me of that South Park episode about Megan and Harry.
51
00:03:45,871 --> 00:03:49,091
CSI Special Victims Unit, you know.
52
00:03:49,320 --> 00:03:53,130
There's a lot of sour grapes and whining and that, but yeah.
53
00:03:53,130 --> 00:03:53,661
Yeah.
54
00:03:53,661 --> 00:03:54,071
right.
55
00:03:54,071 --> 00:03:54,891
We'll edit that out.
56
00:03:54,891 --> 00:03:56,143
This will be censored.
57
00:03:56,143 --> 00:04:03,291
This will be sent to the George Orwell School of video editing to be sanitized before
publication.
58
00:04:03,291 --> 00:04:09,597
But the interesting thing is that Google were one of the, if not the one, that really
helped to pioneer the transformer model.
59
00:04:09,597 --> 00:04:11,489
And they didn't really take it that far.
60
00:04:11,489 --> 00:04:17,352
And then OpenAI, Altman and the team around him and Musk as well.
61
00:04:17,352 --> 00:04:20,132
deserves a lot of credit for backing them early on.
62
00:04:20,132 --> 00:04:21,432
They just picked it up and wrapped them up.
63
00:04:21,432 --> 00:04:25,552
This is one of the cool things, right, about him I saw in the book, which is really
fascinating to me.
64
00:04:25,552 --> 00:04:27,232
They started off doing everything.
65
00:04:27,232 --> 00:04:28,732
They were doing the video.
66
00:04:28,732 --> 00:04:31,023
They also were doing robotics, right?
67
00:04:31,023 --> 00:04:37,863
They had a robotic hand because they were focused on AGI as they conceptualized it back in
sort of 2016, 2017.
68
00:04:38,203 --> 00:04:42,743
They didn't know, they just had this idea there was going to be this super intelligent AI.
69
00:04:42,903 --> 00:04:43,893
right, human-like AI.
70
00:04:43,893 --> 00:04:46,644
They weren't quite sure how it was going to emerge.
71
00:04:46,644 --> 00:04:47,775
And they were looking at everything.
72
00:04:47,775 --> 00:04:51,386
They were looking at, like, sort of visual, video kind of stuff.
73
00:04:51,386 --> 00:04:53,166
They were looking at the physical realm.
74
00:04:53,166 --> 00:04:55,567
And they were also looking at language models.
75
00:04:55,707 --> 00:04:58,458
But those were all seen as, equally interesting.
76
00:04:58,458 --> 00:05:00,118
They didn't quite know which way it was going to go.
77
00:05:00,118 --> 00:05:05,810
And eventually, of course, once the LLM bit took off, they closed down the robotics bit or
got rid of it.
78
00:05:05,810 --> 00:05:08,952
But now, interestingly, they've restarted it.
79
00:05:08,952 --> 00:05:10,392
going off on a tangent.
80
00:05:10,392 --> 00:05:15,832
I think the point I think that I mean, and the funny thing is all of that was going on
unbeknownst to me.
81
00:05:15,832 --> 00:05:24,492
Well, I started running around trying to talk to people like Kira, know, no Waysburg and
the Ebrevier and companies like that, talking to them about natural language processing,
82
00:05:24,492 --> 00:05:25,663
machine learning, la la la.
83
00:05:25,663 --> 00:05:36,634
And in an office somewhere in San Francisco, Palo Alto, were a bunch of people building
the future with very, very little fanfare, very, very little fanfare.
84
00:05:36,634 --> 00:05:37,144
Yeah.
85
00:05:37,144 --> 00:05:37,564
Yeah.
86
00:05:37,564 --> 00:05:46,299
And, but, but you, came to the game, earlier than most and you know, I, paid attention to,
and my listeners have, have heard me talk about this before.
87
00:05:46,299 --> 00:05:53,424
It was, when deep blue on Jeopardy was kind of my first introduction, right?
88
00:05:53,491 --> 00:05:54,565
It was IBM.
89
00:05:54,565 --> 00:05:54,946
Yep.
90
00:05:54,946 --> 00:05:59,650
And then alpha go, which was again, that was roughly the same time period.
91
00:05:59,650 --> 00:06:01,693
2017, 2016.
92
00:06:01,693 --> 00:06:02,025
wasn't it?
93
00:06:02,025 --> 00:06:04,865
Wasn't that, that was out of, yeah.
94
00:06:04,865 --> 00:06:10,539
deep mind, Dennis Hesabas that Google ended up buying deep mind.
95
00:06:10,539 --> 00:06:16,446
So Google has had ambitions and has made investments in AI for, for a very long time.
96
00:06:16,446 --> 00:06:20,129
And you know, their first model, couple of models really sucked.
97
00:06:20,129 --> 00:06:22,731
I mean, Bard was pretty bad.
98
00:06:22,731 --> 00:06:24,993
Um, they've really closed the gap.
99
00:06:24,993 --> 00:06:27,755
Gemini 2.5 is quite good.
100
00:06:27,755 --> 00:06:30,296
And I think they're on the verge of releasing.
101
00:06:30,318 --> 00:06:31,831
whatever that next version is.
102
00:06:31,831 --> 00:06:35,747
yeah, Google, Google was kind of slowly then suddenly.
103
00:06:36,779 --> 00:06:37,645
but they've caught up.
104
00:06:37,645 --> 00:06:38,525
thing, isn't it?
105
00:06:38,525 --> 00:06:41,825
You get these like plateau, kind of like watershed moments.
106
00:06:41,825 --> 00:06:49,445
But I mean, the interesting thing here is that the basic, you might say economic theory,
the sort of techno economic theory, whatever you want to call it.
107
00:06:49,725 --> 00:06:59,205
But I sort of like, I would say I developed, but I certainly spent a ridiculously large
amount of time working on it, you know, around, you know, moving away from the, I mean,
108
00:06:59,205 --> 00:07:02,661
all the key concepts were all there, well developed in the market.
109
00:07:02,661 --> 00:07:11,541
You might say that the only thing really artificial I did was just kind of like tie a
little loose ends together and basically build almost like a worldview.
110
00:07:11,541 --> 00:07:13,521
This is the way it's going to be people, right?
111
00:07:13,521 --> 00:07:14,881
This is the way things are going to evolve.
112
00:07:14,881 --> 00:07:17,332
Well, they have to evolve if AI is going to be anything relevant.
113
00:07:17,332 --> 00:07:21,852
And it all came together like in a few months of 2016, right?
114
00:07:21,852 --> 00:07:23,812
Because it's not rocket science anyway, is it?
115
00:07:24,192 --> 00:07:29,808
And then the technology has moved on, but the basic ideas remain the same.
116
00:07:29,808 --> 00:07:33,611
It's rather like the green movement, know, batteries have improved.
117
00:07:33,770 --> 00:07:39,037
Now we have self-driving cars, which without humans are both safer and will be more
efficient.
118
00:07:39,037 --> 00:07:40,759
Electric vehicles.
119
00:07:40,759 --> 00:07:51,488
Today they announced, I'm not sure which body, one of the large global energy bodies
announced that this year was the first time ever that, well, actually you're going to have
120
00:07:51,488 --> 00:07:53,730
to edit this bit because I can't remember the facts.
121
00:07:53,730 --> 00:07:55,161
It's the...
122
00:07:55,282 --> 00:08:01,358
I think it was the first time that renewable energy overtook fossil fuels globally on
global basis, something like that.
123
00:08:01,358 --> 00:08:01,999
Yeah.
124
00:08:01,999 --> 00:08:10,715
But the point is, that the tech moves forward, but the people who pioneered green energy
had thought all of this through 20 years ago.
125
00:08:10,715 --> 00:08:11,896
Right.
126
00:08:11,896 --> 00:08:13,247
like, okay, so we're going to need wind.
127
00:08:13,247 --> 00:08:15,820
And then someone says, yes, but wind is very variable.
128
00:08:15,820 --> 00:08:19,233
So you're to need better battery technology.
129
00:08:19,233 --> 00:08:20,093
People go, yep, yep.
130
00:08:20,093 --> 00:08:20,664
Okay, right.
131
00:08:20,664 --> 00:08:21,784
Let's develop that.
132
00:08:21,784 --> 00:08:22,184
Right?
133
00:08:22,184 --> 00:08:27,264
And we're going to need a mix and we're going to have, you know, we're going to need
better transmission systems and so forth.
134
00:08:27,264 --> 00:08:30,304
And we're going to massively have to improve battery technology.
135
00:08:30,644 --> 00:08:31,244
So on and so on.
136
00:08:31,244 --> 00:08:32,635
then we're to have to invest in some.
137
00:08:32,635 --> 00:08:42,495
And it's like the groundwork was laid out 20 years ago and we see this again and again in
society that the, the, basic facts are pretty obvious for all to see.
138
00:08:42,515 --> 00:08:43,355
Right?
139
00:08:43,355 --> 00:08:46,986
It's just, they don't have the technology or the investment to make it real.
140
00:08:46,986 --> 00:08:49,657
Um, and then eventually the technology comes along.
141
00:08:49,657 --> 00:08:57,090
You know, it goes back to the, know, the old anecdotes about, you know, you know,
revolutions needing the right conditions, right?
142
00:08:57,490 --> 00:09:02,660
You know, there's always revolutionaries in any society, any point in history, you all way
back to ancient Greece.
143
00:09:02,660 --> 00:09:07,373
There's always a bunch of revolutionaries running around in their togas, you know, going,
right.
144
00:09:07,373 --> 00:09:09,864
But it only takes off if you get the right conditions.
145
00:09:09,864 --> 00:09:11,121
And it's the same with technology.
146
00:09:11,121 --> 00:09:18,058
I mean, like I could have shouted until I was blue in the face until I keeled over about
getting rid of the billable hour.
147
00:09:18,058 --> 00:09:27,573
by automating work streams, about rethinking the business model of law firms, about
thinking about the end outputs, how that benefits society as a whole, all that kind of
148
00:09:27,573 --> 00:09:28,064
stuff,
149
00:09:28,064 --> 00:09:29,387
And we may never have got there.
150
00:09:29,387 --> 00:09:38,651
NLP would have just fizzled out, plateaued out, and you and I would still be, I mean, I
don't know what I would be doing, but I'd still be doing legal tech.
151
00:09:38,651 --> 00:09:39,393
I'm not sure.
152
00:09:39,393 --> 00:09:41,653
Well, you know, this is it.
153
00:09:41,653 --> 00:09:46,253
I had a similar experience, similar, similar dynamic with cloud.
154
00:09:46,253 --> 00:09:57,833
So I was very early to pushing cloud in legal when I really didn't understand just how
much friction towards change there was.
155
00:09:57,833 --> 00:10:00,033
So this is the early 2010s.
156
00:10:00,033 --> 00:10:08,865
And when something called wave 14 came through for office 365 and made, made it usable,
like pre wave
157
00:10:08,865 --> 00:10:17,205
14 or maybe it was wave 15 office 365 now called Microsoft 365 was, was, was not great.
158
00:10:17,205 --> 00:10:24,185
Um, but I really thought to myself, wow, you know, I know what junk Microsoft junkies law
firms are.
159
00:10:24,185 --> 00:10:25,345
This makes perfect sense.
160
00:10:25,345 --> 00:10:32,085
I know how little they like to invest in technology and uh, which has historically been
the case.
161
00:10:32,085 --> 00:10:35,445
Um, moving to the cloud improves economics.
162
00:10:35,445 --> 00:10:38,025
It improves the security posture.
163
00:10:38,025 --> 00:10:38,978
Um,
164
00:10:38,978 --> 00:10:42,618
It certainly can scale up much more efficiently.
165
00:10:43,037 --> 00:10:47,498
And man, that didn't happen for about another eight years.
166
00:10:47,498 --> 00:10:49,138
It wasn't in the US.
167
00:10:49,138 --> 00:10:54,409
So law firms in the US really did not start to move in earnest to the cloud until a little
bit after COVID.
168
00:10:54,409 --> 00:11:00,569
we, again, I beat my head against the wall, kind of like you did talking about AI.
169
00:11:00,769 --> 00:11:05,709
did 22 road shows in 2014 talking about the cloud.
170
00:11:05,765 --> 00:11:12,824
22 different cities and everybody's like, yeah, this is great and did absolutely nothing
for another eight years.
171
00:11:12,986 --> 00:11:13,486
So.
172
00:11:13,486 --> 00:11:18,639
in some ways the AI situation was even more crazy in that cloud clearly did work.
173
00:11:18,639 --> 00:11:21,581
You could demonstrate it, but they still didn't want it.
174
00:11:21,581 --> 00:11:27,184
Well, it's not going to that maybe cloud is actually more crazy, but I suppose to some
degree you could actually not blame the lawyers much because they were looking at the
175
00:11:27,184 --> 00:11:31,716
early NLP tools and going, man, this looks kind of complicated.
176
00:11:31,716 --> 00:11:33,608
I'm not sure I want to use this stuff.
177
00:11:33,608 --> 00:11:39,972
And then you've got all the economic macroeconomic, cultural, professional aspects that
kind of stopped it.
178
00:11:39,972 --> 00:11:42,653
But yeah, I mean, it's
179
00:11:42,745 --> 00:11:43,866
I mean, this the thing, isn't it?
180
00:11:43,866 --> 00:11:52,333
Great, I mean, you cannot kill a great idea, but you can delay it for an enormous,
enormous amount of time, right?
181
00:11:52,333 --> 00:11:59,659
I mean, and you see this again, you see this in politics, see this playing out over all
kinds of different trains in our society.
182
00:11:59,659 --> 00:12:01,201
You cannot kill a great idea.
183
00:12:01,201 --> 00:12:04,904
People just instinctively look at it, they compare it to their own experiences.
184
00:12:04,904 --> 00:12:07,626
Maybe it's even like just like, know, it's gut instinct, right?
185
00:12:07,626 --> 00:12:08,277
It's intuition.
186
00:12:08,277 --> 00:12:10,625
They just go, that's right, that is right.
187
00:12:10,625 --> 00:12:12,785
And they're like, okay, let's have it.
188
00:12:13,125 --> 00:12:14,125
And then it doesn't happen.
189
00:12:14,125 --> 00:12:15,545
And then it doesn't happen.
190
00:12:15,545 --> 00:12:17,685
Although someone tries it, it doesn't really work very well.
191
00:12:17,685 --> 00:12:24,725
And then all the critics and the cynics start jumping on and going, ah, ha, ha, you were a
fool to believe there is hope, young Skywalker.
192
00:12:24,785 --> 00:12:28,925
ha, ha, you know, come to the dark side and enjoy the empire.
193
00:12:28,925 --> 00:12:31,185
It's much nicer over here.
194
00:12:31,365 --> 00:12:34,025
We get much better food, et cetera, et cetera.
195
00:12:34,025 --> 00:12:34,876
office is lovely.
196
00:12:34,876 --> 00:12:39,062
And people give up or they just get disbanded and make off and.
197
00:12:39,062 --> 00:12:41,875
focus something that does work, right?
198
00:12:41,875 --> 00:12:45,377
Like, I don't know, Bitcoin.
199
00:12:45,495 --> 00:12:46,566
That's I was going to say.
200
00:12:46,566 --> 00:12:53,349
Digital currency is another great example of an idea that's been lingering now for almost
20 years.
201
00:12:53,349 --> 00:12:54,029
It was 2008-ish.
202
00:12:54,029 --> 00:12:55,368
But it makes it makes people money.
203
00:12:55,368 --> 00:12:56,020
mean, that's the thing.
204
00:12:56,020 --> 00:13:01,475
mean, I mean, you know, you can step away from the whole intellectual stuff if you want to
and just say, look, does it make money?
205
00:13:01,816 --> 00:13:03,297
If it does make money?
206
00:13:05,659 --> 00:13:06,579
Does.
207
00:13:06,940 --> 00:13:08,894
Well, this is weird thing with crypto.
208
00:13:08,894 --> 00:13:17,349
mean, unless we have a collapse of all fiat currencies and we end up in a sort of Mad Max
world, crypto currencies actually have no real functional value.
209
00:13:17,349 --> 00:13:18,737
They are simply
210
00:13:18,737 --> 00:13:26,284
gigantic momentum trades and the momentum is constantly up, of flips around a bit and then
keeps going up again.
211
00:13:26,284 --> 00:13:28,008
But fundamentally, there's nothing there.
212
00:13:28,008 --> 00:13:29,353
There is literally nothing there.
213
00:13:29,353 --> 00:13:30,798
argue the same about gold.
214
00:13:30,798 --> 00:13:34,666
Gold's value is driven primarily by scarcity.
215
00:13:34,944 --> 00:13:37,686
True, But it has a tangible value.
216
00:13:37,686 --> 00:13:42,800
mean, you can pull out a lump of gold and people go, gold.
217
00:13:42,881 --> 00:13:45,464
People will always want gold.
218
00:13:45,464 --> 00:13:51,919
The same way that people will always want kebabs or burgers or, you know, it's just one of
those things, right?
219
00:13:51,919 --> 00:13:52,671
People like it.
220
00:13:52,671 --> 00:13:59,848
And so, whereas cryptocurrency really is just a momentum trade, maybe that momentum will
keep going forever and ever.
221
00:14:00,184 --> 00:14:06,597
Right, rather like a YouTube channel that just keeps on accumulating views and followers.
222
00:14:06,757 --> 00:14:07,758
It'll never, never stop.
223
00:14:07,758 --> 00:14:10,199
It just goes up and up and up and up.
224
00:14:10,199 --> 00:14:17,802
But if it ever did flip, like really flip, not just fluctuate, but completely flip and go
reverse, there's nothing that you can't cash it in at the end.
225
00:14:18,174 --> 00:14:19,543
You what have I got left?
226
00:14:19,543 --> 00:14:20,935
I've got three electrons.
227
00:14:20,935 --> 00:14:22,160
Are they worth anything?
228
00:14:22,160 --> 00:14:24,396
It's like, no, I'm sorry, your electrons are not really worth anything.
229
00:14:24,396 --> 00:14:26,178
you've got a collection of prime numbers.
230
00:14:26,178 --> 00:14:35,747
um But the underlying technology of blockchain provides a tremendous opportunity and
utility.
231
00:14:35,747 --> 00:14:37,468
And there's been a lot of talk in legal.
232
00:14:37,468 --> 00:14:39,490
It really hasn't gotten anywhere.
233
00:14:39,490 --> 00:14:47,014
the asynchronous ledger, distributed ledger concept is an interesting one in recording.
234
00:14:47,014 --> 00:14:47,674
is, is.
235
00:14:47,674 --> 00:14:50,085
mean, you know, David Fisher, know, integral ledger.
236
00:14:50,085 --> 00:14:52,997
He's been pushing that for ever since I got started.
237
00:14:52,997 --> 00:15:00,020
And, he still has a lot of people who really believe in it, you know, but the problem is,
is it's really be useful.
238
00:15:00,020 --> 00:15:05,483
My personal view is you need a scenario where there's no trust, where trust is completely
broken down.
239
00:15:05,483 --> 00:15:08,835
And most commercial lawyers don't operate in such a system.
240
00:15:08,835 --> 00:15:11,956
Probably this reason is because if trust had completely.
241
00:15:11,956 --> 00:15:19,960
utterly irrevocably broken down and no one can trust each other and nothing could be real,
it'd be quite hard to be a lawyer because you've lost a rules-based system that underpins
242
00:15:19,960 --> 00:15:20,930
all of that.
243
00:15:21,711 --> 00:15:23,191
You're in Mad Max.
244
00:15:23,211 --> 00:15:30,294
And at that point, cryptocurrency and blockchain and everything really makes absolute
sense because you cannot trust anybody to do the right thing.
245
00:15:30,486 --> 00:15:31,366
Yeah.
246
00:15:31,507 --> 00:15:42,519
Yeah, that's a, we could go down a rabbit hole with the cryptocurrency, but, let's talk
about your, so you've got a couple of events coming up that, I wanted to make people aware
247
00:15:42,519 --> 00:15:48,815
of because your theme aligns really well with the theme of this podcast, which is legal
innovation.
248
00:15:48,815 --> 00:15:53,189
So tell us about the, I think you have events coming up in New York and London.
249
00:15:53,189 --> 00:15:54,960
Tell us a little bit about those.
250
00:15:55,078 --> 00:16:00,580
Yeah, so just briefly, so Legal Innovators was a conference that grew out of Artificial
Lawyer.
251
00:16:00,580 --> 00:16:06,044
It's organized by another company called Cosmonauts, but I kind of co-created it with them
and I chair the events.
252
00:16:06,044 --> 00:16:07,982
London, we've been doing for years and years.
253
00:16:07,982 --> 00:16:09,515
It's quite a big conference now.
254
00:16:09,515 --> 00:16:12,868
We're hoping to get around about thousand people over three days.
255
00:16:12,868 --> 00:16:17,641
Law firm day, in-house day, litigation day, which is gonna be fun, first time we've done
that.
256
00:16:17,641 --> 00:16:21,062
So that's the fourth, fifth and sixth of November in London.
257
00:16:21,184 --> 00:16:23,846
And then we're to be doing New York for very first time.
258
00:16:23,846 --> 00:16:28,100
We've been doing California for years and years in San Francisco, which is brilliant fun.
259
00:16:28,100 --> 00:16:28,427
love it.
260
00:16:28,427 --> 00:16:31,894
It's probably my favorite event, but we're doing New York and it's going be very
interesting.
261
00:16:31,894 --> 00:16:42,883
Now, the reason why I delayed doing New York for so long is because I didn't feel that the
big firms there who really rule, you know, the kingdom were really, really getting into AI
262
00:16:42,883 --> 00:16:49,290
and automation and rethinking business sufficiently to really make it an exciting event.
263
00:16:49,290 --> 00:16:51,731
I mean, Cleary was an outlier early on.
264
00:16:51,731 --> 00:16:54,833
They created Cleary X, did a lot of interesting work.
265
00:16:54,833 --> 00:16:57,073
But now things are changing.
266
00:16:57,073 --> 00:16:57,544
Definitely.
267
00:16:57,544 --> 00:17:04,067
I mean, when I was in New York for Legal Week earlier this year, I spoke to a lot of
people that I got the feeling that things are shifting.
268
00:17:04,067 --> 00:17:07,268
You know, and there was some really good stuff going on at Gunderson as well, for example.
269
00:17:07,268 --> 00:17:12,360
ah It is, is, but in their New York office.
270
00:17:12,360 --> 00:17:13,170
it's true.
271
00:17:13,170 --> 00:17:15,980
But let's face it, Leif Mawatkins is a West Coast firm as well.
272
00:17:15,980 --> 00:17:17,604
You know, they're doing some interesting things.
273
00:17:17,604 --> 00:17:21,344
So it's kind of like, I think it's come of age finally.
274
00:17:21,644 --> 00:17:32,144
kind of, I mean, you expect Wilson, Sonsini and Cooley and others to have created startups
and have really huge innovation teams, doing all kinds of fun stuff, working very closely
275
00:17:32,144 --> 00:17:33,484
with AI companies.
276
00:17:33,804 --> 00:17:39,584
New York firms, I think were slightly, they may have brought in a lot of people around
knowledge management.
277
00:17:39,584 --> 00:17:41,044
They certainly did.
278
00:17:41,224 --> 00:17:44,884
But this idea that, hey, AI is actually gonna change the game for us.
279
00:17:44,884 --> 00:17:46,980
You know, just a little.
280
00:17:46,980 --> 00:17:48,000
Just a little.
281
00:17:48,161 --> 00:17:54,997
I think that was, even though people may have talked a good talk, I didn't really feel
that it was actually happening until very recently.
282
00:17:54,997 --> 00:17:58,412
And I think it is actually starting to happen though.
283
00:17:58,412 --> 00:17:58,842
So.
284
00:17:58,842 --> 00:17:59,482
it is.
285
00:17:59,482 --> 00:18:04,415
And I think a leading indicator to that you alluded to, which is some of the hiring.
286
00:18:04,415 --> 00:18:05,916
mean, look at Simpson Thatcher.
287
00:18:05,916 --> 00:18:09,388
I mean, that's the epitome of a New York white shoe law firm.
288
00:18:09,388 --> 00:18:22,315
You know, they brought in Liz Grennan from McKinsey and they've made some other hires that
again, that's how from the outside, if they're a client, I have got an inside view, but
289
00:18:22,315 --> 00:18:23,936
for firms who
290
00:18:24,289 --> 00:18:26,071
We don't do business with an info dash.
291
00:18:26,071 --> 00:18:30,694
look at their org chart and say, how are they, what moves are they making there?
292
00:18:30,694 --> 00:18:39,762
And you know, somebody like Liz isn't going to come from McKinsey without having a high
level of commitment.
293
00:18:39,762 --> 00:18:46,988
Cause she's also an attorney and Connor Grennan's wife, who, you know, is a big thought
leader in the AI space.
294
00:18:46,988 --> 00:18:52,192
think he's a NYU law school or I'm sorry, not as he'd law school.
295
00:18:52,676 --> 00:18:54,317
or maybe business school.
296
00:18:54,458 --> 00:19:04,788
So, you know, I see people who I know they're not going to move unless the firm has
demonstrated a commitment and articulated some sort of a plan to move in a certain
297
00:19:04,788 --> 00:19:05,889
direction.
298
00:19:05,889 --> 00:19:10,815
And that tells me again, that's a leading indicator that they're starting to move in that
direction.
299
00:19:10,815 --> 00:19:11,976
uh
300
00:19:11,976 --> 00:19:14,396
yeah, we shouldn't get carried away.
301
00:19:14,396 --> 00:19:23,747
I mean, while organizations like Salesforce out in California literally have their own
team that builds AI products for their own use, right?
302
00:19:23,747 --> 00:19:31,567
You know, and some of the banks and some of the private equity funds are getting
definitely very interested in what AI can do for them and doing stuff internally as well.
303
00:19:31,567 --> 00:19:35,767
And some of the big consultancies too, have big bases in New York as well.
304
00:19:35,767 --> 00:19:36,987
You know,
305
00:19:37,215 --> 00:19:41,158
Are the big New York firms really disrupting themselves?
306
00:19:41,158 --> 00:19:43,769
I don't think yet, and maybe not for a while.
307
00:19:43,769 --> 00:19:49,203
But as you say, they're genuinely thinking about it now and they're exploring and
experimenting.
308
00:19:49,203 --> 00:19:53,546
They're bringing in these gen-ai productivity platforms and a whole bunch of other tools.
309
00:19:53,546 --> 00:19:57,588
It's going much, much more beyond knowledge management.
310
00:19:57,589 --> 00:20:01,431
It's not just like, okay, well make my life a little bit easier.
311
00:20:01,672 --> 00:20:05,455
Help me find the information I need so I can go off and do the deal that I need to do.
312
00:20:05,455 --> 00:20:06,605
It's more...
313
00:20:06,605 --> 00:20:16,459
now, well not more now, but in addition now, they're saying, okay, you know what, this may
actually alter our business model at a very small level to begin with.
314
00:20:16,459 --> 00:20:18,100
But that, it's all relative, isn't it?
315
00:20:18,100 --> 00:20:18,881
It's all relative.
316
00:20:18,881 --> 00:20:24,454
know, that, that in itself, the fact that they're even asking that question, I think is a
big deal.
317
00:20:24,454 --> 00:20:24,784
Yeah.
318
00:20:24,784 --> 00:20:32,432
Well, and let me ask you something along those lines when we talk about how law firms move
in that direction.
319
00:20:32,432 --> 00:20:44,122
And by that direction, mean, you know, making fundamental changes to foundational pieces
of their business, how legal service delivery, pricing, internal firm compensation, client
320
00:20:44,122 --> 00:20:45,022
engagement.
321
00:20:45,022 --> 00:20:51,267
These are foundational elements that are very firmly entrenched in law firms.
322
00:20:51,431 --> 00:20:54,733
and they have to change and they're going to change.
323
00:20:54,733 --> 00:21:00,896
Whether law firms like it or not, it's just at what speed and how.
324
00:21:00,896 --> 00:21:04,477
think one of the biggest ones is internal firm compensation.
325
00:21:04,477 --> 00:21:10,060
way lawyers are ruthlessly measured on billable hours today and that is such a deeply
entrenched.
326
00:21:10,060 --> 00:21:16,133
I've talked to some leaders of innovation councils at some pretty big firms and they have
cited that
327
00:21:16,255 --> 00:21:28,775
particular cultural element of the firm is one of the hardest to move the dial on where,
because again, if you're AI really changes the internal firm compensation, when pricing
328
00:21:28,775 --> 00:21:31,917
changes, internal firm compensation changes, right?
329
00:21:31,917 --> 00:21:43,555
If you're in a AFA scenario where the firm is taking on risk, revenue in the door doesn't
equate to profitability like it does today.
330
00:21:43,933 --> 00:21:46,891
you have to manage risk and efficiency.
331
00:21:46,984 --> 00:21:48,015
It's an interesting one, isn't it?
332
00:21:48,015 --> 00:21:52,658
Cause it's like, you know, the, I don't think there's anything wrong in using time as an
internal metric.
333
00:21:52,658 --> 00:21:57,031
I don't think it's the best metric internally, but if you, if you want to use it, okay,
fine.
334
00:21:57,031 --> 00:22:04,577
At least you've got some kind of yardstick, but I think for, know, there's Bob, he's only
worked 20 hours this week.
335
00:22:04,577 --> 00:22:08,810
Ooh, you know, there's Jane, she's worked 60 hours or a hundred hours this week.
336
00:22:08,810 --> 00:22:09,251
she's great.
337
00:22:09,251 --> 00:22:10,031
Okay.
338
00:22:10,031 --> 00:22:10,922
Whatever.
339
00:22:11,534 --> 00:22:18,280
That kind of thinking is as old as time, But what I think is an issue is judging value by
time.
340
00:22:18,380 --> 00:22:23,073
So, you know, here's Bob and Jane, they spend all weekend going through these documents,
aren't they wonderful?
341
00:22:23,073 --> 00:22:26,366
That'll be, you know, $25,000, please.
342
00:22:26,366 --> 00:22:29,487
Oh, no, that AI system did it in three minutes.
343
00:22:29,848 --> 00:22:30,879
Now what are we gonna do?
344
00:22:30,879 --> 00:22:32,583
It's nuts, I personally think.
345
00:22:32,583 --> 00:22:36,918
So yeah, keep the art internally, but externally, you're gonna have to find a better
method.
346
00:22:36,918 --> 00:22:38,080
Yeah, indeed.
347
00:22:38,080 --> 00:22:52,148
you know, and what, are your thoughts on, I've had this debate a lot and I would say from
the people I've talked to, it's pretty split, maybe 60, 60, 40 in favor of, and 40 % think
348
00:22:52,148 --> 00:22:54,870
that this can't happen, but I'm a believer.
349
00:22:54,870 --> 00:23:04,099
think that law firms are actually going to have to leverage their data in a meaningful way
as part of their AI strategy and
350
00:23:04,099 --> 00:23:11,239
and invest in R and D, which law firms historically have not done well or haven't done at
all for a couple of reasons.
351
00:23:11,239 --> 00:23:16,539
One, the partnership model optimizes is optimized for profit taking at the end of the
year.
352
00:23:16,539 --> 00:23:23,039
It's hard to accrue capital expenditure doesn't fit into that cash basis partnership
model.
353
00:23:23,039 --> 00:23:27,459
Well, but also they're, they're not, they're not software companies.
354
00:23:27,459 --> 00:23:29,979
They're not used to building software.
355
00:23:30,239 --> 00:23:32,879
And I don't know, what are you?
356
00:23:32,879 --> 00:23:33,899
Yeah.
357
00:23:34,432 --> 00:23:42,458
think, you know, law firms want to build stuff, go for it, you know, the same way that
someone might say to me, well, you don't have to buy that piece of technology to improve
358
00:23:42,458 --> 00:23:43,439
artificial wear.
359
00:23:43,439 --> 00:23:44,539
Why don't you build it yourself?
360
00:23:44,539 --> 00:23:45,454
It's really easy.
361
00:23:45,454 --> 00:23:48,702
You know, it's very, very, you know, pain by numbers.
362
00:23:48,983 --> 00:23:50,344
I might do it.
363
00:23:50,344 --> 00:23:54,738
I'll be honest, I'm probably more minded just to buy it and let someone else deal with it.
364
00:23:54,738 --> 00:23:57,250
I don't know, that for me is not the issue.
365
00:23:57,250 --> 00:24:03,626
It's the issue is can they start to think in a way that AI is the leverage?
366
00:24:03,626 --> 00:24:13,998
And I just want to bring in a law firm, an AI hybrid law firm, which I think is great,
which is called Covenant based in New York, created by the ex-general council of WeWork
367
00:24:13,998 --> 00:24:15,443
and her partner.
368
00:24:15,443 --> 00:24:17,065
And it's a great model.
369
00:24:17,065 --> 00:24:17,665
It's a great model.
370
00:24:17,665 --> 00:24:21,408
They have a tech company, which is the AI bit.
371
00:24:21,408 --> 00:24:25,452
And they have a law firm, which has got multiple lawyers in it, not a huge number at the
moment, just a small group.
372
00:24:25,452 --> 00:24:28,425
And they do private equity investments.
373
00:24:28,425 --> 00:24:30,887
I understand.
374
00:24:30,887 --> 00:24:31,769
And on the buy side.
375
00:24:31,769 --> 00:24:34,142
And there's a license agreement between the two.
376
00:24:34,142 --> 00:24:39,648
So they don't actually have to go away to Arizona to, you know, to do this.
377
00:24:39,648 --> 00:24:42,519
under special bar rules and it's, it's just great.
378
00:24:42,519 --> 00:24:51,171
then we were talking, we did this great video together and Jen, and she just came up with
this phrase, just came out in the conversation and she said, you know, AI is the leverage.
379
00:24:51,331 --> 00:25:02,154
And I just thought, man, I wish I'd coined that phrase 10 years ago, because for me, it
sums up everything that artificial law has been trying to say ever since I got started,
380
00:25:02,154 --> 00:25:05,435
that AI is the means of production.
381
00:25:05,635 --> 00:25:05,885
Right.
382
00:25:05,885 --> 00:25:08,636
Now there are some things which aren't sort of productivity based.
383
00:25:08,636 --> 00:25:16,396
You know, so, know, the classic cliche, know, it's Martin Lipton, you know, sitting on the
sofa thinking about the poison pill and he has a eureka moment.
384
00:25:16,396 --> 00:25:18,356
goes, aha, I've got it.
385
00:25:18,356 --> 00:25:18,576
All right.
386
00:25:18,576 --> 00:25:21,816
That's not a productivity play, but a due diligence report.
387
00:25:21,816 --> 00:25:23,296
That's a productivity play.
388
00:25:23,336 --> 00:25:24,336
A lot of legal research.
389
00:25:24,336 --> 00:25:24,656
Okay.
390
00:25:24,656 --> 00:25:27,956
There's a degree of sophistication around that, but a lot of it is bulk.
391
00:25:28,076 --> 00:25:29,856
E-discovery, that's bulk.
392
00:25:29,856 --> 00:25:30,516
Okay.
393
00:25:30,516 --> 00:25:34,476
Other intelligent insights and knowledge we can make that more sophisticated.
394
00:25:34,476 --> 00:25:34,796
Better.
395
00:25:34,796 --> 00:25:35,656
Yeah, obviously.
396
00:25:35,656 --> 00:25:37,148
But there's a ton of bulk there.
397
00:25:37,148 --> 00:25:39,230
So that's all going down to productivity, right?
398
00:25:39,230 --> 00:25:45,353
Anything that relates to productivity, you throw at leverage, IAV associate body, right?
399
00:25:45,353 --> 00:25:49,515
And the way we need to think is AI is the leverage.
400
00:25:49,595 --> 00:25:56,789
And I think for me, that's a tipping point once law firms, the big established law firms
start to look at Covenant and go, you know what?
401
00:25:56,789 --> 00:25:58,900
That's actually the best model we've ever seen.
402
00:25:59,221 --> 00:26:05,886
And the beautiful thing about it is they don't even have to get into complicated bar
rules, right?
403
00:26:05,886 --> 00:26:11,159
They don't even have to do that because if they structure it in the right way, they can do
this anyway.
404
00:26:11,159 --> 00:26:18,702
And besides, and even if they don't want to do that fancy, fancy structure, they can just
buy in the technology as a license holder.
405
00:26:18,863 --> 00:26:26,206
I mean, it's not these, again, it goes back to this earlier conversation about the great
idea is already here.
406
00:26:26,206 --> 00:26:27,798
The question is, will they do it?
407
00:26:27,798 --> 00:26:28,329
Right.
408
00:26:28,329 --> 00:26:29,289
Yeah.
409
00:26:29,289 --> 00:26:36,509
Did you hear, I'm assuming you read the story about Burford capital and the MSO model
there.
410
00:26:36,509 --> 00:26:37,729
It's a private equity firm.
411
00:26:37,729 --> 00:26:39,609
was in the financial times.
412
00:26:39,727 --> 00:26:46,599
I mean, litigation financial financiers are doing pretty interesting stuff.
413
00:26:46,599 --> 00:26:47,780
Yeah.
414
00:26:47,780 --> 00:26:59,755
They essentially stand up these managed services organizations and deploy private capital
into those, which really kind of take over some business of law functions.
415
00:26:59,755 --> 00:27:01,165
And it's a vehicle.
416
00:27:01,165 --> 00:27:05,758
It's an investment vehicle for private capital to get deployed in the US legal market.
417
00:27:05,758 --> 00:27:10,000
And it feels kind of a little bit like a back door at the moment.
418
00:27:10,000 --> 00:27:12,521
If things were
419
00:27:12,927 --> 00:27:20,743
If we had the ABS roles like we do in Arizona across the country, I'm not so sure that
would be the ideal structure.
420
00:27:20,764 --> 00:27:26,348
But one of the things I wanted to get your take on is I talk about this a lot.
421
00:27:26,588 --> 00:27:32,073
Law firms today, I spent 10 years at Bank of America in mostly risk management roles.
422
00:27:32,073 --> 00:27:41,298
So I was in consumer risk, then I moved into global treasury, then I went to corporate
audit, then I was in anti-money laundering and then compliance.
423
00:27:41,298 --> 00:27:51,203
really interesting work, but I got a really good sense for just how much rigor there is
around risk management and financial services for obvious reasons.
424
00:27:51,544 --> 00:27:54,645
And looking at law firms, they have none.
425
00:27:54,965 --> 00:28:00,297
There's literally nothing between the lawyer and his or her client, right?
426
00:28:00,297 --> 00:28:05,750
It's I make requests, lawyer delivers work product or advice.
427
00:28:05,750 --> 00:28:06,870
There's no
428
00:28:07,018 --> 00:28:17,229
layers and when we get into a scenario like where we start to talk about automation and
legal, you know, tech enabled legal service delivery, we have to start thinking about risk
429
00:28:17,229 --> 00:28:18,230
management.
430
00:28:18,230 --> 00:28:20,293
And what does that look like in a law firm?
431
00:28:20,293 --> 00:28:25,197
It would be lawyers listening to some risk manager saying, you can't do that.
432
00:28:25,298 --> 00:28:27,661
That's not going to go over very well, but I don't know.
433
00:28:27,661 --> 00:28:28,612
How do you think about
434
00:28:28,612 --> 00:28:29,313
two different things.
435
00:28:29,313 --> 00:28:36,418
think, I think one you sort of indirectly touched on absolutely key point, which is the
lawyers are the means of production, right?
436
00:28:36,418 --> 00:28:38,030
They are the center of the universe, right?
437
00:28:38,030 --> 00:28:43,966
We have, we do not live in the Copernican, Copernican, Copernian system, right?
438
00:28:43,966 --> 00:28:46,348
know, lawyers are the center of the universe.
439
00:28:46,348 --> 00:28:51,372
They do not, they do not revolve around process and product and so forth.
440
00:28:51,372 --> 00:28:51,792
Right?
441
00:28:51,792 --> 00:28:53,053
The lawyers,
442
00:28:53,413 --> 00:28:54,514
control everything.
443
00:28:54,514 --> 00:28:56,315
They do the final sign off.
444
00:28:56,315 --> 00:28:58,278
They do the interaction with the clients.
445
00:28:58,278 --> 00:28:59,898
They are the workflow.
446
00:28:59,979 --> 00:29:02,702
They have a mind, they have a hand, they have a whole machine.
447
00:29:02,702 --> 00:29:07,529
And it's only when we start to think about automation in the true sense.
448
00:29:07,529 --> 00:29:09,551
So you don't do that anymore.
449
00:29:09,731 --> 00:29:11,614
The machine does that.
450
00:29:11,614 --> 00:29:13,875
Talking about agentic flows or anything else.
451
00:29:14,195 --> 00:29:15,375
That's when things change.
452
00:29:15,375 --> 00:29:17,675
And I think that's when it comes to your point about risk.
453
00:29:17,675 --> 00:29:25,195
I mean, as it currently stands, it doesn't make much difference because anything that as
far as I understand, anything that goes out of the door of a law firm has to be signed off
454
00:29:25,195 --> 00:29:26,546
and is under their umbrella.
455
00:29:26,546 --> 00:29:32,937
What'll be interesting is as we start to build truly automated workflows, from start to
finish.
456
00:29:32,937 --> 00:29:37,917
Now it may be a very, very narrow workflow, but they will grow.
457
00:29:37,917 --> 00:29:38,927
They will grow.
458
00:29:38,927 --> 00:29:41,979
It's going to get more and more powerful.
459
00:29:41,979 --> 00:29:44,220
That's when I think the whole risk and insurance thing comes in.
460
00:29:44,220 --> 00:29:47,762
But even so, could argue that law firms still have it under their umbrella.
461
00:29:47,762 --> 00:29:59,226
And it will be down to the law firm or any consultants they can bring in to do pen testing
effectively, to make sure that it works completely fine.
462
00:29:59,226 --> 00:30:03,647
But yeah, for me, this has always been the battle in the...
463
00:30:03,829 --> 00:30:06,581
And it's totally understandable, because I probably would do the same.
464
00:30:06,581 --> 00:30:13,248
Most lawyers, most professionals, they see technology and they go, great, how can that add
to what I do already?
465
00:30:13,348 --> 00:30:18,292
How can that finesse or take a little bit of a bother out of my life?
466
00:30:18,412 --> 00:30:22,116
All right, they're the center of the universe, right?
467
00:30:22,116 --> 00:30:27,380
If that is all we do with AI now, then nothing's going to change at all.
468
00:30:27,380 --> 00:30:32,564
It goes back to, don't know if we'll probably have this point before, it becomes the Ikea
catalog situation.
469
00:30:32,588 --> 00:30:38,451
where you get your various shelving units and cushions and rugs and throwers and all of
this kind of stuff.
470
00:30:38,451 --> 00:30:43,514
And it's very pretty and it's very nice and it greatly increases the comfort of that
person.
471
00:30:43,514 --> 00:30:44,184
And why not?
472
00:30:44,184 --> 00:30:47,292
People like to be comfortable, but it doesn't change fundamentally anything.
473
00:30:47,292 --> 00:30:50,318
You're not Le Corbusier completely redesigning the building.
474
00:30:50,318 --> 00:30:56,181
You don't change your one bedroom flat into a machine for living, as Le Corbusier said.
475
00:30:56,181 --> 00:31:00,803
We're fundamentally still in the same world with some decorations from Ikea, right?
476
00:31:00,803 --> 00:31:02,804
Bought out of the catalog and then installed.
477
00:31:02,841 --> 00:31:07,102
Things only change once you start to automate whole streams.
478
00:31:07,422 --> 00:31:17,548
And I think that's, and I think this is incredibly difficult for professionals,
particularly lawyers, to get their heads around because it's just like, yes, you are not
479
00:31:17,548 --> 00:31:20,609
gonna own everything any longer.
480
00:31:21,029 --> 00:31:26,532
You might be able to own the output and make money from it, but you will not own those
workflows, right?
481
00:31:26,532 --> 00:31:28,572
And this is not like an assistant.
482
00:31:28,572 --> 00:31:30,467
I was thinking about how to...
483
00:31:30,467 --> 00:31:33,840
explain this in a very simple diagram for a thing I've got to do next week.
484
00:31:33,840 --> 00:31:38,645
So basically think about a partner, a partner has assistants, right?
485
00:31:38,645 --> 00:31:41,466
As they were kind of originally called rather than associates, right?
486
00:31:41,466 --> 00:31:46,720
That the assistant assists the partner in the things that the partner wants to achieve,
right?
487
00:31:46,720 --> 00:31:50,211
So the partner says, right, I need you to go and look at that law book, go.
488
00:31:50,211 --> 00:31:52,612
I need you to prove that contract, go.
489
00:31:52,612 --> 00:31:55,883
You go and pick me up a sandwich, go, right?
490
00:31:55,883 --> 00:31:57,834
They're assistants, right?
491
00:31:57,840 --> 00:32:06,157
They may contribute meaningful aspects to the end product, but fundamentally they're
completely under the command of this one person, right?
492
00:32:06,157 --> 00:32:08,019
They are the center of our universe, right?
493
00:32:08,019 --> 00:32:15,136
But in true automation and sort of an agentic flow, there is simply a person at one end of
it, because it's all pre-built, right?
494
00:32:15,136 --> 00:32:19,168
Even if it can be uh go through iterations and you can have some feedback.
495
00:32:19,168 --> 00:32:24,583
But ultimately, there's one person at the end of that line and there's another person at
the beginning of it.
496
00:32:24,583 --> 00:32:27,043
And that's it, right?
497
00:32:27,263 --> 00:32:31,474
You know, there is no assistant in this thing, right?
498
00:32:31,474 --> 00:32:36,174
It's a machine that makes something that comes out of the other end, to put it really
simply.
499
00:32:36,474 --> 00:32:46,274
And I think this is very, very, difficult because it's the clash, and this is something
that Susskind wrote about 25 million years ago, about the battle between the artisanal and
500
00:32:46,274 --> 00:32:47,414
the automation.
501
00:32:47,634 --> 00:32:52,642
when I started doing artificial law, I chose, I prefer the term like,
502
00:32:52,642 --> 00:32:59,002
you know, industrial, because I grew up in the Midlands in Britain, which is where parts
of the industrial revolution started.
503
00:32:59,002 --> 00:33:05,533
And I've always thought about, you know, the industrial revolution and how machines
changed everything.
504
00:33:05,533 --> 00:33:10,193
And for me, it it just seemed like so obvious, like this, this will happen to professional
services.
505
00:33:10,193 --> 00:33:17,293
And I think when that, you know, going back to the early beginnings of NLP and machine
learning, I think people could see making some small inroads, but they could say like,
506
00:33:17,293 --> 00:33:20,601
this is going to spread across the entire sector.
507
00:33:20,601 --> 00:33:22,641
No way, no way, it's not gonna happen.
508
00:33:22,741 --> 00:33:29,861
Now with large language models and language understanding and agentic flows, which are
getting better like by the minute, right?
509
00:33:30,241 --> 00:33:32,441
It actually is doable now.
510
00:33:32,701 --> 00:33:33,861
I think it really is doable.
511
00:33:33,861 --> 00:33:35,241
Do we have accuracy problems?
512
00:33:35,241 --> 00:33:37,021
Yes, accuracy is still a problem.
513
00:33:37,021 --> 00:33:44,492
If you automate an inaccurate system, you just mass produce bad goods, right?
514
00:33:44,492 --> 00:33:47,800
You know, like if you've got like a lathe in your garage,
515
00:33:47,800 --> 00:33:52,380
and you're turning out widgets and you've got something wrong with the, with the tool bit.
516
00:33:52,380 --> 00:33:56,454
All you're going to do is turn out a thousand widgets that you can't use.
517
00:33:56,454 --> 00:34:01,887
yeah, but is that, I mean, okay, look at, look at, look at Waymo, right?
518
00:34:01,887 --> 00:34:06,049
If I, if I stick a Waymo on the streets and it's got something wrong with it, you're going
to kill someone.
519
00:34:06,049 --> 00:34:08,140
I mean, literally you can kill someone, right?
520
00:34:08,140 --> 00:34:09,871
It's really serious stuff.
521
00:34:09,871 --> 00:34:12,723
They, they've managed to get on top of it.
522
00:34:12,723 --> 00:34:14,207
And now Waymo.
523
00:34:14,207 --> 00:34:17,232
from a day trial or is actually safer than human drivers.
524
00:34:17,232 --> 00:34:22,224
heard that it reduces side impact collisions by like higher than 95%.
525
00:34:22,224 --> 00:34:32,630
So from, you know, going from a hundred thousand a year to 5,000 a year, you know, because
people don't look both ways before they cross an intersection.
526
00:34:32,630 --> 00:34:34,391
You see a green light, you go.
527
00:34:34,391 --> 00:34:39,564
These vehicles have the ability to, you know, they have uh adjacent visibility.
528
00:34:39,564 --> 00:34:39,975
So.
529
00:34:39,975 --> 00:34:45,146
I was in one in San Francisco, know, for the Legal Innovators California, and it was
absolutely brilliant.
530
00:34:45,146 --> 00:34:46,506
Really fantastic.
531
00:34:46,506 --> 00:34:49,426
Shout out to Todd Smithline for paying for it.
532
00:34:49,426 --> 00:34:50,357
Thank you for that, Todd.
533
00:34:50,357 --> 00:34:51,277
It was so good.
534
00:34:51,277 --> 00:34:51,857
It was so good.
535
00:34:51,857 --> 00:34:54,828
Honestly, I got out of that and I just went, I'm a believer.
536
00:34:54,828 --> 00:34:58,872
It's funny, mean, it was funny having this, having this debate with a bunch of people.
537
00:34:58,872 --> 00:35:08,732
of this thing last week and they were talking about, know, oh my God, you know, it's so
good that, um, you know, you have human pilots on a plane, um, because they can rescue you
538
00:35:08,732 --> 00:35:09,692
if anything happens.
539
00:35:09,692 --> 00:35:12,563
And I was just like, well, why don't you just have an extra AI system?
540
00:35:12,563 --> 00:35:17,863
Have two or two, have three AI systems, one after the other, you know, double fail safe.
541
00:35:18,283 --> 00:35:25,994
I mean, it's just, I mean, it's just, it's just humans hanging on to, um, just
542
00:35:25,994 --> 00:35:27,045
you know, things they got used to.
543
00:35:27,045 --> 00:35:38,165
mean, it's like, okay, so every time you go up in an elevator, right, there's a guy who
stands in the elevator with a bunch of cushions in case anything goes wrong, right?
544
00:35:38,165 --> 00:35:41,127
It's just like, you just accept it, you go with it, right?
545
00:35:41,127 --> 00:35:45,854
You know, because after, literally, how long, when did they invent elevators?
546
00:35:45,854 --> 00:35:47,232
100 years ago?
547
00:35:48,394 --> 00:35:53,911
Gotta be house now, when they started building skyscrapers, so since the 1920s, right?
548
00:35:53,911 --> 00:35:55,213
But they worked out the kinks.
549
00:35:55,213 --> 00:35:56,614
I mean, and they've done it.
550
00:35:56,614 --> 00:36:01,136
First thing with Waymo, when it started, everyone said, this is ridiculous.
551
00:36:01,136 --> 00:36:02,036
It won't work.
552
00:36:02,036 --> 00:36:03,627
And then it kept on not working.
553
00:36:03,627 --> 00:36:04,991
And then people said, there you go.
554
00:36:04,991 --> 00:36:05,818
You're a bunch of idiots.
555
00:36:05,818 --> 00:36:07,238
You've wasted loads of money.
556
00:36:07,238 --> 00:36:08,379
Ha ha ha.
557
00:36:08,379 --> 00:36:11,440
And Tesla didn't help because they kept on saying, hey, we've got a self-driving car.
558
00:36:11,440 --> 00:36:13,042
And it obviously wasn't a self-driving car.
559
00:36:13,042 --> 00:36:14,744
People were like, Elon, you're not helping.
560
00:36:14,744 --> 00:36:15,584
And...
561
00:36:16,456 --> 00:36:21,860
Now they have, but he could have maybe toned it down a few years ago before he really
actually had one.
562
00:36:21,860 --> 00:36:22,550
But they did it.
563
00:36:22,550 --> 00:36:23,388
They actually did it.
564
00:36:23,388 --> 00:36:27,835
I think that is, I don't understand why everyone like parades down every street in
America.
565
00:36:27,835 --> 00:36:32,558
Because that is, that for me is as impressive as putting a man on the moon.
566
00:36:32,558 --> 00:36:33,738
It really is.
567
00:36:33,926 --> 00:36:41,926
You had somebody on your podcast recently that threw out a number that I'd love to bounce
off of you.
568
00:36:41,926 --> 00:36:54,166
Speaking of automation, know, when shortly after chat GPT-35 was released, Goldman came
out and said, like, I think it was 44 % of legal tasks were subject to automation by
569
00:36:54,166 --> 00:36:55,306
artificial intelligence.
570
00:36:55,306 --> 00:37:02,646
And you had somebody on that was talking about by 2027, 80 % of legal tasks could be
automated.
571
00:37:02,646 --> 00:37:03,399
And that
572
00:37:03,399 --> 00:37:05,199
that number seems really high.
573
00:37:05,199 --> 00:37:09,499
Actually the 44 % number still seems really high to me.
574
00:37:09,619 --> 00:37:10,991
In 2027 that
575
00:37:10,991 --> 00:37:16,411
well, I mean, we're going to end up in a sort of like big semantic thing around, you know,
what is a task?
576
00:37:16,411 --> 00:37:19,271
When, when does the task begin and end and so forth.
577
00:37:19,311 --> 00:37:30,091
I mean, if people want to listen to it, it's Richard, maybe, uh, who's the CEO of Juro,
um, uh, legal tech company focused on in-house and it's on my law punks podcast.
578
00:37:30,091 --> 00:37:38,771
If you just type law punks P U N X not K S P U N X into artificial work, you'll find it.
579
00:37:38,791 --> 00:37:39,639
Um,
580
00:37:40,127 --> 00:37:42,807
Yeah, mean, mean, broadly, I agree.
581
00:37:43,047 --> 00:37:46,627
Will more and more and more tasks be automatable?
582
00:37:46,627 --> 00:37:47,467
Yeah.
583
00:37:47,947 --> 00:37:51,107
Secondly, the issue is will they get automated?
584
00:37:51,107 --> 00:37:57,158
Well, given all the things we've just talked about, probably at the point of an economic
gun, right?
585
00:37:57,218 --> 00:37:58,578
So not anytime soon.
586
00:37:58,578 --> 00:37:59,938
Now for in-house, it is different.
587
00:37:59,938 --> 00:38:01,338
Obviously he's focused on in-house.
588
00:38:01,338 --> 00:38:06,738
So he's very much focused on what he's seeing day to day and what he sees.
589
00:38:06,738 --> 00:38:08,090
really does believe.
590
00:38:08,090 --> 00:38:10,010
there can be massive change very rapidly.
591
00:38:10,010 --> 00:38:18,361
And I think in-house probably does have, I'd hope so anyway, less of a barrier, but we've
been saying that for years, right?
592
00:38:18,361 --> 00:38:19,501
We've been saying that for years.
593
00:38:19,501 --> 00:38:27,392
In-house teams, they obviously are gonna defeat the billable hour because why would they
want it?
594
00:38:27,392 --> 00:38:28,252
But they do.
595
00:38:28,252 --> 00:38:35,264
Many, law firms say, actually, we suggested to blah, blah, General Counselor, large
corporate, we'll stop doing billable hours if you want.
596
00:38:35,264 --> 00:38:37,464
I'm a general counsel, so no, no, no, please keep doing it.
597
00:38:37,464 --> 00:38:40,655
Makes it much, much easier for me to keep on track, know, keep track of things.
598
00:38:40,655 --> 00:38:42,534
So I don't know.
599
00:38:42,534 --> 00:38:43,775
I think for me it's for human aspect.
600
00:38:43,775 --> 00:38:45,575
I think technically he is right.
601
00:38:45,575 --> 00:38:47,615
I think technically he's absolutely right.
602
00:38:47,615 --> 00:38:49,035
Could it happen?
603
00:38:49,035 --> 00:38:50,835
Yes, probably.
604
00:38:51,175 --> 00:38:57,775
If we keep going the way we're going, will it happen because of human and economic factors
and cultural factors?
605
00:38:57,775 --> 00:38:59,595
I think, I hate to it, probably not.
606
00:38:59,595 --> 00:39:02,575
My guess is, I think I might have said before on this program,
607
00:39:02,575 --> 00:39:06,606
my bet for real true transformation isn't about 12 years.
608
00:39:06,606 --> 00:39:07,966
Now you might say, well, that's terrible.
609
00:39:07,966 --> 00:39:10,666
That's a really long way, but I'm talking like real, real transformation.
610
00:39:10,666 --> 00:39:11,646
So is it worth it?
611
00:39:11,646 --> 00:39:17,026
For example, if I said to you, there'll be no oil and gas in 12 years, we'll have a
completely green economy.
612
00:39:17,026 --> 00:39:22,886
There will be no more pollution, at least from, you know, the energy sector in 12 years.
613
00:39:22,886 --> 00:39:24,366
You'd like, well, I don't think that's going to happen.
614
00:39:24,366 --> 00:39:26,046
But if it did, that would be amazing.
615
00:39:26,046 --> 00:39:27,606
Completely transform a global economy.
616
00:39:27,606 --> 00:39:30,146
It would have geopolitical consequences.
617
00:39:30,366 --> 00:39:31,022
Right.
618
00:39:31,114 --> 00:39:36,816
like bye-bye Saudi Arabia, know, they're gonna have to move pretty quickly into solar
power.
619
00:39:36,816 --> 00:39:38,497
It would have huge, huge effects.
620
00:39:38,497 --> 00:39:50,883
Now, I honestly do believe that in about 12 years that is gonna happen in professional
services, knowledge management service, services, because if AI is where it is now, just
621
00:39:50,883 --> 00:39:59,458
less than three years after the launch of Jack GPT, when OpenAI was the only company
seriously investing.
622
00:39:59,548 --> 00:40:05,428
in large language models and where and look now, there's more than you can even poke a
stick at, right?
623
00:40:05,448 --> 00:40:07,199
And there's more every day.
624
00:40:07,199 --> 00:40:10,059
I mean, and look at the way it's building.
625
00:40:10,259 --> 00:40:12,659
So like look at lovable, right?
626
00:40:12,659 --> 00:40:18,210
You just type into like a prompt box, build me a job website.
627
00:40:18,210 --> 00:40:20,178
20 minutes later, there it is.
628
00:40:20,178 --> 00:40:21,858
It's incredible.
629
00:40:21,858 --> 00:40:24,598
then, you know, and it's something I remember talking to Zach about ages ago.
630
00:40:24,598 --> 00:40:30,278
He was just saying, you know, like when the internet began, people just use it for very
basic search.
631
00:40:30,418 --> 00:40:33,318
And then you've got these like layers of an onion.
632
00:40:33,318 --> 00:40:36,318
They accreted, they built up and built up and built up.
633
00:40:36,318 --> 00:40:41,838
now, you know, well now the internet has become a source for LLMs.
634
00:40:41,838 --> 00:40:49,272
And through, and then through this bizarre sort of combination of data sources and LLMs
and agents and all this kind of thing, it's all.
635
00:40:49,272 --> 00:40:57,747
merging together into this huge kind of ecosystem that surrounds us now in a way that we
couldn't even have imagined when the internet first appeared.
636
00:40:57,788 --> 00:41:04,912
And on top of all of that will be new understandings and new uses for AI, which we haven't
even started to grasp yet.
637
00:41:05,312 --> 00:41:13,469
know, that's the thing, you know, if you think about it, right, lovable could not have
existed not in the way that it does today or cursor, for example, could not have existed
638
00:41:13,469 --> 00:41:14,209
the way it does today.
639
00:41:14,209 --> 00:41:18,722
If it hadn't been for work, people like Sam Altman and his group, right.
640
00:41:18,812 --> 00:41:21,104
back in the late teens.
641
00:41:21,104 --> 00:41:23,025
But look how fast things have moved.
642
00:41:23,465 --> 00:41:24,807
Where are we gonna be in five years?
643
00:41:24,807 --> 00:41:25,687
And then 10.
644
00:41:25,687 --> 00:41:35,411
And then you've got, you really, really wanna push the boat out, do you think we'll have
some usable quantum computing technology in a decade?
645
00:41:35,572 --> 00:41:37,894
Maybe, not guaranteed.
646
00:41:37,894 --> 00:41:38,534
But you know what mean?
647
00:41:38,534 --> 00:41:45,157
It's like, even without that, even without that extra super powerful compute power, look
at the chips they're knocking out now.
648
00:41:45,481 --> 00:41:55,201
Look at the investments, look at the gigantic investments from Cisco into OpenAI and the
relationship with NVIDIA now, OpenAI is doing a deal with AMD.
649
00:41:55,661 --> 00:41:59,972
And there's a whole bunch of, there's just like billions upon billions flowing into this.
650
00:41:59,972 --> 00:42:00,543
Right.
651
00:42:00,543 --> 00:42:06,563
So yeah, I mean, it's, it's, I really do believe it's going to be a complete sort of tech
ecosystem transformation.
652
00:42:06,743 --> 00:42:10,476
Very, very similar to what we saw from the late nineties onwards with the internet.
653
00:42:10,476 --> 00:42:12,598
And we're seeing, we're seeing examples of that.
654
00:42:12,598 --> 00:42:23,146
You know, the, the Garfields in the UK, the Crosby's in New York, you mentioned covenants,
the UD is like, these are, this isn't theory anymore.
655
00:42:23,146 --> 00:42:24,879
This is, this is happening.
656
00:42:24,879 --> 00:42:27,501
And, um, and they're delivering real work.
657
00:42:27,501 --> 00:42:30,113
Like I know a little bit about Crosby.
658
00:42:30,113 --> 00:42:35,648
they had a Sequoia is one of their capital partners and they have a podcast called
training data.
659
00:42:35,648 --> 00:42:38,121
And they had the founders of Crosby on.
660
00:42:38,121 --> 00:42:41,792
And the way, what they're doing essentially is like master services agreement.
661
00:42:41,792 --> 00:42:49,157
have a very narrow niche master services agreements for big, you know, like clay and
stripe.
662
00:42:49,157 --> 00:42:54,000
And, they are delivering with a very small number of attorneys.
663
00:42:54,020 --> 00:42:56,211
So, I mean, this is, this is happening now.
664
00:42:56,211 --> 00:42:57,811
This isn't theory.
665
00:42:57,811 --> 00:42:58,422
So
666
00:42:58,615 --> 00:43:00,615
But again, it's scale, isn't it?
667
00:43:00,615 --> 00:43:00,995
It's scale.
668
00:43:00,995 --> 00:43:02,215
mean, this is the thing that look at Lovable.
669
00:43:02,215 --> 00:43:07,655
Lovable grew from like, you know, like $1 to tens of millions of dollars in about 12
months.
670
00:43:07,655 --> 00:43:12,455
It was instantly usable and there was no barrier, right?
671
00:43:12,475 --> 00:43:18,586
With legal tech has always suffered from, like I going back to the same point, it's got
structural barriers, right?
672
00:43:18,586 --> 00:43:19,866
I Covenant is a great model.
673
00:43:19,866 --> 00:43:20,726
Crosby is a great model.
674
00:43:20,726 --> 00:43:21,797
Garfield is a great model.
675
00:43:21,797 --> 00:43:23,655
Why aren't there 200 of these?
676
00:43:23,655 --> 00:43:26,048
or why doesn't Crosby
677
00:43:30,644 --> 00:43:33,119
mean, they've only been in business a year, right?
678
00:43:33,119 --> 00:43:35,016
So it's, they're, they're brand new.
679
00:43:35,016 --> 00:43:37,257
I suppose it was really interesting calculation.
680
00:43:37,257 --> 00:43:45,197
I'm have to do it at some point, which is, I mean, if you look at the total revenue of all
the legal tech companies on the planet, right?
681
00:43:45,217 --> 00:43:54,297
And this is not a zero sum type calculations, which makes it very, very complicated, but
start off with what is the total revenue of all the legal tech companies in the world,
682
00:43:54,297 --> 00:43:54,917
right?
683
00:43:54,917 --> 00:43:57,277
And break it down into what they do.
684
00:43:57,477 --> 00:44:03,093
So if we look at sort of more productivity side, so let's put aside all project
management, billing.
685
00:44:03,093 --> 00:44:10,156
all that kind of stuff, time tracking, take that out and just look at them all sort of
productivity tools and look at the revenue that comes into them.
686
00:44:10,156 --> 00:44:10,636
Right.
687
00:44:10,636 --> 00:44:17,709
Now that is a measure to some degree of how far business of law has integrated these
technologies.
688
00:44:17,709 --> 00:44:18,000
Right.
689
00:44:18,000 --> 00:44:22,942
And then once you've done that, and it's very, very complicated to figure that out, maybe
McKinsey can do it.
690
00:44:22,942 --> 00:44:28,854
Then look at what a percentage that is of the total legal tech market.
691
00:44:29,296 --> 00:44:33,016
sorry, legal market, sorry, it's very late in the day here.
692
00:44:33,016 --> 00:44:38,656
What percentage is of the total revenue of the whole legal market, right?
693
00:44:38,656 --> 00:44:40,496
Now, these are not exact fractions, right?
694
00:44:40,496 --> 00:44:51,316
You can't say, you know, if someone spends 20 million on X legal tech company over a year,
or a bunch of different people spend 20 million, doesn't mean that that has been extracted
695
00:44:51,316 --> 00:44:56,696
from the legal market, because it may actually be empowering the legal market to do more.
696
00:44:56,696 --> 00:44:57,646
The same way,
697
00:44:57,646 --> 00:45:04,437
that law firms spent a fortune on Microsoft Word and email, and that enabled them to make
a hell of a lot more money, right?
698
00:45:04,437 --> 00:45:08,299
So I think I'd love to see some financial analysis around that.
699
00:45:08,299 --> 00:45:14,261
So like money that's coming out of a legal market into legal tech that then in turn drives
productivity.
700
00:45:14,261 --> 00:45:22,563
And then how much of a sort of force multiplier effect that has on those lawyers.
701
00:45:22,563 --> 00:45:24,823
which enables the market to grow even larger.
702
00:45:24,823 --> 00:45:28,923
So it becomes like this sort of supporting cycle.
703
00:45:28,923 --> 00:45:30,943
And we just don't really have the data.
704
00:45:30,943 --> 00:45:32,583
We just don't have the data.
705
00:45:32,583 --> 00:45:41,294
And there's a generally very, you know, it's classic reductive view is, you if you spend a
dollar on an AI tool, you're taking a dollar out of a lawyer's pocket, right?
706
00:45:41,294 --> 00:45:43,294
Which I don't think really follows at all.
707
00:45:43,294 --> 00:45:44,994
It could, it could in certain streams.
708
00:45:44,994 --> 00:45:49,485
And that's why it gets even more complicated because there are certain streams of work,
which I think will go away.
709
00:45:49,485 --> 00:45:53,209
right, they will get automated out of existence in terms of the lawyers putting their
fingers on it.
710
00:45:53,209 --> 00:45:55,391
It will become commoditized, right?
711
00:45:55,391 --> 00:45:56,432
That will happen.
712
00:45:56,432 --> 00:46:02,457
But there'll be other areas of work where the AI will actually massively increase what
lawyers can do.
713
00:46:02,457 --> 00:46:06,319
And they may end up becoming way, way, way more wealthy than they are today.
714
00:46:06,320 --> 00:46:09,913
And it's quite difficult because you both of these, both of these pictures are true.
715
00:46:09,913 --> 00:46:14,537
And I think that's one of the things that's and people like clear pictures, right?
716
00:46:14,537 --> 00:46:16,548
know, AI will destroy all lawyers.
717
00:46:16,548 --> 00:46:17,533
No, that's not true.
718
00:46:17,533 --> 00:46:19,773
Okay, so AI is irrelevant.
719
00:46:20,053 --> 00:46:21,333
No, that's not true.
720
00:46:21,333 --> 00:46:30,893
So it's kind of like, you know, it's like, how can AI actually eat up probably what will
be in 10 years, a significant chunk of what is today of a legal market.
721
00:46:31,373 --> 00:46:37,964
And yet at the end of the day, Christmas 3035, is that right?
722
00:46:37,964 --> 00:46:40,144
Yeah, Christmas 35, right?
723
00:46:40,624 --> 00:46:43,364
There'll be a lot of lawyers who are saying, my God, thank God for AI.
724
00:46:43,364 --> 00:46:44,616
I'm so much wealthier now.
725
00:46:44,616 --> 00:46:50,499
And I don't want to jump out of a window because that's really the mindset of a lot of
lawyers today.
726
00:46:50,499 --> 00:46:56,425
It can be a very brutal profession and coming up the ranks is sometimes a painful process.
727
00:46:56,425 --> 00:46:57,699
uh
728
00:46:57,699 --> 00:47:06,279
that's, that's one of the things I mean, I I gave a talk in Oxford the other day at a, and
I don't know if it me something in public meeting somebody called Gunnar Cook, which is a
729
00:47:06,279 --> 00:47:07,859
distributed law firm.
730
00:47:07,859 --> 00:47:08,099
Right.
731
00:47:08,099 --> 00:47:11,470
So they're all individuals, but they work under the umbrella of Gunnar Cook.
732
00:47:11,470 --> 00:47:12,210
Right.
733
00:47:12,330 --> 00:47:15,830
And they're using AI systems now, which is fantastic.
734
00:47:15,850 --> 00:47:24,070
And, you know, we were talking about the fact that, you know, it's, it's just, it's an,
it's all, it's an all, you know, it's all positive for them because they don't have
735
00:47:24,070 --> 00:47:24,900
leverage.
736
00:47:24,900 --> 00:47:28,521
It's mostly made up of experienced lawyers, right?
737
00:47:28,521 --> 00:47:31,553
They don't have this huge leverage associate group beneath them.
738
00:47:31,553 --> 00:47:34,885
And having AI greatly increases what they can do.
739
00:47:34,885 --> 00:47:41,817
And it does work that quite often if a client comes to an individual in a group like this,
one of these distributed law firms, they're not coming to them because they know they've
740
00:47:41,817 --> 00:47:44,288
got 25 associates working in the basement.
741
00:47:44,288 --> 00:47:49,820
They come to them because they know that person has real expertise and will really add
value on a very human level.
742
00:47:49,820 --> 00:47:51,440
But of course,
743
00:47:51,766 --> 00:47:55,048
leverage based work will creep in.
744
00:47:55,048 --> 00:47:56,630
But now they can use AI to do that.
745
00:47:56,630 --> 00:47:58,872
So I mean, it's an absolute win-win for them.
746
00:47:58,872 --> 00:48:10,549
And again, for these hybrid AI companies and the ALSPs and all of these, it's, mean, this
is the thing that is kind of tantalizing is that we really are starting to see more and
747
00:48:10,549 --> 00:48:14,040
more examples of organizations using AI.
748
00:48:14,356 --> 00:48:15,946
and it's just all positive.
749
00:48:15,946 --> 00:48:22,786
And simultaneously, we're seeing both some law firms and some in-house legal teams kind of
gritting their teeth and going, oh, I'm not sure.
750
00:48:22,786 --> 00:48:23,826
I'm not sure about this.
751
00:48:23,826 --> 00:48:24,946
I'm not sure about this.
752
00:48:24,946 --> 00:48:26,446
Is this a good idea?
753
00:48:26,686 --> 00:48:28,826
And you're just like...
754
00:48:28,857 --> 00:48:32,459
Yeah, there's still a lot of pearl clutching going on.
755
00:48:32,459 --> 00:48:39,384
Well, we're almost out of time, but this has been a great conversation as it always is
every time we have you on.
756
00:48:39,384 --> 00:48:46,589
Before we hop off, how do people find out more about legal innovators and the writing that
you do?
757
00:48:46,936 --> 00:48:50,707
Go to chat GPT and I'm not even kidding.
758
00:48:50,707 --> 00:48:56,090
I get a lot of traffic from, from LLMs now uh perplexity and others, right?
759
00:48:56,090 --> 00:48:59,792
Go to chat GPT and ask it about artificial lawyer.
760
00:48:59,792 --> 00:49:02,592
And hopefully it'll say something nice about me and provide a link.
761
00:49:02,592 --> 00:49:08,495
But if you don't want to do that, just go into Google and type artificial lawyer.com and
you'll, you'll find me.
762
00:49:08,495 --> 00:49:10,447
But yeah, on the New York event.
763
00:49:10,447 --> 00:49:20,886
if I imagine some of your listeners are in the US, the New York event, Legal Innovators
New York, the website is literally legalinnovatorsnewyork.com and it will be on the 19th
764
00:49:20,886 --> 00:49:22,878
and 20th of November.
765
00:49:22,878 --> 00:49:31,330
Day one is law firms, day two is in-house and obviously we'd like to see everybody on both
days but if you want to focus on one or the other then you can take your pick and it will
766
00:49:31,330 --> 00:49:32,630
be in Midtown.
767
00:49:33,041 --> 00:49:33,801
Okay.
768
00:49:34,021 --> 00:49:35,462
That's great to hear.
769
00:49:35,462 --> 00:49:38,545
And I'm a big fan of the artificial lawyer site.
770
00:49:38,545 --> 00:49:40,226
I'm very selective.
771
00:49:40,226 --> 00:49:42,316
I get a few newsletters that I read every day.
772
00:49:42,316 --> 00:49:43,207
Yours is one of them.
773
00:49:43,207 --> 00:49:46,188
So um I really appreciate your time.
774
00:49:46,188 --> 00:49:50,291
Best of luck with the conference and I'm sure we'll bump into each other sometime real
soon.
775
00:49:50,291 --> 00:49:51,325
I hope so, definitely.
776
00:49:51,325 --> 00:49:51,872
Thank you.
777
00:49:51,872 --> 00:49:52,737
All right, take care.
778
00:49:52,737 --> 00:49:54,185
Bye bye.
00:00:02,334
Richard Trumans, how are you this afternoon?
2
00:00:02,597 --> 00:00:04,721
ah Good, good, good.
3
00:00:04,721 --> 00:00:12,334
Slightly frazzled around the edges, but I've been turned over and I'm a sunny side up
again now, so I'm pretty good.
4
00:00:12,334 --> 00:00:14,156
uh
5
00:00:14,156 --> 00:00:14,527
good.
6
00:00:14,527 --> 00:00:16,123
That's better than sunny side down.
7
00:00:16,123 --> 00:00:17,593
um
8
00:00:17,593 --> 00:00:18,194
I don't like that.
9
00:00:18,194 --> 00:00:20,225
mean, yeah, it seems that will go scrambled.
10
00:00:20,225 --> 00:00:20,945
There you go.
11
00:00:20,945 --> 00:00:27,589
Yeah, I'm a over medium kind of guy, whatever floats your boat.
12
00:00:27,589 --> 00:00:35,326
So I know you, think most of our audience knows you, but for the ones that don't, just
give us a quick introduction about who you are, what you do, and where you do it.
13
00:00:35,326 --> 00:00:38,238
Yeah, well, I'll give you the sensible two minute version.
14
00:00:38,238 --> 00:00:47,874
So working in the legal sector for about 25 years, started off as a proper journalist,
went to journalism school for my sins, ended up at a magazine called Legal Week, which
15
00:00:47,874 --> 00:00:52,837
doesn't exist anymore, which was bought eventually by law.com, focused on the
international world.
16
00:00:52,837 --> 00:00:55,278
Then I became a foreign correspondent.
17
00:00:55,278 --> 00:01:02,122
After a couple of years decided that I'd much rather focus on the business of law, ended
up in the city.
18
00:01:02,170 --> 00:01:05,450
Chem and management consultant helped to merge law firms together.
19
00:01:05,450 --> 00:01:19,745
It was very much focused on research of the market, big macroeconomic picture, did a lot
of work over the years on profitability, business structure, business culture, and all of
20
00:01:19,745 --> 00:01:20,506
those things.
21
00:01:20,506 --> 00:01:24,869
And really, if I had been happy doing that, you probably would never have met me.
22
00:01:24,869 --> 00:01:28,873
In fact, no one in the legal tech world would ever, ever have met me.
23
00:01:28,873 --> 00:01:30,734
Artificial law would not exist.
24
00:01:30,790 --> 00:01:38,960
and I would be living in Surrey in a nice little cottage and commuting into London to do
my little bits of consulting work with big law firms.
25
00:01:38,960 --> 00:01:42,043
And I'd still be wearing a suit and all of that kind of stuff.
26
00:01:42,043 --> 00:01:48,479
But, fatefully, in 2015, I went and started getting interested in technology.
27
00:01:48,479 --> 00:01:53,793
And I became fascinated with AI and I've always been interested in science in my free
time.
28
00:01:53,915 --> 00:01:59,247
And I, in 2016, it bubbled up to the point where I suddenly realized, my God, it's going
to happen.
29
00:01:59,247 --> 00:02:00,768
It's really going to happen.
30
00:02:00,768 --> 00:02:02,169
AI is going to change everything.
31
00:02:02,169 --> 00:02:10,373
And I launched Artificial Lawyer, which combined a bit of my business analysis and macro
picture skills with my good old fashioned journalistic skills.
32
00:02:10,373 --> 00:02:18,776
People kind of think that I went from being a journalist to running Artificial Lawyer, but
there's like an enormous gap in the middle of running around the city in a suit, trying to
33
00:02:18,776 --> 00:02:21,667
advise people on business, which no one ever talks about.
34
00:02:21,667 --> 00:02:23,580
understandably, it wasn't very exciting.
35
00:02:23,580 --> 00:02:29,905
It was it was to begin with it is I mean, honestly, when you've been a reporter, going,
tell me about your profitability.
36
00:02:29,905 --> 00:02:38,693
And then you get invited inside the inner sanctum and they show you your your you know,
they show you the spreadsheet and you see everything you like, wow, I you get to talk to
37
00:02:38,693 --> 00:02:41,216
managing partners and practice heads and everything.
38
00:02:41,216 --> 00:02:44,019
You get a completely different sense of what's going on.
39
00:02:44,019 --> 00:02:44,579
And
40
00:02:44,579 --> 00:02:53,259
That for me is why I think those two things when AI came around, I just thought, yes, this
is going to change everything.
41
00:02:53,259 --> 00:02:59,799
And then of course it didn't, but I hung in there like a fool.
42
00:02:59,799 --> 00:03:01,379
hung in there and guess what?
43
00:03:01,379 --> 00:03:03,902
In 2022, it became real.
44
00:03:03,902 --> 00:03:04,502
Yeah.
45
00:03:04,502 --> 00:03:09,482
And well, and you were, I've talked about this on a previous episode with you, you were
early.
46
00:03:09,562 --> 00:03:23,351
So the, really the birth of generative AI as I know it was, you know, the attention is all
you need paper from Google, which I think was 2017, right?
47
00:03:23,351 --> 00:03:30,591
Around that time, that time, we were just having things because I'm just reading, you
know, book about Empire of AI, great book.
48
00:03:31,071 --> 00:03:33,591
Some bits that aren't so great, but broadly.
49
00:03:35,111 --> 00:03:39,271
Let's not get into the whole looking for victims kind of thing.
50
00:03:39,430 --> 00:03:45,051
But it reminds me of that South Park episode about Megan and Harry.
51
00:03:45,871 --> 00:03:49,091
CSI Special Victims Unit, you know.
52
00:03:49,320 --> 00:03:53,130
There's a lot of sour grapes and whining and that, but yeah.
53
00:03:53,130 --> 00:03:53,661
Yeah.
54
00:03:53,661 --> 00:03:54,071
right.
55
00:03:54,071 --> 00:03:54,891
We'll edit that out.
56
00:03:54,891 --> 00:03:56,143
This will be censored.
57
00:03:56,143 --> 00:04:03,291
This will be sent to the George Orwell School of video editing to be sanitized before
publication.
58
00:04:03,291 --> 00:04:09,597
But the interesting thing is that Google were one of the, if not the one, that really
helped to pioneer the transformer model.
59
00:04:09,597 --> 00:04:11,489
And they didn't really take it that far.
60
00:04:11,489 --> 00:04:17,352
And then OpenAI, Altman and the team around him and Musk as well.
61
00:04:17,352 --> 00:04:20,132
deserves a lot of credit for backing them early on.
62
00:04:20,132 --> 00:04:21,432
They just picked it up and wrapped them up.
63
00:04:21,432 --> 00:04:25,552
This is one of the cool things, right, about him I saw in the book, which is really
fascinating to me.
64
00:04:25,552 --> 00:04:27,232
They started off doing everything.
65
00:04:27,232 --> 00:04:28,732
They were doing the video.
66
00:04:28,732 --> 00:04:31,023
They also were doing robotics, right?
67
00:04:31,023 --> 00:04:37,863
They had a robotic hand because they were focused on AGI as they conceptualized it back in
sort of 2016, 2017.
68
00:04:38,203 --> 00:04:42,743
They didn't know, they just had this idea there was going to be this super intelligent AI.
69
00:04:42,903 --> 00:04:43,893
right, human-like AI.
70
00:04:43,893 --> 00:04:46,644
They weren't quite sure how it was going to emerge.
71
00:04:46,644 --> 00:04:47,775
And they were looking at everything.
72
00:04:47,775 --> 00:04:51,386
They were looking at, like, sort of visual, video kind of stuff.
73
00:04:51,386 --> 00:04:53,166
They were looking at the physical realm.
74
00:04:53,166 --> 00:04:55,567
And they were also looking at language models.
75
00:04:55,707 --> 00:04:58,458
But those were all seen as, equally interesting.
76
00:04:58,458 --> 00:05:00,118
They didn't quite know which way it was going to go.
77
00:05:00,118 --> 00:05:05,810
And eventually, of course, once the LLM bit took off, they closed down the robotics bit or
got rid of it.
78
00:05:05,810 --> 00:05:08,952
But now, interestingly, they've restarted it.
79
00:05:08,952 --> 00:05:10,392
going off on a tangent.
80
00:05:10,392 --> 00:05:15,832
I think the point I think that I mean, and the funny thing is all of that was going on
unbeknownst to me.
81
00:05:15,832 --> 00:05:24,492
Well, I started running around trying to talk to people like Kira, know, no Waysburg and
the Ebrevier and companies like that, talking to them about natural language processing,
82
00:05:24,492 --> 00:05:25,663
machine learning, la la la.
83
00:05:25,663 --> 00:05:36,634
And in an office somewhere in San Francisco, Palo Alto, were a bunch of people building
the future with very, very little fanfare, very, very little fanfare.
84
00:05:36,634 --> 00:05:37,144
Yeah.
85
00:05:37,144 --> 00:05:37,564
Yeah.
86
00:05:37,564 --> 00:05:46,299
And, but, but you, came to the game, earlier than most and you know, I, paid attention to,
and my listeners have, have heard me talk about this before.
87
00:05:46,299 --> 00:05:53,424
It was, when deep blue on Jeopardy was kind of my first introduction, right?
88
00:05:53,491 --> 00:05:54,565
It was IBM.
89
00:05:54,565 --> 00:05:54,946
Yep.
90
00:05:54,946 --> 00:05:59,650
And then alpha go, which was again, that was roughly the same time period.
91
00:05:59,650 --> 00:06:01,693
2017, 2016.
92
00:06:01,693 --> 00:06:02,025
wasn't it?
93
00:06:02,025 --> 00:06:04,865
Wasn't that, that was out of, yeah.
94
00:06:04,865 --> 00:06:10,539
deep mind, Dennis Hesabas that Google ended up buying deep mind.
95
00:06:10,539 --> 00:06:16,446
So Google has had ambitions and has made investments in AI for, for a very long time.
96
00:06:16,446 --> 00:06:20,129
And you know, their first model, couple of models really sucked.
97
00:06:20,129 --> 00:06:22,731
I mean, Bard was pretty bad.
98
00:06:22,731 --> 00:06:24,993
Um, they've really closed the gap.
99
00:06:24,993 --> 00:06:27,755
Gemini 2.5 is quite good.
100
00:06:27,755 --> 00:06:30,296
And I think they're on the verge of releasing.
101
00:06:30,318 --> 00:06:31,831
whatever that next version is.
102
00:06:31,831 --> 00:06:35,747
yeah, Google, Google was kind of slowly then suddenly.
103
00:06:36,779 --> 00:06:37,645
but they've caught up.
104
00:06:37,645 --> 00:06:38,525
thing, isn't it?
105
00:06:38,525 --> 00:06:41,825
You get these like plateau, kind of like watershed moments.
106
00:06:41,825 --> 00:06:49,445
But I mean, the interesting thing here is that the basic, you might say economic theory,
the sort of techno economic theory, whatever you want to call it.
107
00:06:49,725 --> 00:06:59,205
But I sort of like, I would say I developed, but I certainly spent a ridiculously large
amount of time working on it, you know, around, you know, moving away from the, I mean,
108
00:06:59,205 --> 00:07:02,661
all the key concepts were all there, well developed in the market.
109
00:07:02,661 --> 00:07:11,541
You might say that the only thing really artificial I did was just kind of like tie a
little loose ends together and basically build almost like a worldview.
110
00:07:11,541 --> 00:07:13,521
This is the way it's going to be people, right?
111
00:07:13,521 --> 00:07:14,881
This is the way things are going to evolve.
112
00:07:14,881 --> 00:07:17,332
Well, they have to evolve if AI is going to be anything relevant.
113
00:07:17,332 --> 00:07:21,852
And it all came together like in a few months of 2016, right?
114
00:07:21,852 --> 00:07:23,812
Because it's not rocket science anyway, is it?
115
00:07:24,192 --> 00:07:29,808
And then the technology has moved on, but the basic ideas remain the same.
116
00:07:29,808 --> 00:07:33,611
It's rather like the green movement, know, batteries have improved.
117
00:07:33,770 --> 00:07:39,037
Now we have self-driving cars, which without humans are both safer and will be more
efficient.
118
00:07:39,037 --> 00:07:40,759
Electric vehicles.
119
00:07:40,759 --> 00:07:51,488
Today they announced, I'm not sure which body, one of the large global energy bodies
announced that this year was the first time ever that, well, actually you're going to have
120
00:07:51,488 --> 00:07:53,730
to edit this bit because I can't remember the facts.
121
00:07:53,730 --> 00:07:55,161
It's the...
122
00:07:55,282 --> 00:08:01,358
I think it was the first time that renewable energy overtook fossil fuels globally on
global basis, something like that.
123
00:08:01,358 --> 00:08:01,999
Yeah.
124
00:08:01,999 --> 00:08:10,715
But the point is, that the tech moves forward, but the people who pioneered green energy
had thought all of this through 20 years ago.
125
00:08:10,715 --> 00:08:11,896
Right.
126
00:08:11,896 --> 00:08:13,247
like, okay, so we're going to need wind.
127
00:08:13,247 --> 00:08:15,820
And then someone says, yes, but wind is very variable.
128
00:08:15,820 --> 00:08:19,233
So you're to need better battery technology.
129
00:08:19,233 --> 00:08:20,093
People go, yep, yep.
130
00:08:20,093 --> 00:08:20,664
Okay, right.
131
00:08:20,664 --> 00:08:21,784
Let's develop that.
132
00:08:21,784 --> 00:08:22,184
Right?
133
00:08:22,184 --> 00:08:27,264
And we're going to need a mix and we're going to have, you know, we're going to need
better transmission systems and so forth.
134
00:08:27,264 --> 00:08:30,304
And we're going to massively have to improve battery technology.
135
00:08:30,644 --> 00:08:31,244
So on and so on.
136
00:08:31,244 --> 00:08:32,635
then we're to have to invest in some.
137
00:08:32,635 --> 00:08:42,495
And it's like the groundwork was laid out 20 years ago and we see this again and again in
society that the, the, basic facts are pretty obvious for all to see.
138
00:08:42,515 --> 00:08:43,355
Right?
139
00:08:43,355 --> 00:08:46,986
It's just, they don't have the technology or the investment to make it real.
140
00:08:46,986 --> 00:08:49,657
Um, and then eventually the technology comes along.
141
00:08:49,657 --> 00:08:57,090
You know, it goes back to the, know, the old anecdotes about, you know, you know,
revolutions needing the right conditions, right?
142
00:08:57,490 --> 00:09:02,660
You know, there's always revolutionaries in any society, any point in history, you all way
back to ancient Greece.
143
00:09:02,660 --> 00:09:07,373
There's always a bunch of revolutionaries running around in their togas, you know, going,
right.
144
00:09:07,373 --> 00:09:09,864
But it only takes off if you get the right conditions.
145
00:09:09,864 --> 00:09:11,121
And it's the same with technology.
146
00:09:11,121 --> 00:09:18,058
I mean, like I could have shouted until I was blue in the face until I keeled over about
getting rid of the billable hour.
147
00:09:18,058 --> 00:09:27,573
by automating work streams, about rethinking the business model of law firms, about
thinking about the end outputs, how that benefits society as a whole, all that kind of
148
00:09:27,573 --> 00:09:28,064
stuff,
149
00:09:28,064 --> 00:09:29,387
And we may never have got there.
150
00:09:29,387 --> 00:09:38,651
NLP would have just fizzled out, plateaued out, and you and I would still be, I mean, I
don't know what I would be doing, but I'd still be doing legal tech.
151
00:09:38,651 --> 00:09:39,393
I'm not sure.
152
00:09:39,393 --> 00:09:41,653
Well, you know, this is it.
153
00:09:41,653 --> 00:09:46,253
I had a similar experience, similar, similar dynamic with cloud.
154
00:09:46,253 --> 00:09:57,833
So I was very early to pushing cloud in legal when I really didn't understand just how
much friction towards change there was.
155
00:09:57,833 --> 00:10:00,033
So this is the early 2010s.
156
00:10:00,033 --> 00:10:08,865
And when something called wave 14 came through for office 365 and made, made it usable,
like pre wave
157
00:10:08,865 --> 00:10:17,205
14 or maybe it was wave 15 office 365 now called Microsoft 365 was, was, was not great.
158
00:10:17,205 --> 00:10:24,185
Um, but I really thought to myself, wow, you know, I know what junk Microsoft junkies law
firms are.
159
00:10:24,185 --> 00:10:25,345
This makes perfect sense.
160
00:10:25,345 --> 00:10:32,085
I know how little they like to invest in technology and uh, which has historically been
the case.
161
00:10:32,085 --> 00:10:35,445
Um, moving to the cloud improves economics.
162
00:10:35,445 --> 00:10:38,025
It improves the security posture.
163
00:10:38,025 --> 00:10:38,978
Um,
164
00:10:38,978 --> 00:10:42,618
It certainly can scale up much more efficiently.
165
00:10:43,037 --> 00:10:47,498
And man, that didn't happen for about another eight years.
166
00:10:47,498 --> 00:10:49,138
It wasn't in the US.
167
00:10:49,138 --> 00:10:54,409
So law firms in the US really did not start to move in earnest to the cloud until a little
bit after COVID.
168
00:10:54,409 --> 00:11:00,569
we, again, I beat my head against the wall, kind of like you did talking about AI.
169
00:11:00,769 --> 00:11:05,709
did 22 road shows in 2014 talking about the cloud.
170
00:11:05,765 --> 00:11:12,824
22 different cities and everybody's like, yeah, this is great and did absolutely nothing
for another eight years.
171
00:11:12,986 --> 00:11:13,486
So.
172
00:11:13,486 --> 00:11:18,639
in some ways the AI situation was even more crazy in that cloud clearly did work.
173
00:11:18,639 --> 00:11:21,581
You could demonstrate it, but they still didn't want it.
174
00:11:21,581 --> 00:11:27,184
Well, it's not going to that maybe cloud is actually more crazy, but I suppose to some
degree you could actually not blame the lawyers much because they were looking at the
175
00:11:27,184 --> 00:11:31,716
early NLP tools and going, man, this looks kind of complicated.
176
00:11:31,716 --> 00:11:33,608
I'm not sure I want to use this stuff.
177
00:11:33,608 --> 00:11:39,972
And then you've got all the economic macroeconomic, cultural, professional aspects that
kind of stopped it.
178
00:11:39,972 --> 00:11:42,653
But yeah, I mean, it's
179
00:11:42,745 --> 00:11:43,866
I mean, this the thing, isn't it?
180
00:11:43,866 --> 00:11:52,333
Great, I mean, you cannot kill a great idea, but you can delay it for an enormous,
enormous amount of time, right?
181
00:11:52,333 --> 00:11:59,659
I mean, and you see this again, you see this in politics, see this playing out over all
kinds of different trains in our society.
182
00:11:59,659 --> 00:12:01,201
You cannot kill a great idea.
183
00:12:01,201 --> 00:12:04,904
People just instinctively look at it, they compare it to their own experiences.
184
00:12:04,904 --> 00:12:07,626
Maybe it's even like just like, know, it's gut instinct, right?
185
00:12:07,626 --> 00:12:08,277
It's intuition.
186
00:12:08,277 --> 00:12:10,625
They just go, that's right, that is right.
187
00:12:10,625 --> 00:12:12,785
And they're like, okay, let's have it.
188
00:12:13,125 --> 00:12:14,125
And then it doesn't happen.
189
00:12:14,125 --> 00:12:15,545
And then it doesn't happen.
190
00:12:15,545 --> 00:12:17,685
Although someone tries it, it doesn't really work very well.
191
00:12:17,685 --> 00:12:24,725
And then all the critics and the cynics start jumping on and going, ah, ha, ha, you were a
fool to believe there is hope, young Skywalker.
192
00:12:24,785 --> 00:12:28,925
ha, ha, you know, come to the dark side and enjoy the empire.
193
00:12:28,925 --> 00:12:31,185
It's much nicer over here.
194
00:12:31,365 --> 00:12:34,025
We get much better food, et cetera, et cetera.
195
00:12:34,025 --> 00:12:34,876
office is lovely.
196
00:12:34,876 --> 00:12:39,062
And people give up or they just get disbanded and make off and.
197
00:12:39,062 --> 00:12:41,875
focus something that does work, right?
198
00:12:41,875 --> 00:12:45,377
Like, I don't know, Bitcoin.
199
00:12:45,495 --> 00:12:46,566
That's I was going to say.
200
00:12:46,566 --> 00:12:53,349
Digital currency is another great example of an idea that's been lingering now for almost
20 years.
201
00:12:53,349 --> 00:12:54,029
It was 2008-ish.
202
00:12:54,029 --> 00:12:55,368
But it makes it makes people money.
203
00:12:55,368 --> 00:12:56,020
mean, that's the thing.
204
00:12:56,020 --> 00:13:01,475
mean, I mean, you know, you can step away from the whole intellectual stuff if you want to
and just say, look, does it make money?
205
00:13:01,816 --> 00:13:03,297
If it does make money?
206
00:13:05,659 --> 00:13:06,579
Does.
207
00:13:06,940 --> 00:13:08,894
Well, this is weird thing with crypto.
208
00:13:08,894 --> 00:13:17,349
mean, unless we have a collapse of all fiat currencies and we end up in a sort of Mad Max
world, crypto currencies actually have no real functional value.
209
00:13:17,349 --> 00:13:18,737
They are simply
210
00:13:18,737 --> 00:13:26,284
gigantic momentum trades and the momentum is constantly up, of flips around a bit and then
keeps going up again.
211
00:13:26,284 --> 00:13:28,008
But fundamentally, there's nothing there.
212
00:13:28,008 --> 00:13:29,353
There is literally nothing there.
213
00:13:29,353 --> 00:13:30,798
argue the same about gold.
214
00:13:30,798 --> 00:13:34,666
Gold's value is driven primarily by scarcity.
215
00:13:34,944 --> 00:13:37,686
True, But it has a tangible value.
216
00:13:37,686 --> 00:13:42,800
mean, you can pull out a lump of gold and people go, gold.
217
00:13:42,881 --> 00:13:45,464
People will always want gold.
218
00:13:45,464 --> 00:13:51,919
The same way that people will always want kebabs or burgers or, you know, it's just one of
those things, right?
219
00:13:51,919 --> 00:13:52,671
People like it.
220
00:13:52,671 --> 00:13:59,848
And so, whereas cryptocurrency really is just a momentum trade, maybe that momentum will
keep going forever and ever.
221
00:14:00,184 --> 00:14:06,597
Right, rather like a YouTube channel that just keeps on accumulating views and followers.
222
00:14:06,757 --> 00:14:07,758
It'll never, never stop.
223
00:14:07,758 --> 00:14:10,199
It just goes up and up and up and up.
224
00:14:10,199 --> 00:14:17,802
But if it ever did flip, like really flip, not just fluctuate, but completely flip and go
reverse, there's nothing that you can't cash it in at the end.
225
00:14:18,174 --> 00:14:19,543
You what have I got left?
226
00:14:19,543 --> 00:14:20,935
I've got three electrons.
227
00:14:20,935 --> 00:14:22,160
Are they worth anything?
228
00:14:22,160 --> 00:14:24,396
It's like, no, I'm sorry, your electrons are not really worth anything.
229
00:14:24,396 --> 00:14:26,178
you've got a collection of prime numbers.
230
00:14:26,178 --> 00:14:35,747
um But the underlying technology of blockchain provides a tremendous opportunity and
utility.
231
00:14:35,747 --> 00:14:37,468
And there's been a lot of talk in legal.
232
00:14:37,468 --> 00:14:39,490
It really hasn't gotten anywhere.
233
00:14:39,490 --> 00:14:47,014
the asynchronous ledger, distributed ledger concept is an interesting one in recording.
234
00:14:47,014 --> 00:14:47,674
is, is.
235
00:14:47,674 --> 00:14:50,085
mean, you know, David Fisher, know, integral ledger.
236
00:14:50,085 --> 00:14:52,997
He's been pushing that for ever since I got started.
237
00:14:52,997 --> 00:15:00,020
And, he still has a lot of people who really believe in it, you know, but the problem is,
is it's really be useful.
238
00:15:00,020 --> 00:15:05,483
My personal view is you need a scenario where there's no trust, where trust is completely
broken down.
239
00:15:05,483 --> 00:15:08,835
And most commercial lawyers don't operate in such a system.
240
00:15:08,835 --> 00:15:11,956
Probably this reason is because if trust had completely.
241
00:15:11,956 --> 00:15:19,960
utterly irrevocably broken down and no one can trust each other and nothing could be real,
it'd be quite hard to be a lawyer because you've lost a rules-based system that underpins
242
00:15:19,960 --> 00:15:20,930
all of that.
243
00:15:21,711 --> 00:15:23,191
You're in Mad Max.
244
00:15:23,211 --> 00:15:30,294
And at that point, cryptocurrency and blockchain and everything really makes absolute
sense because you cannot trust anybody to do the right thing.
245
00:15:30,486 --> 00:15:31,366
Yeah.
246
00:15:31,507 --> 00:15:42,519
Yeah, that's a, we could go down a rabbit hole with the cryptocurrency, but, let's talk
about your, so you've got a couple of events coming up that, I wanted to make people aware
247
00:15:42,519 --> 00:15:48,815
of because your theme aligns really well with the theme of this podcast, which is legal
innovation.
248
00:15:48,815 --> 00:15:53,189
So tell us about the, I think you have events coming up in New York and London.
249
00:15:53,189 --> 00:15:54,960
Tell us a little bit about those.
250
00:15:55,078 --> 00:16:00,580
Yeah, so just briefly, so Legal Innovators was a conference that grew out of Artificial
Lawyer.
251
00:16:00,580 --> 00:16:06,044
It's organized by another company called Cosmonauts, but I kind of co-created it with them
and I chair the events.
252
00:16:06,044 --> 00:16:07,982
London, we've been doing for years and years.
253
00:16:07,982 --> 00:16:09,515
It's quite a big conference now.
254
00:16:09,515 --> 00:16:12,868
We're hoping to get around about thousand people over three days.
255
00:16:12,868 --> 00:16:17,641
Law firm day, in-house day, litigation day, which is gonna be fun, first time we've done
that.
256
00:16:17,641 --> 00:16:21,062
So that's the fourth, fifth and sixth of November in London.
257
00:16:21,184 --> 00:16:23,846
And then we're to be doing New York for very first time.
258
00:16:23,846 --> 00:16:28,100
We've been doing California for years and years in San Francisco, which is brilliant fun.
259
00:16:28,100 --> 00:16:28,427
love it.
260
00:16:28,427 --> 00:16:31,894
It's probably my favorite event, but we're doing New York and it's going be very
interesting.
261
00:16:31,894 --> 00:16:42,883
Now, the reason why I delayed doing New York for so long is because I didn't feel that the
big firms there who really rule, you know, the kingdom were really, really getting into AI
262
00:16:42,883 --> 00:16:49,290
and automation and rethinking business sufficiently to really make it an exciting event.
263
00:16:49,290 --> 00:16:51,731
I mean, Cleary was an outlier early on.
264
00:16:51,731 --> 00:16:54,833
They created Cleary X, did a lot of interesting work.
265
00:16:54,833 --> 00:16:57,073
But now things are changing.
266
00:16:57,073 --> 00:16:57,544
Definitely.
267
00:16:57,544 --> 00:17:04,067
I mean, when I was in New York for Legal Week earlier this year, I spoke to a lot of
people that I got the feeling that things are shifting.
268
00:17:04,067 --> 00:17:07,268
You know, and there was some really good stuff going on at Gunderson as well, for example.
269
00:17:07,268 --> 00:17:12,360
ah It is, is, but in their New York office.
270
00:17:12,360 --> 00:17:13,170
it's true.
271
00:17:13,170 --> 00:17:15,980
But let's face it, Leif Mawatkins is a West Coast firm as well.
272
00:17:15,980 --> 00:17:17,604
You know, they're doing some interesting things.
273
00:17:17,604 --> 00:17:21,344
So it's kind of like, I think it's come of age finally.
274
00:17:21,644 --> 00:17:32,144
kind of, I mean, you expect Wilson, Sonsini and Cooley and others to have created startups
and have really huge innovation teams, doing all kinds of fun stuff, working very closely
275
00:17:32,144 --> 00:17:33,484
with AI companies.
276
00:17:33,804 --> 00:17:39,584
New York firms, I think were slightly, they may have brought in a lot of people around
knowledge management.
277
00:17:39,584 --> 00:17:41,044
They certainly did.
278
00:17:41,224 --> 00:17:44,884
But this idea that, hey, AI is actually gonna change the game for us.
279
00:17:44,884 --> 00:17:46,980
You know, just a little.
280
00:17:46,980 --> 00:17:48,000
Just a little.
281
00:17:48,161 --> 00:17:54,997
I think that was, even though people may have talked a good talk, I didn't really feel
that it was actually happening until very recently.
282
00:17:54,997 --> 00:17:58,412
And I think it is actually starting to happen though.
283
00:17:58,412 --> 00:17:58,842
So.
284
00:17:58,842 --> 00:17:59,482
it is.
285
00:17:59,482 --> 00:18:04,415
And I think a leading indicator to that you alluded to, which is some of the hiring.
286
00:18:04,415 --> 00:18:05,916
mean, look at Simpson Thatcher.
287
00:18:05,916 --> 00:18:09,388
I mean, that's the epitome of a New York white shoe law firm.
288
00:18:09,388 --> 00:18:22,315
You know, they brought in Liz Grennan from McKinsey and they've made some other hires that
again, that's how from the outside, if they're a client, I have got an inside view, but
289
00:18:22,315 --> 00:18:23,936
for firms who
290
00:18:24,289 --> 00:18:26,071
We don't do business with an info dash.
291
00:18:26,071 --> 00:18:30,694
look at their org chart and say, how are they, what moves are they making there?
292
00:18:30,694 --> 00:18:39,762
And you know, somebody like Liz isn't going to come from McKinsey without having a high
level of commitment.
293
00:18:39,762 --> 00:18:46,988
Cause she's also an attorney and Connor Grennan's wife, who, you know, is a big thought
leader in the AI space.
294
00:18:46,988 --> 00:18:52,192
think he's a NYU law school or I'm sorry, not as he'd law school.
295
00:18:52,676 --> 00:18:54,317
or maybe business school.
296
00:18:54,458 --> 00:19:04,788
So, you know, I see people who I know they're not going to move unless the firm has
demonstrated a commitment and articulated some sort of a plan to move in a certain
297
00:19:04,788 --> 00:19:05,889
direction.
298
00:19:05,889 --> 00:19:10,815
And that tells me again, that's a leading indicator that they're starting to move in that
direction.
299
00:19:10,815 --> 00:19:11,976
uh
300
00:19:11,976 --> 00:19:14,396
yeah, we shouldn't get carried away.
301
00:19:14,396 --> 00:19:23,747
I mean, while organizations like Salesforce out in California literally have their own
team that builds AI products for their own use, right?
302
00:19:23,747 --> 00:19:31,567
You know, and some of the banks and some of the private equity funds are getting
definitely very interested in what AI can do for them and doing stuff internally as well.
303
00:19:31,567 --> 00:19:35,767
And some of the big consultancies too, have big bases in New York as well.
304
00:19:35,767 --> 00:19:36,987
You know,
305
00:19:37,215 --> 00:19:41,158
Are the big New York firms really disrupting themselves?
306
00:19:41,158 --> 00:19:43,769
I don't think yet, and maybe not for a while.
307
00:19:43,769 --> 00:19:49,203
But as you say, they're genuinely thinking about it now and they're exploring and
experimenting.
308
00:19:49,203 --> 00:19:53,546
They're bringing in these gen-ai productivity platforms and a whole bunch of other tools.
309
00:19:53,546 --> 00:19:57,588
It's going much, much more beyond knowledge management.
310
00:19:57,589 --> 00:20:01,431
It's not just like, okay, well make my life a little bit easier.
311
00:20:01,672 --> 00:20:05,455
Help me find the information I need so I can go off and do the deal that I need to do.
312
00:20:05,455 --> 00:20:06,605
It's more...
313
00:20:06,605 --> 00:20:16,459
now, well not more now, but in addition now, they're saying, okay, you know what, this may
actually alter our business model at a very small level to begin with.
314
00:20:16,459 --> 00:20:18,100
But that, it's all relative, isn't it?
315
00:20:18,100 --> 00:20:18,881
It's all relative.
316
00:20:18,881 --> 00:20:24,454
know, that, that in itself, the fact that they're even asking that question, I think is a
big deal.
317
00:20:24,454 --> 00:20:24,784
Yeah.
318
00:20:24,784 --> 00:20:32,432
Well, and let me ask you something along those lines when we talk about how law firms move
in that direction.
319
00:20:32,432 --> 00:20:44,122
And by that direction, mean, you know, making fundamental changes to foundational pieces
of their business, how legal service delivery, pricing, internal firm compensation, client
320
00:20:44,122 --> 00:20:45,022
engagement.
321
00:20:45,022 --> 00:20:51,267
These are foundational elements that are very firmly entrenched in law firms.
322
00:20:51,431 --> 00:20:54,733
and they have to change and they're going to change.
323
00:20:54,733 --> 00:21:00,896
Whether law firms like it or not, it's just at what speed and how.
324
00:21:00,896 --> 00:21:04,477
think one of the biggest ones is internal firm compensation.
325
00:21:04,477 --> 00:21:10,060
way lawyers are ruthlessly measured on billable hours today and that is such a deeply
entrenched.
326
00:21:10,060 --> 00:21:16,133
I've talked to some leaders of innovation councils at some pretty big firms and they have
cited that
327
00:21:16,255 --> 00:21:28,775
particular cultural element of the firm is one of the hardest to move the dial on where,
because again, if you're AI really changes the internal firm compensation, when pricing
328
00:21:28,775 --> 00:21:31,917
changes, internal firm compensation changes, right?
329
00:21:31,917 --> 00:21:43,555
If you're in a AFA scenario where the firm is taking on risk, revenue in the door doesn't
equate to profitability like it does today.
330
00:21:43,933 --> 00:21:46,891
you have to manage risk and efficiency.
331
00:21:46,984 --> 00:21:48,015
It's an interesting one, isn't it?
332
00:21:48,015 --> 00:21:52,658
Cause it's like, you know, the, I don't think there's anything wrong in using time as an
internal metric.
333
00:21:52,658 --> 00:21:57,031
I don't think it's the best metric internally, but if you, if you want to use it, okay,
fine.
334
00:21:57,031 --> 00:22:04,577
At least you've got some kind of yardstick, but I think for, know, there's Bob, he's only
worked 20 hours this week.
335
00:22:04,577 --> 00:22:08,810
Ooh, you know, there's Jane, she's worked 60 hours or a hundred hours this week.
336
00:22:08,810 --> 00:22:09,251
she's great.
337
00:22:09,251 --> 00:22:10,031
Okay.
338
00:22:10,031 --> 00:22:10,922
Whatever.
339
00:22:11,534 --> 00:22:18,280
That kind of thinking is as old as time, But what I think is an issue is judging value by
time.
340
00:22:18,380 --> 00:22:23,073
So, you know, here's Bob and Jane, they spend all weekend going through these documents,
aren't they wonderful?
341
00:22:23,073 --> 00:22:26,366
That'll be, you know, $25,000, please.
342
00:22:26,366 --> 00:22:29,487
Oh, no, that AI system did it in three minutes.
343
00:22:29,848 --> 00:22:30,879
Now what are we gonna do?
344
00:22:30,879 --> 00:22:32,583
It's nuts, I personally think.
345
00:22:32,583 --> 00:22:36,918
So yeah, keep the art internally, but externally, you're gonna have to find a better
method.
346
00:22:36,918 --> 00:22:38,080
Yeah, indeed.
347
00:22:38,080 --> 00:22:52,148
you know, and what, are your thoughts on, I've had this debate a lot and I would say from
the people I've talked to, it's pretty split, maybe 60, 60, 40 in favor of, and 40 % think
348
00:22:52,148 --> 00:22:54,870
that this can't happen, but I'm a believer.
349
00:22:54,870 --> 00:23:04,099
think that law firms are actually going to have to leverage their data in a meaningful way
as part of their AI strategy and
350
00:23:04,099 --> 00:23:11,239
and invest in R and D, which law firms historically have not done well or haven't done at
all for a couple of reasons.
351
00:23:11,239 --> 00:23:16,539
One, the partnership model optimizes is optimized for profit taking at the end of the
year.
352
00:23:16,539 --> 00:23:23,039
It's hard to accrue capital expenditure doesn't fit into that cash basis partnership
model.
353
00:23:23,039 --> 00:23:27,459
Well, but also they're, they're not, they're not software companies.
354
00:23:27,459 --> 00:23:29,979
They're not used to building software.
355
00:23:30,239 --> 00:23:32,879
And I don't know, what are you?
356
00:23:32,879 --> 00:23:33,899
Yeah.
357
00:23:34,432 --> 00:23:42,458
think, you know, law firms want to build stuff, go for it, you know, the same way that
someone might say to me, well, you don't have to buy that piece of technology to improve
358
00:23:42,458 --> 00:23:43,439
artificial wear.
359
00:23:43,439 --> 00:23:44,539
Why don't you build it yourself?
360
00:23:44,539 --> 00:23:45,454
It's really easy.
361
00:23:45,454 --> 00:23:48,702
You know, it's very, very, you know, pain by numbers.
362
00:23:48,983 --> 00:23:50,344
I might do it.
363
00:23:50,344 --> 00:23:54,738
I'll be honest, I'm probably more minded just to buy it and let someone else deal with it.
364
00:23:54,738 --> 00:23:57,250
I don't know, that for me is not the issue.
365
00:23:57,250 --> 00:24:03,626
It's the issue is can they start to think in a way that AI is the leverage?
366
00:24:03,626 --> 00:24:13,998
And I just want to bring in a law firm, an AI hybrid law firm, which I think is great,
which is called Covenant based in New York, created by the ex-general council of WeWork
367
00:24:13,998 --> 00:24:15,443
and her partner.
368
00:24:15,443 --> 00:24:17,065
And it's a great model.
369
00:24:17,065 --> 00:24:17,665
It's a great model.
370
00:24:17,665 --> 00:24:21,408
They have a tech company, which is the AI bit.
371
00:24:21,408 --> 00:24:25,452
And they have a law firm, which has got multiple lawyers in it, not a huge number at the
moment, just a small group.
372
00:24:25,452 --> 00:24:28,425
And they do private equity investments.
373
00:24:28,425 --> 00:24:30,887
I understand.
374
00:24:30,887 --> 00:24:31,769
And on the buy side.
375
00:24:31,769 --> 00:24:34,142
And there's a license agreement between the two.
376
00:24:34,142 --> 00:24:39,648
So they don't actually have to go away to Arizona to, you know, to do this.
377
00:24:39,648 --> 00:24:42,519
under special bar rules and it's, it's just great.
378
00:24:42,519 --> 00:24:51,171
then we were talking, we did this great video together and Jen, and she just came up with
this phrase, just came out in the conversation and she said, you know, AI is the leverage.
379
00:24:51,331 --> 00:25:02,154
And I just thought, man, I wish I'd coined that phrase 10 years ago, because for me, it
sums up everything that artificial law has been trying to say ever since I got started,
380
00:25:02,154 --> 00:25:05,435
that AI is the means of production.
381
00:25:05,635 --> 00:25:05,885
Right.
382
00:25:05,885 --> 00:25:08,636
Now there are some things which aren't sort of productivity based.
383
00:25:08,636 --> 00:25:16,396
You know, so, know, the classic cliche, know, it's Martin Lipton, you know, sitting on the
sofa thinking about the poison pill and he has a eureka moment.
384
00:25:16,396 --> 00:25:18,356
goes, aha, I've got it.
385
00:25:18,356 --> 00:25:18,576
All right.
386
00:25:18,576 --> 00:25:21,816
That's not a productivity play, but a due diligence report.
387
00:25:21,816 --> 00:25:23,296
That's a productivity play.
388
00:25:23,336 --> 00:25:24,336
A lot of legal research.
389
00:25:24,336 --> 00:25:24,656
Okay.
390
00:25:24,656 --> 00:25:27,956
There's a degree of sophistication around that, but a lot of it is bulk.
391
00:25:28,076 --> 00:25:29,856
E-discovery, that's bulk.
392
00:25:29,856 --> 00:25:30,516
Okay.
393
00:25:30,516 --> 00:25:34,476
Other intelligent insights and knowledge we can make that more sophisticated.
394
00:25:34,476 --> 00:25:34,796
Better.
395
00:25:34,796 --> 00:25:35,656
Yeah, obviously.
396
00:25:35,656 --> 00:25:37,148
But there's a ton of bulk there.
397
00:25:37,148 --> 00:25:39,230
So that's all going down to productivity, right?
398
00:25:39,230 --> 00:25:45,353
Anything that relates to productivity, you throw at leverage, IAV associate body, right?
399
00:25:45,353 --> 00:25:49,515
And the way we need to think is AI is the leverage.
400
00:25:49,595 --> 00:25:56,789
And I think for me, that's a tipping point once law firms, the big established law firms
start to look at Covenant and go, you know what?
401
00:25:56,789 --> 00:25:58,900
That's actually the best model we've ever seen.
402
00:25:59,221 --> 00:26:05,886
And the beautiful thing about it is they don't even have to get into complicated bar
rules, right?
403
00:26:05,886 --> 00:26:11,159
They don't even have to do that because if they structure it in the right way, they can do
this anyway.
404
00:26:11,159 --> 00:26:18,702
And besides, and even if they don't want to do that fancy, fancy structure, they can just
buy in the technology as a license holder.
405
00:26:18,863 --> 00:26:26,206
I mean, it's not these, again, it goes back to this earlier conversation about the great
idea is already here.
406
00:26:26,206 --> 00:26:27,798
The question is, will they do it?
407
00:26:27,798 --> 00:26:28,329
Right.
408
00:26:28,329 --> 00:26:29,289
Yeah.
409
00:26:29,289 --> 00:26:36,509
Did you hear, I'm assuming you read the story about Burford capital and the MSO model
there.
410
00:26:36,509 --> 00:26:37,729
It's a private equity firm.
411
00:26:37,729 --> 00:26:39,609
was in the financial times.
412
00:26:39,727 --> 00:26:46,599
I mean, litigation financial financiers are doing pretty interesting stuff.
413
00:26:46,599 --> 00:26:47,780
Yeah.
414
00:26:47,780 --> 00:26:59,755
They essentially stand up these managed services organizations and deploy private capital
into those, which really kind of take over some business of law functions.
415
00:26:59,755 --> 00:27:01,165
And it's a vehicle.
416
00:27:01,165 --> 00:27:05,758
It's an investment vehicle for private capital to get deployed in the US legal market.
417
00:27:05,758 --> 00:27:10,000
And it feels kind of a little bit like a back door at the moment.
418
00:27:10,000 --> 00:27:12,521
If things were
419
00:27:12,927 --> 00:27:20,743
If we had the ABS roles like we do in Arizona across the country, I'm not so sure that
would be the ideal structure.
420
00:27:20,764 --> 00:27:26,348
But one of the things I wanted to get your take on is I talk about this a lot.
421
00:27:26,588 --> 00:27:32,073
Law firms today, I spent 10 years at Bank of America in mostly risk management roles.
422
00:27:32,073 --> 00:27:41,298
So I was in consumer risk, then I moved into global treasury, then I went to corporate
audit, then I was in anti-money laundering and then compliance.
423
00:27:41,298 --> 00:27:51,203
really interesting work, but I got a really good sense for just how much rigor there is
around risk management and financial services for obvious reasons.
424
00:27:51,544 --> 00:27:54,645
And looking at law firms, they have none.
425
00:27:54,965 --> 00:28:00,297
There's literally nothing between the lawyer and his or her client, right?
426
00:28:00,297 --> 00:28:05,750
It's I make requests, lawyer delivers work product or advice.
427
00:28:05,750 --> 00:28:06,870
There's no
428
00:28:07,018 --> 00:28:17,229
layers and when we get into a scenario like where we start to talk about automation and
legal, you know, tech enabled legal service delivery, we have to start thinking about risk
429
00:28:17,229 --> 00:28:18,230
management.
430
00:28:18,230 --> 00:28:20,293
And what does that look like in a law firm?
431
00:28:20,293 --> 00:28:25,197
It would be lawyers listening to some risk manager saying, you can't do that.
432
00:28:25,298 --> 00:28:27,661
That's not going to go over very well, but I don't know.
433
00:28:27,661 --> 00:28:28,612
How do you think about
434
00:28:28,612 --> 00:28:29,313
two different things.
435
00:28:29,313 --> 00:28:36,418
think, I think one you sort of indirectly touched on absolutely key point, which is the
lawyers are the means of production, right?
436
00:28:36,418 --> 00:28:38,030
They are the center of the universe, right?
437
00:28:38,030 --> 00:28:43,966
We have, we do not live in the Copernican, Copernican, Copernian system, right?
438
00:28:43,966 --> 00:28:46,348
know, lawyers are the center of the universe.
439
00:28:46,348 --> 00:28:51,372
They do not, they do not revolve around process and product and so forth.
440
00:28:51,372 --> 00:28:51,792
Right?
441
00:28:51,792 --> 00:28:53,053
The lawyers,
442
00:28:53,413 --> 00:28:54,514
control everything.
443
00:28:54,514 --> 00:28:56,315
They do the final sign off.
444
00:28:56,315 --> 00:28:58,278
They do the interaction with the clients.
445
00:28:58,278 --> 00:28:59,898
They are the workflow.
446
00:28:59,979 --> 00:29:02,702
They have a mind, they have a hand, they have a whole machine.
447
00:29:02,702 --> 00:29:07,529
And it's only when we start to think about automation in the true sense.
448
00:29:07,529 --> 00:29:09,551
So you don't do that anymore.
449
00:29:09,731 --> 00:29:11,614
The machine does that.
450
00:29:11,614 --> 00:29:13,875
Talking about agentic flows or anything else.
451
00:29:14,195 --> 00:29:15,375
That's when things change.
452
00:29:15,375 --> 00:29:17,675
And I think that's when it comes to your point about risk.
453
00:29:17,675 --> 00:29:25,195
I mean, as it currently stands, it doesn't make much difference because anything that as
far as I understand, anything that goes out of the door of a law firm has to be signed off
454
00:29:25,195 --> 00:29:26,546
and is under their umbrella.
455
00:29:26,546 --> 00:29:32,937
What'll be interesting is as we start to build truly automated workflows, from start to
finish.
456
00:29:32,937 --> 00:29:37,917
Now it may be a very, very narrow workflow, but they will grow.
457
00:29:37,917 --> 00:29:38,927
They will grow.
458
00:29:38,927 --> 00:29:41,979
It's going to get more and more powerful.
459
00:29:41,979 --> 00:29:44,220
That's when I think the whole risk and insurance thing comes in.
460
00:29:44,220 --> 00:29:47,762
But even so, could argue that law firms still have it under their umbrella.
461
00:29:47,762 --> 00:29:59,226
And it will be down to the law firm or any consultants they can bring in to do pen testing
effectively, to make sure that it works completely fine.
462
00:29:59,226 --> 00:30:03,647
But yeah, for me, this has always been the battle in the...
463
00:30:03,829 --> 00:30:06,581
And it's totally understandable, because I probably would do the same.
464
00:30:06,581 --> 00:30:13,248
Most lawyers, most professionals, they see technology and they go, great, how can that add
to what I do already?
465
00:30:13,348 --> 00:30:18,292
How can that finesse or take a little bit of a bother out of my life?
466
00:30:18,412 --> 00:30:22,116
All right, they're the center of the universe, right?
467
00:30:22,116 --> 00:30:27,380
If that is all we do with AI now, then nothing's going to change at all.
468
00:30:27,380 --> 00:30:32,564
It goes back to, don't know if we'll probably have this point before, it becomes the Ikea
catalog situation.
469
00:30:32,588 --> 00:30:38,451
where you get your various shelving units and cushions and rugs and throwers and all of
this kind of stuff.
470
00:30:38,451 --> 00:30:43,514
And it's very pretty and it's very nice and it greatly increases the comfort of that
person.
471
00:30:43,514 --> 00:30:44,184
And why not?
472
00:30:44,184 --> 00:30:47,292
People like to be comfortable, but it doesn't change fundamentally anything.
473
00:30:47,292 --> 00:30:50,318
You're not Le Corbusier completely redesigning the building.
474
00:30:50,318 --> 00:30:56,181
You don't change your one bedroom flat into a machine for living, as Le Corbusier said.
475
00:30:56,181 --> 00:31:00,803
We're fundamentally still in the same world with some decorations from Ikea, right?
476
00:31:00,803 --> 00:31:02,804
Bought out of the catalog and then installed.
477
00:31:02,841 --> 00:31:07,102
Things only change once you start to automate whole streams.
478
00:31:07,422 --> 00:31:17,548
And I think that's, and I think this is incredibly difficult for professionals,
particularly lawyers, to get their heads around because it's just like, yes, you are not
479
00:31:17,548 --> 00:31:20,609
gonna own everything any longer.
480
00:31:21,029 --> 00:31:26,532
You might be able to own the output and make money from it, but you will not own those
workflows, right?
481
00:31:26,532 --> 00:31:28,572
And this is not like an assistant.
482
00:31:28,572 --> 00:31:30,467
I was thinking about how to...
483
00:31:30,467 --> 00:31:33,840
explain this in a very simple diagram for a thing I've got to do next week.
484
00:31:33,840 --> 00:31:38,645
So basically think about a partner, a partner has assistants, right?
485
00:31:38,645 --> 00:31:41,466
As they were kind of originally called rather than associates, right?
486
00:31:41,466 --> 00:31:46,720
That the assistant assists the partner in the things that the partner wants to achieve,
right?
487
00:31:46,720 --> 00:31:50,211
So the partner says, right, I need you to go and look at that law book, go.
488
00:31:50,211 --> 00:31:52,612
I need you to prove that contract, go.
489
00:31:52,612 --> 00:31:55,883
You go and pick me up a sandwich, go, right?
490
00:31:55,883 --> 00:31:57,834
They're assistants, right?
491
00:31:57,840 --> 00:32:06,157
They may contribute meaningful aspects to the end product, but fundamentally they're
completely under the command of this one person, right?
492
00:32:06,157 --> 00:32:08,019
They are the center of our universe, right?
493
00:32:08,019 --> 00:32:15,136
But in true automation and sort of an agentic flow, there is simply a person at one end of
it, because it's all pre-built, right?
494
00:32:15,136 --> 00:32:19,168
Even if it can be uh go through iterations and you can have some feedback.
495
00:32:19,168 --> 00:32:24,583
But ultimately, there's one person at the end of that line and there's another person at
the beginning of it.
496
00:32:24,583 --> 00:32:27,043
And that's it, right?
497
00:32:27,263 --> 00:32:31,474
You know, there is no assistant in this thing, right?
498
00:32:31,474 --> 00:32:36,174
It's a machine that makes something that comes out of the other end, to put it really
simply.
499
00:32:36,474 --> 00:32:46,274
And I think this is very, very, difficult because it's the clash, and this is something
that Susskind wrote about 25 million years ago, about the battle between the artisanal and
500
00:32:46,274 --> 00:32:47,414
the automation.
501
00:32:47,634 --> 00:32:52,642
when I started doing artificial law, I chose, I prefer the term like,
502
00:32:52,642 --> 00:32:59,002
you know, industrial, because I grew up in the Midlands in Britain, which is where parts
of the industrial revolution started.
503
00:32:59,002 --> 00:33:05,533
And I've always thought about, you know, the industrial revolution and how machines
changed everything.
504
00:33:05,533 --> 00:33:10,193
And for me, it it just seemed like so obvious, like this, this will happen to professional
services.
505
00:33:10,193 --> 00:33:17,293
And I think when that, you know, going back to the early beginnings of NLP and machine
learning, I think people could see making some small inroads, but they could say like,
506
00:33:17,293 --> 00:33:20,601
this is going to spread across the entire sector.
507
00:33:20,601 --> 00:33:22,641
No way, no way, it's not gonna happen.
508
00:33:22,741 --> 00:33:29,861
Now with large language models and language understanding and agentic flows, which are
getting better like by the minute, right?
509
00:33:30,241 --> 00:33:32,441
It actually is doable now.
510
00:33:32,701 --> 00:33:33,861
I think it really is doable.
511
00:33:33,861 --> 00:33:35,241
Do we have accuracy problems?
512
00:33:35,241 --> 00:33:37,021
Yes, accuracy is still a problem.
513
00:33:37,021 --> 00:33:44,492
If you automate an inaccurate system, you just mass produce bad goods, right?
514
00:33:44,492 --> 00:33:47,800
You know, like if you've got like a lathe in your garage,
515
00:33:47,800 --> 00:33:52,380
and you're turning out widgets and you've got something wrong with the, with the tool bit.
516
00:33:52,380 --> 00:33:56,454
All you're going to do is turn out a thousand widgets that you can't use.
517
00:33:56,454 --> 00:34:01,887
yeah, but is that, I mean, okay, look at, look at, look at Waymo, right?
518
00:34:01,887 --> 00:34:06,049
If I, if I stick a Waymo on the streets and it's got something wrong with it, you're going
to kill someone.
519
00:34:06,049 --> 00:34:08,140
I mean, literally you can kill someone, right?
520
00:34:08,140 --> 00:34:09,871
It's really serious stuff.
521
00:34:09,871 --> 00:34:12,723
They, they've managed to get on top of it.
522
00:34:12,723 --> 00:34:14,207
And now Waymo.
523
00:34:14,207 --> 00:34:17,232
from a day trial or is actually safer than human drivers.
524
00:34:17,232 --> 00:34:22,224
heard that it reduces side impact collisions by like higher than 95%.
525
00:34:22,224 --> 00:34:32,630
So from, you know, going from a hundred thousand a year to 5,000 a year, you know, because
people don't look both ways before they cross an intersection.
526
00:34:32,630 --> 00:34:34,391
You see a green light, you go.
527
00:34:34,391 --> 00:34:39,564
These vehicles have the ability to, you know, they have uh adjacent visibility.
528
00:34:39,564 --> 00:34:39,975
So.
529
00:34:39,975 --> 00:34:45,146
I was in one in San Francisco, know, for the Legal Innovators California, and it was
absolutely brilliant.
530
00:34:45,146 --> 00:34:46,506
Really fantastic.
531
00:34:46,506 --> 00:34:49,426
Shout out to Todd Smithline for paying for it.
532
00:34:49,426 --> 00:34:50,357
Thank you for that, Todd.
533
00:34:50,357 --> 00:34:51,277
It was so good.
534
00:34:51,277 --> 00:34:51,857
It was so good.
535
00:34:51,857 --> 00:34:54,828
Honestly, I got out of that and I just went, I'm a believer.
536
00:34:54,828 --> 00:34:58,872
It's funny, mean, it was funny having this, having this debate with a bunch of people.
537
00:34:58,872 --> 00:35:08,732
of this thing last week and they were talking about, know, oh my God, you know, it's so
good that, um, you know, you have human pilots on a plane, um, because they can rescue you
538
00:35:08,732 --> 00:35:09,692
if anything happens.
539
00:35:09,692 --> 00:35:12,563
And I was just like, well, why don't you just have an extra AI system?
540
00:35:12,563 --> 00:35:17,863
Have two or two, have three AI systems, one after the other, you know, double fail safe.
541
00:35:18,283 --> 00:35:25,994
I mean, it's just, I mean, it's just, it's just humans hanging on to, um, just
542
00:35:25,994 --> 00:35:27,045
you know, things they got used to.
543
00:35:27,045 --> 00:35:38,165
mean, it's like, okay, so every time you go up in an elevator, right, there's a guy who
stands in the elevator with a bunch of cushions in case anything goes wrong, right?
544
00:35:38,165 --> 00:35:41,127
It's just like, you just accept it, you go with it, right?
545
00:35:41,127 --> 00:35:45,854
You know, because after, literally, how long, when did they invent elevators?
546
00:35:45,854 --> 00:35:47,232
100 years ago?
547
00:35:48,394 --> 00:35:53,911
Gotta be house now, when they started building skyscrapers, so since the 1920s, right?
548
00:35:53,911 --> 00:35:55,213
But they worked out the kinks.
549
00:35:55,213 --> 00:35:56,614
I mean, and they've done it.
550
00:35:56,614 --> 00:36:01,136
First thing with Waymo, when it started, everyone said, this is ridiculous.
551
00:36:01,136 --> 00:36:02,036
It won't work.
552
00:36:02,036 --> 00:36:03,627
And then it kept on not working.
553
00:36:03,627 --> 00:36:04,991
And then people said, there you go.
554
00:36:04,991 --> 00:36:05,818
You're a bunch of idiots.
555
00:36:05,818 --> 00:36:07,238
You've wasted loads of money.
556
00:36:07,238 --> 00:36:08,379
Ha ha ha.
557
00:36:08,379 --> 00:36:11,440
And Tesla didn't help because they kept on saying, hey, we've got a self-driving car.
558
00:36:11,440 --> 00:36:13,042
And it obviously wasn't a self-driving car.
559
00:36:13,042 --> 00:36:14,744
People were like, Elon, you're not helping.
560
00:36:14,744 --> 00:36:15,584
And...
561
00:36:16,456 --> 00:36:21,860
Now they have, but he could have maybe toned it down a few years ago before he really
actually had one.
562
00:36:21,860 --> 00:36:22,550
But they did it.
563
00:36:22,550 --> 00:36:23,388
They actually did it.
564
00:36:23,388 --> 00:36:27,835
I think that is, I don't understand why everyone like parades down every street in
America.
565
00:36:27,835 --> 00:36:32,558
Because that is, that for me is as impressive as putting a man on the moon.
566
00:36:32,558 --> 00:36:33,738
It really is.
567
00:36:33,926 --> 00:36:41,926
You had somebody on your podcast recently that threw out a number that I'd love to bounce
off of you.
568
00:36:41,926 --> 00:36:54,166
Speaking of automation, know, when shortly after chat GPT-35 was released, Goldman came
out and said, like, I think it was 44 % of legal tasks were subject to automation by
569
00:36:54,166 --> 00:36:55,306
artificial intelligence.
570
00:36:55,306 --> 00:37:02,646
And you had somebody on that was talking about by 2027, 80 % of legal tasks could be
automated.
571
00:37:02,646 --> 00:37:03,399
And that
572
00:37:03,399 --> 00:37:05,199
that number seems really high.
573
00:37:05,199 --> 00:37:09,499
Actually the 44 % number still seems really high to me.
574
00:37:09,619 --> 00:37:10,991
In 2027 that
575
00:37:10,991 --> 00:37:16,411
well, I mean, we're going to end up in a sort of like big semantic thing around, you know,
what is a task?
576
00:37:16,411 --> 00:37:19,271
When, when does the task begin and end and so forth.
577
00:37:19,311 --> 00:37:30,091
I mean, if people want to listen to it, it's Richard, maybe, uh, who's the CEO of Juro,
um, uh, legal tech company focused on in-house and it's on my law punks podcast.
578
00:37:30,091 --> 00:37:38,771
If you just type law punks P U N X not K S P U N X into artificial work, you'll find it.
579
00:37:38,791 --> 00:37:39,639
Um,
580
00:37:40,127 --> 00:37:42,807
Yeah, mean, mean, broadly, I agree.
581
00:37:43,047 --> 00:37:46,627
Will more and more and more tasks be automatable?
582
00:37:46,627 --> 00:37:47,467
Yeah.
583
00:37:47,947 --> 00:37:51,107
Secondly, the issue is will they get automated?
584
00:37:51,107 --> 00:37:57,158
Well, given all the things we've just talked about, probably at the point of an economic
gun, right?
585
00:37:57,218 --> 00:37:58,578
So not anytime soon.
586
00:37:58,578 --> 00:37:59,938
Now for in-house, it is different.
587
00:37:59,938 --> 00:38:01,338
Obviously he's focused on in-house.
588
00:38:01,338 --> 00:38:06,738
So he's very much focused on what he's seeing day to day and what he sees.
589
00:38:06,738 --> 00:38:08,090
really does believe.
590
00:38:08,090 --> 00:38:10,010
there can be massive change very rapidly.
591
00:38:10,010 --> 00:38:18,361
And I think in-house probably does have, I'd hope so anyway, less of a barrier, but we've
been saying that for years, right?
592
00:38:18,361 --> 00:38:19,501
We've been saying that for years.
593
00:38:19,501 --> 00:38:27,392
In-house teams, they obviously are gonna defeat the billable hour because why would they
want it?
594
00:38:27,392 --> 00:38:28,252
But they do.
595
00:38:28,252 --> 00:38:35,264
Many, law firms say, actually, we suggested to blah, blah, General Counselor, large
corporate, we'll stop doing billable hours if you want.
596
00:38:35,264 --> 00:38:37,464
I'm a general counsel, so no, no, no, please keep doing it.
597
00:38:37,464 --> 00:38:40,655
Makes it much, much easier for me to keep on track, know, keep track of things.
598
00:38:40,655 --> 00:38:42,534
So I don't know.
599
00:38:42,534 --> 00:38:43,775
I think for me it's for human aspect.
600
00:38:43,775 --> 00:38:45,575
I think technically he is right.
601
00:38:45,575 --> 00:38:47,615
I think technically he's absolutely right.
602
00:38:47,615 --> 00:38:49,035
Could it happen?
603
00:38:49,035 --> 00:38:50,835
Yes, probably.
604
00:38:51,175 --> 00:38:57,775
If we keep going the way we're going, will it happen because of human and economic factors
and cultural factors?
605
00:38:57,775 --> 00:38:59,595
I think, I hate to it, probably not.
606
00:38:59,595 --> 00:39:02,575
My guess is, I think I might have said before on this program,
607
00:39:02,575 --> 00:39:06,606
my bet for real true transformation isn't about 12 years.
608
00:39:06,606 --> 00:39:07,966
Now you might say, well, that's terrible.
609
00:39:07,966 --> 00:39:10,666
That's a really long way, but I'm talking like real, real transformation.
610
00:39:10,666 --> 00:39:11,646
So is it worth it?
611
00:39:11,646 --> 00:39:17,026
For example, if I said to you, there'll be no oil and gas in 12 years, we'll have a
completely green economy.
612
00:39:17,026 --> 00:39:22,886
There will be no more pollution, at least from, you know, the energy sector in 12 years.
613
00:39:22,886 --> 00:39:24,366
You'd like, well, I don't think that's going to happen.
614
00:39:24,366 --> 00:39:26,046
But if it did, that would be amazing.
615
00:39:26,046 --> 00:39:27,606
Completely transform a global economy.
616
00:39:27,606 --> 00:39:30,146
It would have geopolitical consequences.
617
00:39:30,366 --> 00:39:31,022
Right.
618
00:39:31,114 --> 00:39:36,816
like bye-bye Saudi Arabia, know, they're gonna have to move pretty quickly into solar
power.
619
00:39:36,816 --> 00:39:38,497
It would have huge, huge effects.
620
00:39:38,497 --> 00:39:50,883
Now, I honestly do believe that in about 12 years that is gonna happen in professional
services, knowledge management service, services, because if AI is where it is now, just
621
00:39:50,883 --> 00:39:59,458
less than three years after the launch of Jack GPT, when OpenAI was the only company
seriously investing.
622
00:39:59,548 --> 00:40:05,428
in large language models and where and look now, there's more than you can even poke a
stick at, right?
623
00:40:05,448 --> 00:40:07,199
And there's more every day.
624
00:40:07,199 --> 00:40:10,059
I mean, and look at the way it's building.
625
00:40:10,259 --> 00:40:12,659
So like look at lovable, right?
626
00:40:12,659 --> 00:40:18,210
You just type into like a prompt box, build me a job website.
627
00:40:18,210 --> 00:40:20,178
20 minutes later, there it is.
628
00:40:20,178 --> 00:40:21,858
It's incredible.
629
00:40:21,858 --> 00:40:24,598
then, you know, and it's something I remember talking to Zach about ages ago.
630
00:40:24,598 --> 00:40:30,278
He was just saying, you know, like when the internet began, people just use it for very
basic search.
631
00:40:30,418 --> 00:40:33,318
And then you've got these like layers of an onion.
632
00:40:33,318 --> 00:40:36,318
They accreted, they built up and built up and built up.
633
00:40:36,318 --> 00:40:41,838
now, you know, well now the internet has become a source for LLMs.
634
00:40:41,838 --> 00:40:49,272
And through, and then through this bizarre sort of combination of data sources and LLMs
and agents and all this kind of thing, it's all.
635
00:40:49,272 --> 00:40:57,747
merging together into this huge kind of ecosystem that surrounds us now in a way that we
couldn't even have imagined when the internet first appeared.
636
00:40:57,788 --> 00:41:04,912
And on top of all of that will be new understandings and new uses for AI, which we haven't
even started to grasp yet.
637
00:41:05,312 --> 00:41:13,469
know, that's the thing, you know, if you think about it, right, lovable could not have
existed not in the way that it does today or cursor, for example, could not have existed
638
00:41:13,469 --> 00:41:14,209
the way it does today.
639
00:41:14,209 --> 00:41:18,722
If it hadn't been for work, people like Sam Altman and his group, right.
640
00:41:18,812 --> 00:41:21,104
back in the late teens.
641
00:41:21,104 --> 00:41:23,025
But look how fast things have moved.
642
00:41:23,465 --> 00:41:24,807
Where are we gonna be in five years?
643
00:41:24,807 --> 00:41:25,687
And then 10.
644
00:41:25,687 --> 00:41:35,411
And then you've got, you really, really wanna push the boat out, do you think we'll have
some usable quantum computing technology in a decade?
645
00:41:35,572 --> 00:41:37,894
Maybe, not guaranteed.
646
00:41:37,894 --> 00:41:38,534
But you know what mean?
647
00:41:38,534 --> 00:41:45,157
It's like, even without that, even without that extra super powerful compute power, look
at the chips they're knocking out now.
648
00:41:45,481 --> 00:41:55,201
Look at the investments, look at the gigantic investments from Cisco into OpenAI and the
relationship with NVIDIA now, OpenAI is doing a deal with AMD.
649
00:41:55,661 --> 00:41:59,972
And there's a whole bunch of, there's just like billions upon billions flowing into this.
650
00:41:59,972 --> 00:42:00,543
Right.
651
00:42:00,543 --> 00:42:06,563
So yeah, I mean, it's, it's, I really do believe it's going to be a complete sort of tech
ecosystem transformation.
652
00:42:06,743 --> 00:42:10,476
Very, very similar to what we saw from the late nineties onwards with the internet.
653
00:42:10,476 --> 00:42:12,598
And we're seeing, we're seeing examples of that.
654
00:42:12,598 --> 00:42:23,146
You know, the, the Garfields in the UK, the Crosby's in New York, you mentioned covenants,
the UD is like, these are, this isn't theory anymore.
655
00:42:23,146 --> 00:42:24,879
This is, this is happening.
656
00:42:24,879 --> 00:42:27,501
And, um, and they're delivering real work.
657
00:42:27,501 --> 00:42:30,113
Like I know a little bit about Crosby.
658
00:42:30,113 --> 00:42:35,648
they had a Sequoia is one of their capital partners and they have a podcast called
training data.
659
00:42:35,648 --> 00:42:38,121
And they had the founders of Crosby on.
660
00:42:38,121 --> 00:42:41,792
And the way, what they're doing essentially is like master services agreement.
661
00:42:41,792 --> 00:42:49,157
have a very narrow niche master services agreements for big, you know, like clay and
stripe.
662
00:42:49,157 --> 00:42:54,000
And, they are delivering with a very small number of attorneys.
663
00:42:54,020 --> 00:42:56,211
So, I mean, this is, this is happening now.
664
00:42:56,211 --> 00:42:57,811
This isn't theory.
665
00:42:57,811 --> 00:42:58,422
So
666
00:42:58,615 --> 00:43:00,615
But again, it's scale, isn't it?
667
00:43:00,615 --> 00:43:00,995
It's scale.
668
00:43:00,995 --> 00:43:02,215
mean, this is the thing that look at Lovable.
669
00:43:02,215 --> 00:43:07,655
Lovable grew from like, you know, like $1 to tens of millions of dollars in about 12
months.
670
00:43:07,655 --> 00:43:12,455
It was instantly usable and there was no barrier, right?
671
00:43:12,475 --> 00:43:18,586
With legal tech has always suffered from, like I going back to the same point, it's got
structural barriers, right?
672
00:43:18,586 --> 00:43:19,866
I Covenant is a great model.
673
00:43:19,866 --> 00:43:20,726
Crosby is a great model.
674
00:43:20,726 --> 00:43:21,797
Garfield is a great model.
675
00:43:21,797 --> 00:43:23,655
Why aren't there 200 of these?
676
00:43:23,655 --> 00:43:26,048
or why doesn't Crosby
677
00:43:30,644 --> 00:43:33,119
mean, they've only been in business a year, right?
678
00:43:33,119 --> 00:43:35,016
So it's, they're, they're brand new.
679
00:43:35,016 --> 00:43:37,257
I suppose it was really interesting calculation.
680
00:43:37,257 --> 00:43:45,197
I'm have to do it at some point, which is, I mean, if you look at the total revenue of all
the legal tech companies on the planet, right?
681
00:43:45,217 --> 00:43:54,297
And this is not a zero sum type calculations, which makes it very, very complicated, but
start off with what is the total revenue of all the legal tech companies in the world,
682
00:43:54,297 --> 00:43:54,917
right?
683
00:43:54,917 --> 00:43:57,277
And break it down into what they do.
684
00:43:57,477 --> 00:44:03,093
So if we look at sort of more productivity side, so let's put aside all project
management, billing.
685
00:44:03,093 --> 00:44:10,156
all that kind of stuff, time tracking, take that out and just look at them all sort of
productivity tools and look at the revenue that comes into them.
686
00:44:10,156 --> 00:44:10,636
Right.
687
00:44:10,636 --> 00:44:17,709
Now that is a measure to some degree of how far business of law has integrated these
technologies.
688
00:44:17,709 --> 00:44:18,000
Right.
689
00:44:18,000 --> 00:44:22,942
And then once you've done that, and it's very, very complicated to figure that out, maybe
McKinsey can do it.
690
00:44:22,942 --> 00:44:28,854
Then look at what a percentage that is of the total legal tech market.
691
00:44:29,296 --> 00:44:33,016
sorry, legal market, sorry, it's very late in the day here.
692
00:44:33,016 --> 00:44:38,656
What percentage is of the total revenue of the whole legal market, right?
693
00:44:38,656 --> 00:44:40,496
Now, these are not exact fractions, right?
694
00:44:40,496 --> 00:44:51,316
You can't say, you know, if someone spends 20 million on X legal tech company over a year,
or a bunch of different people spend 20 million, doesn't mean that that has been extracted
695
00:44:51,316 --> 00:44:56,696
from the legal market, because it may actually be empowering the legal market to do more.
696
00:44:56,696 --> 00:44:57,646
The same way,
697
00:44:57,646 --> 00:45:04,437
that law firms spent a fortune on Microsoft Word and email, and that enabled them to make
a hell of a lot more money, right?
698
00:45:04,437 --> 00:45:08,299
So I think I'd love to see some financial analysis around that.
699
00:45:08,299 --> 00:45:14,261
So like money that's coming out of a legal market into legal tech that then in turn drives
productivity.
700
00:45:14,261 --> 00:45:22,563
And then how much of a sort of force multiplier effect that has on those lawyers.
701
00:45:22,563 --> 00:45:24,823
which enables the market to grow even larger.
702
00:45:24,823 --> 00:45:28,923
So it becomes like this sort of supporting cycle.
703
00:45:28,923 --> 00:45:30,943
And we just don't really have the data.
704
00:45:30,943 --> 00:45:32,583
We just don't have the data.
705
00:45:32,583 --> 00:45:41,294
And there's a generally very, you know, it's classic reductive view is, you if you spend a
dollar on an AI tool, you're taking a dollar out of a lawyer's pocket, right?
706
00:45:41,294 --> 00:45:43,294
Which I don't think really follows at all.
707
00:45:43,294 --> 00:45:44,994
It could, it could in certain streams.
708
00:45:44,994 --> 00:45:49,485
And that's why it gets even more complicated because there are certain streams of work,
which I think will go away.
709
00:45:49,485 --> 00:45:53,209
right, they will get automated out of existence in terms of the lawyers putting their
fingers on it.
710
00:45:53,209 --> 00:45:55,391
It will become commoditized, right?
711
00:45:55,391 --> 00:45:56,432
That will happen.
712
00:45:56,432 --> 00:46:02,457
But there'll be other areas of work where the AI will actually massively increase what
lawyers can do.
713
00:46:02,457 --> 00:46:06,319
And they may end up becoming way, way, way more wealthy than they are today.
714
00:46:06,320 --> 00:46:09,913
And it's quite difficult because you both of these, both of these pictures are true.
715
00:46:09,913 --> 00:46:14,537
And I think that's one of the things that's and people like clear pictures, right?
716
00:46:14,537 --> 00:46:16,548
know, AI will destroy all lawyers.
717
00:46:16,548 --> 00:46:17,533
No, that's not true.
718
00:46:17,533 --> 00:46:19,773
Okay, so AI is irrelevant.
719
00:46:20,053 --> 00:46:21,333
No, that's not true.
720
00:46:21,333 --> 00:46:30,893
So it's kind of like, you know, it's like, how can AI actually eat up probably what will
be in 10 years, a significant chunk of what is today of a legal market.
721
00:46:31,373 --> 00:46:37,964
And yet at the end of the day, Christmas 3035, is that right?
722
00:46:37,964 --> 00:46:40,144
Yeah, Christmas 35, right?
723
00:46:40,624 --> 00:46:43,364
There'll be a lot of lawyers who are saying, my God, thank God for AI.
724
00:46:43,364 --> 00:46:44,616
I'm so much wealthier now.
725
00:46:44,616 --> 00:46:50,499
And I don't want to jump out of a window because that's really the mindset of a lot of
lawyers today.
726
00:46:50,499 --> 00:46:56,425
It can be a very brutal profession and coming up the ranks is sometimes a painful process.
727
00:46:56,425 --> 00:46:57,699
uh
728
00:46:57,699 --> 00:47:06,279
that's, that's one of the things I mean, I I gave a talk in Oxford the other day at a, and
I don't know if it me something in public meeting somebody called Gunnar Cook, which is a
729
00:47:06,279 --> 00:47:07,859
distributed law firm.
730
00:47:07,859 --> 00:47:08,099
Right.
731
00:47:08,099 --> 00:47:11,470
So they're all individuals, but they work under the umbrella of Gunnar Cook.
732
00:47:11,470 --> 00:47:12,210
Right.
733
00:47:12,330 --> 00:47:15,830
And they're using AI systems now, which is fantastic.
734
00:47:15,850 --> 00:47:24,070
And, you know, we were talking about the fact that, you know, it's, it's just, it's an,
it's all, it's an all, you know, it's all positive for them because they don't have
735
00:47:24,070 --> 00:47:24,900
leverage.
736
00:47:24,900 --> 00:47:28,521
It's mostly made up of experienced lawyers, right?
737
00:47:28,521 --> 00:47:31,553
They don't have this huge leverage associate group beneath them.
738
00:47:31,553 --> 00:47:34,885
And having AI greatly increases what they can do.
739
00:47:34,885 --> 00:47:41,817
And it does work that quite often if a client comes to an individual in a group like this,
one of these distributed law firms, they're not coming to them because they know they've
740
00:47:41,817 --> 00:47:44,288
got 25 associates working in the basement.
741
00:47:44,288 --> 00:47:49,820
They come to them because they know that person has real expertise and will really add
value on a very human level.
742
00:47:49,820 --> 00:47:51,440
But of course,
743
00:47:51,766 --> 00:47:55,048
leverage based work will creep in.
744
00:47:55,048 --> 00:47:56,630
But now they can use AI to do that.
745
00:47:56,630 --> 00:47:58,872
So I mean, it's an absolute win-win for them.
746
00:47:58,872 --> 00:48:10,549
And again, for these hybrid AI companies and the ALSPs and all of these, it's, mean, this
is the thing that is kind of tantalizing is that we really are starting to see more and
747
00:48:10,549 --> 00:48:14,040
more examples of organizations using AI.
748
00:48:14,356 --> 00:48:15,946
and it's just all positive.
749
00:48:15,946 --> 00:48:22,786
And simultaneously, we're seeing both some law firms and some in-house legal teams kind of
gritting their teeth and going, oh, I'm not sure.
750
00:48:22,786 --> 00:48:23,826
I'm not sure about this.
751
00:48:23,826 --> 00:48:24,946
I'm not sure about this.
752
00:48:24,946 --> 00:48:26,446
Is this a good idea?
753
00:48:26,686 --> 00:48:28,826
And you're just like...
754
00:48:28,857 --> 00:48:32,459
Yeah, there's still a lot of pearl clutching going on.
755
00:48:32,459 --> 00:48:39,384
Well, we're almost out of time, but this has been a great conversation as it always is
every time we have you on.
756
00:48:39,384 --> 00:48:46,589
Before we hop off, how do people find out more about legal innovators and the writing that
you do?
757
00:48:46,936 --> 00:48:50,707
Go to chat GPT and I'm not even kidding.
758
00:48:50,707 --> 00:48:56,090
I get a lot of traffic from, from LLMs now uh perplexity and others, right?
759
00:48:56,090 --> 00:48:59,792
Go to chat GPT and ask it about artificial lawyer.
760
00:48:59,792 --> 00:49:02,592
And hopefully it'll say something nice about me and provide a link.
761
00:49:02,592 --> 00:49:08,495
But if you don't want to do that, just go into Google and type artificial lawyer.com and
you'll, you'll find me.
762
00:49:08,495 --> 00:49:10,447
But yeah, on the New York event.
763
00:49:10,447 --> 00:49:20,886
if I imagine some of your listeners are in the US, the New York event, Legal Innovators
New York, the website is literally legalinnovatorsnewyork.com and it will be on the 19th
764
00:49:20,886 --> 00:49:22,878
and 20th of November.
765
00:49:22,878 --> 00:49:31,330
Day one is law firms, day two is in-house and obviously we'd like to see everybody on both
days but if you want to focus on one or the other then you can take your pick and it will
766
00:49:31,330 --> 00:49:32,630
be in Midtown.
767
00:49:33,041 --> 00:49:33,801
Okay.
768
00:49:34,021 --> 00:49:35,462
That's great to hear.
769
00:49:35,462 --> 00:49:38,545
And I'm a big fan of the artificial lawyer site.
770
00:49:38,545 --> 00:49:40,226
I'm very selective.
771
00:49:40,226 --> 00:49:42,316
I get a few newsletters that I read every day.
772
00:49:42,316 --> 00:49:43,207
Yours is one of them.
773
00:49:43,207 --> 00:49:46,188
So um I really appreciate your time.
774
00:49:46,188 --> 00:49:50,291
Best of luck with the conference and I'm sure we'll bump into each other sometime real
soon.
775
00:49:50,291 --> 00:49:51,325
I hope so, definitely.
776
00:49:51,325 --> 00:49:51,872
Thank you.
777
00:49:51,872 --> 00:49:52,737
All right, take care.
778
00:49:52,737 --> 00:49:54,185
Bye bye. -->
Subscribe
Stay up on the latest innovations in legal technology and knowledge management.