This recap episode brings together insights from legal innovators, law firm leaders, technologists, educators, and founders to explore how artificial intelligence is reshaping the business of law, not just the tools lawyers use, but how legal services are delivered, staffed, priced, and experienced by clients.
Rather than a single conversation, this episode curates key moments across multiple discussions to surface the patterns emerging across the legal industry: AI as an augmenting force, delivery as a differentiator, and business models under pressure to evolve.
From law firm innovation teams and knowledge management leaders to legal tech founders and academics, this episode captures where the profession is aligning and where it’s still wrestling with change.
Key takeaways:
AI works best when it augments existing workflows, not when it’s treated as a silver bullet
Faster, clearer, and more transparent delivery often matters more than lower cost
Client portals, data services, and self-service tools are redefining availability and value Change management, not technology, is the hardest part of innovation
Legal education and associate development must evolve beyond “learning by grunt work”
Firms that rethink delivery models now are better positioned for the next decade
About the guests
This recap includes insights from the following leaders across legal, technology, and education:
Zach Posner – Co-Founder & Managing Partner, The LegalTech Fund
David Boland – Chief Knowledge & Innovation Officer, Ogletree Deakins
Annie Datesh – Chief Innovation Officer, Wilson Sonsini
Patrick Dundas – Partner, Knowledge Management, Proskauer Rose LLP
Heidi Brown – Associate Dean for Upper Level Writing, New York Law School
Sarah Thompson – Chief Product Officer, BlueStar
Peter Duffy – CEO, Titans
Abhijat Saraswat – Chief Revenue Officer, Lupl
Haley Altman – Strategic Advisor, Litera
Monica Zent – Founder & CEO, ZentLaw
Rob Saccone – CTO, Lega
Kara Peterson – Co-Founder & CEO, Descrybe
Elisabeth Cappuyns – Director of Knowledge Management, DLA Piper
Sean Harrington – Director of Technology & Innovation, University of Oklahoma College of Law
Hayley Stillwell – Associate Professor, University of Oklahoma College of Law
Subscribe for Updates
Machine Generated Episode Transcript
1
00:00:00,000 --> 00:00:04,770
On November 30th of last year when
they, when OpenAI released their
2
00:00:04,770 --> 00:00:09,660
demo, it was probably the greatest
demo of all time when it comes
3
00:00:09,660 --> 00:00:11,670
to like technology introductions.
4
00:00:12,420 --> 00:00:15,720
Like you saw that and it didn't
matter what it was saying, right?
5
00:00:15,720 --> 00:00:19,440
Because we know now that half the stuff
it was saying was made up or some high
6
00:00:19,495 --> 00:00:21,420
percentage was hallucinating at the time.
7
00:00:21,960 --> 00:00:22,050
Right.
8
00:00:22,050 --> 00:00:25,560
But it was such a good demo that like
anybody could see it and understand it.
9
00:00:25,560 --> 00:00:27,510
You didn't need to be
technically inclined.
10
00:00:28,290 --> 00:00:31,740
To think about how that could affect
your world, your role, et cetera.
11
00:00:31,800 --> 00:00:35,550
I go back to the quote that Bill Gates
has from years ago, and he basically
12
00:00:35,550 --> 00:00:40,050
says, people overestimate what happens
in one year with new technology, but
13
00:00:40,050 --> 00:00:41,700
they underestimate what happens in 10.
14
00:00:41,880 --> 00:00:45,390
And I think that although it
was a spectacular demo, we're
15
00:00:45,390 --> 00:00:47,640
somewhere on that scale right now.
16
00:00:47,640 --> 00:00:50,820
My guess is the, the, the stuff
for this really to start thinking
17
00:00:50,820 --> 00:00:52,020
about legal in a meaningful way.
18
00:00:52,020 --> 00:00:55,710
We're probably still three years away,
two years away, but I think that, um.
19
00:00:56,385 --> 00:00:59,864
If you ask me why the demo, why
everybody's paying attention, I think
20
00:00:59,864 --> 00:01:02,894
it's 'cause the demo was so good and it's
great that people are paying attention
21
00:01:02,894 --> 00:01:06,179
to this because it's, it's probably gonna
propel a lot of technology adoption.
22
00:01:06,915 --> 00:01:08,715
You mentioned, uh, copilot.
23
00:01:09,045 --> 00:01:12,735
That's part of our generative AI strategy,
which, um, you know, we've had the
24
00:01:12,735 --> 00:01:16,935
pleasure and knowledge management to
help, uh, define what our generative
25
00:01:16,935 --> 00:01:18,855
AI strategy will be for the firm.
26
00:01:18,945 --> 00:01:23,115
And a big part of that is embracing
copilot, eventually going to just
27
00:01:23,115 --> 00:01:27,045
be table stakes, uh, for many of
the law firms that are out there.
28
00:01:27,045 --> 00:01:31,305
But given our position with Microsoft,
it makes complete sense, almost
29
00:01:31,305 --> 00:01:34,965
self-evident that that's something
that we need to, uh, embrace.
30
00:01:35,335 --> 00:01:37,825
And explore and do that
as quickly as we can.
31
00:01:38,155 --> 00:01:43,285
But, you know, we, we we're big power
BI users for data visualization.
32
00:01:43,285 --> 00:01:45,775
That's both internally as
well as with our clients.
33
00:01:45,805 --> 00:01:51,235
Uh, our clients have found that to be
incredibly helpful in, um, representing
34
00:01:51,235 --> 00:01:53,365
a lot of their content and their data.
35
00:01:53,755 --> 00:01:56,665
Uh, but it's also helpful in
things like matter management.
36
00:01:57,205 --> 00:02:00,804
Uh, in creating dashboards on and
making sure that we're running our
37
00:02:00,804 --> 00:02:04,945
matters, especially our portfolio
accounts very profitably, uh,
38
00:02:04,945 --> 00:02:06,640
and keeping a close eye on, um.
39
00:02:07,620 --> 00:02:08,699
On those things.
40
00:02:08,699 --> 00:02:12,240
And then, um, also, again, I mentioned,
you know, with our data analytics
41
00:02:12,240 --> 00:02:15,510
capability, looking at our internal
data and supplementing that with
42
00:02:15,510 --> 00:02:19,260
a lot of the publicly available
content or data that's available, we
43
00:02:19,260 --> 00:02:20,970
still don't have the right answers.
44
00:02:21,060 --> 00:02:24,420
Should it be the big tech that we're
not talking about art industry, that
45
00:02:24,420 --> 00:02:28,200
we're talking about big, big tech
like Meta and Google and saying,
46
00:02:28,380 --> 00:02:31,380
should they be the keepers of the
big foundational models and just.
47
00:02:31,935 --> 00:02:37,575
Keep creating these small layers on top
of it, or should actually many startups be
48
00:02:37,575 --> 00:02:43,755
funded and have a go at creating smaller
foundation models for specific cases.
49
00:02:44,085 --> 00:02:47,954
Not that much more different
than previous tech ways.
50
00:02:47,954 --> 00:02:54,089
Ted, when we had mobile apps, when we had
cloud, or when we had SaaS, all of this.
51
00:02:54,675 --> 00:02:59,055
Um, all of these tech waves have
followed very similar patterns, right?
52
00:02:59,325 --> 00:03:01,425
The VC community gets super excited.
53
00:03:01,665 --> 00:03:04,005
The tech community gets super excited.
54
00:03:04,035 --> 00:03:05,955
'cause building tech is very easy now.
55
00:03:06,315 --> 00:03:10,455
Lots of money is funneled into it, and
then there comes a time when things
56
00:03:10,455 --> 00:03:15,495
just sell down and you realize, well,
some of it was smoke and some of it was.
57
00:03:18,270 --> 00:03:23,460
Citation you just made, um, is a little
bit more scary because now we're talking
58
00:03:23,460 --> 00:03:28,320
about a very significant player in our
industry, um, that's being questioned.
59
00:03:28,320 --> 00:03:30,630
Like, okay, how much of this was real?
60
00:03:30,900 --> 00:03:33,840
There's also, not to get super
technical on the call, but.
61
00:03:34,260 --> 00:03:38,220
There's also a difference between
precision and recall, right?
62
00:03:38,220 --> 00:03:43,170
And I think some of the Stanford paper
was getting into the details of, yes,
63
00:03:43,230 --> 00:03:46,200
are you over-engineering on one site?
64
00:03:46,200 --> 00:03:51,180
So that it only gives us, um, you
know, it eliminates, uh, false
65
00:03:51,180 --> 00:03:54,390
negatives to the point that we
don't have enough false positives.
66
00:03:54,390 --> 00:03:56,339
So there's a very interesting.
67
00:03:56,820 --> 00:04:01,620
Um, uh, you know, sort of
deep delve into this space.
68
00:04:01,830 --> 00:04:05,460
But yeah, for now I would say
we're definitely in the space
69
00:04:05,460 --> 00:04:09,330
of more of a reality check, uh,
which is a good place to be.
70
00:04:09,330 --> 00:04:10,410
I think it is.
71
00:04:10,710 --> 00:04:15,150
You know, it's the place frankly,
where those of us in my role in in
72
00:04:15,360 --> 00:04:19,050
firms have a better place to position
these products to our lawyers.
73
00:04:19,589 --> 00:04:20,279
It's here.
74
00:04:20,430 --> 00:04:23,880
These are the things that can do, here
are the things it doesn't do very well.
75
00:04:24,150 --> 00:04:25,530
Let's use it sensibly.
76
00:04:25,530 --> 00:04:28,469
Let's use it safely,
and so on and so forth.
77
00:04:28,469 --> 00:04:31,860
So it just makes it more
palatable, Ted, also.
78
00:04:31,860 --> 00:04:34,409
Yeah, you don't want
something to be too perfect.
79
00:04:34,409 --> 00:04:37,050
'cause that for sure is
a recipe for disaster.
80
00:04:37,140 --> 00:04:38,730
I think we're sliding into a trough.
81
00:04:38,730 --> 00:04:44,159
I, I hate to be not optimistic, but, you
know, vendors have over promised, there's
82
00:04:44,159 --> 00:04:46,200
still confusion about what the tech.
83
00:04:46,710 --> 00:04:48,600
Can and should do.
84
00:04:49,200 --> 00:04:53,460
I think people are sliding into
the classic trust issues that mark
85
00:04:53,460 --> 00:04:56,700
the disillusionment part of the
cycle, and I mean, it makes sense.
86
00:04:56,700 --> 00:05:00,570
You don't have applications
right now that are actually.
87
00:05:00,645 --> 00:05:04,815
It's the right application
of the underlying technology.
88
00:05:04,875 --> 00:05:08,835
I mean, even rag, if you have an
LLM that's a statistical model of
89
00:05:08,835 --> 00:05:12,525
language, not a knowledge base, and
you're trying to stick a knowledge
90
00:05:12,525 --> 00:05:18,585
base on it, and you have a generalized
retrieval process with chunking, that
91
00:05:18,585 --> 00:05:20,865
might just be for any particular.
92
00:05:21,465 --> 00:05:22,815
Uh, type of query.
93
00:05:22,815 --> 00:05:26,775
And then you have lawyers querying
in lawyer phrases and suddenly
94
00:05:26,775 --> 00:05:29,594
the chunking's not quite right
and it's all not working well.
95
00:05:29,594 --> 00:05:31,365
It's kind of, of course not, right?
96
00:05:31,365 --> 00:05:36,135
So I think we're waiting for new model
architectures changes to rag, like
97
00:05:36,135 --> 00:05:38,205
using agents that are gonna improve.
98
00:05:38,534 --> 00:05:38,745
Its.
99
00:05:39,385 --> 00:05:44,425
Current, uh, drawbacks and of
course hallucination fixes that.
100
00:05:44,425 --> 00:05:47,305
Who knows how those are gonna
happen before we're gonna be
101
00:05:47,305 --> 00:05:48,685
climbing out of this trough.
102
00:05:48,685 --> 00:05:52,285
And the use cases that people end
up using for now are gonna be much
103
00:05:52,285 --> 00:05:55,285
more limited, I think, until we
solve a lot of those technical issues
104
00:05:55,285 --> 00:06:02,425
that GPT and other ai like it are
actually pretty good at drafting.
105
00:06:03,265 --> 00:06:08,095
Um, very short legal
provisions, definitions.
106
00:06:08,640 --> 00:06:12,719
I, I don't know that I would trust
it to draft an exculpation provision,
107
00:06:12,780 --> 00:06:18,450
but I might ask it to draft a
definition of x whatever, whatever
108
00:06:18,450 --> 00:06:19,650
you wanna fill the blank in with.
109
00:06:19,650 --> 00:06:19,920
Right.
110
00:06:20,190 --> 00:06:23,310
Um, and I think there are some products,
I haven't looked at some of these,
111
00:06:23,310 --> 00:06:28,170
um, drafting assistance recently,
but I would expect that they would
112
00:06:28,170 --> 00:06:31,020
be starting to build in that kind
of functionality if they haven't had
113
00:06:31,020 --> 00:06:33,810
it for while already for practice.
114
00:06:36,780 --> 00:06:40,409
The thing that I think a lot of people are
hoping AI generative AI will be able to
115
00:06:40,409 --> 00:06:47,460
do is write that first version of a draft
so that firms don't need to continue to
116
00:06:47,460 --> 00:06:51,150
maintain form banks or precedent banks.
117
00:06:51,210 --> 00:06:53,520
The AI will just figure it out.
118
00:06:54,390 --> 00:06:55,804
I don't think we're there.
119
00:06:56,580 --> 00:06:59,460
In the term, I, I think
it will really struggle.
120
00:06:59,460 --> 00:07:03,180
I think, and I'm not an expert in ai,
but my understanding is that some of
121
00:07:03,180 --> 00:07:08,160
these have page limitations on the
kinds of documents they can ingest and
122
00:07:08,160 --> 00:07:09,810
the kinds of documents they can create.
123
00:07:10,230 --> 00:07:18,180
Uh, and the, there's a, a fair number
of very commonly prepared documents
124
00:07:18,630 --> 00:07:20,520
that run into the hundreds of pages.
125
00:07:20,790 --> 00:07:24,180
Also, there's a lot of,
um, interdependence.
126
00:07:24,750 --> 00:07:27,690
Among documents, uh, in certain practices.
127
00:07:27,840 --> 00:07:30,780
For example, in the investment
management practice funds practices,
128
00:07:30,990 --> 00:07:37,440
you'll have fund documents that are
very in interdependent and have what
129
00:07:37,440 --> 00:07:40,470
should be nearly identical provisions.
130
00:07:40,560 --> 00:07:45,210
And if there's a hallucination between
the expense section in a disclosure
131
00:07:45,210 --> 00:07:49,170
document, ver versus an expense section
in an investment management agreement
132
00:07:49,170 --> 00:07:52,590
or a limited partnership agreement, or
you know, the list keeps going, right?
133
00:07:52,740 --> 00:07:53,640
Um, that.
134
00:07:54,060 --> 00:07:56,580
Is a malpractice claim, right?
135
00:07:56,789 --> 00:08:02,310
So I think there'll be, there'll
be some very narrow use cases
136
00:08:02,310 --> 00:08:03,930
for AI when it comes to drafting.
137
00:08:04,935 --> 00:08:09,135
For now, but who knows what this
landscape looks like in 10 years?
138
00:08:09,375 --> 00:08:12,705
I started off reading that, that
first case, Mata versus Avianca.
139
00:08:12,705 --> 00:08:15,285
But then, you know, there was
another case a couple months later
140
00:08:15,285 --> 00:08:19,395
and another case, and right now by
my tally and, and I'll explain how
141
00:08:19,395 --> 00:08:20,835
others are finding other cases.
142
00:08:20,835 --> 00:08:24,825
I think I have 14 cases in which
lawyers have gotten in trouble for
143
00:08:24,825 --> 00:08:29,715
using AI without checking and verifying
the sites and, and the cases, call
144
00:08:29,715 --> 00:08:31,875
it hallucinated cases, fictitious.
145
00:08:32,400 --> 00:08:35,880
Most recent case called it
phantom cases, fake cases.
146
00:08:35,880 --> 00:08:38,909
So if anybody out there is, is
trying to research these cases,
147
00:08:39,150 --> 00:08:40,470
use all of those synonyms.
148
00:08:41,340 --> 00:08:46,710
But then what's also shocking is that,
um, or I think surprising and alarming is
149
00:08:46,710 --> 00:08:51,000
that pro se litigants, litigants who are
representing themselves without lawyers,
150
00:08:51,000 --> 00:08:55,439
you know, a lot of people are saying AI
is great for access to justice and, and
151
00:08:55,439 --> 00:08:57,300
people not needing to hire a lawyer.
152
00:08:58,155 --> 00:09:03,765
Pro se litigants, at least 12 by my count
have have also submitted court filings
153
00:09:03,765 --> 00:09:09,465
either complaints or pleadings or briefs,
and that is causing a burden on the
154
00:09:09,465 --> 00:09:12,795
court, uh, personnel and opposing counsel.
155
00:09:13,500 --> 00:09:17,370
To research those cases, spend time
figuring out that the cases don't
156
00:09:17,370 --> 00:09:22,020
exist, pointing them out to the pro
se litigant, and then the judge who,
157
00:09:22,050 --> 00:09:26,430
those cases say that the courts exercise
what they call special solicitude, or
158
00:09:26,430 --> 00:09:30,209
they're a little lenient on litigants
who don't have lawyers, but they
159
00:09:30,209 --> 00:09:33,959
have to remind them, Hey, you can't
do this if you do this again, we're
160
00:09:33,959 --> 00:09:36,209
gonna consider imposing sanctions.
161
00:09:36,209 --> 00:09:39,390
And some of the courts have
imposed pretty significant
162
00:09:39,390 --> 00:09:41,250
sanctions on even pro se litigants.
163
00:09:41,775 --> 00:09:43,755
And then I'll tell you kind
of two other categories.
164
00:09:43,814 --> 00:09:46,365
One law firm just keep doubling down.
165
00:09:46,365 --> 00:09:50,775
It's a new law, it's a law firm filing
cases in New York against the New York
166
00:09:50,775 --> 00:09:55,425
Department of Education, and they've
won the the main case and they're
167
00:09:55,425 --> 00:09:59,444
entitled to their attorney's fees under
the statute, but they keep using chat
168
00:09:59,444 --> 00:10:04,545
CPT to calculate their fee requests
or to like support their fee requests.
169
00:10:04,814 --> 00:10:06,105
And they've done this eight times.
170
00:10:06,975 --> 00:10:11,324
Eight times the, the judges, different
judges in New York, but different
171
00:10:11,324 --> 00:10:16,845
judges have said, we're not accepting
this, this fee request based on chat
172
00:10:17,204 --> 00:10:23,714
t's calculations, because in chat t's
current state, it's not reliable as
173
00:10:23,714 --> 00:10:25,755
a, as a source for this information.
174
00:10:25,845 --> 00:10:29,145
Just, I just wanted to be devil's
advocate as to why you think it's not,
175
00:10:29,204 --> 00:10:32,890
they're not ready, these agents to kind
of do the things that are high risk.
176
00:10:33,600 --> 00:10:34,290
High risk.
177
00:10:34,680 --> 00:10:39,030
You have to kind of treat
it like a junior associate.
178
00:10:39,030 --> 00:10:40,980
Like this stuff needs eyes on.
179
00:10:40,980 --> 00:10:45,569
And I think in, pretty much, in most
respects, like even if it's not high
180
00:10:45,569 --> 00:10:48,810
risk, like if you're gonna, if you're
gonna be repeating anything that you
181
00:10:48,810 --> 00:10:52,949
get out of ai, you should probably, you
know, make sure that it's actually true.
182
00:10:53,430 --> 00:10:57,840
Even, you know, even like, you
know, facts about the news or
183
00:10:57,840 --> 00:11:01,050
this or that or the other, like,
you know, this is not perfect.
184
00:11:01,050 --> 00:11:02,340
It is getting data.
185
00:11:03,660 --> 00:11:07,740
That it's been trained on, and the
training data may not be correct.
186
00:11:07,770 --> 00:11:12,120
The people that are creating the
agents, they, they have bias.
187
00:11:12,329 --> 00:11:15,449
They, you know, you don't have
any transparency into how these
188
00:11:15,449 --> 00:11:17,610
are created or anything like that.
189
00:11:17,610 --> 00:11:22,140
So we always, like, we, we do a lot
of AI solutions and I would never say,
190
00:11:22,170 --> 00:11:23,640
all right, yeah, just send this out.
191
00:11:24,000 --> 00:11:27,209
It's like, you know, when we create
something for our clients, we,
192
00:11:27,270 --> 00:11:30,810
we proof it and then we make sure
that they proof it, you know, like.
193
00:11:31,590 --> 00:11:34,800
This is not a person, this is a machine.
194
00:11:35,220 --> 00:11:38,460
It is that it created this
so you, but it's real.
195
00:11:38,520 --> 00:11:39,780
I mean they're very effective.
196
00:11:39,780 --> 00:11:40,740
They save a lot of time.
197
00:11:40,740 --> 00:11:43,950
Like we do production
request, uh, responses.
198
00:11:44,010 --> 00:11:47,610
We have a tool that does this for
our clients and it writes as the
199
00:11:47,610 --> 00:11:51,150
attorneys write and it has the same
format of looks exactly like that.
200
00:11:51,480 --> 00:11:54,240
So we'll create a production
request response for it,
201
00:11:54,300 --> 00:11:55,500
the attorneys to start with.
202
00:11:55,890 --> 00:11:58,830
So just saves them a lot of time,
just even create that saves them like.
203
00:11:59,685 --> 00:12:04,485
Days provides like sample arguments,
but you know, I would never say
204
00:12:04,485 --> 00:12:07,785
just send that out like you get,
you know, it'll take them an hour
205
00:12:07,785 --> 00:12:09,074
instead of two days to do something.
206
00:12:09,074 --> 00:12:09,915
I think that's great.
207
00:12:10,335 --> 00:12:14,025
When it comes to implementation of
ai, think of three different things.
208
00:12:14,025 --> 00:12:17,925
Firstly, starting small and handholding
a particular group that you focus on.
209
00:12:18,314 --> 00:12:21,704
Secondly is getting very specific
on the use cases that you're looking
210
00:12:21,704 --> 00:12:25,185
to solve, not just a. Push the
AI out there for the sake of it.
211
00:12:25,245 --> 00:12:28,214
And thirdly is setting expectations.
212
00:12:28,245 --> 00:12:31,515
As you said, if you lose that trust
with people, it's hard to regain it.
213
00:12:31,574 --> 00:12:36,885
And when we deploy AI with clients,
that's one of the things we really focus
214
00:12:36,885 --> 00:12:38,625
on is appropriate expectation setting.
215
00:12:39,074 --> 00:12:42,165
And with the introduction of any
tool, it's not just, here are
216
00:12:42,165 --> 00:12:43,995
all the things the tool can do.
217
00:12:44,385 --> 00:12:47,145
It's being super clear on
this is what it cannot do.
218
00:12:47,505 --> 00:12:50,354
If you try and use it for
these use cases, it will fail.
219
00:12:50,385 --> 00:12:52,189
You will get bad results,
you'll get frustrated.
220
00:12:53,055 --> 00:12:54,854
Just being super transparent with people.
221
00:12:55,485 --> 00:12:58,785
You know, touching on the hype
piece, that there's some talk in
222
00:12:58,785 --> 00:13:01,545
the market about AI being magical
and what it can can't do, et cetera.
223
00:13:02,415 --> 00:13:05,685
However, if you go in with that
attitude, you will fail for sure.
224
00:13:05,685 --> 00:13:08,625
It's not at that level for the
vast majority of use cases.
225
00:13:08,625 --> 00:13:13,540
Whereas if you frame it of, look, this is
like having a junior associate or in CER
226
00:13:13,540 --> 00:13:18,104
certain cases, even a mid-level associate
that could support with the work that you
227
00:13:18,104 --> 00:13:20,295
complete, that they will make mistakes.
228
00:13:20,295 --> 00:13:20,985
It's not perfect.
229
00:13:20,985 --> 00:13:21,765
It needs your input.
230
00:13:22,395 --> 00:13:27,315
That's actually a far better change
management piece as well, because from the
231
00:13:27,315 --> 00:13:30,105
lawyer's point of view, it's very clear,
look, this is not replacing them, this
232
00:13:30,105 --> 00:13:31,995
is augmenting how they perform the work.
233
00:13:32,145 --> 00:13:34,365
So yeah, expectation
setting is a massive one.
234
00:13:34,425 --> 00:13:37,485
And then, as I mentioned about getting
very, very specific, it needs to be
235
00:13:37,485 --> 00:13:43,995
tied to a very clear use case that
the benefits are very tangible, that
236
00:13:43,995 --> 00:13:46,605
it's clear what the objectives are
and what you're trying to achieve.
237
00:13:46,605 --> 00:13:49,455
And just having that in a
kind of contained environment.
238
00:13:49,455 --> 00:13:50,565
And by contained, I mean.
239
00:13:51,000 --> 00:13:51,900
Structures.
240
00:13:51,960 --> 00:13:54,060
This is how we are going to approach it.
241
00:13:54,060 --> 00:13:57,689
Here's how we check, how, you
know, the feedback as we progress.
242
00:13:57,689 --> 00:13:59,160
Here is how we iterate as we go.
243
00:13:59,580 --> 00:14:03,720
Just overall delivery best practices,
uh, change management, best practices.
244
00:14:03,720 --> 00:14:08,520
You know, start small, expand, learn,
get some proof points, and then,
245
00:14:08,760 --> 00:14:12,115
then go broader when that approach is
taken, they've seen marvelous results.
246
00:14:13,170 --> 00:14:17,760
However, people need to be mindful that
like all the standard best practices
247
00:14:17,760 --> 00:14:22,230
we would have with any technology
implementation, they still are true.
248
00:14:22,380 --> 00:14:25,245
You still need to do all the
good stuff you would do before.
249
00:14:25,964 --> 00:14:30,135
AI just doesn't, uh, remove the need
for traditional change management
250
00:14:30,135 --> 00:14:32,295
and delivery experience that you
would have with any technology.
251
00:14:32,324 --> 00:14:37,545
Uh, probably the more widely used,
uh, AI component for us, which, uh,
252
00:14:37,755 --> 00:14:41,175
you and I'll discuss for sure, is our
integration with copilot, which is
253
00:14:41,265 --> 00:14:44,025
live, it exists in the team store.
254
00:14:44,295 --> 00:14:49,094
Uh, so you can actually query
loophole data directly from copilot
255
00:14:49,094 --> 00:14:51,474
without needing to leave where you're
spending a lot of your time working.
256
00:14:52,530 --> 00:14:55,680
We can talk about that, but I think
even as we think about the prompting,
257
00:14:56,070 --> 00:15:01,050
if you look at that, if I just
give someone a empty box and say,
258
00:15:01,200 --> 00:15:02,730
you can plan and scope your work.
259
00:15:03,030 --> 00:15:03,900
Describe your work.
260
00:15:04,800 --> 00:15:09,120
You, you write a 1, 2, 3 sentence
prompts saying, you know, it's
261
00:15:09,120 --> 00:15:15,270
an infringement suits, uh, from X
against Y, um, in these jurisdictions.
262
00:15:15,270 --> 00:15:17,940
The plan that you're
going to get from that.
263
00:15:18,675 --> 00:15:19,935
It's going to be pretty basic.
264
00:15:19,935 --> 00:15:23,265
We've done a lot of work to try and
sort of interpret what that means in
265
00:15:23,265 --> 00:15:27,375
the backend, but the reality is you
need to provide people training and
266
00:15:27,375 --> 00:15:33,135
guidance on both the level of detail
that's needed and how best to put
267
00:15:33,135 --> 00:15:37,785
that data into the system, and that
then needs to be interpreted against
268
00:15:37,785 --> 00:15:40,905
the large language model that you're
using because if you're using something
269
00:15:40,905 --> 00:15:43,605
like OpenAI or on Azure or otherwise.
270
00:15:44,160 --> 00:15:46,229
Sure you can put that in, in that way.
271
00:15:46,439 --> 00:15:50,430
If you're using, um, Claude
and topics API actually, it
272
00:15:50,430 --> 00:15:52,050
can take a lot more rich text.
273
00:15:52,050 --> 00:15:54,300
You can actually give
it, uh, certain fields.
274
00:15:54,300 --> 00:15:56,910
You can format information
in a specific way.
275
00:15:57,209 --> 00:16:00,209
You have things that prompt caching
that says part, that's, the users
276
00:16:00,209 --> 00:16:03,030
should not really have to think about
that because they may not know what any
277
00:16:03,030 --> 00:16:04,589
of these words mean or care, frankly.
278
00:16:04,739 --> 00:16:06,510
But you need to provide guidance.
279
00:16:06,510 --> 00:16:11,040
And part of that is don't give people what
I like to call the white screen of death.
280
00:16:11,550 --> 00:16:13,980
So if you think about chat, GPT,
which most people will be familiar
281
00:16:13,980 --> 00:16:17,520
with, you look at the early iteration,
it was just type, anything you
282
00:16:17,520 --> 00:16:19,830
want here and it's not a good use.
283
00:16:19,830 --> 00:16:21,630
I don't know what I wanna type here.
284
00:16:21,630 --> 00:16:22,830
What, what are the parameters?
285
00:16:22,830 --> 00:16:24,600
What's the guideline law firms had?
286
00:16:25,215 --> 00:16:30,645
Saw 70 new gen AI companies entered the
market and have no capacity to evaluate
287
00:16:30,645 --> 00:16:35,595
them in the time or speed that the
investors of those companies would expect.
288
00:16:35,835 --> 00:16:42,255
There's too much in the market to
truly diligence pilot security assess.
289
00:16:42,495 --> 00:16:45,705
So law firms are under extreme
amounts of pressure to actually even
290
00:16:45,705 --> 00:16:49,935
evaluate technology, and there's so
many new startups and no one knows if
291
00:16:49,935 --> 00:16:51,495
those startups are going to make it.
292
00:16:51,810 --> 00:16:56,340
A year or if they are vaporware
or you know, are they just
293
00:16:56,340 --> 00:16:58,500
a pretty front end to GPT?
294
00:16:58,620 --> 00:17:02,880
Are they thin wrappers that don't
do much or add much value outside
295
00:17:02,880 --> 00:17:04,260
of the core cost of the model.
296
00:17:04,710 --> 00:17:08,730
So law firms can't evaluate things
at the speed at which it's gonna take
297
00:17:08,730 --> 00:17:11,940
to move the sales cycles forward for
these companies that just got high
298
00:17:11,940 --> 00:17:13,860
valuations, they're burning cash.
299
00:17:13,860 --> 00:17:17,070
'cause they hired a lot of expensive
engineers and data scientists.
300
00:17:17,520 --> 00:17:18,480
So it's gonna be.
301
00:17:18,944 --> 00:17:21,464
I think you're gonna see a set
of down rounds on that area.
302
00:17:21,615 --> 00:17:23,714
I think the Clio is different.
303
00:17:23,775 --> 00:17:25,785
I think Clio is an established company.
304
00:17:25,785 --> 00:17:27,675
They have a massive customer base.
305
00:17:27,944 --> 00:17:31,695
They moved into the worlds of payments
when they kind of cut ties with the
306
00:17:31,695 --> 00:17:33,985
Fin Pay and LA Pay, and they have a.
307
00:17:34,165 --> 00:17:38,425
An entire new set of products that
they can sell into a very established,
308
00:17:38,725 --> 00:17:40,495
loyal, existing customer base.
309
00:17:40,764 --> 00:17:44,935
So I would imagine the Clio
valuation is, is set differently
310
00:17:45,235 --> 00:17:48,415
because it is based on, you know,
a lot of companies like a Harvey.
311
00:17:48,415 --> 00:17:51,264
It's based on the promise of what
you can build in entering markets.
312
00:17:51,625 --> 00:17:55,440
Clio has an extremely established customer
base that is very loyal, that has.
313
00:17:55,919 --> 00:17:59,760
Long, like lifetime value, they
have a longer lifetime value.
314
00:18:00,090 --> 00:18:04,080
So now they have to prove that
they can sell new products also
315
00:18:04,080 --> 00:18:05,610
into the existing customer base.
316
00:18:05,610 --> 00:18:09,570
We can't open sort of the, you know,
any feed on any social media or newsfeed
317
00:18:09,570 --> 00:18:11,580
and not see AI in the headlines, right?
318
00:18:11,580 --> 00:18:13,710
So it's a huge, huge area right now.
319
00:18:13,710 --> 00:18:14,370
Lots of hype.
320
00:18:14,430 --> 00:18:16,500
Um, it could be getting a
little bit frothy, right?
321
00:18:16,500 --> 00:18:18,870
People are kind of throwing
money at it so fast and maybe
322
00:18:18,870 --> 00:18:21,570
not really looking carefully or
critically at the business model.
323
00:18:21,895 --> 00:18:23,455
Or what problem is this trying to solve?
324
00:18:23,485 --> 00:18:27,235
Or does this technology or this product
actually even do what it claims to do?
325
00:18:27,535 --> 00:18:31,075
So, uh, you know, again, having
been a founder, a technologist, a
326
00:18:31,075 --> 00:18:34,015
product person, I'm always really
interested in what the product does.
327
00:18:34,045 --> 00:18:35,395
Does it actually do these things?
328
00:18:35,395 --> 00:18:36,925
Is it on track to do these things?
329
00:18:37,225 --> 00:18:38,305
You have the right team.
330
00:18:38,665 --> 00:18:40,045
Uh, how are they executing?
331
00:18:40,045 --> 00:18:41,395
What's their experience as well?
332
00:18:41,665 --> 00:18:44,360
Um, but definitely, I mean,
huge opportunities in ai.
333
00:18:44,400 --> 00:18:48,565
I am very bullish on sort of AI and
AI and legal tech and AI and reg tech.
334
00:18:48,565 --> 00:18:51,265
And I know I've written and
spoken on those topics too.
335
00:18:51,689 --> 00:18:55,590
I would say that legal, uh, like a lot
of professional sectors, especially
336
00:18:55,590 --> 00:19:00,540
a lot of the kind of uninteresting
or boring back office functions still
337
00:19:00,540 --> 00:19:04,470
lend themselves to a lot of good old
fashioned automation that may or may not
338
00:19:04,470 --> 00:19:08,909
necessarily have to enable the user to
utilize AI in the course of doing that.
339
00:19:08,909 --> 00:19:09,179
Right.
340
00:19:09,179 --> 00:19:12,570
So there's a lot of tasks that we can
think of in legal, a lot of use cases.
341
00:19:13,110 --> 00:19:16,950
That simply haven't been automated yet,
or haven't been automated well, and
342
00:19:16,950 --> 00:19:18,690
so there's still a lot of opportunity.
343
00:19:18,690 --> 00:19:22,620
So I would tend to agree with that vc,
that there are opportunities out there
344
00:19:22,620 --> 00:19:28,140
for technology that maybe isn't AI heavy
centric, but yet it's automating a prior
345
00:19:28,140 --> 00:19:32,820
process that was dated or clunky and
needs, needs refinement, and efficiency.
346
00:19:32,820 --> 00:19:34,500
And there's still a lot of
opportunities like that.
347
00:19:34,530 --> 00:19:38,040
There are lots of smart people and
innovative people in law firms,
348
00:19:38,400 --> 00:19:40,260
but the sum total of the model.
349
00:19:40,770 --> 00:19:43,500
Legacy comp structure, the way
the money flows, they're all
350
00:19:43,500 --> 00:19:44,700
passed through entities, right?
351
00:19:44,700 --> 00:19:46,950
Like the way all of that works
just makes it very hard to
352
00:19:46,950 --> 00:19:48,480
actually do anything about it.
353
00:19:48,900 --> 00:19:51,330
But what do you do in a fixed
price scenario where you
354
00:19:51,330 --> 00:19:52,380
can make it up in volume?
355
00:19:52,410 --> 00:19:53,700
It's a portfolio play.
356
00:19:53,910 --> 00:19:56,550
Like if I do a hundred projects
and I win some, I lose some.
357
00:19:56,550 --> 00:19:58,920
If I net okay, then okay.
358
00:19:59,430 --> 00:19:59,670
Right?
359
00:19:59,730 --> 00:20:05,670
Or if it's like one large consulting
project, but the client can't seem to lock
360
00:20:05,670 --> 00:20:08,550
in on scope, then how could you commit?
361
00:20:09,000 --> 00:20:10,380
To a do not exceeds.
362
00:20:10,385 --> 00:20:14,430
And, and this is exactly what
plays out in legal work and some
363
00:20:14,430 --> 00:20:21,300
practice areas, some matter types
are more easily boxed into a scope.
364
00:20:21,480 --> 00:20:25,655
But if you can get the planets to
align with that stuff, then the margin,
365
00:20:26,190 --> 00:20:30,450
margin opportunity or challenge,
depending on how you look at it,
366
00:20:30,780 --> 00:20:32,070
um, is in the hands of the firm.
367
00:20:32,160 --> 00:20:36,420
And if they can reduce their
cost, let's charge the same.
368
00:20:37,110 --> 00:20:38,100
They make more money.
369
00:20:38,250 --> 00:20:40,139
Like that's the simple
economic part of it.
370
00:20:40,200 --> 00:20:44,970
EAs like easier said than done, but
sometimes I think the, um, especially
371
00:20:44,970 --> 00:20:49,379
with AI fueled automation and efficiency
that's being touted right now, we
372
00:20:49,379 --> 00:20:54,179
can't forget that if we value the
input of time, like that's how we get
373
00:20:54,179 --> 00:20:57,210
paid and then we reduce the time, like
quite obviously that's not gonna work.
374
00:20:57,210 --> 00:21:00,540
So you have to look at all
four Ps of product management
375
00:21:00,540 --> 00:21:01,530
when you're dealing with this.
376
00:21:01,530 --> 00:21:05,490
And product management as a discipline
is not something a lot of firms have.
377
00:21:06,030 --> 00:21:10,470
Very deeply ingrained in their ethos
and uh, but certainly there's a lot of
378
00:21:10,470 --> 00:21:12,510
pricing, people who understand this.
379
00:21:12,930 --> 00:21:16,260
But pricing is one of those services
just kind of almost like innovation
380
00:21:16,320 --> 00:21:20,280
that is very difficult to scale across
all the partners, all the clients.
381
00:21:20,280 --> 00:21:24,240
In the same way, if you're engaging
in some kind of legal dispute or legal
382
00:21:24,240 --> 00:21:26,550
situation, you have a pretty serious.
383
00:21:26,895 --> 00:21:28,545
Thing that you're trying
to work through, right?
384
00:21:28,545 --> 00:21:32,625
So for when you talk about the risk
profile of something being wrong,
385
00:21:33,225 --> 00:21:39,255
it's much scarier how it could affect
people's lives in a legal sphere or
386
00:21:39,255 --> 00:21:43,185
like a medical sphere or something
like that, versus apartment hunting or
387
00:21:43,185 --> 00:21:44,775
planning a trip or things like that.
388
00:21:44,775 --> 00:21:48,975
And now this is all coming from someone
who's like very AI positive and very much.
389
00:21:49,305 --> 00:21:50,895
Like a pro, a pro ai.
390
00:21:50,895 --> 00:21:52,725
I mean, obviously I have a show about it.
391
00:21:52,725 --> 00:21:55,635
I have a company about it, so I'm
like super, super excited about
392
00:21:55,635 --> 00:21:58,125
the potential, the same way you
are about how it can help humans
393
00:21:58,125 --> 00:21:59,985
become better at what they're doing.
394
00:22:00,315 --> 00:22:05,835
But I do think the biggest risk I
see about this idea of just pointing
395
00:22:05,835 --> 00:22:09,735
at all this data, you know, having
people who frankly don't have.
396
00:22:10,635 --> 00:22:14,085
The depth of knowledge, either in
the legal sphere, in the tech sphere
397
00:22:14,085 --> 00:22:16,455
to understand what's coming back.
398
00:22:16,455 --> 00:22:17,385
Is it good, is it bad?
399
00:22:17,385 --> 00:22:18,465
I mean, this is really hard.
400
00:22:18,465 --> 00:22:19,695
Benchmarking is really hard.
401
00:22:19,695 --> 00:22:23,415
We can talk about that too, because
we could end up as a community
402
00:22:23,745 --> 00:22:27,315
destroying any possibility we have
of having these tools be helpful
403
00:22:27,315 --> 00:22:28,785
before they even get out of the gate.
404
00:22:28,815 --> 00:22:29,730
And so I'm probably not.
405
00:22:30,155 --> 00:22:35,225
Surprising, anybody listening to
this call that the judiciary is not
406
00:22:35,225 --> 00:22:38,225
necessarily, or the people involved in
the courts and things of that nature
407
00:22:38,225 --> 00:22:41,075
aren't necessarily the most technically
advanced people on earth, you know?
408
00:22:41,105 --> 00:22:41,435
Right.
409
00:22:41,524 --> 00:22:43,504
They just are and it, it's, it's okay.
410
00:22:43,504 --> 00:22:45,305
And that's not necessarily their job.
411
00:22:45,725 --> 00:22:48,785
But if you can see, and we
saw it with hallucinations.
412
00:22:48,815 --> 00:22:52,445
If we create noise and we create.
413
00:22:53,085 --> 00:22:56,085
Situations where people are causing
themselves, like we said, or the
414
00:22:56,085 --> 00:22:59,880
system, more harm than good, we
could end up getting shut down.
415
00:23:00,780 --> 00:23:03,449
You know, regulated to a,
a point where we're at.
416
00:23:03,449 --> 00:23:07,980
We took, uh, quite a bit of time to
test the tools and to roll out in
417
00:23:07,980 --> 00:23:10,169
a way that we felt was appropriate.
418
00:23:10,199 --> 00:23:13,470
And we actually added a lot of
requirements around giving people
419
00:23:13,470 --> 00:23:16,379
access to our first gen AI tool.
420
00:23:16,830 --> 00:23:22,139
Um, we required a CLE an hour long CLE
on the actual technology and the ethical
421
00:23:22,139 --> 00:23:24,060
obligations because this is very new.
422
00:23:24,060 --> 00:23:25,470
I mean, this is now last fall, right?
423
00:23:25,470 --> 00:23:26,669
So it seems like ages ago.
424
00:23:27,130 --> 00:23:28,780
It was still very new for a lot of people.
425
00:23:28,780 --> 00:23:32,440
People hadn't necessarily heard about
prompting and, you know, context
426
00:23:32,440 --> 00:23:34,330
windows and vectorization, et cetera.
427
00:23:34,330 --> 00:23:37,510
So we wanted to make sure that they
understood what this was so they can also
428
00:23:37,510 --> 00:23:39,190
understand what it can and cannot do.
429
00:23:39,580 --> 00:23:41,650
And then the ethical obligations
of course, are you need to
430
00:23:41,650 --> 00:23:42,970
have technical competency.
431
00:23:42,970 --> 00:23:45,670
You need to be to explain to
your client like what you're
432
00:23:45,670 --> 00:23:46,690
doing and what you're using.
433
00:23:46,690 --> 00:23:49,750
So this is all just part of that
education that we're trying to.
434
00:23:50,155 --> 00:23:54,385
Make part of our attorney's life and that
we also require that they accept power
435
00:23:54,385 --> 00:23:56,665
policy on the acceptable use of gen ai.
436
00:23:56,665 --> 00:24:00,115
And then on top of that, whenever
there's a new gen AI focused
437
00:24:00,115 --> 00:24:01,080
tool, we require a training.
438
00:24:01,830 --> 00:24:02,940
On that particular tool.
439
00:24:02,940 --> 00:24:06,450
So we wanted to make sure that the
education was there, that people
440
00:24:06,450 --> 00:24:09,690
are becoming more and more familiar
with what this is and isn't.
441
00:24:10,050 --> 00:24:13,080
And that continues to be our
goal going forward and, and
442
00:24:13,080 --> 00:24:14,400
our requirement going forward.
443
00:24:14,730 --> 00:24:17,550
So that was partly what we discussed
at Skills because I think at that
444
00:24:17,550 --> 00:24:20,730
point, not that many firms had
tried to rule out anything, um,
445
00:24:20,820 --> 00:24:24,780
with that much of a comprehensive
plan and onboarding requirements.
446
00:24:25,080 --> 00:24:27,690
And we thought it seemed like that
was helpful for people to hear.
447
00:24:28,545 --> 00:24:31,605
So the first thing we tried to
do was just let the frontier
448
00:24:31,605 --> 00:24:34,635
models try to create a jury.
449
00:24:34,635 --> 00:24:38,475
So we said, create for us a jury
pool that is similar to what a
450
00:24:38,475 --> 00:24:42,765
federal jury pool would be, and
that's where Michael Scott emerged.
451
00:24:42,825 --> 00:24:44,655
It was, it was really hilarious.
452
00:24:44,685 --> 00:24:48,180
Uh, they would, they would output the
demographics of the jurors, so it was.
453
00:24:48,774 --> 00:24:53,514
White man in his mid forties, who is
the manager of a mid-size paper firm in
454
00:24:53,514 --> 00:24:58,465
Scranton, Pennsylvania, which you and I
would obviously know is Michael Scott.
455
00:24:58,524 --> 00:25:01,735
Michael Scott is not a real
person, let alone the real
456
00:25:01,735 --> 00:25:04,014
juror in the federal jury pool.
457
00:25:04,014 --> 00:25:04,225
Right.
458
00:25:04,225 --> 00:25:09,055
We also had a lot of other interesting
combinations of, there was a 90-year-old
459
00:25:09,055 --> 00:25:11,395
woman who was a part-time botanist.
460
00:25:11,395 --> 00:25:12,535
Part-time dj.
461
00:25:12,540 --> 00:25:12,580
Dj.
462
00:25:13,435 --> 00:25:13,764
I love that one.
463
00:25:13,764 --> 00:25:15,595
We had an abolition abolitionist.
464
00:25:15,595 --> 00:25:16,615
Podcaster.
465
00:25:16,615 --> 00:25:16,675
Yeah.
466
00:25:16,735 --> 00:25:18,295
So it seemed like when.
467
00:25:18,705 --> 00:25:20,805
These platforms were left
to their own devices.
468
00:25:20,805 --> 00:25:25,035
They were generating jurors
that were more for show, kind of
469
00:25:25,035 --> 00:25:27,045
eye-catching types of backgrounds.
470
00:25:27,255 --> 00:25:31,485
That really didn't reflect what
we needed for our purposes,
471
00:25:31,485 --> 00:25:36,195
what real people on a jury would
actually look like demographically.
472
00:25:36,960 --> 00:25:40,830
And then you can tell that, you
know, they're a kid in, you know,
473
00:25:41,129 --> 00:25:44,550
Washington is using them right now
to study who's 12 years old and
474
00:25:44,550 --> 00:25:45,900
maybe using it for creative writing.
475
00:25:45,900 --> 00:25:48,420
So, you know, there's a big range
of people why people are using these
476
00:25:48,420 --> 00:25:51,780
tools and they, you know, have the dial
on certain types of representation,
477
00:25:51,780 --> 00:25:54,540
which could be very useful obviously
in a creative writing context.
478
00:25:54,540 --> 00:25:56,430
But in ours that was,
you know, catastrophic.
479
00:25:56,875 --> 00:25:58,345
Because it was wasn't
representing reality.
480
00:25:58,615 --> 00:26:00,895
Thanks for listening to
Legal Innovation Spotlight.
481
00:26:01,405 --> 00:26:04,915
If you found value in this chat, hit
the subscribe button to be notified
482
00:26:04,915 --> 00:26:06,385
when we release new episodes.
483
00:26:06,895 --> 00:26:09,565
We'd also really appreciate it if
you could take a moment to rate
484
00:26:09,565 --> 00:26:12,235
us and leave us a review wherever
you're listening right now.
485
00:26:12,805 --> 00:26:15,275
Your feedback helps us provide
you with top-notch content.
00:00:04,770
On November 30th of last year when
they, when OpenAI released their
2
00:00:04,770 --> 00:00:09,660
demo, it was probably the greatest
demo of all time when it comes
3
00:00:09,660 --> 00:00:11,670
to like technology introductions.
4
00:00:12,420 --> 00:00:15,720
Like you saw that and it didn't
matter what it was saying, right?
5
00:00:15,720 --> 00:00:19,440
Because we know now that half the stuff
it was saying was made up or some high
6
00:00:19,495 --> 00:00:21,420
percentage was hallucinating at the time.
7
00:00:21,960 --> 00:00:22,050
Right.
8
00:00:22,050 --> 00:00:25,560
But it was such a good demo that like
anybody could see it and understand it.
9
00:00:25,560 --> 00:00:27,510
You didn't need to be
technically inclined.
10
00:00:28,290 --> 00:00:31,740
To think about how that could affect
your world, your role, et cetera.
11
00:00:31,800 --> 00:00:35,550
I go back to the quote that Bill Gates
has from years ago, and he basically
12
00:00:35,550 --> 00:00:40,050
says, people overestimate what happens
in one year with new technology, but
13
00:00:40,050 --> 00:00:41,700
they underestimate what happens in 10.
14
00:00:41,880 --> 00:00:45,390
And I think that although it
was a spectacular demo, we're
15
00:00:45,390 --> 00:00:47,640
somewhere on that scale right now.
16
00:00:47,640 --> 00:00:50,820
My guess is the, the, the stuff
for this really to start thinking
17
00:00:50,820 --> 00:00:52,020
about legal in a meaningful way.
18
00:00:52,020 --> 00:00:55,710
We're probably still three years away,
two years away, but I think that, um.
19
00:00:56,385 --> 00:00:59,864
If you ask me why the demo, why
everybody's paying attention, I think
20
00:00:59,864 --> 00:01:02,894
it's 'cause the demo was so good and it's
great that people are paying attention
21
00:01:02,894 --> 00:01:06,179
to this because it's, it's probably gonna
propel a lot of technology adoption.
22
00:01:06,915 --> 00:01:08,715
You mentioned, uh, copilot.
23
00:01:09,045 --> 00:01:12,735
That's part of our generative AI strategy,
which, um, you know, we've had the
24
00:01:12,735 --> 00:01:16,935
pleasure and knowledge management to
help, uh, define what our generative
25
00:01:16,935 --> 00:01:18,855
AI strategy will be for the firm.
26
00:01:18,945 --> 00:01:23,115
And a big part of that is embracing
copilot, eventually going to just
27
00:01:23,115 --> 00:01:27,045
be table stakes, uh, for many of
the law firms that are out there.
28
00:01:27,045 --> 00:01:31,305
But given our position with Microsoft,
it makes complete sense, almost
29
00:01:31,305 --> 00:01:34,965
self-evident that that's something
that we need to, uh, embrace.
30
00:01:35,335 --> 00:01:37,825
And explore and do that
as quickly as we can.
31
00:01:38,155 --> 00:01:43,285
But, you know, we, we we're big power
BI users for data visualization.
32
00:01:43,285 --> 00:01:45,775
That's both internally as
well as with our clients.
33
00:01:45,805 --> 00:01:51,235
Uh, our clients have found that to be
incredibly helpful in, um, representing
34
00:01:51,235 --> 00:01:53,365
a lot of their content and their data.
35
00:01:53,755 --> 00:01:56,665
Uh, but it's also helpful in
things like matter management.
36
00:01:57,205 --> 00:02:00,804
Uh, in creating dashboards on and
making sure that we're running our
37
00:02:00,804 --> 00:02:04,945
matters, especially our portfolio
accounts very profitably, uh,
38
00:02:04,945 --> 00:02:06,640
and keeping a close eye on, um.
39
00:02:07,620 --> 00:02:08,699
On those things.
40
00:02:08,699 --> 00:02:12,240
And then, um, also, again, I mentioned,
you know, with our data analytics
41
00:02:12,240 --> 00:02:15,510
capability, looking at our internal
data and supplementing that with
42
00:02:15,510 --> 00:02:19,260
a lot of the publicly available
content or data that's available, we
43
00:02:19,260 --> 00:02:20,970
still don't have the right answers.
44
00:02:21,060 --> 00:02:24,420
Should it be the big tech that we're
not talking about art industry, that
45
00:02:24,420 --> 00:02:28,200
we're talking about big, big tech
like Meta and Google and saying,
46
00:02:28,380 --> 00:02:31,380
should they be the keepers of the
big foundational models and just.
47
00:02:31,935 --> 00:02:37,575
Keep creating these small layers on top
of it, or should actually many startups be
48
00:02:37,575 --> 00:02:43,755
funded and have a go at creating smaller
foundation models for specific cases.
49
00:02:44,085 --> 00:02:47,954
Not that much more different
than previous tech ways.
50
00:02:47,954 --> 00:02:54,089
Ted, when we had mobile apps, when we had
cloud, or when we had SaaS, all of this.
51
00:02:54,675 --> 00:02:59,055
Um, all of these tech waves have
followed very similar patterns, right?
52
00:02:59,325 --> 00:03:01,425
The VC community gets super excited.
53
00:03:01,665 --> 00:03:04,005
The tech community gets super excited.
54
00:03:04,035 --> 00:03:05,955
'cause building tech is very easy now.
55
00:03:06,315 --> 00:03:10,455
Lots of money is funneled into it, and
then there comes a time when things
56
00:03:10,455 --> 00:03:15,495
just sell down and you realize, well,
some of it was smoke and some of it was.
57
00:03:18,270 --> 00:03:23,460
Citation you just made, um, is a little
bit more scary because now we're talking
58
00:03:23,460 --> 00:03:28,320
about a very significant player in our
industry, um, that's being questioned.
59
00:03:28,320 --> 00:03:30,630
Like, okay, how much of this was real?
60
00:03:30,900 --> 00:03:33,840
There's also, not to get super
technical on the call, but.
61
00:03:34,260 --> 00:03:38,220
There's also a difference between
precision and recall, right?
62
00:03:38,220 --> 00:03:43,170
And I think some of the Stanford paper
was getting into the details of, yes,
63
00:03:43,230 --> 00:03:46,200
are you over-engineering on one site?
64
00:03:46,200 --> 00:03:51,180
So that it only gives us, um, you
know, it eliminates, uh, false
65
00:03:51,180 --> 00:03:54,390
negatives to the point that we
don't have enough false positives.
66
00:03:54,390 --> 00:03:56,339
So there's a very interesting.
67
00:03:56,820 --> 00:04:01,620
Um, uh, you know, sort of
deep delve into this space.
68
00:04:01,830 --> 00:04:05,460
But yeah, for now I would say
we're definitely in the space
69
00:04:05,460 --> 00:04:09,330
of more of a reality check, uh,
which is a good place to be.
70
00:04:09,330 --> 00:04:10,410
I think it is.
71
00:04:10,710 --> 00:04:15,150
You know, it's the place frankly,
where those of us in my role in in
72
00:04:15,360 --> 00:04:19,050
firms have a better place to position
these products to our lawyers.
73
00:04:19,589 --> 00:04:20,279
It's here.
74
00:04:20,430 --> 00:04:23,880
These are the things that can do, here
are the things it doesn't do very well.
75
00:04:24,150 --> 00:04:25,530
Let's use it sensibly.
76
00:04:25,530 --> 00:04:28,469
Let's use it safely,
and so on and so forth.
77
00:04:28,469 --> 00:04:31,860
So it just makes it more
palatable, Ted, also.
78
00:04:31,860 --> 00:04:34,409
Yeah, you don't want
something to be too perfect.
79
00:04:34,409 --> 00:04:37,050
'cause that for sure is
a recipe for disaster.
80
00:04:37,140 --> 00:04:38,730
I think we're sliding into a trough.
81
00:04:38,730 --> 00:04:44,159
I, I hate to be not optimistic, but, you
know, vendors have over promised, there's
82
00:04:44,159 --> 00:04:46,200
still confusion about what the tech.
83
00:04:46,710 --> 00:04:48,600
Can and should do.
84
00:04:49,200 --> 00:04:53,460
I think people are sliding into
the classic trust issues that mark
85
00:04:53,460 --> 00:04:56,700
the disillusionment part of the
cycle, and I mean, it makes sense.
86
00:04:56,700 --> 00:05:00,570
You don't have applications
right now that are actually.
87
00:05:00,645 --> 00:05:04,815
It's the right application
of the underlying technology.
88
00:05:04,875 --> 00:05:08,835
I mean, even rag, if you have an
LLM that's a statistical model of
89
00:05:08,835 --> 00:05:12,525
language, not a knowledge base, and
you're trying to stick a knowledge
90
00:05:12,525 --> 00:05:18,585
base on it, and you have a generalized
retrieval process with chunking, that
91
00:05:18,585 --> 00:05:20,865
might just be for any particular.
92
00:05:21,465 --> 00:05:22,815
Uh, type of query.
93
00:05:22,815 --> 00:05:26,775
And then you have lawyers querying
in lawyer phrases and suddenly
94
00:05:26,775 --> 00:05:29,594
the chunking's not quite right
and it's all not working well.
95
00:05:29,594 --> 00:05:31,365
It's kind of, of course not, right?
96
00:05:31,365 --> 00:05:36,135
So I think we're waiting for new model
architectures changes to rag, like
97
00:05:36,135 --> 00:05:38,205
using agents that are gonna improve.
98
00:05:38,534 --> 00:05:38,745
Its.
99
00:05:39,385 --> 00:05:44,425
Current, uh, drawbacks and of
course hallucination fixes that.
100
00:05:44,425 --> 00:05:47,305
Who knows how those are gonna
happen before we're gonna be
101
00:05:47,305 --> 00:05:48,685
climbing out of this trough.
102
00:05:48,685 --> 00:05:52,285
And the use cases that people end
up using for now are gonna be much
103
00:05:52,285 --> 00:05:55,285
more limited, I think, until we
solve a lot of those technical issues
104
00:05:55,285 --> 00:06:02,425
that GPT and other ai like it are
actually pretty good at drafting.
105
00:06:03,265 --> 00:06:08,095
Um, very short legal
provisions, definitions.
106
00:06:08,640 --> 00:06:12,719
I, I don't know that I would trust
it to draft an exculpation provision,
107
00:06:12,780 --> 00:06:18,450
but I might ask it to draft a
definition of x whatever, whatever
108
00:06:18,450 --> 00:06:19,650
you wanna fill the blank in with.
109
00:06:19,650 --> 00:06:19,920
Right.
110
00:06:20,190 --> 00:06:23,310
Um, and I think there are some products,
I haven't looked at some of these,
111
00:06:23,310 --> 00:06:28,170
um, drafting assistance recently,
but I would expect that they would
112
00:06:28,170 --> 00:06:31,020
be starting to build in that kind
of functionality if they haven't had
113
00:06:31,020 --> 00:06:33,810
it for while already for practice.
114
00:06:36,780 --> 00:06:40,409
The thing that I think a lot of people are
hoping AI generative AI will be able to
115
00:06:40,409 --> 00:06:47,460
do is write that first version of a draft
so that firms don't need to continue to
116
00:06:47,460 --> 00:06:51,150
maintain form banks or precedent banks.
117
00:06:51,210 --> 00:06:53,520
The AI will just figure it out.
118
00:06:54,390 --> 00:06:55,804
I don't think we're there.
119
00:06:56,580 --> 00:06:59,460
In the term, I, I think
it will really struggle.
120
00:06:59,460 --> 00:07:03,180
I think, and I'm not an expert in ai,
but my understanding is that some of
121
00:07:03,180 --> 00:07:08,160
these have page limitations on the
kinds of documents they can ingest and
122
00:07:08,160 --> 00:07:09,810
the kinds of documents they can create.
123
00:07:10,230 --> 00:07:18,180
Uh, and the, there's a, a fair number
of very commonly prepared documents
124
00:07:18,630 --> 00:07:20,520
that run into the hundreds of pages.
125
00:07:20,790 --> 00:07:24,180
Also, there's a lot of,
um, interdependence.
126
00:07:24,750 --> 00:07:27,690
Among documents, uh, in certain practices.
127
00:07:27,840 --> 00:07:30,780
For example, in the investment
management practice funds practices,
128
00:07:30,990 --> 00:07:37,440
you'll have fund documents that are
very in interdependent and have what
129
00:07:37,440 --> 00:07:40,470
should be nearly identical provisions.
130
00:07:40,560 --> 00:07:45,210
And if there's a hallucination between
the expense section in a disclosure
131
00:07:45,210 --> 00:07:49,170
document, ver versus an expense section
in an investment management agreement
132
00:07:49,170 --> 00:07:52,590
or a limited partnership agreement, or
you know, the list keeps going, right?
133
00:07:52,740 --> 00:07:53,640
Um, that.
134
00:07:54,060 --> 00:07:56,580
Is a malpractice claim, right?
135
00:07:56,789 --> 00:08:02,310
So I think there'll be, there'll
be some very narrow use cases
136
00:08:02,310 --> 00:08:03,930
for AI when it comes to drafting.
137
00:08:04,935 --> 00:08:09,135
For now, but who knows what this
landscape looks like in 10 years?
138
00:08:09,375 --> 00:08:12,705
I started off reading that, that
first case, Mata versus Avianca.
139
00:08:12,705 --> 00:08:15,285
But then, you know, there was
another case a couple months later
140
00:08:15,285 --> 00:08:19,395
and another case, and right now by
my tally and, and I'll explain how
141
00:08:19,395 --> 00:08:20,835
others are finding other cases.
142
00:08:20,835 --> 00:08:24,825
I think I have 14 cases in which
lawyers have gotten in trouble for
143
00:08:24,825 --> 00:08:29,715
using AI without checking and verifying
the sites and, and the cases, call
144
00:08:29,715 --> 00:08:31,875
it hallucinated cases, fictitious.
145
00:08:32,400 --> 00:08:35,880
Most recent case called it
phantom cases, fake cases.
146
00:08:35,880 --> 00:08:38,909
So if anybody out there is, is
trying to research these cases,
147
00:08:39,150 --> 00:08:40,470
use all of those synonyms.
148
00:08:41,340 --> 00:08:46,710
But then what's also shocking is that,
um, or I think surprising and alarming is
149
00:08:46,710 --> 00:08:51,000
that pro se litigants, litigants who are
representing themselves without lawyers,
150
00:08:51,000 --> 00:08:55,439
you know, a lot of people are saying AI
is great for access to justice and, and
151
00:08:55,439 --> 00:08:57,300
people not needing to hire a lawyer.
152
00:08:58,155 --> 00:09:03,765
Pro se litigants, at least 12 by my count
have have also submitted court filings
153
00:09:03,765 --> 00:09:09,465
either complaints or pleadings or briefs,
and that is causing a burden on the
154
00:09:09,465 --> 00:09:12,795
court, uh, personnel and opposing counsel.
155
00:09:13,500 --> 00:09:17,370
To research those cases, spend time
figuring out that the cases don't
156
00:09:17,370 --> 00:09:22,020
exist, pointing them out to the pro
se litigant, and then the judge who,
157
00:09:22,050 --> 00:09:26,430
those cases say that the courts exercise
what they call special solicitude, or
158
00:09:26,430 --> 00:09:30,209
they're a little lenient on litigants
who don't have lawyers, but they
159
00:09:30,209 --> 00:09:33,959
have to remind them, Hey, you can't
do this if you do this again, we're
160
00:09:33,959 --> 00:09:36,209
gonna consider imposing sanctions.
161
00:09:36,209 --> 00:09:39,390
And some of the courts have
imposed pretty significant
162
00:09:39,390 --> 00:09:41,250
sanctions on even pro se litigants.
163
00:09:41,775 --> 00:09:43,755
And then I'll tell you kind
of two other categories.
164
00:09:43,814 --> 00:09:46,365
One law firm just keep doubling down.
165
00:09:46,365 --> 00:09:50,775
It's a new law, it's a law firm filing
cases in New York against the New York
166
00:09:50,775 --> 00:09:55,425
Department of Education, and they've
won the the main case and they're
167
00:09:55,425 --> 00:09:59,444
entitled to their attorney's fees under
the statute, but they keep using chat
168
00:09:59,444 --> 00:10:04,545
CPT to calculate their fee requests
or to like support their fee requests.
169
00:10:04,814 --> 00:10:06,105
And they've done this eight times.
170
00:10:06,975 --> 00:10:11,324
Eight times the, the judges, different
judges in New York, but different
171
00:10:11,324 --> 00:10:16,845
judges have said, we're not accepting
this, this fee request based on chat
172
00:10:17,204 --> 00:10:23,714
t's calculations, because in chat t's
current state, it's not reliable as
173
00:10:23,714 --> 00:10:25,755
a, as a source for this information.
174
00:10:25,845 --> 00:10:29,145
Just, I just wanted to be devil's
advocate as to why you think it's not,
175
00:10:29,204 --> 00:10:32,890
they're not ready, these agents to kind
of do the things that are high risk.
176
00:10:33,600 --> 00:10:34,290
High risk.
177
00:10:34,680 --> 00:10:39,030
You have to kind of treat
it like a junior associate.
178
00:10:39,030 --> 00:10:40,980
Like this stuff needs eyes on.
179
00:10:40,980 --> 00:10:45,569
And I think in, pretty much, in most
respects, like even if it's not high
180
00:10:45,569 --> 00:10:48,810
risk, like if you're gonna, if you're
gonna be repeating anything that you
181
00:10:48,810 --> 00:10:52,949
get out of ai, you should probably, you
know, make sure that it's actually true.
182
00:10:53,430 --> 00:10:57,840
Even, you know, even like, you
know, facts about the news or
183
00:10:57,840 --> 00:11:01,050
this or that or the other, like,
you know, this is not perfect.
184
00:11:01,050 --> 00:11:02,340
It is getting data.
185
00:11:03,660 --> 00:11:07,740
That it's been trained on, and the
training data may not be correct.
186
00:11:07,770 --> 00:11:12,120
The people that are creating the
agents, they, they have bias.
187
00:11:12,329 --> 00:11:15,449
They, you know, you don't have
any transparency into how these
188
00:11:15,449 --> 00:11:17,610
are created or anything like that.
189
00:11:17,610 --> 00:11:22,140
So we always, like, we, we do a lot
of AI solutions and I would never say,
190
00:11:22,170 --> 00:11:23,640
all right, yeah, just send this out.
191
00:11:24,000 --> 00:11:27,209
It's like, you know, when we create
something for our clients, we,
192
00:11:27,270 --> 00:11:30,810
we proof it and then we make sure
that they proof it, you know, like.
193
00:11:31,590 --> 00:11:34,800
This is not a person, this is a machine.
194
00:11:35,220 --> 00:11:38,460
It is that it created this
so you, but it's real.
195
00:11:38,520 --> 00:11:39,780
I mean they're very effective.
196
00:11:39,780 --> 00:11:40,740
They save a lot of time.
197
00:11:40,740 --> 00:11:43,950
Like we do production
request, uh, responses.
198
00:11:44,010 --> 00:11:47,610
We have a tool that does this for
our clients and it writes as the
199
00:11:47,610 --> 00:11:51,150
attorneys write and it has the same
format of looks exactly like that.
200
00:11:51,480 --> 00:11:54,240
So we'll create a production
request response for it,
201
00:11:54,300 --> 00:11:55,500
the attorneys to start with.
202
00:11:55,890 --> 00:11:58,830
So just saves them a lot of time,
just even create that saves them like.
203
00:11:59,685 --> 00:12:04,485
Days provides like sample arguments,
but you know, I would never say
204
00:12:04,485 --> 00:12:07,785
just send that out like you get,
you know, it'll take them an hour
205
00:12:07,785 --> 00:12:09,074
instead of two days to do something.
206
00:12:09,074 --> 00:12:09,915
I think that's great.
207
00:12:10,335 --> 00:12:14,025
When it comes to implementation of
ai, think of three different things.
208
00:12:14,025 --> 00:12:17,925
Firstly, starting small and handholding
a particular group that you focus on.
209
00:12:18,314 --> 00:12:21,704
Secondly is getting very specific
on the use cases that you're looking
210
00:12:21,704 --> 00:12:25,185
to solve, not just a. Push the
AI out there for the sake of it.
211
00:12:25,245 --> 00:12:28,214
And thirdly is setting expectations.
212
00:12:28,245 --> 00:12:31,515
As you said, if you lose that trust
with people, it's hard to regain it.
213
00:12:31,574 --> 00:12:36,885
And when we deploy AI with clients,
that's one of the things we really focus
214
00:12:36,885 --> 00:12:38,625
on is appropriate expectation setting.
215
00:12:39,074 --> 00:12:42,165
And with the introduction of any
tool, it's not just, here are
216
00:12:42,165 --> 00:12:43,995
all the things the tool can do.
217
00:12:44,385 --> 00:12:47,145
It's being super clear on
this is what it cannot do.
218
00:12:47,505 --> 00:12:50,354
If you try and use it for
these use cases, it will fail.
219
00:12:50,385 --> 00:12:52,189
You will get bad results,
you'll get frustrated.
220
00:12:53,055 --> 00:12:54,854
Just being super transparent with people.
221
00:12:55,485 --> 00:12:58,785
You know, touching on the hype
piece, that there's some talk in
222
00:12:58,785 --> 00:13:01,545
the market about AI being magical
and what it can can't do, et cetera.
223
00:13:02,415 --> 00:13:05,685
However, if you go in with that
attitude, you will fail for sure.
224
00:13:05,685 --> 00:13:08,625
It's not at that level for the
vast majority of use cases.
225
00:13:08,625 --> 00:13:13,540
Whereas if you frame it of, look, this is
like having a junior associate or in CER
226
00:13:13,540 --> 00:13:18,104
certain cases, even a mid-level associate
that could support with the work that you
227
00:13:18,104 --> 00:13:20,295
complete, that they will make mistakes.
228
00:13:20,295 --> 00:13:20,985
It's not perfect.
229
00:13:20,985 --> 00:13:21,765
It needs your input.
230
00:13:22,395 --> 00:13:27,315
That's actually a far better change
management piece as well, because from the
231
00:13:27,315 --> 00:13:30,105
lawyer's point of view, it's very clear,
look, this is not replacing them, this
232
00:13:30,105 --> 00:13:31,995
is augmenting how they perform the work.
233
00:13:32,145 --> 00:13:34,365
So yeah, expectation
setting is a massive one.
234
00:13:34,425 --> 00:13:37,485
And then, as I mentioned about getting
very, very specific, it needs to be
235
00:13:37,485 --> 00:13:43,995
tied to a very clear use case that
the benefits are very tangible, that
236
00:13:43,995 --> 00:13:46,605
it's clear what the objectives are
and what you're trying to achieve.
237
00:13:46,605 --> 00:13:49,455
And just having that in a
kind of contained environment.
238
00:13:49,455 --> 00:13:50,565
And by contained, I mean.
239
00:13:51,000 --> 00:13:51,900
Structures.
240
00:13:51,960 --> 00:13:54,060
This is how we are going to approach it.
241
00:13:54,060 --> 00:13:57,689
Here's how we check, how, you
know, the feedback as we progress.
242
00:13:57,689 --> 00:13:59,160
Here is how we iterate as we go.
243
00:13:59,580 --> 00:14:03,720
Just overall delivery best practices,
uh, change management, best practices.
244
00:14:03,720 --> 00:14:08,520
You know, start small, expand, learn,
get some proof points, and then,
245
00:14:08,760 --> 00:14:12,115
then go broader when that approach is
taken, they've seen marvelous results.
246
00:14:13,170 --> 00:14:17,760
However, people need to be mindful that
like all the standard best practices
247
00:14:17,760 --> 00:14:22,230
we would have with any technology
implementation, they still are true.
248
00:14:22,380 --> 00:14:25,245
You still need to do all the
good stuff you would do before.
249
00:14:25,964 --> 00:14:30,135
AI just doesn't, uh, remove the need
for traditional change management
250
00:14:30,135 --> 00:14:32,295
and delivery experience that you
would have with any technology.
251
00:14:32,324 --> 00:14:37,545
Uh, probably the more widely used,
uh, AI component for us, which, uh,
252
00:14:37,755 --> 00:14:41,175
you and I'll discuss for sure, is our
integration with copilot, which is
253
00:14:41,265 --> 00:14:44,025
live, it exists in the team store.
254
00:14:44,295 --> 00:14:49,094
Uh, so you can actually query
loophole data directly from copilot
255
00:14:49,094 --> 00:14:51,474
without needing to leave where you're
spending a lot of your time working.
256
00:14:52,530 --> 00:14:55,680
We can talk about that, but I think
even as we think about the prompting,
257
00:14:56,070 --> 00:15:01,050
if you look at that, if I just
give someone a empty box and say,
258
00:15:01,200 --> 00:15:02,730
you can plan and scope your work.
259
00:15:03,030 --> 00:15:03,900
Describe your work.
260
00:15:04,800 --> 00:15:09,120
You, you write a 1, 2, 3 sentence
prompts saying, you know, it's
261
00:15:09,120 --> 00:15:15,270
an infringement suits, uh, from X
against Y, um, in these jurisdictions.
262
00:15:15,270 --> 00:15:17,940
The plan that you're
going to get from that.
263
00:15:18,675 --> 00:15:19,935
It's going to be pretty basic.
264
00:15:19,935 --> 00:15:23,265
We've done a lot of work to try and
sort of interpret what that means in
265
00:15:23,265 --> 00:15:27,375
the backend, but the reality is you
need to provide people training and
266
00:15:27,375 --> 00:15:33,135
guidance on both the level of detail
that's needed and how best to put
267
00:15:33,135 --> 00:15:37,785
that data into the system, and that
then needs to be interpreted against
268
00:15:37,785 --> 00:15:40,905
the large language model that you're
using because if you're using something
269
00:15:40,905 --> 00:15:43,605
like OpenAI or on Azure or otherwise.
270
00:15:44,160 --> 00:15:46,229
Sure you can put that in, in that way.
271
00:15:46,439 --> 00:15:50,430
If you're using, um, Claude
and topics API actually, it
272
00:15:50,430 --> 00:15:52,050
can take a lot more rich text.
273
00:15:52,050 --> 00:15:54,300
You can actually give
it, uh, certain fields.
274
00:15:54,300 --> 00:15:56,910
You can format information
in a specific way.
275
00:15:57,209 --> 00:16:00,209
You have things that prompt caching
that says part, that's, the users
276
00:16:00,209 --> 00:16:03,030
should not really have to think about
that because they may not know what any
277
00:16:03,030 --> 00:16:04,589
of these words mean or care, frankly.
278
00:16:04,739 --> 00:16:06,510
But you need to provide guidance.
279
00:16:06,510 --> 00:16:11,040
And part of that is don't give people what
I like to call the white screen of death.
280
00:16:11,550 --> 00:16:13,980
So if you think about chat, GPT,
which most people will be familiar
281
00:16:13,980 --> 00:16:17,520
with, you look at the early iteration,
it was just type, anything you
282
00:16:17,520 --> 00:16:19,830
want here and it's not a good use.
283
00:16:19,830 --> 00:16:21,630
I don't know what I wanna type here.
284
00:16:21,630 --> 00:16:22,830
What, what are the parameters?
285
00:16:22,830 --> 00:16:24,600
What's the guideline law firms had?
286
00:16:25,215 --> 00:16:30,645
Saw 70 new gen AI companies entered the
market and have no capacity to evaluate
287
00:16:30,645 --> 00:16:35,595
them in the time or speed that the
investors of those companies would expect.
288
00:16:35,835 --> 00:16:42,255
There's too much in the market to
truly diligence pilot security assess.
289
00:16:42,495 --> 00:16:45,705
So law firms are under extreme
amounts of pressure to actually even
290
00:16:45,705 --> 00:16:49,935
evaluate technology, and there's so
many new startups and no one knows if
291
00:16:49,935 --> 00:16:51,495
those startups are going to make it.
292
00:16:51,810 --> 00:16:56,340
A year or if they are vaporware
or you know, are they just
293
00:16:56,340 --> 00:16:58,500
a pretty front end to GPT?
294
00:16:58,620 --> 00:17:02,880
Are they thin wrappers that don't
do much or add much value outside
295
00:17:02,880 --> 00:17:04,260
of the core cost of the model.
296
00:17:04,710 --> 00:17:08,730
So law firms can't evaluate things
at the speed at which it's gonna take
297
00:17:08,730 --> 00:17:11,940
to move the sales cycles forward for
these companies that just got high
298
00:17:11,940 --> 00:17:13,860
valuations, they're burning cash.
299
00:17:13,860 --> 00:17:17,070
'cause they hired a lot of expensive
engineers and data scientists.
300
00:17:17,520 --> 00:17:18,480
So it's gonna be.
301
00:17:18,944 --> 00:17:21,464
I think you're gonna see a set
of down rounds on that area.
302
00:17:21,615 --> 00:17:23,714
I think the Clio is different.
303
00:17:23,775 --> 00:17:25,785
I think Clio is an established company.
304
00:17:25,785 --> 00:17:27,675
They have a massive customer base.
305
00:17:27,944 --> 00:17:31,695
They moved into the worlds of payments
when they kind of cut ties with the
306
00:17:31,695 --> 00:17:33,985
Fin Pay and LA Pay, and they have a.
307
00:17:34,165 --> 00:17:38,425
An entire new set of products that
they can sell into a very established,
308
00:17:38,725 --> 00:17:40,495
loyal, existing customer base.
309
00:17:40,764 --> 00:17:44,935
So I would imagine the Clio
valuation is, is set differently
310
00:17:45,235 --> 00:17:48,415
because it is based on, you know,
a lot of companies like a Harvey.
311
00:17:48,415 --> 00:17:51,264
It's based on the promise of what
you can build in entering markets.
312
00:17:51,625 --> 00:17:55,440
Clio has an extremely established customer
base that is very loyal, that has.
313
00:17:55,919 --> 00:17:59,760
Long, like lifetime value, they
have a longer lifetime value.
314
00:18:00,090 --> 00:18:04,080
So now they have to prove that
they can sell new products also
315
00:18:04,080 --> 00:18:05,610
into the existing customer base.
316
00:18:05,610 --> 00:18:09,570
We can't open sort of the, you know,
any feed on any social media or newsfeed
317
00:18:09,570 --> 00:18:11,580
and not see AI in the headlines, right?
318
00:18:11,580 --> 00:18:13,710
So it's a huge, huge area right now.
319
00:18:13,710 --> 00:18:14,370
Lots of hype.
320
00:18:14,430 --> 00:18:16,500
Um, it could be getting a
little bit frothy, right?
321
00:18:16,500 --> 00:18:18,870
People are kind of throwing
money at it so fast and maybe
322
00:18:18,870 --> 00:18:21,570
not really looking carefully or
critically at the business model.
323
00:18:21,895 --> 00:18:23,455
Or what problem is this trying to solve?
324
00:18:23,485 --> 00:18:27,235
Or does this technology or this product
actually even do what it claims to do?
325
00:18:27,535 --> 00:18:31,075
So, uh, you know, again, having
been a founder, a technologist, a
326
00:18:31,075 --> 00:18:34,015
product person, I'm always really
interested in what the product does.
327
00:18:34,045 --> 00:18:35,395
Does it actually do these things?
328
00:18:35,395 --> 00:18:36,925
Is it on track to do these things?
329
00:18:37,225 --> 00:18:38,305
You have the right team.
330
00:18:38,665 --> 00:18:40,045
Uh, how are they executing?
331
00:18:40,045 --> 00:18:41,395
What's their experience as well?
332
00:18:41,665 --> 00:18:44,360
Um, but definitely, I mean,
huge opportunities in ai.
333
00:18:44,400 --> 00:18:48,565
I am very bullish on sort of AI and
AI and legal tech and AI and reg tech.
334
00:18:48,565 --> 00:18:51,265
And I know I've written and
spoken on those topics too.
335
00:18:51,689 --> 00:18:55,590
I would say that legal, uh, like a lot
of professional sectors, especially
336
00:18:55,590 --> 00:19:00,540
a lot of the kind of uninteresting
or boring back office functions still
337
00:19:00,540 --> 00:19:04,470
lend themselves to a lot of good old
fashioned automation that may or may not
338
00:19:04,470 --> 00:19:08,909
necessarily have to enable the user to
utilize AI in the course of doing that.
339
00:19:08,909 --> 00:19:09,179
Right.
340
00:19:09,179 --> 00:19:12,570
So there's a lot of tasks that we can
think of in legal, a lot of use cases.
341
00:19:13,110 --> 00:19:16,950
That simply haven't been automated yet,
or haven't been automated well, and
342
00:19:16,950 --> 00:19:18,690
so there's still a lot of opportunity.
343
00:19:18,690 --> 00:19:22,620
So I would tend to agree with that vc,
that there are opportunities out there
344
00:19:22,620 --> 00:19:28,140
for technology that maybe isn't AI heavy
centric, but yet it's automating a prior
345
00:19:28,140 --> 00:19:32,820
process that was dated or clunky and
needs, needs refinement, and efficiency.
346
00:19:32,820 --> 00:19:34,500
And there's still a lot of
opportunities like that.
347
00:19:34,530 --> 00:19:38,040
There are lots of smart people and
innovative people in law firms,
348
00:19:38,400 --> 00:19:40,260
but the sum total of the model.
349
00:19:40,770 --> 00:19:43,500
Legacy comp structure, the way
the money flows, they're all
350
00:19:43,500 --> 00:19:44,700
passed through entities, right?
351
00:19:44,700 --> 00:19:46,950
Like the way all of that works
just makes it very hard to
352
00:19:46,950 --> 00:19:48,480
actually do anything about it.
353
00:19:48,900 --> 00:19:51,330
But what do you do in a fixed
price scenario where you
354
00:19:51,330 --> 00:19:52,380
can make it up in volume?
355
00:19:52,410 --> 00:19:53,700
It's a portfolio play.
356
00:19:53,910 --> 00:19:56,550
Like if I do a hundred projects
and I win some, I lose some.
357
00:19:56,550 --> 00:19:58,920
If I net okay, then okay.
358
00:19:59,430 --> 00:19:59,670
Right?
359
00:19:59,730 --> 00:20:05,670
Or if it's like one large consulting
project, but the client can't seem to lock
360
00:20:05,670 --> 00:20:08,550
in on scope, then how could you commit?
361
00:20:09,000 --> 00:20:10,380
To a do not exceeds.
362
00:20:10,385 --> 00:20:14,430
And, and this is exactly what
plays out in legal work and some
363
00:20:14,430 --> 00:20:21,300
practice areas, some matter types
are more easily boxed into a scope.
364
00:20:21,480 --> 00:20:25,655
But if you can get the planets to
align with that stuff, then the margin,
365
00:20:26,190 --> 00:20:30,450
margin opportunity or challenge,
depending on how you look at it,
366
00:20:30,780 --> 00:20:32,070
um, is in the hands of the firm.
367
00:20:32,160 --> 00:20:36,420
And if they can reduce their
cost, let's charge the same.
368
00:20:37,110 --> 00:20:38,100
They make more money.
369
00:20:38,250 --> 00:20:40,139
Like that's the simple
economic part of it.
370
00:20:40,200 --> 00:20:44,970
EAs like easier said than done, but
sometimes I think the, um, especially
371
00:20:44,970 --> 00:20:49,379
with AI fueled automation and efficiency
that's being touted right now, we
372
00:20:49,379 --> 00:20:54,179
can't forget that if we value the
input of time, like that's how we get
373
00:20:54,179 --> 00:20:57,210
paid and then we reduce the time, like
quite obviously that's not gonna work.
374
00:20:57,210 --> 00:21:00,540
So you have to look at all
four Ps of product management
375
00:21:00,540 --> 00:21:01,530
when you're dealing with this.
376
00:21:01,530 --> 00:21:05,490
And product management as a discipline
is not something a lot of firms have.
377
00:21:06,030 --> 00:21:10,470
Very deeply ingrained in their ethos
and uh, but certainly there's a lot of
378
00:21:10,470 --> 00:21:12,510
pricing, people who understand this.
379
00:21:12,930 --> 00:21:16,260
But pricing is one of those services
just kind of almost like innovation
380
00:21:16,320 --> 00:21:20,280
that is very difficult to scale across
all the partners, all the clients.
381
00:21:20,280 --> 00:21:24,240
In the same way, if you're engaging
in some kind of legal dispute or legal
382
00:21:24,240 --> 00:21:26,550
situation, you have a pretty serious.
383
00:21:26,895 --> 00:21:28,545
Thing that you're trying
to work through, right?
384
00:21:28,545 --> 00:21:32,625
So for when you talk about the risk
profile of something being wrong,
385
00:21:33,225 --> 00:21:39,255
it's much scarier how it could affect
people's lives in a legal sphere or
386
00:21:39,255 --> 00:21:43,185
like a medical sphere or something
like that, versus apartment hunting or
387
00:21:43,185 --> 00:21:44,775
planning a trip or things like that.
388
00:21:44,775 --> 00:21:48,975
And now this is all coming from someone
who's like very AI positive and very much.
389
00:21:49,305 --> 00:21:50,895
Like a pro, a pro ai.
390
00:21:50,895 --> 00:21:52,725
I mean, obviously I have a show about it.
391
00:21:52,725 --> 00:21:55,635
I have a company about it, so I'm
like super, super excited about
392
00:21:55,635 --> 00:21:58,125
the potential, the same way you
are about how it can help humans
393
00:21:58,125 --> 00:21:59,985
become better at what they're doing.
394
00:22:00,315 --> 00:22:05,835
But I do think the biggest risk I
see about this idea of just pointing
395
00:22:05,835 --> 00:22:09,735
at all this data, you know, having
people who frankly don't have.
396
00:22:10,635 --> 00:22:14,085
The depth of knowledge, either in
the legal sphere, in the tech sphere
397
00:22:14,085 --> 00:22:16,455
to understand what's coming back.
398
00:22:16,455 --> 00:22:17,385
Is it good, is it bad?
399
00:22:17,385 --> 00:22:18,465
I mean, this is really hard.
400
00:22:18,465 --> 00:22:19,695
Benchmarking is really hard.
401
00:22:19,695 --> 00:22:23,415
We can talk about that too, because
we could end up as a community
402
00:22:23,745 --> 00:22:27,315
destroying any possibility we have
of having these tools be helpful
403
00:22:27,315 --> 00:22:28,785
before they even get out of the gate.
404
00:22:28,815 --> 00:22:29,730
And so I'm probably not.
405
00:22:30,155 --> 00:22:35,225
Surprising, anybody listening to
this call that the judiciary is not
406
00:22:35,225 --> 00:22:38,225
necessarily, or the people involved in
the courts and things of that nature
407
00:22:38,225 --> 00:22:41,075
aren't necessarily the most technically
advanced people on earth, you know?
408
00:22:41,105 --> 00:22:41,435
Right.
409
00:22:41,524 --> 00:22:43,504
They just are and it, it's, it's okay.
410
00:22:43,504 --> 00:22:45,305
And that's not necessarily their job.
411
00:22:45,725 --> 00:22:48,785
But if you can see, and we
saw it with hallucinations.
412
00:22:48,815 --> 00:22:52,445
If we create noise and we create.
413
00:22:53,085 --> 00:22:56,085
Situations where people are causing
themselves, like we said, or the
414
00:22:56,085 --> 00:22:59,880
system, more harm than good, we
could end up getting shut down.
415
00:23:00,780 --> 00:23:03,449
You know, regulated to a,
a point where we're at.
416
00:23:03,449 --> 00:23:07,980
We took, uh, quite a bit of time to
test the tools and to roll out in
417
00:23:07,980 --> 00:23:10,169
a way that we felt was appropriate.
418
00:23:10,199 --> 00:23:13,470
And we actually added a lot of
requirements around giving people
419
00:23:13,470 --> 00:23:16,379
access to our first gen AI tool.
420
00:23:16,830 --> 00:23:22,139
Um, we required a CLE an hour long CLE
on the actual technology and the ethical
421
00:23:22,139 --> 00:23:24,060
obligations because this is very new.
422
00:23:24,060 --> 00:23:25,470
I mean, this is now last fall, right?
423
00:23:25,470 --> 00:23:26,669
So it seems like ages ago.
424
00:23:27,130 --> 00:23:28,780
It was still very new for a lot of people.
425
00:23:28,780 --> 00:23:32,440
People hadn't necessarily heard about
prompting and, you know, context
426
00:23:32,440 --> 00:23:34,330
windows and vectorization, et cetera.
427
00:23:34,330 --> 00:23:37,510
So we wanted to make sure that they
understood what this was so they can also
428
00:23:37,510 --> 00:23:39,190
understand what it can and cannot do.
429
00:23:39,580 --> 00:23:41,650
And then the ethical obligations
of course, are you need to
430
00:23:41,650 --> 00:23:42,970
have technical competency.
431
00:23:42,970 --> 00:23:45,670
You need to be to explain to
your client like what you're
432
00:23:45,670 --> 00:23:46,690
doing and what you're using.
433
00:23:46,690 --> 00:23:49,750
So this is all just part of that
education that we're trying to.
434
00:23:50,155 --> 00:23:54,385
Make part of our attorney's life and that
we also require that they accept power
435
00:23:54,385 --> 00:23:56,665
policy on the acceptable use of gen ai.
436
00:23:56,665 --> 00:24:00,115
And then on top of that, whenever
there's a new gen AI focused
437
00:24:00,115 --> 00:24:01,080
tool, we require a training.
438
00:24:01,830 --> 00:24:02,940
On that particular tool.
439
00:24:02,940 --> 00:24:06,450
So we wanted to make sure that the
education was there, that people
440
00:24:06,450 --> 00:24:09,690
are becoming more and more familiar
with what this is and isn't.
441
00:24:10,050 --> 00:24:13,080
And that continues to be our
goal going forward and, and
442
00:24:13,080 --> 00:24:14,400
our requirement going forward.
443
00:24:14,730 --> 00:24:17,550
So that was partly what we discussed
at Skills because I think at that
444
00:24:17,550 --> 00:24:20,730
point, not that many firms had
tried to rule out anything, um,
445
00:24:20,820 --> 00:24:24,780
with that much of a comprehensive
plan and onboarding requirements.
446
00:24:25,080 --> 00:24:27,690
And we thought it seemed like that
was helpful for people to hear.
447
00:24:28,545 --> 00:24:31,605
So the first thing we tried to
do was just let the frontier
448
00:24:31,605 --> 00:24:34,635
models try to create a jury.
449
00:24:34,635 --> 00:24:38,475
So we said, create for us a jury
pool that is similar to what a
450
00:24:38,475 --> 00:24:42,765
federal jury pool would be, and
that's where Michael Scott emerged.
451
00:24:42,825 --> 00:24:44,655
It was, it was really hilarious.
452
00:24:44,685 --> 00:24:48,180
Uh, they would, they would output the
demographics of the jurors, so it was.
453
00:24:48,774 --> 00:24:53,514
White man in his mid forties, who is
the manager of a mid-size paper firm in
454
00:24:53,514 --> 00:24:58,465
Scranton, Pennsylvania, which you and I
would obviously know is Michael Scott.
455
00:24:58,524 --> 00:25:01,735
Michael Scott is not a real
person, let alone the real
456
00:25:01,735 --> 00:25:04,014
juror in the federal jury pool.
457
00:25:04,014 --> 00:25:04,225
Right.
458
00:25:04,225 --> 00:25:09,055
We also had a lot of other interesting
combinations of, there was a 90-year-old
459
00:25:09,055 --> 00:25:11,395
woman who was a part-time botanist.
460
00:25:11,395 --> 00:25:12,535
Part-time dj.
461
00:25:12,540 --> 00:25:12,580
Dj.
462
00:25:13,435 --> 00:25:13,764
I love that one.
463
00:25:13,764 --> 00:25:15,595
We had an abolition abolitionist.
464
00:25:15,595 --> 00:25:16,615
Podcaster.
465
00:25:16,615 --> 00:25:16,675
Yeah.
466
00:25:16,735 --> 00:25:18,295
So it seemed like when.
467
00:25:18,705 --> 00:25:20,805
These platforms were left
to their own devices.
468
00:25:20,805 --> 00:25:25,035
They were generating jurors
that were more for show, kind of
469
00:25:25,035 --> 00:25:27,045
eye-catching types of backgrounds.
470
00:25:27,255 --> 00:25:31,485
That really didn't reflect what
we needed for our purposes,
471
00:25:31,485 --> 00:25:36,195
what real people on a jury would
actually look like demographically.
472
00:25:36,960 --> 00:25:40,830
And then you can tell that, you
know, they're a kid in, you know,
473
00:25:41,129 --> 00:25:44,550
Washington is using them right now
to study who's 12 years old and
474
00:25:44,550 --> 00:25:45,900
maybe using it for creative writing.
475
00:25:45,900 --> 00:25:48,420
So, you know, there's a big range
of people why people are using these
476
00:25:48,420 --> 00:25:51,780
tools and they, you know, have the dial
on certain types of representation,
477
00:25:51,780 --> 00:25:54,540
which could be very useful obviously
in a creative writing context.
478
00:25:54,540 --> 00:25:56,430
But in ours that was,
you know, catastrophic.
479
00:25:56,875 --> 00:25:58,345
Because it was wasn't
representing reality.
480
00:25:58,615 --> 00:26:00,895
Thanks for listening to
Legal Innovation Spotlight.
481
00:26:01,405 --> 00:26:04,915
If you found value in this chat, hit
the subscribe button to be notified
482
00:26:04,915 --> 00:26:06,385
when we release new episodes.
483
00:26:06,895 --> 00:26:09,565
We'd also really appreciate it if
you could take a moment to rate
484
00:26:09,565 --> 00:26:12,235
us and leave us a review wherever
you're listening right now.
485
00:26:12,805 --> 00:26:15,275
Your feedback helps us provide
you with top-notch content.
-->
Subscribe
Stay up on the latest innovations in legal technology and knowledge management.
Manage Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.