In this episode, Ted sits down with Julien Steel, Director of Product Management at LexisNexis, to discuss the evolution of legal technology and the integration of AI in legal workflows. From the challenges of scaling a legal tech startup to the impact of knowledge management on search relevancy, Julien shares his expertise in product strategy and innovation. With AI transforming how legal professionals access and utilize information, this conversation highlights the growing role of technology in improving efficiency and accuracy in the legal industry.
In this episode, Julien Steel shares insights on how to:
Scale a legal tech company in a competitive market
Leverage AI to enhance legal search and document management
Use knowledge management to optimize workflow efficiency
Navigate the challenges of integrating AI into legal technology
Improve user adoption through targeted implementation strategies
Key takeaways:
AI is reshaping legal workflows by improving search efficiency and accuracy
Knowledge management plays a crucial role in optimizing legal tech solutions
Rapid implementation is key to driving adoption in legal firms
Marketing strategies that combine humor and engagement can drive brand success
The future of AI in legal technology will blend multiple technologies for better outcomes
About the guest, Julien Steel:
Julien Steel is the Director of Product Management at LexisNexis, where he leads the development of AI-driven solutions for the legal industry. With over a decade of experience in Software-as-a-Service, he focuses on optimizing contract drafting and negotiation through innovative technology. His expertise lies in enhancing operational efficiency and user experience to drive smarter legal workflows.
“I don’t believe that we need to ultimately train everybody to become the best prompt engineers. I think we need to be a lot smarter in how we build the best user experience that can really understand all of the intricacies of what the lawyer is doing.”
1
00:00:01,976 --> 00:00:04,035
Julien, how are you this afternoon?
2
00:00:04,035 --> 00:00:04,599
Great!
3
00:00:04,599 --> 00:00:05,642
How are you?
4
00:00:05,932 --> 00:00:07,645
I'm doing excellent.
5
00:00:07,645 --> 00:00:08,897
It's a little cold here in St.
6
00:00:08,897 --> 00:00:13,654
Louis, but you're in Belgium, correct?
7
00:00:13,704 --> 00:00:14,115
Exactly.
8
00:00:14,115 --> 00:00:16,278
I'm currently in Ghent in Belgium.
9
00:00:16,278 --> 00:00:18,191
So it's on the north side of Belgium.
10
00:00:18,774 --> 00:00:20,341
Is it cold there as well?
11
00:00:20,364 --> 00:00:22,126
So it's getting warmer.
12
00:00:22,167 --> 00:00:25,511
We had a couple of weeks where it was very, cold.
13
00:00:25,512 --> 00:00:28,756
So yeah, I hope we'll get the sun very soon.
14
00:00:28,856 --> 00:00:29,987
Gotcha.
15
00:00:30,268 --> 00:00:33,711
Well, I appreciate you spending a few minutes with me today.
16
00:00:33,746 --> 00:00:41,591
And I thought it would be a really interesting conversation around your past with Henchman
and now you guys are part of Lexus.
17
00:00:41,591 --> 00:00:49,160
But before we jump in, why don't you take a couple of minutes, tell us who you are, what
you do and where you do it.
18
00:00:49,909 --> 00:00:50,860
Sure.
19
00:00:50,860 --> 00:00:52,681
So my name is Julian.
20
00:00:52,721 --> 00:00:55,813
I'm of born and raised in Belgium.
21
00:00:55,813 --> 00:00:58,605
I'm part of the team at Henchman.
22
00:00:58,605 --> 00:01:00,627
So I lead the product team.
23
00:01:00,627 --> 00:01:07,951
That means that I'm responsible for leading the product strategy, vision, and then
executing it together with our team.
24
00:01:08,552 --> 00:01:13,215
I'm part of the team since two years.
25
00:01:13,435 --> 00:01:19,069
Together with the founders, we've been working very hard at building out this company.
26
00:01:19,172 --> 00:01:28,389
The founders are three people, young entrepreneurs from Ghent, really tech entrepreneurs,
and they started the business about four years ago.
27
00:01:29,548 --> 00:01:29,748
Yeah.
28
00:01:29,748 --> 00:01:32,624
And you guys have had quite the trajectory.
29
00:01:32,624 --> 00:01:38,558
I think you, Hinchman was founded in 2021 and then acquired last year.
30
00:01:38,558 --> 00:01:40,008
Is that right?
31
00:01:40,036 --> 00:01:40,576
Exactly.
32
00:01:40,576 --> 00:01:47,238
So in the summer of last year, 2024, yeah, it's been an amazing ride.
33
00:01:47,358 --> 00:01:54,019
So the company I founded, indeed, four years ago, just at the beginning of the COVID
crisis.
34
00:01:54,740 --> 00:02:05,252
I mean, the whole idea of the founding was when Arttec, the founders, sat together with
some friends who were lawyers, and they were discussing ideas.
35
00:02:05,252 --> 00:02:08,043
And from there, the idea was sort of born.
36
00:02:08,043 --> 00:02:15,283
to help lawyers find precedent language, so clauses and definitions they had written
before.
37
00:02:16,203 --> 00:02:20,263
And from there, it really was being built out.
38
00:02:20,263 --> 00:02:30,283
We quickly built a prototype and then iterated with customers here in Belgium, but also
elsewhere.
39
00:02:30,343 --> 00:02:35,682
And then, yeah, so we got, I'm sure we're going to talk about it in just a bit.
40
00:02:35,702 --> 00:02:40,429
started talking with Lexus over a year ago.
41
00:02:40,471 --> 00:02:45,178
And then that led to the recent acquisition of our company.
42
00:02:45,698 --> 00:02:49,410
Yeah, I mean, it's quite amazing and unusual in the legal tech world.
43
00:02:49,410 --> 00:03:02,848
I've been in it for quite some time and you don't see, you know, zero to acquisition
happen that quickly because for a number of reasons, but you know, in the law firm world,
44
00:03:02,848 --> 00:03:12,144
especially in big law, it takes a little while usually to build market share because law
firms operate on the concept of precedence.
45
00:03:12,144 --> 00:03:13,314
They want to know
46
00:03:13,538 --> 00:03:17,080
you know, what other, what other law firms their size you do business with.
47
00:03:17,080 --> 00:03:22,803
And if you go into certain markets like New York, like they also, they want to know what
New York law firms you work with.
48
00:03:22,803 --> 00:03:30,186
And when you're breaking into the market, sometimes that, that takes a while, but you guys
did it in very short order.
49
00:03:30,836 --> 00:03:31,666
Right.
50
00:03:31,766 --> 00:03:32,456
Yeah, absolutely.
51
00:03:32,456 --> 00:03:35,747
I think it's a combination of different things.
52
00:03:36,788 --> 00:03:44,480
if I look at the team of Henchmen, we're all people who have been through scaling
technology companies before.
53
00:03:44,480 --> 00:03:48,671
And you could really feel that energy.
54
00:03:48,671 --> 00:04:00,117
We quickly also looked at outside of Belgium, quickly went into the US markets, were at a
lot of events, created a lot of buzz with our messaging.
55
00:04:00,117 --> 00:04:11,330
But I think what really was a big part of our success was our laser focus on a very, very
specific problem that exists throughout the world.
56
00:04:11,330 --> 00:04:24,714
And that problem is specifically with, I mean, legal professionals that negotiate and
draft complex and bespoke contracts.
57
00:04:24,714 --> 00:04:26,344
They had a real pain of
58
00:04:26,410 --> 00:04:31,950
A lot of them are of course relying on their own precedence or precedence of their
colleagues.
59
00:04:32,310 --> 00:04:35,350
And before Henschman existed, that was a real pain.
60
00:04:35,350 --> 00:04:45,370
They had to go through a DMS, go open multiple documents and find that one particular
clause that they once had written.
61
00:04:45,370 --> 00:04:48,290
And that's a very painful flow.
62
00:04:48,410 --> 00:04:55,070
I think the founders really found that problem worth solving and then worked very closely.
63
00:04:55,242 --> 00:04:55,853
in doing that.
64
00:04:55,853 --> 00:05:07,113
And I think our focus on that particular problem allowed us to really build a very
compelling product that resonated around the world.
65
00:05:07,778 --> 00:05:08,098
Yeah.
66
00:05:08,098 --> 00:05:16,484
And you guys, to your credit, um, I, from afar, I admired your marketing, you know, and
your sales efforts.
67
00:05:16,484 --> 00:05:19,826
Uh, I, I love Steve.
68
00:05:20,046 --> 00:05:21,407
wasn't that his name?
69
00:05:23,088 --> 00:05:23,499
Yeah.
70
00:05:23,499 --> 00:05:31,324
I mean, that was such an interesting character and you know, it's kind of hard to
sometimes to pull off, you know, comedy in marketing.
71
00:05:31,324 --> 00:05:35,987
Sometimes it comes off as dumb and cheesy and you guys managed to nail it.
72
00:05:35,987 --> 00:05:37,728
Like was that.
73
00:05:37,826 --> 00:05:41,577
Did the concept for that campaign, I mean, was that internally?
74
00:05:41,577 --> 00:05:42,941
Did you have some external help?
75
00:05:42,941 --> 00:05:44,946
Because it was, it was so entertaining.
76
00:05:45,127 --> 00:05:45,618
Yeah.
77
00:05:45,618 --> 00:05:55,525
I mean, all the credits to our marketing founder, Jorn, and our entire team, marketing
team, who of course built up the idea and then iterated on top of that.
78
00:05:56,257 --> 00:06:00,830
I was also amazed when joining Henchman at the taste.
79
00:06:00,830 --> 00:06:05,854
Actually, it's very tasteful, the marketing material, and it got a lot of appeal.
80
00:06:05,854 --> 00:06:08,236
mean, everybody's talking about Steve Lit.
81
00:06:08,236 --> 00:06:10,151
I mean, even the announcement video.
82
00:06:10,151 --> 00:06:15,444
We did with Lexus had a really good flavor of that.
83
00:06:15,605 --> 00:06:19,427
And I think that really helped spread the message.
84
00:06:19,427 --> 00:06:22,159
It's intrigued a lot of people.
85
00:06:22,159 --> 00:06:29,994
then when we then actually came to the product, because that's really the message we carry
for ourselves.
86
00:06:29,994 --> 00:06:35,218
We take maybe ourselves not too seriously, but our product super, super serious.
87
00:06:35,218 --> 00:06:39,160
And you can really see that in everything we do.
88
00:06:39,950 --> 00:06:40,796
What happened to Steve?
89
00:06:40,796 --> 00:06:42,121
Is he still around?
90
00:06:42,493 --> 00:06:44,136
He's definitely still around.
91
00:06:45,140 --> 00:06:48,227
And I mean, he's gone on to other, to do other things.
92
00:06:48,227 --> 00:06:50,852
But yeah, he's definitely still around.
93
00:06:51,224 --> 00:06:55,129
Gotcha, are we gonna see him in any future campaigns or did he?
94
00:06:55,570 --> 00:06:59,046
We might, so no comment.
95
00:07:01,366 --> 00:07:02,056
there you go.
96
00:07:02,056 --> 00:07:02,607
Good.
97
00:07:02,607 --> 00:07:04,047
Keep us guessing.
98
00:07:04,548 --> 00:07:11,701
What was the growth like, I guess in your case locally in the EU versus US?
99
00:07:11,701 --> 00:07:17,633
Did you immediately jump into the US market or did you initially gain some momentum in the
EU?
100
00:07:18,269 --> 00:07:24,093
We initially gained some momentum in the EU, especially in our home markets, markets
around.
101
00:07:24,774 --> 00:07:31,978
I think it's important to mention, of course, our product is helping to serve as precedent
language.
102
00:07:32,019 --> 00:07:35,381
From the get-go, we wanted to make our product language-agnostic.
103
00:07:35,381 --> 00:07:41,005
So that means that any language, Latin and Cyrillic, could be supported by our product.
104
00:07:41,005 --> 00:07:45,447
And that allowed, I mean, for Europe in particular, that's super important.
105
00:07:45,862 --> 00:07:50,784
The way we really spread ourselves was focusing on hubs in countries.
106
00:07:50,784 --> 00:08:06,280
So for example, in Sweden, Finland, Germany, Paris, and in France, we went in and find
innovative firms.
107
00:08:06,621 --> 00:08:11,543
And then they were the early adopters and that helped spread around the area.
108
00:08:11,543 --> 00:08:13,263
What we found as well is that
109
00:08:13,320 --> 00:08:18,002
especially with the rise of Chatch GPT, a lot of the firms were looking at each other.
110
00:08:18,002 --> 00:08:19,312
How are you doing things?
111
00:08:19,312 --> 00:08:22,363
And how are you doing things with generative AI?
112
00:08:22,363 --> 00:08:23,544
And there was a lot of meetups.
113
00:08:23,544 --> 00:08:30,466
We organized our own meetups where we were explaining our vision around gen AI.
114
00:08:30,466 --> 00:08:35,048
And that helped to create a lot of buzz here in Europe.
115
00:08:35,288 --> 00:08:42,531
I think we started doing US, I mean, fairly quickly, not from the get go, but fairly
quickly.
116
00:08:42,952 --> 00:08:45,444
We didn't hire people there.
117
00:08:45,444 --> 00:09:01,046
What we did directly, what we did is we traveled a lot to the US, to lot of conferences
and then got initial traction, also signed two of the MLOD 200 firms in that first year.
118
00:09:01,046 --> 00:09:07,870
So that was a good early step for the growth that we now have.
119
00:09:09,722 --> 00:09:11,775
It's been, I mean, a combination of course.
120
00:09:11,775 --> 00:09:18,924
and, um, but I think what resonates really well around, the world was, uh, it was the same
problem we're solving.
121
00:09:18,924 --> 00:09:27,275
Um, even, mean, in Europe, U S, all of these lawyers were, were looking for precedent
language and I couldn't find it.
122
00:09:27,502 --> 00:09:28,282
Yeah.
123
00:09:28,282 --> 00:09:35,365
And I did a episode that was released just the other day with Jack Sheppard from iManage.
124
00:09:35,365 --> 00:09:39,467
we were talking about Enterprise Search.
125
00:09:39,927 --> 00:09:44,929
Enterprise Search has had mixed success in the legal market.
126
00:09:44,929 --> 00:09:48,010
It's a really hard project to pull off.
127
00:09:48,090 --> 00:09:49,871
It's very expensive.
128
00:09:49,871 --> 00:09:53,692
There's a huge corpus of data that has to be crawled and indexed.
129
00:09:54,013 --> 00:09:57,374
It sounds like you guys early on,
130
00:09:58,680 --> 00:10:01,694
didn't go that direction very intentionally, it sounds like.
131
00:10:01,694 --> 00:10:04,398
You really wanted to be a niche point solution.
132
00:10:04,398 --> 00:10:05,640
Is that accurate?
133
00:10:05,991 --> 00:10:06,441
Absolutely.
134
00:10:06,441 --> 00:10:14,224
I think the focus was particularly on helping to find clauses and definitions that you
have worked on before.
135
00:10:14,224 --> 00:10:19,036
And I think that focus really allowed us to go very, very special.
136
00:10:19,036 --> 00:10:25,979
For example, one of the features that we also included is a version history of clauses.
137
00:10:25,979 --> 00:10:35,513
Now that's one of our more advanced functionalities, but we recognize that contracts get
negotiated back and forth and these are all individual files.
138
00:10:35,513 --> 00:10:44,165
often stored on one folder in the DMS and maybe not under the version history of the DMS,
but we could automatically detect that.
139
00:10:44,165 --> 00:10:52,487
Well, we've built a very specific algorithm that detects that and can then match how
clauses got evolved over time.
140
00:10:52,487 --> 00:10:59,209
And that's a very, very specific use case we could only do by focusing on that particular
data type.
141
00:10:59,209 --> 00:11:00,769
And that's so useful.
142
00:11:00,769 --> 00:11:03,490
I mean, imagine, I mean, what you can do is...
143
00:11:03,910 --> 00:11:13,190
For example, you want to find a class, a liability class you've written before for that
same client, the same opposing counsel many years ago.
144
00:11:13,190 --> 00:11:19,250
With one click, you can find it and can see how it got negotiated back in the day and what
was ultimately accepted.
145
00:11:19,610 --> 00:11:25,970
Those insights are so useful for your drafting preparation for negotiation.
146
00:11:26,090 --> 00:11:27,590
All of that lives in the DMS.
147
00:11:27,590 --> 00:11:30,710
You could just not easily find it, and now you can.
148
00:11:31,254 --> 00:11:32,136
Interesting.
149
00:11:32,136 --> 00:11:36,854
And how has your integration with Lexus been?
150
00:11:36,854 --> 00:11:38,523
Well, first of all, how long has it been?
151
00:11:38,523 --> 00:11:40,588
Was that mid last year?
152
00:11:41,776 --> 00:11:53,152
So the integration with Lexus and our combined products are actually launching end of this
month, at the time of this recording.
153
00:11:53,152 --> 00:11:55,393
So that means end of January.
154
00:11:55,973 --> 00:12:03,327
So the journey started many months before that, I think more than a year ago.
155
00:12:03,327 --> 00:12:11,396
And I think we started talking with each other because we obviously have same clients.
156
00:12:11,396 --> 00:12:22,630
But a lot of our clients were telling us, well, it's good to find our own language, but it
would be good to also have language from, for example, practical guidance being in the
157
00:12:22,630 --> 00:12:24,291
search results as well.
158
00:12:24,291 --> 00:12:28,792
So that led us to start talking to Lexis, for example.
159
00:12:29,033 --> 00:12:33,314
And Lexis actually had very similar requests, but in the other way.
160
00:12:33,314 --> 00:12:41,382
So they have all of this premium content, of course, but their customers were saying,
well, actually I prefer also my own data to
161
00:12:41,382 --> 00:12:45,441
to be surfaced in the search experience or in AI solutions.
162
00:12:45,441 --> 00:12:48,042
So that led us really to come together.
163
00:12:49,842 --> 00:13:00,782
And I mean, I'm very excited that, of course, in the last six months after we announced
the acquisition this summer, our teams have been very hard at work to make that reality.
164
00:13:01,162 --> 00:13:09,002
So that means, mean, beginning now, 2025, we're launching two combined products, one in
Plus AI,
165
00:13:10,438 --> 00:13:14,138
we are launching the integration to your DMS.
166
00:13:14,138 --> 00:13:26,298
That means that you can ask PlusAI, for example, to draft a particular clause or a
particular document, and it will grant its answer on data coming from your DMS with the
167
00:13:26,298 --> 00:13:28,458
sources mentioned there.
168
00:13:28,518 --> 00:13:38,698
And then we're also launching Create Plus, which is a Microsoft Word add-in, very much
similar to what Henchman offered before.
169
00:13:38,758 --> 00:13:50,238
But it includes all of the henshin capabilities with all of the create capabilities that
existed before, a lot of proofreading and other smart drafting capabilities.
170
00:13:50,538 --> 00:13:54,518
So this is what we're launching beginning this year.
171
00:13:54,718 --> 00:14:08,431
And it's really been a hard, I mean, a lot of efforts these last six months to bring that
together, but it's really under this shared vision of bringing premium content from Lexus.
172
00:14:08,431 --> 00:14:16,113
together with DMS data and very advanced AI solutions, all together in products that are
close to you.
173
00:14:18,080 --> 00:14:19,031
Interesting.
174
00:14:19,031 --> 00:14:26,687
what is the, tell me a little bit about the architecture of the solution and your document
processing methodology.
175
00:14:26,687 --> 00:14:28,238
How does all that work?
176
00:14:29,295 --> 00:14:33,766
Yeah, so that's a little bit of the secret sauce, I would say.
177
00:14:33,766 --> 00:14:38,407
But I can give you some insights because of course, I believe in transparency.
178
00:14:38,407 --> 00:14:43,328
mean, that's always the conversation that we have with law firms.
179
00:14:43,689 --> 00:14:44,859
So what happens, right?
180
00:14:44,859 --> 00:14:47,489
So we connect with the DMS of the law firm.
181
00:14:47,489 --> 00:14:50,710
So that's either an on-prem solution or in the cloud.
182
00:14:51,070 --> 00:14:55,792
We are often talking or integrated with a part of the DMS.
183
00:14:55,792 --> 00:14:58,052
So typically it's...
184
00:14:58,455 --> 00:15:06,289
For example, going live for certain practice groups or we're only looking at the last five
years worth of data.
185
00:15:07,189 --> 00:15:09,831
We can automatically exclude all the emails, for example.
186
00:15:09,831 --> 00:15:14,173
So what we then do is we process all of the documents.
187
00:15:14,173 --> 00:15:17,614
We recognize what is a contract and what is not.
188
00:15:17,615 --> 00:15:21,756
And then if it's a contract, we take out calls and definitions.
189
00:15:21,977 --> 00:15:28,567
We then generate a lot of metadata, also mirror metadata that is on the DMS.
190
00:15:28,567 --> 00:15:31,841
mirror the whole security framework that is in the DMS.
191
00:15:31,841 --> 00:15:35,766
So you're only allowed to see what you're allowed to see in the DMS.
192
00:15:35,766 --> 00:15:41,682
And then basically build this index, this index of your previously written clauses and
definitions.
193
00:15:41,682 --> 00:15:48,400
And then that's the search experience that you can find in CreatePlus and in PlusAI.
194
00:15:49,260 --> 00:15:55,713
And is this DMS agnostic or are you just I manage or just net docs?
195
00:15:55,960 --> 00:15:57,974
We support all the major DMSs.
196
00:15:57,974 --> 00:16:05,720
So the ones you mentioned, I manage net documents, but we also support open text, Google
Drive and SharePoint.
197
00:16:05,984 --> 00:16:08,075
Interesting.
198
00:16:08,416 --> 00:16:18,024
so it doesn't sound like, you know, one of the challenges, we're not a search company, but
we are search adjacent, I would say.
199
00:16:18,085 --> 00:16:25,471
And I've had a front row seat to a lot of projects in legal enterprise search projects.
200
00:16:25,471 --> 00:16:33,943
And I've seen firms that recrawl and index their entire DM corpus, hundreds of millions of
documents, and spend an obscene amount of money on
201
00:16:33,943 --> 00:16:34,714
Mmm.
202
00:16:34,714 --> 00:16:39,055
a search framework and just the supporting infrastructure.
203
00:16:39,055 --> 00:16:45,177
We had one client was spending 20, $30,000 a month in Azure consumption to do this.
204
00:16:45,757 --> 00:16:48,628
Yeah, that didn't include the licensing fee for the framework.
205
00:16:48,628 --> 00:16:51,799
didn't include the implementation of the project.
206
00:16:52,159 --> 00:16:57,780
It did not include the ongoing internal staff support required to keep it running.
207
00:16:57,780 --> 00:17:03,820
So all in, they were in seven figures for this search project and not really that big of a
firm.
208
00:17:03,820 --> 00:17:04,136
Mmm.
209
00:17:04,136 --> 00:17:09,448
maybe just inside the AmLaw 50, I think.
210
00:17:09,708 --> 00:17:14,810
so you're not, I guess your solution to this, you're not taking that approach.
211
00:17:15,310 --> 00:17:26,324
This is either a subset, is the area of the DMS that you tap into, is that curated or do
you just narrow it down by practice and by date?
212
00:17:26,583 --> 00:17:31,447
Yeah, so practice data and there's a whole range of other criteria that we could leverage.
213
00:17:31,447 --> 00:17:33,348
It's often, I mean, a collaboration with the firm.
214
00:17:33,348 --> 00:17:34,429
So every firm is unique.
215
00:17:34,429 --> 00:17:37,291
So it's really various.
216
00:17:37,592 --> 00:17:42,256
But we don't necessarily look at only a curated data set.
217
00:17:42,256 --> 00:17:46,098
I can talk about our strategy around that in just a bit as well.
218
00:17:46,880 --> 00:17:55,233
But especially if we're integrated, what we're very focused on is
219
00:17:55,233 --> 00:18:03,886
the particular use case, helping you service clauses and definitions you've written before
and that's finding the needle in the haystack, if you may.
220
00:18:03,886 --> 00:18:12,109
And the way we measure that and obviously then work together with the firm to optimize
that is the conversion to useful action.
221
00:18:12,109 --> 00:18:24,714
And what this means is that for every search request that happens in enhancement, we
measure, did that search experience end in a user doing something useful with it or not?
222
00:18:24,992 --> 00:18:29,844
And that that conversion to useful action shows us relevancy of our search experience.
223
00:18:29,844 --> 00:18:35,896
So our collaboration with the firm is looking at that number and optimizing that over
time.
224
00:18:35,896 --> 00:18:47,350
And we see that that increases by number of parameters that we can implement, but also
just over time as users are getting familiar with it.
225
00:18:48,211 --> 00:18:54,853
So that's really how we measure and collaborate with the firm to make sure that of course,
226
00:18:55,467 --> 00:18:59,123
we're helping everybody in the firm find what they're looking for.
227
00:18:59,512 --> 00:19:02,688
How do you know if they've done something useful with it?
228
00:19:02,688 --> 00:19:07,046
set of actions in the add-in.
229
00:19:07,046 --> 00:19:11,973
So it's either taking a copy, doing something with the search results.
230
00:19:11,974 --> 00:19:16,261
It's a set of, let's say actions that the user can do.
231
00:19:16,854 --> 00:19:17,595
I see.
232
00:19:17,595 --> 00:19:23,490
And then how do you handle multi-jurisdictional and language challenges?
233
00:19:23,490 --> 00:19:30,785
mean, would seem, especially in the EU, you've got a multitude of how was all that
managed?
234
00:19:30,785 --> 00:19:44,199
So I think our processing of contracts that actually determines, this is a clause and this
is another clause is actually a complex set of rules that is looking at the structure of
235
00:19:44,199 --> 00:19:48,390
documents rather than actually what is written inside of it.
236
00:19:48,650 --> 00:19:59,613
Whereas a lot of, let's say NLP solutions are really just looking at the text, we're doing
a combination of it, but primarily looking at the structure that allowed us to...
237
00:19:59,730 --> 00:20:04,733
be language agnostic from the get-go, from the start.
238
00:20:06,154 --> 00:20:10,337
so that is sort of almost that secret sauce.
239
00:20:10,377 --> 00:20:19,303
But we do have some machine learning models that, for example, recognize the contract
type, jurisdiction of every document.
240
00:20:19,303 --> 00:20:29,153
And that is metadata that is added on top that allows you to filter in the add-in to
exactly the sort of set of clauses or precedents.
241
00:20:29,153 --> 00:20:30,364
that you're looking for.
242
00:20:31,042 --> 00:20:35,115
Yeah, everybody wants to talk about gen AI these days, right?
243
00:20:35,115 --> 00:20:38,669
It's the topic du jour and an interesting one.
244
00:20:38,669 --> 00:20:41,572
We talk a lot about it a lot on the podcast.
245
00:20:41,572 --> 00:20:48,237
I'm an avid user of gen AI, but AI has existed in legal for many, many years.
246
00:20:49,459 --> 00:20:58,306
How much of what you're doing is machine learning versus gen AI versus just kind of
rules-based algorithms?
247
00:20:59,106 --> 00:21:04,668
So I can't tell sort of exact percentages, but it's really a combination of those three.
248
00:21:04,668 --> 00:21:11,131
so rule-based is really helping us determine, well, this most likely looks like a contract
or not.
249
00:21:11,131 --> 00:21:13,092
This is nested clauses.
250
00:21:13,092 --> 00:21:19,574
Definitions are most often separate section in the contract and are typically defined as
such.
251
00:21:19,614 --> 00:21:20,835
We have machine learning models.
252
00:21:20,835 --> 00:21:22,195
I mentioned it already.
253
00:21:22,195 --> 00:21:27,357
Recognize contract type, jurisdiction, and a whole range of other metadata.
254
00:21:27,785 --> 00:21:34,648
And then Gen.EI, it does help us with some very specific use cases.
255
00:21:34,648 --> 00:21:36,358
So let me give you some examples.
256
00:21:36,358 --> 00:21:40,950
So one of the things that we have is a smart replace function.
257
00:21:40,950 --> 00:21:48,373
So imagine you're looking at a search results, very particular precedent, and there's a
word mentioned, sellers.
258
00:21:48,373 --> 00:21:49,894
So plural.
259
00:21:49,894 --> 00:21:54,196
In the contract you're working on, every seller is just singular.
260
00:21:54,196 --> 00:21:55,966
We just one click of a button.
261
00:21:56,272 --> 00:21:59,713
you will replace the words, but all the grammar will also be taken into account.
262
00:21:59,713 --> 00:22:09,292
That's a gen AI capability that just allows you to just quickly match that search results
to the contract you're working on.
263
00:22:09,556 --> 00:22:11,897
That's one example of applying gen AI.
264
00:22:11,897 --> 00:22:22,179
Another one is when you, example, to comparing two clauses with each other, the one in
your contract and the one of the search results, we leverage gen AI to create a table that
265
00:22:22,179 --> 00:22:24,820
discusses or that lists all the key
266
00:22:24,981 --> 00:22:28,544
topics that are mentioned in both clauses and the differences in between.
267
00:22:28,544 --> 00:22:34,249
So you can quickly benchmark against your standard or precedent within your database.
268
00:22:34,249 --> 00:22:38,633
So we're using our combination of these three.
269
00:22:38,633 --> 00:22:44,459
And for all, all about what works best for that use case, right?
270
00:22:44,459 --> 00:22:46,600
And for what we're trying to achieve.
271
00:22:47,192 --> 00:22:47,542
Yeah.
272
00:22:47,542 --> 00:22:57,075
So in that example that you just gave leveraging gen AI, I don't see a big risk where
hallucinations would come into play.
273
00:22:57,075 --> 00:23:03,402
Is that a risk you have to manage at all with the gen AI aspects of the platform?
274
00:23:03,718 --> 00:23:11,213
I of course, I think that's always sort of an area and we test for that a lot.
275
00:23:11,213 --> 00:23:20,960
what we've done is built evaluation scenarios, so possible inputs that could be in a
particular functionality.
276
00:23:20,960 --> 00:23:26,343
And then we let a legal team or a legal experts team evaluate, is everything configured?
277
00:23:26,343 --> 00:23:28,725
Is the prompts correctly configured?
278
00:23:28,725 --> 00:23:32,041
The types of inputs correctly maintained?
279
00:23:32,041 --> 00:23:39,974
to produce the most accurate or the most hallucination-free results.
280
00:23:40,095 --> 00:23:47,948
We also, I mean, just explain to the user, look, this is AI-generated and please use this
with caution.
281
00:23:47,948 --> 00:23:58,713
I think that's a fairly standard and general practice nowadays so that everybody is aware
where this comes from.
282
00:23:58,990 --> 00:23:59,951
Gotcha.
283
00:23:59,951 --> 00:24:01,532
yeah, you mentioned prompting.
284
00:24:01,532 --> 00:24:08,578
Like how much of this is point and click and how much requires actual prompting from your
end users?
285
00:24:08,984 --> 00:24:15,387
All of it is just clicking a button and then everything happens behind the scenes.
286
00:24:15,387 --> 00:24:19,709
We do have some capabilities where you can just say, hey, help me.
287
00:24:19,829 --> 00:24:25,292
there's a free text field where you can just prompt and make a suggestion.
288
00:24:25,612 --> 00:24:32,325
But most of the capabilities that we have are really just one click away.
289
00:24:33,294 --> 00:24:34,274
I see.
290
00:24:34,274 --> 00:24:34,694
Yeah.
291
00:24:34,694 --> 00:24:46,454
You know, um, so you guys started in 2021 and everyone, most people think that gen AI
started in November of 2022.
292
00:24:47,394 --> 00:24:57,054
Um, transformers, which is the technology that, that gen AI is based on the paper for that
attention is all you need.
293
00:24:57,054 --> 00:25:00,574
I think is the paper that's become very infamous.
294
00:25:00,574 --> 00:25:01,100
That
295
00:25:01,100 --> 00:25:02,951
was 2017, 2016.
296
00:25:02,951 --> 00:25:06,053
So AI existed prior to.
297
00:25:06,053 --> 00:25:10,896
And machine learning has been around for a very long time.
298
00:25:10,896 --> 00:25:19,121
Where do you see, you kind of have a front row seat, not only with your work at Henchman,
but Lexus is a leader in the space.
299
00:25:19,121 --> 00:25:28,886
How big a role and how quickly do you see Gen AI making real transformation in legal?
300
00:25:30,974 --> 00:25:36,295
I think we're only at the beginning of what AI can do for us.
301
00:25:36,636 --> 00:25:50,359
I think what everybody sort of recognizes a lot of the low value task, initial draft
generation, helping me to understand all different kinds of emails, summarization of long
302
00:25:50,359 --> 00:25:53,720
texts, helping me organize thoughts.
303
00:25:53,720 --> 00:25:57,481
These are just very sort of the beginning of everything.
304
00:25:58,161 --> 00:26:00,242
We believe at Lexis that
305
00:26:00,901 --> 00:26:16,559
we can build or that every legal professional will have an AI agent that is really there
by their side and knows their work, their style and their expertise and can really help
306
00:26:16,580 --> 00:26:18,600
with very specific tasks.
307
00:26:18,981 --> 00:26:29,086
Those today are let's say little more general ones but are increasingly going to become
more practice group oriented, very specific things.
308
00:26:29,086 --> 00:26:46,946
And we'll see a lot more relevance to that and will help elevate the legal professional so
that they can be a lot more closer to their clients and support them much more
309
00:26:46,946 --> 00:26:47,886
strategically.
310
00:26:48,486 --> 00:26:59,086
So I'm very bullish about how it will impact the legal world.
311
00:26:59,791 --> 00:27:03,293
I think we just need to continue to experiment a lot.
312
00:27:03,293 --> 00:27:04,853
think that's crucial.
313
00:27:05,734 --> 00:27:07,235
And we need to be open to that.
314
00:27:07,235 --> 00:27:16,040
There's going to be a lot of fails, but it's crucial for the innovation to flourish.
315
00:27:16,322 --> 00:27:27,448
And we need to also understand that the interface needs to be as easy as possible.
316
00:27:28,157 --> 00:27:34,237
I don't believe that we need to ultimately sort of train everybody to become the best
prompt engineers.
317
00:27:34,237 --> 00:27:51,137
I think we need to be a lot more smarter in how can we build best user experience that is
basically either sort of voice to text or that can really understand all the different
318
00:27:51,137 --> 00:27:57,653
inter-seal for sort of what the lawyer is doing and can talk to the...
319
00:27:57,669 --> 00:28:01,470
LLMs or on models in the right way.
320
00:28:01,791 --> 00:28:06,392
So to me that's really the key two things that we have to continue to focus on.
321
00:28:07,053 --> 00:28:18,117
But obviously I'm very hopeful that that will have a huge impact on the legal world.
322
00:28:18,712 --> 00:28:30,858
Well, you mentioned experimentation and you know, we've largely been, there's surveys out
there that don't line up with reality on what adoption is actually taking place, at least
323
00:28:30,858 --> 00:28:32,509
in the practice of law.
324
00:28:32,509 --> 00:28:42,813
I think there have been some business of law use cases a little where AI's made further
inroads, but there's not a lot happening today in the practice of law.
325
00:28:42,974 --> 00:28:44,270
And, um,
326
00:28:44,270 --> 00:28:48,590
But that's starting to change, but it's mostly been experimentation.
327
00:28:48,910 --> 00:28:58,330
you know, you see, uh, the Iltatech survey, uh, came back and said that, you know, uh,
actually have it up right here.
328
00:28:59,830 --> 00:29:07,170
74 % of law firms with more than 700 lawyers are using AI and for business tasks.
329
00:29:07,170 --> 00:29:08,610
Well, what does that, what does that mean?
330
00:29:08,610 --> 00:29:08,930
Really?
331
00:29:08,930 --> 00:29:12,750
Does that mean a lawyer Googled chat GPT?
332
00:29:12,750 --> 00:29:19,195
Um, or is, that mean they're using tools like yours in, in, practice?
333
00:29:19,355 --> 00:29:33,678
And what I found interesting 74%, which is totally aspirational that, that I, but then I
saw another, uh, survey from Thompson Reuters from two weeks ago that said that only 10 %
334
00:29:33,678 --> 00:29:38,240
of law firms have a gen AI policy.
335
00:29:38,373 --> 00:29:39,026
Yeah.
336
00:29:39,026 --> 00:29:46,886
it's like, okay, if 74 % are using it and 10 % have a policy around it, that's a
disconnect.
337
00:29:46,906 --> 00:29:51,626
And it feels to me that we're still in that experimentation phase.
338
00:29:51,766 --> 00:29:56,466
And I believe that is going to continue through at least the first half of the year.
339
00:29:56,466 --> 00:30:01,798
I really start, I, just, this is just gut feel, don't really have data to support it, but.
340
00:30:01,934 --> 00:30:06,634
You know, we do business with dozens of AMLaw firms and we kind of get a nice
cross-sectional view.
341
00:30:06,634 --> 00:30:09,018
I attend 10, 12 conferences a year.
342
00:30:09,018 --> 00:30:17,102
So I get my ears to the ground on this and it feels like the first half of the year is
going to continue to be experimentation.
343
00:30:17,102 --> 00:30:28,589
then second half of the year, maybe even Q4, we'll see some more like real traction in
terms of practice of law use cases, leveraging AI.
344
00:30:28,589 --> 00:30:30,059
What do you, how do you feel about the timeline?
345
00:30:30,059 --> 00:30:31,860
Do you see it the same or differently?
346
00:30:32,915 --> 00:30:36,356
Yeah, of the same.
347
00:30:36,817 --> 00:30:53,983
think what I see a lot is that firms are setting up AI task groups, group of maybe young
lawyers who are tasked with, hey, experiment with it and help us craft its AI policy.
348
00:30:55,036 --> 00:31:05,376
I think those are poised to continue to exist as more advanced models will continue to
come out.
349
00:31:05,376 --> 00:31:21,156
What I often warn firms and our sort of pilot fatigue, so we see that happen a lot where
there's a lot of legal AI or legal tech solutions coming out and almost everybody wants to
350
00:31:21,156 --> 00:31:24,820
sort of it out and
351
00:31:24,944 --> 00:31:32,946
What often happens is that it's just thrown to lawyers, hey, try this, try this, try this,
try this, their actual preparation of, what are we trying to test here?
352
00:31:32,946 --> 00:31:34,706
What are we trying to succeed?
353
00:31:34,706 --> 00:31:41,528
So I think, I mean, we warned and collaborated with firms on very diligently on like,
okay, well, you're going to test this.
354
00:31:41,528 --> 00:31:43,689
What are you trying to achieve here?
355
00:31:43,689 --> 00:31:46,429
What are our success criteria?
356
00:31:46,429 --> 00:31:52,451
Are we adequately informing all of the participants here to test
357
00:31:53,807 --> 00:31:57,288
test this out and then determine the success after that.
358
00:31:57,348 --> 00:32:03,390
So that you don't create pilot fatigue because that is then the worst that could happen.
359
00:32:03,390 --> 00:32:06,011
That of course everybody's just discouraged.
360
00:32:06,011 --> 00:32:13,233
Well, it's not really fitting what I want because I mean, that was not really allowed,
aligned.
361
00:32:13,393 --> 00:32:20,195
And that of course is very detrimental to what we're trying to achieve.
362
00:32:20,974 --> 00:32:22,175
Yeah, I agree.
363
00:32:22,175 --> 00:32:32,279
And I think that's where KM plays a key role is being the filter and KM is uniquely
positioned.
364
00:32:32,279 --> 00:32:34,880
And that's why KM should be leading AI.
365
00:32:34,880 --> 00:32:48,405
If KM exists in a law firm, some firms, smaller firms, especially, they don't have a KM
group, but having KM who understands, you know, many are lawyers and they understand the
366
00:32:48,405 --> 00:32:49,206
practice of law.
367
00:32:49,206 --> 00:32:50,606
They understand
368
00:32:50,798 --> 00:33:02,958
the firm culture and being the filter and being selective about what actually gets in
front of the timekeepers because they have aggressive goals and expectations around the
369
00:33:02,958 --> 00:33:04,238
delivery of work.
370
00:33:04,238 --> 00:33:14,538
And if you're going to distract them, you better have a good reason because you're not
going to, you're going to have a much harder time distracting them the second time.
371
00:33:14,538 --> 00:33:18,838
If, if the first go round, wasn't productive.
372
00:33:18,918 --> 00:33:20,678
So yeah, what do you,
373
00:33:21,051 --> 00:33:30,017
What role do you see CKM professionals playing in this era of AI?
374
00:33:30,017 --> 00:33:31,628
I think they've become more important.
375
00:33:31,628 --> 00:33:36,401
I've heard chatter around, okay, what do we need KM for?
376
00:33:36,401 --> 00:33:37,682
We've got AI.
377
00:33:37,743 --> 00:33:41,735
You need KM more than ever is the way I see it.
378
00:33:41,735 --> 00:33:43,146
I don't know if you agree.
379
00:33:43,885 --> 00:33:44,965
I agree.
380
00:33:45,526 --> 00:33:49,607
I think KM plays a crucial role in all of this indeed.
381
00:33:49,888 --> 00:33:55,845
What we've seen is that we cannot just connect with the DMS.
382
00:33:55,845 --> 00:34:06,203
So, and we've really learned this through deploying Henchman through a number of different
very large firms around the world.
383
00:34:07,204 --> 00:34:18,753
it's really, I mean, this idea of, well, we have all of the contracts, all the precedents
live in one unstructured way and we integrate with that and we help organize whatever is
384
00:34:18,753 --> 00:34:19,534
there.
385
00:34:19,534 --> 00:34:25,018
But very often, I mean, you have what you're talking about, just the KM team that created
386
00:34:25,018 --> 00:34:29,958
templates, know how, everything is sort of curated in a particular folder.
387
00:34:30,238 --> 00:34:34,458
And what we've learned is that we're also going to integrate with that.
388
00:34:34,458 --> 00:34:44,658
But we're going to allow KMs to have influence on how that appears in the search results,
and ultimately also how it is ranked in it.
389
00:34:44,738 --> 00:34:53,378
And this is where sort of the magic really comes in, where, for example, a firm, we
integrate with iManage.
390
00:34:53,378 --> 00:34:54,958
That's really where all of their
391
00:34:54,958 --> 00:34:57,380
their deals are and all the contracts.
392
00:34:57,380 --> 00:35:00,563
Okay, all of that is structured in our index.
393
00:35:00,563 --> 00:35:07,008
And then we also integrate with the SharePoint, for example, where a lot of the KM content
lives.
394
00:35:07,008 --> 00:35:14,755
All of the KM content is labeled for specifically, and then gets boosted to the top of the
search experience.
395
00:35:14,755 --> 00:35:23,662
That means if you're looking at particular class indemnities or anything else, you first
stumble upon what is our...
396
00:35:24,115 --> 00:35:27,277
sort of internal standards and then other relevant precedents.
397
00:35:27,277 --> 00:35:32,739
You can really build a nuanced view for your draft.
398
00:35:32,940 --> 00:35:37,082
I mean, that's sort of giving tools to the QM.
399
00:35:37,082 --> 00:35:38,442
We have a lot more ideas.
400
00:35:38,442 --> 00:35:50,489
For example, one of the things we recently launched is giving QM attorneys the insights
into what is being searched for within the DMS using Henchman.
401
00:35:50,489 --> 00:35:53,658
So what are searches that have zero search results?
402
00:35:53,658 --> 00:35:54,909
So that's really a gap in knowledge.
403
00:35:54,909 --> 00:35:58,722
mean, people are looking for that, but there's actually nothing appearing.
404
00:35:58,722 --> 00:36:03,326
So, I mean, we might want to prioritize some work here.
405
00:36:03,326 --> 00:36:06,389
What are precedents that everybody's constantly looking back at?
406
00:36:06,389 --> 00:36:10,032
And maybe that's something we want to look at and then curate as well.
407
00:36:10,032 --> 00:36:22,724
So these are now insights that we can give that allow KMs to help reduce what we call
dynamic knowledge management, very versus
408
00:36:22,724 --> 00:36:27,157
traditional knowledge management that was really a longer process.
409
00:36:27,157 --> 00:36:32,239
I had to bring a lot of people around the table and it was very hard to maintain.
410
00:36:32,680 --> 00:36:42,525
So for us, CAMS play a crucial role in helping to create the search experience a lot more
relevant and tapping into the workflow.
411
00:36:42,525 --> 00:36:51,310
So all with the goal of supporting the legal professional with a new draft and in their
negotiations.
412
00:36:52,002 --> 00:37:00,429
Yeah, you mentioned searches with zero results, maybe being a gap in knowledge.
413
00:37:00,429 --> 00:37:07,184
It's usually a gap in the search index or a gap in the search strategy I've found.
414
00:37:07,184 --> 00:37:18,203
Usually the data exists, but it's incorrectly indexed or the information is usually there.
415
00:37:18,203 --> 00:37:21,165
It's just not very findable.
416
00:37:22,062 --> 00:37:30,030
But how do you think about measuring search relevancy and user adoption?
417
00:37:30,030 --> 00:37:35,636
Obviously, there's the extreme cases where you got zero search results.
418
00:37:35,636 --> 00:37:36,987
We know that's a gap.
419
00:37:36,987 --> 00:37:46,686
But in general, how are you tracking relevancy of the output of your platform?
420
00:37:48,358 --> 00:37:50,119
Combination of two things.
421
00:37:50,660 --> 00:38:07,069
One is we've built our own data sets and build our own tests or sort of test scenarios and
our teams or legal expert teams are evaluating the relevance, recall and precision of the
422
00:38:07,069 --> 00:38:07,929
search.
423
00:38:08,097 --> 00:38:18,105
So that's of course using our own data sets and it's evaluating the algorithms that are
constantly evolving.
424
00:38:18,105 --> 00:38:29,554
That of course, mean, doesn't mimic the real life scenario where, I mean, it's client data
that is being used and that's where conversion to useful action.
425
00:38:29,554 --> 00:38:36,560
So what I was describing before helps us indicate, well, are users doing something with
the search results?
426
00:38:36,948 --> 00:38:38,049
It's not 100 %...
427
00:38:38,049 --> 00:38:40,971
Let's say...
428
00:38:42,953 --> 00:38:52,582
It will sometimes be inaccurate, but it gives us largely a direction of, okay, are we
actually producing relevant search results?
429
00:38:52,582 --> 00:38:55,724
And what we want to see is trends that are going up, of course.
430
00:38:55,885 --> 00:39:02,710
And it's, like I mentioned before, a way in which we are collaborating.
431
00:39:02,710 --> 00:39:04,780
So if we're looking, for example...
432
00:39:04,780 --> 00:39:12,736
had a low conversion to useful action number in a particular firm, we know we need to do
some things in the configuration.
433
00:39:12,736 --> 00:39:19,310
Like you're saying, sometimes it's not findable because it's somewhere else or we didn't
index it properly.
434
00:39:19,310 --> 00:39:20,831
Well, that's all.
435
00:39:21,812 --> 00:39:26,455
We have a whole tool set that allows to discover that and to optimize for that.
436
00:39:26,540 --> 00:39:33,450
I think that's ultimately the secret source of Henshwin.
437
00:39:33,718 --> 00:39:41,884
those capabilities together with our capability to index these individual clauses.
438
00:39:41,884 --> 00:39:43,676
So that's really how we go about it.
439
00:39:44,224 --> 00:39:44,894
Interesting.
440
00:39:44,894 --> 00:39:56,899
How much of how much work is there between practice areas and, implement implementing, you
know, we at info dash, we're an internet, extranet platform.
441
00:39:56,899 --> 00:40:01,861
There is a tremendous amount of services that have to be delivered as part of our product.
442
00:40:01,861 --> 00:40:04,192
It's a blessing and a curse, right?
443
00:40:04,192 --> 00:40:09,174
It's it, it doesn't allow us to just, it's not like Dropbox.
444
00:40:09,174 --> 00:40:12,566
You download setup.exe, you run it and you're off to the races.
445
00:40:12,566 --> 00:40:13,358
There's all these.
446
00:40:13,358 --> 00:40:17,378
plumbing that has to be built in and then the clients want customizations.
447
00:40:17,478 --> 00:40:26,818
So, you know, it's, um, we really enjoy that work because it, get to see our product get
brought to life, but there's a lot of it.
448
00:40:26,818 --> 00:40:31,658
In fact, probably 40 % of our revenue is services related.
449
00:40:31,658 --> 00:40:37,078
Like how much work is required on your end across different practices to implement the
platform.
450
00:40:37,352 --> 00:40:40,714
I'm going to give a nuanced answer because actually it's not a lot of work.
451
00:40:40,714 --> 00:40:47,437
So we have a record for putting a customer life from signing the contract to actually user
using it.
452
00:40:47,437 --> 00:40:49,078
And it was 24 hours.
453
00:40:49,078 --> 00:40:55,722
So we were in 24 hours, database got connected, we indexed everything and then they
started to use it.
454
00:40:55,722 --> 00:41:03,286
It's a small firm out of New York and small disclaimer, but just to say it can be quite
quickly.
455
00:41:03,286 --> 00:41:05,097
Now, if we're talking about a big firm,
456
00:41:05,339 --> 00:41:12,715
with multiple locations, multiple practice group, a huge database with a lot of security
controls that might take a couple of days.
457
00:41:12,715 --> 00:41:22,994
There's a lot of people involved and then we're sort of largely dependent on how can we
bring the right people into the meeting?
458
00:41:22,994 --> 00:41:26,036
Do we understand what we're actually solving for?
459
00:41:26,456 --> 00:41:31,621
There's also a whole range of configuration that help tailor the search experience.
460
00:41:31,621 --> 00:41:33,662
we can group people or
461
00:41:33,662 --> 00:41:44,891
you group users in teams per practice group, for example, and then we can help you all the
real estate team only needs to look at this set of the data and all of the rest is really
462
00:41:44,891 --> 00:41:45,772
irrelevant.
463
00:41:45,772 --> 00:41:49,555
So all of that is part of the configuration that can be done.
464
00:41:49,555 --> 00:42:03,586
So the nuanced answer is it can be within 24 hours, but it can also be a lot more if we're
really sort of tailoring it for, let's say more complex scenarios.
465
00:42:03,662 --> 00:42:06,405
which is very normal and standard.
466
00:42:06,746 --> 00:42:10,471
But it's important to mention that there's no technical involvement needed.
467
00:42:10,471 --> 00:42:16,808
Everything is all configuration, all sort tailoring the integration.
468
00:42:17,582 --> 00:42:18,243
Wow.
469
00:42:18,243 --> 00:42:18,523
Yeah.
470
00:42:18,523 --> 00:42:22,906
I chuckled when you said days because like, um, I'm jealous.
471
00:42:22,906 --> 00:42:25,477
Um, I mean it takes months for us.
472
00:42:25,477 --> 00:42:30,611
Like we can actually install in two hours our product and wiring up integrations.
473
00:42:30,611 --> 00:42:32,052
We've done it a million times.
474
00:42:32,052 --> 00:42:33,433
That doesn't take long either.
475
00:42:33,433 --> 00:42:39,297
It's really kind of the envisioning process and getting out of users heads, what they want
to see.
476
00:42:39,297 --> 00:42:43,880
And there's a big visual component obviously to intranets and extranets.
477
00:42:43,880 --> 00:42:47,752
Um, well know we're almost at, sorry, go ahead.
478
00:42:47,783 --> 00:42:52,905
I just wanted to mention, mean, it's really a core foundational part of our product
strategy.
479
00:42:53,066 --> 00:42:55,377
We know that lawyers have almost no time.
480
00:42:55,377 --> 00:42:56,738
They want to have things quickly.
481
00:42:56,738 --> 00:43:02,170
And we really built everything with a very quick time to value.
482
00:43:02,170 --> 00:43:08,293
And we invested a lot in our integrations with iManage, Netdocuments, all the DMSs.
483
00:43:08,293 --> 00:43:13,556
We worked very closely with them to understand what are all of the different
configurations that we need to support.
484
00:43:13,556 --> 00:43:16,007
So all of that is baked in the products.
485
00:43:17,067 --> 00:43:24,290
Yeah, that's incredible that you guys are able to get that short of a runway to get off
the ground.
486
00:43:25,031 --> 00:43:28,883
Well, I know we're almost out of time, but I did want to ask you one final question.
487
00:43:28,883 --> 00:43:40,679
And that was like, what is your, and it's going to be a little open-ended, but what is
your future vision for AI enabled workflows outside of just the scope of where your
488
00:43:40,679 --> 00:43:42,600
product plays today?
489
00:43:43,921 --> 00:43:45,772
What do you see coming down the
490
00:43:46,328 --> 00:43:50,191
coming down the runway with AI and legal workflows.
491
00:43:50,191 --> 00:43:52,312
know this is supposed to be the year of agents.
492
00:43:52,312 --> 00:43:54,243
I actually think that's going to be next year.
493
00:43:54,243 --> 00:43:56,144
That's just my personal opinion.
494
00:43:57,246 --> 00:44:05,251
I think there's way too much uncertainty today with how AI responds.
495
00:44:05,251 --> 00:44:15,788
mean, you can take the exact same prompt, put it into the AI platform of your choice, and
you're to get different answers a good percentage of the time, which makes
496
00:44:15,788 --> 00:44:25,556
the decision tree navigation a little challenging, but I don't know what's your vision for
the short term and where AI is going to add value in legal?
497
00:44:26,142 --> 00:44:29,944
I think we're going to see a lot of combination of technologies.
498
00:44:29,944 --> 00:44:44,925
So, gen AI that has its virtues and has its drawbacks, rule-based technologies, just
machine learning models that we talked about before, and bringing these three together and
499
00:44:44,925 --> 00:44:47,879
really making sure that
500
00:44:47,879 --> 00:44:51,732
it's sort of weighted to what is more important.
501
00:44:51,732 --> 00:45:03,819
For example, for billing issues, you might want to look more at very closely precise
things versus if you're looking at more inspiration for gen AI So I think short term,
502
00:45:03,819 --> 00:45:08,142
we're going to see a lot of these things are coming together.
503
00:45:08,963 --> 00:45:16,807
Whereas before, we sort of had applications that were pure gen AI or pure rule-based.
504
00:45:18,134 --> 00:45:22,587
And I think the future is really definitely a combination of those two.
505
00:45:23,054 --> 00:45:23,895
Yeah.
506
00:45:23,998 --> 00:45:27,812
Well, how do people find out more about, are you still called Henchman?
507
00:45:27,812 --> 00:45:29,657
Are you Create Plus now?
508
00:45:29,832 --> 00:45:38,038
No, so, I mean, we're the DMS capabilities within Plus AI and DMS capabilities within
Create Plus.
509
00:45:38,179 --> 00:45:51,069
You can definitely go to the websites of Lexis, Nexus, and that's where you will find a
lot about Plus AI, which is really serviced by Progé and then Create Plus.
510
00:45:51,069 --> 00:45:54,432
So in those two products, we're living today.
511
00:45:55,032 --> 00:45:56,062
Gotcha.
512
00:45:56,063 --> 00:45:56,663
Good stuff.
513
00:45:56,663 --> 00:46:04,506
And I see you on LinkedIn from time to time, not super active, but you're there as well in
case people want to connect.
514
00:46:05,707 --> 00:46:06,457
Well, good stuff.
515
00:46:06,457 --> 00:46:12,320
I appreciate you spending a few minutes with me this afternoon and I hope to chat again
with you soon.
516
00:46:12,424 --> 00:46:13,215
Yeah, likewise.
517
00:46:13,215 --> 00:46:14,210
Thank you very much.
518
00:46:14,210 --> 00:46:15,657
All right, take care.
00:00:04,035
Julien, how are you this afternoon?
2
00:00:04,035 --> 00:00:04,599
Great!
3
00:00:04,599 --> 00:00:05,642
How are you?
4
00:00:05,932 --> 00:00:07,645
I'm doing excellent.
5
00:00:07,645 --> 00:00:08,897
It's a little cold here in St.
6
00:00:08,897 --> 00:00:13,654
Louis, but you're in Belgium, correct?
7
00:00:13,704 --> 00:00:14,115
Exactly.
8
00:00:14,115 --> 00:00:16,278
I'm currently in Ghent in Belgium.
9
00:00:16,278 --> 00:00:18,191
So it's on the north side of Belgium.
10
00:00:18,774 --> 00:00:20,341
Is it cold there as well?
11
00:00:20,364 --> 00:00:22,126
So it's getting warmer.
12
00:00:22,167 --> 00:00:25,511
We had a couple of weeks where it was very, cold.
13
00:00:25,512 --> 00:00:28,756
So yeah, I hope we'll get the sun very soon.
14
00:00:28,856 --> 00:00:29,987
Gotcha.
15
00:00:30,268 --> 00:00:33,711
Well, I appreciate you spending a few minutes with me today.
16
00:00:33,746 --> 00:00:41,591
And I thought it would be a really interesting conversation around your past with Henchman
and now you guys are part of Lexus.
17
00:00:41,591 --> 00:00:49,160
But before we jump in, why don't you take a couple of minutes, tell us who you are, what
you do and where you do it.
18
00:00:49,909 --> 00:00:50,860
Sure.
19
00:00:50,860 --> 00:00:52,681
So my name is Julian.
20
00:00:52,721 --> 00:00:55,813
I'm of born and raised in Belgium.
21
00:00:55,813 --> 00:00:58,605
I'm part of the team at Henchman.
22
00:00:58,605 --> 00:01:00,627
So I lead the product team.
23
00:01:00,627 --> 00:01:07,951
That means that I'm responsible for leading the product strategy, vision, and then
executing it together with our team.
24
00:01:08,552 --> 00:01:13,215
I'm part of the team since two years.
25
00:01:13,435 --> 00:01:19,069
Together with the founders, we've been working very hard at building out this company.
26
00:01:19,172 --> 00:01:28,389
The founders are three people, young entrepreneurs from Ghent, really tech entrepreneurs,
and they started the business about four years ago.
27
00:01:29,548 --> 00:01:29,748
Yeah.
28
00:01:29,748 --> 00:01:32,624
And you guys have had quite the trajectory.
29
00:01:32,624 --> 00:01:38,558
I think you, Hinchman was founded in 2021 and then acquired last year.
30
00:01:38,558 --> 00:01:40,008
Is that right?
31
00:01:40,036 --> 00:01:40,576
Exactly.
32
00:01:40,576 --> 00:01:47,238
So in the summer of last year, 2024, yeah, it's been an amazing ride.
33
00:01:47,358 --> 00:01:54,019
So the company I founded, indeed, four years ago, just at the beginning of the COVID
crisis.
34
00:01:54,740 --> 00:02:05,252
I mean, the whole idea of the founding was when Arttec, the founders, sat together with
some friends who were lawyers, and they were discussing ideas.
35
00:02:05,252 --> 00:02:08,043
And from there, the idea was sort of born.
36
00:02:08,043 --> 00:02:15,283
to help lawyers find precedent language, so clauses and definitions they had written
before.
37
00:02:16,203 --> 00:02:20,263
And from there, it really was being built out.
38
00:02:20,263 --> 00:02:30,283
We quickly built a prototype and then iterated with customers here in Belgium, but also
elsewhere.
39
00:02:30,343 --> 00:02:35,682
And then, yeah, so we got, I'm sure we're going to talk about it in just a bit.
40
00:02:35,702 --> 00:02:40,429
started talking with Lexus over a year ago.
41
00:02:40,471 --> 00:02:45,178
And then that led to the recent acquisition of our company.
42
00:02:45,698 --> 00:02:49,410
Yeah, I mean, it's quite amazing and unusual in the legal tech world.
43
00:02:49,410 --> 00:03:02,848
I've been in it for quite some time and you don't see, you know, zero to acquisition
happen that quickly because for a number of reasons, but you know, in the law firm world,
44
00:03:02,848 --> 00:03:12,144
especially in big law, it takes a little while usually to build market share because law
firms operate on the concept of precedence.
45
00:03:12,144 --> 00:03:13,314
They want to know
46
00:03:13,538 --> 00:03:17,080
you know, what other, what other law firms their size you do business with.
47
00:03:17,080 --> 00:03:22,803
And if you go into certain markets like New York, like they also, they want to know what
New York law firms you work with.
48
00:03:22,803 --> 00:03:30,186
And when you're breaking into the market, sometimes that, that takes a while, but you guys
did it in very short order.
49
00:03:30,836 --> 00:03:31,666
Right.
50
00:03:31,766 --> 00:03:32,456
Yeah, absolutely.
51
00:03:32,456 --> 00:03:35,747
I think it's a combination of different things.
52
00:03:36,788 --> 00:03:44,480
if I look at the team of Henchmen, we're all people who have been through scaling
technology companies before.
53
00:03:44,480 --> 00:03:48,671
And you could really feel that energy.
54
00:03:48,671 --> 00:04:00,117
We quickly also looked at outside of Belgium, quickly went into the US markets, were at a
lot of events, created a lot of buzz with our messaging.
55
00:04:00,117 --> 00:04:11,330
But I think what really was a big part of our success was our laser focus on a very, very
specific problem that exists throughout the world.
56
00:04:11,330 --> 00:04:24,714
And that problem is specifically with, I mean, legal professionals that negotiate and
draft complex and bespoke contracts.
57
00:04:24,714 --> 00:04:26,344
They had a real pain of
58
00:04:26,410 --> 00:04:31,950
A lot of them are of course relying on their own precedence or precedence of their
colleagues.
59
00:04:32,310 --> 00:04:35,350
And before Henschman existed, that was a real pain.
60
00:04:35,350 --> 00:04:45,370
They had to go through a DMS, go open multiple documents and find that one particular
clause that they once had written.
61
00:04:45,370 --> 00:04:48,290
And that's a very painful flow.
62
00:04:48,410 --> 00:04:55,070
I think the founders really found that problem worth solving and then worked very closely.
63
00:04:55,242 --> 00:04:55,853
in doing that.
64
00:04:55,853 --> 00:05:07,113
And I think our focus on that particular problem allowed us to really build a very
compelling product that resonated around the world.
65
00:05:07,778 --> 00:05:08,098
Yeah.
66
00:05:08,098 --> 00:05:16,484
And you guys, to your credit, um, I, from afar, I admired your marketing, you know, and
your sales efforts.
67
00:05:16,484 --> 00:05:19,826
Uh, I, I love Steve.
68
00:05:20,046 --> 00:05:21,407
wasn't that his name?
69
00:05:23,088 --> 00:05:23,499
Yeah.
70
00:05:23,499 --> 00:05:31,324
I mean, that was such an interesting character and you know, it's kind of hard to
sometimes to pull off, you know, comedy in marketing.
71
00:05:31,324 --> 00:05:35,987
Sometimes it comes off as dumb and cheesy and you guys managed to nail it.
72
00:05:35,987 --> 00:05:37,728
Like was that.
73
00:05:37,826 --> 00:05:41,577
Did the concept for that campaign, I mean, was that internally?
74
00:05:41,577 --> 00:05:42,941
Did you have some external help?
75
00:05:42,941 --> 00:05:44,946
Because it was, it was so entertaining.
76
00:05:45,127 --> 00:05:45,618
Yeah.
77
00:05:45,618 --> 00:05:55,525
I mean, all the credits to our marketing founder, Jorn, and our entire team, marketing
team, who of course built up the idea and then iterated on top of that.
78
00:05:56,257 --> 00:06:00,830
I was also amazed when joining Henchman at the taste.
79
00:06:00,830 --> 00:06:05,854
Actually, it's very tasteful, the marketing material, and it got a lot of appeal.
80
00:06:05,854 --> 00:06:08,236
mean, everybody's talking about Steve Lit.
81
00:06:08,236 --> 00:06:10,151
I mean, even the announcement video.
82
00:06:10,151 --> 00:06:15,444
We did with Lexus had a really good flavor of that.
83
00:06:15,605 --> 00:06:19,427
And I think that really helped spread the message.
84
00:06:19,427 --> 00:06:22,159
It's intrigued a lot of people.
85
00:06:22,159 --> 00:06:29,994
then when we then actually came to the product, because that's really the message we carry
for ourselves.
86
00:06:29,994 --> 00:06:35,218
We take maybe ourselves not too seriously, but our product super, super serious.
87
00:06:35,218 --> 00:06:39,160
And you can really see that in everything we do.
88
00:06:39,950 --> 00:06:40,796
What happened to Steve?
89
00:06:40,796 --> 00:06:42,121
Is he still around?
90
00:06:42,493 --> 00:06:44,136
He's definitely still around.
91
00:06:45,140 --> 00:06:48,227
And I mean, he's gone on to other, to do other things.
92
00:06:48,227 --> 00:06:50,852
But yeah, he's definitely still around.
93
00:06:51,224 --> 00:06:55,129
Gotcha, are we gonna see him in any future campaigns or did he?
94
00:06:55,570 --> 00:06:59,046
We might, so no comment.
95
00:07:01,366 --> 00:07:02,056
there you go.
96
00:07:02,056 --> 00:07:02,607
Good.
97
00:07:02,607 --> 00:07:04,047
Keep us guessing.
98
00:07:04,548 --> 00:07:11,701
What was the growth like, I guess in your case locally in the EU versus US?
99
00:07:11,701 --> 00:07:17,633
Did you immediately jump into the US market or did you initially gain some momentum in the
EU?
100
00:07:18,269 --> 00:07:24,093
We initially gained some momentum in the EU, especially in our home markets, markets
around.
101
00:07:24,774 --> 00:07:31,978
I think it's important to mention, of course, our product is helping to serve as precedent
language.
102
00:07:32,019 --> 00:07:35,381
From the get-go, we wanted to make our product language-agnostic.
103
00:07:35,381 --> 00:07:41,005
So that means that any language, Latin and Cyrillic, could be supported by our product.
104
00:07:41,005 --> 00:07:45,447
And that allowed, I mean, for Europe in particular, that's super important.
105
00:07:45,862 --> 00:07:50,784
The way we really spread ourselves was focusing on hubs in countries.
106
00:07:50,784 --> 00:08:06,280
So for example, in Sweden, Finland, Germany, Paris, and in France, we went in and find
innovative firms.
107
00:08:06,621 --> 00:08:11,543
And then they were the early adopters and that helped spread around the area.
108
00:08:11,543 --> 00:08:13,263
What we found as well is that
109
00:08:13,320 --> 00:08:18,002
especially with the rise of Chatch GPT, a lot of the firms were looking at each other.
110
00:08:18,002 --> 00:08:19,312
How are you doing things?
111
00:08:19,312 --> 00:08:22,363
And how are you doing things with generative AI?
112
00:08:22,363 --> 00:08:23,544
And there was a lot of meetups.
113
00:08:23,544 --> 00:08:30,466
We organized our own meetups where we were explaining our vision around gen AI.
114
00:08:30,466 --> 00:08:35,048
And that helped to create a lot of buzz here in Europe.
115
00:08:35,288 --> 00:08:42,531
I think we started doing US, I mean, fairly quickly, not from the get go, but fairly
quickly.
116
00:08:42,952 --> 00:08:45,444
We didn't hire people there.
117
00:08:45,444 --> 00:09:01,046
What we did directly, what we did is we traveled a lot to the US, to lot of conferences
and then got initial traction, also signed two of the MLOD 200 firms in that first year.
118
00:09:01,046 --> 00:09:07,870
So that was a good early step for the growth that we now have.
119
00:09:09,722 --> 00:09:11,775
It's been, I mean, a combination of course.
120
00:09:11,775 --> 00:09:18,924
and, um, but I think what resonates really well around, the world was, uh, it was the same
problem we're solving.
121
00:09:18,924 --> 00:09:27,275
Um, even, mean, in Europe, U S, all of these lawyers were, were looking for precedent
language and I couldn't find it.
122
00:09:27,502 --> 00:09:28,282
Yeah.
123
00:09:28,282 --> 00:09:35,365
And I did a episode that was released just the other day with Jack Sheppard from iManage.
124
00:09:35,365 --> 00:09:39,467
we were talking about Enterprise Search.
125
00:09:39,927 --> 00:09:44,929
Enterprise Search has had mixed success in the legal market.
126
00:09:44,929 --> 00:09:48,010
It's a really hard project to pull off.
127
00:09:48,090 --> 00:09:49,871
It's very expensive.
128
00:09:49,871 --> 00:09:53,692
There's a huge corpus of data that has to be crawled and indexed.
129
00:09:54,013 --> 00:09:57,374
It sounds like you guys early on,
130
00:09:58,680 --> 00:10:01,694
didn't go that direction very intentionally, it sounds like.
131
00:10:01,694 --> 00:10:04,398
You really wanted to be a niche point solution.
132
00:10:04,398 --> 00:10:05,640
Is that accurate?
133
00:10:05,991 --> 00:10:06,441
Absolutely.
134
00:10:06,441 --> 00:10:14,224
I think the focus was particularly on helping to find clauses and definitions that you
have worked on before.
135
00:10:14,224 --> 00:10:19,036
And I think that focus really allowed us to go very, very special.
136
00:10:19,036 --> 00:10:25,979
For example, one of the features that we also included is a version history of clauses.
137
00:10:25,979 --> 00:10:35,513
Now that's one of our more advanced functionalities, but we recognize that contracts get
negotiated back and forth and these are all individual files.
138
00:10:35,513 --> 00:10:44,165
often stored on one folder in the DMS and maybe not under the version history of the DMS,
but we could automatically detect that.
139
00:10:44,165 --> 00:10:52,487
Well, we've built a very specific algorithm that detects that and can then match how
clauses got evolved over time.
140
00:10:52,487 --> 00:10:59,209
And that's a very, very specific use case we could only do by focusing on that particular
data type.
141
00:10:59,209 --> 00:11:00,769
And that's so useful.
142
00:11:00,769 --> 00:11:03,490
I mean, imagine, I mean, what you can do is...
143
00:11:03,910 --> 00:11:13,190
For example, you want to find a class, a liability class you've written before for that
same client, the same opposing counsel many years ago.
144
00:11:13,190 --> 00:11:19,250
With one click, you can find it and can see how it got negotiated back in the day and what
was ultimately accepted.
145
00:11:19,610 --> 00:11:25,970
Those insights are so useful for your drafting preparation for negotiation.
146
00:11:26,090 --> 00:11:27,590
All of that lives in the DMS.
147
00:11:27,590 --> 00:11:30,710
You could just not easily find it, and now you can.
148
00:11:31,254 --> 00:11:32,136
Interesting.
149
00:11:32,136 --> 00:11:36,854
And how has your integration with Lexus been?
150
00:11:36,854 --> 00:11:38,523
Well, first of all, how long has it been?
151
00:11:38,523 --> 00:11:40,588
Was that mid last year?
152
00:11:41,776 --> 00:11:53,152
So the integration with Lexus and our combined products are actually launching end of this
month, at the time of this recording.
153
00:11:53,152 --> 00:11:55,393
So that means end of January.
154
00:11:55,973 --> 00:12:03,327
So the journey started many months before that, I think more than a year ago.
155
00:12:03,327 --> 00:12:11,396
And I think we started talking with each other because we obviously have same clients.
156
00:12:11,396 --> 00:12:22,630
But a lot of our clients were telling us, well, it's good to find our own language, but it
would be good to also have language from, for example, practical guidance being in the
157
00:12:22,630 --> 00:12:24,291
search results as well.
158
00:12:24,291 --> 00:12:28,792
So that led us to start talking to Lexis, for example.
159
00:12:29,033 --> 00:12:33,314
And Lexis actually had very similar requests, but in the other way.
160
00:12:33,314 --> 00:12:41,382
So they have all of this premium content, of course, but their customers were saying,
well, actually I prefer also my own data to
161
00:12:41,382 --> 00:12:45,441
to be surfaced in the search experience or in AI solutions.
162
00:12:45,441 --> 00:12:48,042
So that led us really to come together.
163
00:12:49,842 --> 00:13:00,782
And I mean, I'm very excited that, of course, in the last six months after we announced
the acquisition this summer, our teams have been very hard at work to make that reality.
164
00:13:01,162 --> 00:13:09,002
So that means, mean, beginning now, 2025, we're launching two combined products, one in
Plus AI,
165
00:13:10,438 --> 00:13:14,138
we are launching the integration to your DMS.
166
00:13:14,138 --> 00:13:26,298
That means that you can ask PlusAI, for example, to draft a particular clause or a
particular document, and it will grant its answer on data coming from your DMS with the
167
00:13:26,298 --> 00:13:28,458
sources mentioned there.
168
00:13:28,518 --> 00:13:38,698
And then we're also launching Create Plus, which is a Microsoft Word add-in, very much
similar to what Henchman offered before.
169
00:13:38,758 --> 00:13:50,238
But it includes all of the henshin capabilities with all of the create capabilities that
existed before, a lot of proofreading and other smart drafting capabilities.
170
00:13:50,538 --> 00:13:54,518
So this is what we're launching beginning this year.
171
00:13:54,718 --> 00:14:08,431
And it's really been a hard, I mean, a lot of efforts these last six months to bring that
together, but it's really under this shared vision of bringing premium content from Lexus.
172
00:14:08,431 --> 00:14:16,113
together with DMS data and very advanced AI solutions, all together in products that are
close to you.
173
00:14:18,080 --> 00:14:19,031
Interesting.
174
00:14:19,031 --> 00:14:26,687
what is the, tell me a little bit about the architecture of the solution and your document
processing methodology.
175
00:14:26,687 --> 00:14:28,238
How does all that work?
176
00:14:29,295 --> 00:14:33,766
Yeah, so that's a little bit of the secret sauce, I would say.
177
00:14:33,766 --> 00:14:38,407
But I can give you some insights because of course, I believe in transparency.
178
00:14:38,407 --> 00:14:43,328
mean, that's always the conversation that we have with law firms.
179
00:14:43,689 --> 00:14:44,859
So what happens, right?
180
00:14:44,859 --> 00:14:47,489
So we connect with the DMS of the law firm.
181
00:14:47,489 --> 00:14:50,710
So that's either an on-prem solution or in the cloud.
182
00:14:51,070 --> 00:14:55,792
We are often talking or integrated with a part of the DMS.
183
00:14:55,792 --> 00:14:58,052
So typically it's...
184
00:14:58,455 --> 00:15:06,289
For example, going live for certain practice groups or we're only looking at the last five
years worth of data.
185
00:15:07,189 --> 00:15:09,831
We can automatically exclude all the emails, for example.
186
00:15:09,831 --> 00:15:14,173
So what we then do is we process all of the documents.
187
00:15:14,173 --> 00:15:17,614
We recognize what is a contract and what is not.
188
00:15:17,615 --> 00:15:21,756
And then if it's a contract, we take out calls and definitions.
189
00:15:21,977 --> 00:15:28,567
We then generate a lot of metadata, also mirror metadata that is on the DMS.
190
00:15:28,567 --> 00:15:31,841
mirror the whole security framework that is in the DMS.
191
00:15:31,841 --> 00:15:35,766
So you're only allowed to see what you're allowed to see in the DMS.
192
00:15:35,766 --> 00:15:41,682
And then basically build this index, this index of your previously written clauses and
definitions.
193
00:15:41,682 --> 00:15:48,400
And then that's the search experience that you can find in CreatePlus and in PlusAI.
194
00:15:49,260 --> 00:15:55,713
And is this DMS agnostic or are you just I manage or just net docs?
195
00:15:55,960 --> 00:15:57,974
We support all the major DMSs.
196
00:15:57,974 --> 00:16:05,720
So the ones you mentioned, I manage net documents, but we also support open text, Google
Drive and SharePoint.
197
00:16:05,984 --> 00:16:08,075
Interesting.
198
00:16:08,416 --> 00:16:18,024
so it doesn't sound like, you know, one of the challenges, we're not a search company, but
we are search adjacent, I would say.
199
00:16:18,085 --> 00:16:25,471
And I've had a front row seat to a lot of projects in legal enterprise search projects.
200
00:16:25,471 --> 00:16:33,943
And I've seen firms that recrawl and index their entire DM corpus, hundreds of millions of
documents, and spend an obscene amount of money on
201
00:16:33,943 --> 00:16:34,714
Mmm.
202
00:16:34,714 --> 00:16:39,055
a search framework and just the supporting infrastructure.
203
00:16:39,055 --> 00:16:45,177
We had one client was spending 20, $30,000 a month in Azure consumption to do this.
204
00:16:45,757 --> 00:16:48,628
Yeah, that didn't include the licensing fee for the framework.
205
00:16:48,628 --> 00:16:51,799
didn't include the implementation of the project.
206
00:16:52,159 --> 00:16:57,780
It did not include the ongoing internal staff support required to keep it running.
207
00:16:57,780 --> 00:17:03,820
So all in, they were in seven figures for this search project and not really that big of a
firm.
208
00:17:03,820 --> 00:17:04,136
Mmm.
209
00:17:04,136 --> 00:17:09,448
maybe just inside the AmLaw 50, I think.
210
00:17:09,708 --> 00:17:14,810
so you're not, I guess your solution to this, you're not taking that approach.
211
00:17:15,310 --> 00:17:26,324
This is either a subset, is the area of the DMS that you tap into, is that curated or do
you just narrow it down by practice and by date?
212
00:17:26,583 --> 00:17:31,447
Yeah, so practice data and there's a whole range of other criteria that we could leverage.
213
00:17:31,447 --> 00:17:33,348
It's often, I mean, a collaboration with the firm.
214
00:17:33,348 --> 00:17:34,429
So every firm is unique.
215
00:17:34,429 --> 00:17:37,291
So it's really various.
216
00:17:37,592 --> 00:17:42,256
But we don't necessarily look at only a curated data set.
217
00:17:42,256 --> 00:17:46,098
I can talk about our strategy around that in just a bit as well.
218
00:17:46,880 --> 00:17:55,233
But especially if we're integrated, what we're very focused on is
219
00:17:55,233 --> 00:18:03,886
the particular use case, helping you service clauses and definitions you've written before
and that's finding the needle in the haystack, if you may.
220
00:18:03,886 --> 00:18:12,109
And the way we measure that and obviously then work together with the firm to optimize
that is the conversion to useful action.
221
00:18:12,109 --> 00:18:24,714
And what this means is that for every search request that happens in enhancement, we
measure, did that search experience end in a user doing something useful with it or not?
222
00:18:24,992 --> 00:18:29,844
And that that conversion to useful action shows us relevancy of our search experience.
223
00:18:29,844 --> 00:18:35,896
So our collaboration with the firm is looking at that number and optimizing that over
time.
224
00:18:35,896 --> 00:18:47,350
And we see that that increases by number of parameters that we can implement, but also
just over time as users are getting familiar with it.
225
00:18:48,211 --> 00:18:54,853
So that's really how we measure and collaborate with the firm to make sure that of course,
226
00:18:55,467 --> 00:18:59,123
we're helping everybody in the firm find what they're looking for.
227
00:18:59,512 --> 00:19:02,688
How do you know if they've done something useful with it?
228
00:19:02,688 --> 00:19:07,046
set of actions in the add-in.
229
00:19:07,046 --> 00:19:11,973
So it's either taking a copy, doing something with the search results.
230
00:19:11,974 --> 00:19:16,261
It's a set of, let's say actions that the user can do.
231
00:19:16,854 --> 00:19:17,595
I see.
232
00:19:17,595 --> 00:19:23,490
And then how do you handle multi-jurisdictional and language challenges?
233
00:19:23,490 --> 00:19:30,785
mean, would seem, especially in the EU, you've got a multitude of how was all that
managed?
234
00:19:30,785 --> 00:19:44,199
So I think our processing of contracts that actually determines, this is a clause and this
is another clause is actually a complex set of rules that is looking at the structure of
235
00:19:44,199 --> 00:19:48,390
documents rather than actually what is written inside of it.
236
00:19:48,650 --> 00:19:59,613
Whereas a lot of, let's say NLP solutions are really just looking at the text, we're doing
a combination of it, but primarily looking at the structure that allowed us to...
237
00:19:59,730 --> 00:20:04,733
be language agnostic from the get-go, from the start.
238
00:20:06,154 --> 00:20:10,337
so that is sort of almost that secret sauce.
239
00:20:10,377 --> 00:20:19,303
But we do have some machine learning models that, for example, recognize the contract
type, jurisdiction of every document.
240
00:20:19,303 --> 00:20:29,153
And that is metadata that is added on top that allows you to filter in the add-in to
exactly the sort of set of clauses or precedents.
241
00:20:29,153 --> 00:20:30,364
that you're looking for.
242
00:20:31,042 --> 00:20:35,115
Yeah, everybody wants to talk about gen AI these days, right?
243
00:20:35,115 --> 00:20:38,669
It's the topic du jour and an interesting one.
244
00:20:38,669 --> 00:20:41,572
We talk a lot about it a lot on the podcast.
245
00:20:41,572 --> 00:20:48,237
I'm an avid user of gen AI, but AI has existed in legal for many, many years.
246
00:20:49,459 --> 00:20:58,306
How much of what you're doing is machine learning versus gen AI versus just kind of
rules-based algorithms?
247
00:20:59,106 --> 00:21:04,668
So I can't tell sort of exact percentages, but it's really a combination of those three.
248
00:21:04,668 --> 00:21:11,131
so rule-based is really helping us determine, well, this most likely looks like a contract
or not.
249
00:21:11,131 --> 00:21:13,092
This is nested clauses.
250
00:21:13,092 --> 00:21:19,574
Definitions are most often separate section in the contract and are typically defined as
such.
251
00:21:19,614 --> 00:21:20,835
We have machine learning models.
252
00:21:20,835 --> 00:21:22,195
I mentioned it already.
253
00:21:22,195 --> 00:21:27,357
Recognize contract type, jurisdiction, and a whole range of other metadata.
254
00:21:27,785 --> 00:21:34,648
And then Gen.EI, it does help us with some very specific use cases.
255
00:21:34,648 --> 00:21:36,358
So let me give you some examples.
256
00:21:36,358 --> 00:21:40,950
So one of the things that we have is a smart replace function.
257
00:21:40,950 --> 00:21:48,373
So imagine you're looking at a search results, very particular precedent, and there's a
word mentioned, sellers.
258
00:21:48,373 --> 00:21:49,894
So plural.
259
00:21:49,894 --> 00:21:54,196
In the contract you're working on, every seller is just singular.
260
00:21:54,196 --> 00:21:55,966
We just one click of a button.
261
00:21:56,272 --> 00:21:59,713
you will replace the words, but all the grammar will also be taken into account.
262
00:21:59,713 --> 00:22:09,292
That's a gen AI capability that just allows you to just quickly match that search results
to the contract you're working on.
263
00:22:09,556 --> 00:22:11,897
That's one example of applying gen AI.
264
00:22:11,897 --> 00:22:22,179
Another one is when you, example, to comparing two clauses with each other, the one in
your contract and the one of the search results, we leverage gen AI to create a table that
265
00:22:22,179 --> 00:22:24,820
discusses or that lists all the key
266
00:22:24,981 --> 00:22:28,544
topics that are mentioned in both clauses and the differences in between.
267
00:22:28,544 --> 00:22:34,249
So you can quickly benchmark against your standard or precedent within your database.
268
00:22:34,249 --> 00:22:38,633
So we're using our combination of these three.
269
00:22:38,633 --> 00:22:44,459
And for all, all about what works best for that use case, right?
270
00:22:44,459 --> 00:22:46,600
And for what we're trying to achieve.
271
00:22:47,192 --> 00:22:47,542
Yeah.
272
00:22:47,542 --> 00:22:57,075
So in that example that you just gave leveraging gen AI, I don't see a big risk where
hallucinations would come into play.
273
00:22:57,075 --> 00:23:03,402
Is that a risk you have to manage at all with the gen AI aspects of the platform?
274
00:23:03,718 --> 00:23:11,213
I of course, I think that's always sort of an area and we test for that a lot.
275
00:23:11,213 --> 00:23:20,960
what we've done is built evaluation scenarios, so possible inputs that could be in a
particular functionality.
276
00:23:20,960 --> 00:23:26,343
And then we let a legal team or a legal experts team evaluate, is everything configured?
277
00:23:26,343 --> 00:23:28,725
Is the prompts correctly configured?
278
00:23:28,725 --> 00:23:32,041
The types of inputs correctly maintained?
279
00:23:32,041 --> 00:23:39,974
to produce the most accurate or the most hallucination-free results.
280
00:23:40,095 --> 00:23:47,948
We also, I mean, just explain to the user, look, this is AI-generated and please use this
with caution.
281
00:23:47,948 --> 00:23:58,713
I think that's a fairly standard and general practice nowadays so that everybody is aware
where this comes from.
282
00:23:58,990 --> 00:23:59,951
Gotcha.
283
00:23:59,951 --> 00:24:01,532
yeah, you mentioned prompting.
284
00:24:01,532 --> 00:24:08,578
Like how much of this is point and click and how much requires actual prompting from your
end users?
285
00:24:08,984 --> 00:24:15,387
All of it is just clicking a button and then everything happens behind the scenes.
286
00:24:15,387 --> 00:24:19,709
We do have some capabilities where you can just say, hey, help me.
287
00:24:19,829 --> 00:24:25,292
there's a free text field where you can just prompt and make a suggestion.
288
00:24:25,612 --> 00:24:32,325
But most of the capabilities that we have are really just one click away.
289
00:24:33,294 --> 00:24:34,274
I see.
290
00:24:34,274 --> 00:24:34,694
Yeah.
291
00:24:34,694 --> 00:24:46,454
You know, um, so you guys started in 2021 and everyone, most people think that gen AI
started in November of 2022.
292
00:24:47,394 --> 00:24:57,054
Um, transformers, which is the technology that, that gen AI is based on the paper for that
attention is all you need.
293
00:24:57,054 --> 00:25:00,574
I think is the paper that's become very infamous.
294
00:25:00,574 --> 00:25:01,100
That
295
00:25:01,100 --> 00:25:02,951
was 2017, 2016.
296
00:25:02,951 --> 00:25:06,053
So AI existed prior to.
297
00:25:06,053 --> 00:25:10,896
And machine learning has been around for a very long time.
298
00:25:10,896 --> 00:25:19,121
Where do you see, you kind of have a front row seat, not only with your work at Henchman,
but Lexus is a leader in the space.
299
00:25:19,121 --> 00:25:28,886
How big a role and how quickly do you see Gen AI making real transformation in legal?
300
00:25:30,974 --> 00:25:36,295
I think we're only at the beginning of what AI can do for us.
301
00:25:36,636 --> 00:25:50,359
I think what everybody sort of recognizes a lot of the low value task, initial draft
generation, helping me to understand all different kinds of emails, summarization of long
302
00:25:50,359 --> 00:25:53,720
texts, helping me organize thoughts.
303
00:25:53,720 --> 00:25:57,481
These are just very sort of the beginning of everything.
304
00:25:58,161 --> 00:26:00,242
We believe at Lexis that
305
00:26:00,901 --> 00:26:16,559
we can build or that every legal professional will have an AI agent that is really there
by their side and knows their work, their style and their expertise and can really help
306
00:26:16,580 --> 00:26:18,600
with very specific tasks.
307
00:26:18,981 --> 00:26:29,086
Those today are let's say little more general ones but are increasingly going to become
more practice group oriented, very specific things.
308
00:26:29,086 --> 00:26:46,946
And we'll see a lot more relevance to that and will help elevate the legal professional so
that they can be a lot more closer to their clients and support them much more
309
00:26:46,946 --> 00:26:47,886
strategically.
310
00:26:48,486 --> 00:26:59,086
So I'm very bullish about how it will impact the legal world.
311
00:26:59,791 --> 00:27:03,293
I think we just need to continue to experiment a lot.
312
00:27:03,293 --> 00:27:04,853
think that's crucial.
313
00:27:05,734 --> 00:27:07,235
And we need to be open to that.
314
00:27:07,235 --> 00:27:16,040
There's going to be a lot of fails, but it's crucial for the innovation to flourish.
315
00:27:16,322 --> 00:27:27,448
And we need to also understand that the interface needs to be as easy as possible.
316
00:27:28,157 --> 00:27:34,237
I don't believe that we need to ultimately sort of train everybody to become the best
prompt engineers.
317
00:27:34,237 --> 00:27:51,137
I think we need to be a lot more smarter in how can we build best user experience that is
basically either sort of voice to text or that can really understand all the different
318
00:27:51,137 --> 00:27:57,653
inter-seal for sort of what the lawyer is doing and can talk to the...
319
00:27:57,669 --> 00:28:01,470
LLMs or on models in the right way.
320
00:28:01,791 --> 00:28:06,392
So to me that's really the key two things that we have to continue to focus on.
321
00:28:07,053 --> 00:28:18,117
But obviously I'm very hopeful that that will have a huge impact on the legal world.
322
00:28:18,712 --> 00:28:30,858
Well, you mentioned experimentation and you know, we've largely been, there's surveys out
there that don't line up with reality on what adoption is actually taking place, at least
323
00:28:30,858 --> 00:28:32,509
in the practice of law.
324
00:28:32,509 --> 00:28:42,813
I think there have been some business of law use cases a little where AI's made further
inroads, but there's not a lot happening today in the practice of law.
325
00:28:42,974 --> 00:28:44,270
And, um,
326
00:28:44,270 --> 00:28:48,590
But that's starting to change, but it's mostly been experimentation.
327
00:28:48,910 --> 00:28:58,330
you know, you see, uh, the Iltatech survey, uh, came back and said that, you know, uh,
actually have it up right here.
328
00:28:59,830 --> 00:29:07,170
74 % of law firms with more than 700 lawyers are using AI and for business tasks.
329
00:29:07,170 --> 00:29:08,610
Well, what does that, what does that mean?
330
00:29:08,610 --> 00:29:08,930
Really?
331
00:29:08,930 --> 00:29:12,750
Does that mean a lawyer Googled chat GPT?
332
00:29:12,750 --> 00:29:19,195
Um, or is, that mean they're using tools like yours in, in, practice?
333
00:29:19,355 --> 00:29:33,678
And what I found interesting 74%, which is totally aspirational that, that I, but then I
saw another, uh, survey from Thompson Reuters from two weeks ago that said that only 10 %
334
00:29:33,678 --> 00:29:38,240
of law firms have a gen AI policy.
335
00:29:38,373 --> 00:29:39,026
Yeah.
336
00:29:39,026 --> 00:29:46,886
it's like, okay, if 74 % are using it and 10 % have a policy around it, that's a
disconnect.
337
00:29:46,906 --> 00:29:51,626
And it feels to me that we're still in that experimentation phase.
338
00:29:51,766 --> 00:29:56,466
And I believe that is going to continue through at least the first half of the year.
339
00:29:56,466 --> 00:30:01,798
I really start, I, just, this is just gut feel, don't really have data to support it, but.
340
00:30:01,934 --> 00:30:06,634
You know, we do business with dozens of AMLaw firms and we kind of get a nice
cross-sectional view.
341
00:30:06,634 --> 00:30:09,018
I attend 10, 12 conferences a year.
342
00:30:09,018 --> 00:30:17,102
So I get my ears to the ground on this and it feels like the first half of the year is
going to continue to be experimentation.
343
00:30:17,102 --> 00:30:28,589
then second half of the year, maybe even Q4, we'll see some more like real traction in
terms of practice of law use cases, leveraging AI.
344
00:30:28,589 --> 00:30:30,059
What do you, how do you feel about the timeline?
345
00:30:30,059 --> 00:30:31,860
Do you see it the same or differently?
346
00:30:32,915 --> 00:30:36,356
Yeah, of the same.
347
00:30:36,817 --> 00:30:53,983
think what I see a lot is that firms are setting up AI task groups, group of maybe young
lawyers who are tasked with, hey, experiment with it and help us craft its AI policy.
348
00:30:55,036 --> 00:31:05,376
I think those are poised to continue to exist as more advanced models will continue to
come out.
349
00:31:05,376 --> 00:31:21,156
What I often warn firms and our sort of pilot fatigue, so we see that happen a lot where
there's a lot of legal AI or legal tech solutions coming out and almost everybody wants to
350
00:31:21,156 --> 00:31:24,820
sort of it out and
351
00:31:24,944 --> 00:31:32,946
What often happens is that it's just thrown to lawyers, hey, try this, try this, try this,
try this, their actual preparation of, what are we trying to test here?
352
00:31:32,946 --> 00:31:34,706
What are we trying to succeed?
353
00:31:34,706 --> 00:31:41,528
So I think, I mean, we warned and collaborated with firms on very diligently on like,
okay, well, you're going to test this.
354
00:31:41,528 --> 00:31:43,689
What are you trying to achieve here?
355
00:31:43,689 --> 00:31:46,429
What are our success criteria?
356
00:31:46,429 --> 00:31:52,451
Are we adequately informing all of the participants here to test
357
00:31:53,807 --> 00:31:57,288
test this out and then determine the success after that.
358
00:31:57,348 --> 00:32:03,390
So that you don't create pilot fatigue because that is then the worst that could happen.
359
00:32:03,390 --> 00:32:06,011
That of course everybody's just discouraged.
360
00:32:06,011 --> 00:32:13,233
Well, it's not really fitting what I want because I mean, that was not really allowed,
aligned.
361
00:32:13,393 --> 00:32:20,195
And that of course is very detrimental to what we're trying to achieve.
362
00:32:20,974 --> 00:32:22,175
Yeah, I agree.
363
00:32:22,175 --> 00:32:32,279
And I think that's where KM plays a key role is being the filter and KM is uniquely
positioned.
364
00:32:32,279 --> 00:32:34,880
And that's why KM should be leading AI.
365
00:32:34,880 --> 00:32:48,405
If KM exists in a law firm, some firms, smaller firms, especially, they don't have a KM
group, but having KM who understands, you know, many are lawyers and they understand the
366
00:32:48,405 --> 00:32:49,206
practice of law.
367
00:32:49,206 --> 00:32:50,606
They understand
368
00:32:50,798 --> 00:33:02,958
the firm culture and being the filter and being selective about what actually gets in
front of the timekeepers because they have aggressive goals and expectations around the
369
00:33:02,958 --> 00:33:04,238
delivery of work.
370
00:33:04,238 --> 00:33:14,538
And if you're going to distract them, you better have a good reason because you're not
going to, you're going to have a much harder time distracting them the second time.
371
00:33:14,538 --> 00:33:18,838
If, if the first go round, wasn't productive.
372
00:33:18,918 --> 00:33:20,678
So yeah, what do you,
373
00:33:21,051 --> 00:33:30,017
What role do you see CKM professionals playing in this era of AI?
374
00:33:30,017 --> 00:33:31,628
I think they've become more important.
375
00:33:31,628 --> 00:33:36,401
I've heard chatter around, okay, what do we need KM for?
376
00:33:36,401 --> 00:33:37,682
We've got AI.
377
00:33:37,743 --> 00:33:41,735
You need KM more than ever is the way I see it.
378
00:33:41,735 --> 00:33:43,146
I don't know if you agree.
379
00:33:43,885 --> 00:33:44,965
I agree.
380
00:33:45,526 --> 00:33:49,607
I think KM plays a crucial role in all of this indeed.
381
00:33:49,888 --> 00:33:55,845
What we've seen is that we cannot just connect with the DMS.
382
00:33:55,845 --> 00:34:06,203
So, and we've really learned this through deploying Henchman through a number of different
very large firms around the world.
383
00:34:07,204 --> 00:34:18,753
it's really, I mean, this idea of, well, we have all of the contracts, all the precedents
live in one unstructured way and we integrate with that and we help organize whatever is
384
00:34:18,753 --> 00:34:19,534
there.
385
00:34:19,534 --> 00:34:25,018
But very often, I mean, you have what you're talking about, just the KM team that created
386
00:34:25,018 --> 00:34:29,958
templates, know how, everything is sort of curated in a particular folder.
387
00:34:30,238 --> 00:34:34,458
And what we've learned is that we're also going to integrate with that.
388
00:34:34,458 --> 00:34:44,658
But we're going to allow KMs to have influence on how that appears in the search results,
and ultimately also how it is ranked in it.
389
00:34:44,738 --> 00:34:53,378
And this is where sort of the magic really comes in, where, for example, a firm, we
integrate with iManage.
390
00:34:53,378 --> 00:34:54,958
That's really where all of their
391
00:34:54,958 --> 00:34:57,380
their deals are and all the contracts.
392
00:34:57,380 --> 00:35:00,563
Okay, all of that is structured in our index.
393
00:35:00,563 --> 00:35:07,008
And then we also integrate with the SharePoint, for example, where a lot of the KM content
lives.
394
00:35:07,008 --> 00:35:14,755
All of the KM content is labeled for specifically, and then gets boosted to the top of the
search experience.
395
00:35:14,755 --> 00:35:23,662
That means if you're looking at particular class indemnities or anything else, you first
stumble upon what is our...
396
00:35:24,115 --> 00:35:27,277
sort of internal standards and then other relevant precedents.
397
00:35:27,277 --> 00:35:32,739
You can really build a nuanced view for your draft.
398
00:35:32,940 --> 00:35:37,082
I mean, that's sort of giving tools to the QM.
399
00:35:37,082 --> 00:35:38,442
We have a lot more ideas.
400
00:35:38,442 --> 00:35:50,489
For example, one of the things we recently launched is giving QM attorneys the insights
into what is being searched for within the DMS using Henchman.
401
00:35:50,489 --> 00:35:53,658
So what are searches that have zero search results?
402
00:35:53,658 --> 00:35:54,909
So that's really a gap in knowledge.
403
00:35:54,909 --> 00:35:58,722
mean, people are looking for that, but there's actually nothing appearing.
404
00:35:58,722 --> 00:36:03,326
So, I mean, we might want to prioritize some work here.
405
00:36:03,326 --> 00:36:06,389
What are precedents that everybody's constantly looking back at?
406
00:36:06,389 --> 00:36:10,032
And maybe that's something we want to look at and then curate as well.
407
00:36:10,032 --> 00:36:22,724
So these are now insights that we can give that allow KMs to help reduce what we call
dynamic knowledge management, very versus
408
00:36:22,724 --> 00:36:27,157
traditional knowledge management that was really a longer process.
409
00:36:27,157 --> 00:36:32,239
I had to bring a lot of people around the table and it was very hard to maintain.
410
00:36:32,680 --> 00:36:42,525
So for us, CAMS play a crucial role in helping to create the search experience a lot more
relevant and tapping into the workflow.
411
00:36:42,525 --> 00:36:51,310
So all with the goal of supporting the legal professional with a new draft and in their
negotiations.
412
00:36:52,002 --> 00:37:00,429
Yeah, you mentioned searches with zero results, maybe being a gap in knowledge.
413
00:37:00,429 --> 00:37:07,184
It's usually a gap in the search index or a gap in the search strategy I've found.
414
00:37:07,184 --> 00:37:18,203
Usually the data exists, but it's incorrectly indexed or the information is usually there.
415
00:37:18,203 --> 00:37:21,165
It's just not very findable.
416
00:37:22,062 --> 00:37:30,030
But how do you think about measuring search relevancy and user adoption?
417
00:37:30,030 --> 00:37:35,636
Obviously, there's the extreme cases where you got zero search results.
418
00:37:35,636 --> 00:37:36,987
We know that's a gap.
419
00:37:36,987 --> 00:37:46,686
But in general, how are you tracking relevancy of the output of your platform?
420
00:37:48,358 --> 00:37:50,119
Combination of two things.
421
00:37:50,660 --> 00:38:07,069
One is we've built our own data sets and build our own tests or sort of test scenarios and
our teams or legal expert teams are evaluating the relevance, recall and precision of the
422
00:38:07,069 --> 00:38:07,929
search.
423
00:38:08,097 --> 00:38:18,105
So that's of course using our own data sets and it's evaluating the algorithms that are
constantly evolving.
424
00:38:18,105 --> 00:38:29,554
That of course, mean, doesn't mimic the real life scenario where, I mean, it's client data
that is being used and that's where conversion to useful action.
425
00:38:29,554 --> 00:38:36,560
So what I was describing before helps us indicate, well, are users doing something with
the search results?
426
00:38:36,948 --> 00:38:38,049
It's not 100 %...
427
00:38:38,049 --> 00:38:40,971
Let's say...
428
00:38:42,953 --> 00:38:52,582
It will sometimes be inaccurate, but it gives us largely a direction of, okay, are we
actually producing relevant search results?
429
00:38:52,582 --> 00:38:55,724
And what we want to see is trends that are going up, of course.
430
00:38:55,885 --> 00:39:02,710
And it's, like I mentioned before, a way in which we are collaborating.
431
00:39:02,710 --> 00:39:04,780
So if we're looking, for example...
432
00:39:04,780 --> 00:39:12,736
had a low conversion to useful action number in a particular firm, we know we need to do
some things in the configuration.
433
00:39:12,736 --> 00:39:19,310
Like you're saying, sometimes it's not findable because it's somewhere else or we didn't
index it properly.
434
00:39:19,310 --> 00:39:20,831
Well, that's all.
435
00:39:21,812 --> 00:39:26,455
We have a whole tool set that allows to discover that and to optimize for that.
436
00:39:26,540 --> 00:39:33,450
I think that's ultimately the secret source of Henshwin.
437
00:39:33,718 --> 00:39:41,884
those capabilities together with our capability to index these individual clauses.
438
00:39:41,884 --> 00:39:43,676
So that's really how we go about it.
439
00:39:44,224 --> 00:39:44,894
Interesting.
440
00:39:44,894 --> 00:39:56,899
How much of how much work is there between practice areas and, implement implementing, you
know, we at info dash, we're an internet, extranet platform.
441
00:39:56,899 --> 00:40:01,861
There is a tremendous amount of services that have to be delivered as part of our product.
442
00:40:01,861 --> 00:40:04,192
It's a blessing and a curse, right?
443
00:40:04,192 --> 00:40:09,174
It's it, it doesn't allow us to just, it's not like Dropbox.
444
00:40:09,174 --> 00:40:12,566
You download setup.exe, you run it and you're off to the races.
445
00:40:12,566 --> 00:40:13,358
There's all these.
446
00:40:13,358 --> 00:40:17,378
plumbing that has to be built in and then the clients want customizations.
447
00:40:17,478 --> 00:40:26,818
So, you know, it's, um, we really enjoy that work because it, get to see our product get
brought to life, but there's a lot of it.
448
00:40:26,818 --> 00:40:31,658
In fact, probably 40 % of our revenue is services related.
449
00:40:31,658 --> 00:40:37,078
Like how much work is required on your end across different practices to implement the
platform.
450
00:40:37,352 --> 00:40:40,714
I'm going to give a nuanced answer because actually it's not a lot of work.
451
00:40:40,714 --> 00:40:47,437
So we have a record for putting a customer life from signing the contract to actually user
using it.
452
00:40:47,437 --> 00:40:49,078
And it was 24 hours.
453
00:40:49,078 --> 00:40:55,722
So we were in 24 hours, database got connected, we indexed everything and then they
started to use it.
454
00:40:55,722 --> 00:41:03,286
It's a small firm out of New York and small disclaimer, but just to say it can be quite
quickly.
455
00:41:03,286 --> 00:41:05,097
Now, if we're talking about a big firm,
456
00:41:05,339 --> 00:41:12,715
with multiple locations, multiple practice group, a huge database with a lot of security
controls that might take a couple of days.
457
00:41:12,715 --> 00:41:22,994
There's a lot of people involved and then we're sort of largely dependent on how can we
bring the right people into the meeting?
458
00:41:22,994 --> 00:41:26,036
Do we understand what we're actually solving for?
459
00:41:26,456 --> 00:41:31,621
There's also a whole range of configuration that help tailor the search experience.
460
00:41:31,621 --> 00:41:33,662
we can group people or
461
00:41:33,662 --> 00:41:44,891
you group users in teams per practice group, for example, and then we can help you all the
real estate team only needs to look at this set of the data and all of the rest is really
462
00:41:44,891 --> 00:41:45,772
irrelevant.
463
00:41:45,772 --> 00:41:49,555
So all of that is part of the configuration that can be done.
464
00:41:49,555 --> 00:42:03,586
So the nuanced answer is it can be within 24 hours, but it can also be a lot more if we're
really sort of tailoring it for, let's say more complex scenarios.
465
00:42:03,662 --> 00:42:06,405
which is very normal and standard.
466
00:42:06,746 --> 00:42:10,471
But it's important to mention that there's no technical involvement needed.
467
00:42:10,471 --> 00:42:16,808
Everything is all configuration, all sort tailoring the integration.
468
00:42:17,582 --> 00:42:18,243
Wow.
469
00:42:18,243 --> 00:42:18,523
Yeah.
470
00:42:18,523 --> 00:42:22,906
I chuckled when you said days because like, um, I'm jealous.
471
00:42:22,906 --> 00:42:25,477
Um, I mean it takes months for us.
472
00:42:25,477 --> 00:42:30,611
Like we can actually install in two hours our product and wiring up integrations.
473
00:42:30,611 --> 00:42:32,052
We've done it a million times.
474
00:42:32,052 --> 00:42:33,433
That doesn't take long either.
475
00:42:33,433 --> 00:42:39,297
It's really kind of the envisioning process and getting out of users heads, what they want
to see.
476
00:42:39,297 --> 00:42:43,880
And there's a big visual component obviously to intranets and extranets.
477
00:42:43,880 --> 00:42:47,752
Um, well know we're almost at, sorry, go ahead.
478
00:42:47,783 --> 00:42:52,905
I just wanted to mention, mean, it's really a core foundational part of our product
strategy.
479
00:42:53,066 --> 00:42:55,377
We know that lawyers have almost no time.
480
00:42:55,377 --> 00:42:56,738
They want to have things quickly.
481
00:42:56,738 --> 00:43:02,170
And we really built everything with a very quick time to value.
482
00:43:02,170 --> 00:43:08,293
And we invested a lot in our integrations with iManage, Netdocuments, all the DMSs.
483
00:43:08,293 --> 00:43:13,556
We worked very closely with them to understand what are all of the different
configurations that we need to support.
484
00:43:13,556 --> 00:43:16,007
So all of that is baked in the products.
485
00:43:17,067 --> 00:43:24,290
Yeah, that's incredible that you guys are able to get that short of a runway to get off
the ground.
486
00:43:25,031 --> 00:43:28,883
Well, I know we're almost out of time, but I did want to ask you one final question.
487
00:43:28,883 --> 00:43:40,679
And that was like, what is your, and it's going to be a little open-ended, but what is
your future vision for AI enabled workflows outside of just the scope of where your
488
00:43:40,679 --> 00:43:42,600
product plays today?
489
00:43:43,921 --> 00:43:45,772
What do you see coming down the
490
00:43:46,328 --> 00:43:50,191
coming down the runway with AI and legal workflows.
491
00:43:50,191 --> 00:43:52,312
know this is supposed to be the year of agents.
492
00:43:52,312 --> 00:43:54,243
I actually think that's going to be next year.
493
00:43:54,243 --> 00:43:56,144
That's just my personal opinion.
494
00:43:57,246 --> 00:44:05,251
I think there's way too much uncertainty today with how AI responds.
495
00:44:05,251 --> 00:44:15,788
mean, you can take the exact same prompt, put it into the AI platform of your choice, and
you're to get different answers a good percentage of the time, which makes
496
00:44:15,788 --> 00:44:25,556
the decision tree navigation a little challenging, but I don't know what's your vision for
the short term and where AI is going to add value in legal?
497
00:44:26,142 --> 00:44:29,944
I think we're going to see a lot of combination of technologies.
498
00:44:29,944 --> 00:44:44,925
So, gen AI that has its virtues and has its drawbacks, rule-based technologies, just
machine learning models that we talked about before, and bringing these three together and
499
00:44:44,925 --> 00:44:47,879
really making sure that
500
00:44:47,879 --> 00:44:51,732
it's sort of weighted to what is more important.
501
00:44:51,732 --> 00:45:03,819
For example, for billing issues, you might want to look more at very closely precise
things versus if you're looking at more inspiration for gen AI So I think short term,
502
00:45:03,819 --> 00:45:08,142
we're going to see a lot of these things are coming together.
503
00:45:08,963 --> 00:45:16,807
Whereas before, we sort of had applications that were pure gen AI or pure rule-based.
504
00:45:18,134 --> 00:45:22,587
And I think the future is really definitely a combination of those two.
505
00:45:23,054 --> 00:45:23,895
Yeah.
506
00:45:23,998 --> 00:45:27,812
Well, how do people find out more about, are you still called Henchman?
507
00:45:27,812 --> 00:45:29,657
Are you Create Plus now?
508
00:45:29,832 --> 00:45:38,038
No, so, I mean, we're the DMS capabilities within Plus AI and DMS capabilities within
Create Plus.
509
00:45:38,179 --> 00:45:51,069
You can definitely go to the websites of Lexis, Nexus, and that's where you will find a
lot about Plus AI, which is really serviced by Progé and then Create Plus.
510
00:45:51,069 --> 00:45:54,432
So in those two products, we're living today.
511
00:45:55,032 --> 00:45:56,062
Gotcha.
512
00:45:56,063 --> 00:45:56,663
Good stuff.
513
00:45:56,663 --> 00:46:04,506
And I see you on LinkedIn from time to time, not super active, but you're there as well in
case people want to connect.
514
00:46:05,707 --> 00:46:06,457
Well, good stuff.
515
00:46:06,457 --> 00:46:12,320
I appreciate you spending a few minutes with me this afternoon and I hope to chat again
with you soon.
516
00:46:12,424 --> 00:46:13,215
Yeah, likewise.
517
00:46:13,215 --> 00:46:14,210
Thank you very much.
518
00:46:14,210 --> 00:46:15,657
All right, take care. -->
Subscribe
Stay up on the latest innovations in legal technology and knowledge management.