In this episode, Ted sits down with Kara Peterson, Co-founder at Descrybe.ai, to discuss the realities of building effective AI tools in legal tech and the movement to open up access to public legal data. From the challenges of legal data usability to the importance of independent benchmarking, Kara shares her expertise in product strategy, access-to-justice innovation, and startup growth. With a mission-driven yet market-savvy approach, this conversation offers legal professionals a grounded perspective on the future of legal research.
In this episode, Kara shares insights on how to:
Understand the limitations of open-source legal data in practice
Navigate the risks of oversimplifying AI’s role in legal research
Build trustworthy AI tools that expand access to legal information
Identify underserved markets and rethink legal tech strategy
Advocate for independent benchmarking and human-in-the-loop design
Key takeaways:
Access to public legal data is not the same as usability – real tools require real work
Benchmarking and independent validation are essential in legal AI
AI is a powerful complement to, not a replacement for, legal professionals
There is major opportunity in underserved legal markets if you know where to look
Mission-driven, bootstrapped startups can still compete with legal tech giants
About the guest, Kara Peterson
Kara Peterson is the co-founder of Descrybe.ai, a Boston-based legal tech startup using AI to make legal research faster, smarter, and more accessible. With a background in marketing leadership at institutions like Harvard and Suffolk Law, Kara leads Descrybe’s business strategy and public presence, earning national recognition for the platform. She also co-hosts Building AI Boston and was named a 2024 “Woman of Legal Tech” by the American Bar Association.
“Much of the law is done with just the law of the land, statutes, regulations, things of that nature. And it’s really, really hard to get. It’s almost impossible to get, and that’s weird, right? Because it’s all public information.”
1
00:00:02,111 --> 00:00:03,846
Kara, how are you this morning?
2
00:00:04,084 --> 00:00:04,738
I'm good.
3
00:00:04,738 --> 00:00:05,817
How are doing Ted?
4
00:00:05,817 --> 00:00:08,879
I'm doing great, doing great.
5
00:00:08,879 --> 00:00:20,828
You and I were, I saw a thread on LinkedIn about some open source data and you had made a
comment in there that opened my eyes to some things that I didn't know.
6
00:00:20,949 --> 00:00:32,787
So you and I jumped on a call, you educated me a little bit and um I think we have a
really interesting conversation to be had because I can tell, I see that post.
7
00:00:33,437 --> 00:00:42,843
a million times and I think a lot of people don't understand all the details and nuance
behind it and yeah, it'll be good to kind of get some clarity around that.
8
00:00:42,843 --> 00:00:46,695
um But before we do, let's get you introduced.
9
00:00:46,695 --> 00:00:50,948
So you and I were actually on a podcast together.
10
00:00:50,948 --> 00:00:53,189
Was that a year ago or two years ago?
11
00:00:53,486 --> 00:00:54,786
Yeah, let's see.
12
00:00:54,786 --> 00:00:59,286
We, we launched like two years ago, so it has to have been within the past two year
window.
13
00:00:59,286 --> 00:01:00,786
I just can't remember when.
14
00:01:00,786 --> 00:01:01,521
Yeah.
15
00:01:01,521 --> 00:01:09,838
and I think Horace and Max from Lagora and um maybe the guy from LegalOn.
16
00:01:09,838 --> 00:01:16,404
um But anyway, so you are the co-founder of Describe AI.
17
00:01:16,404 --> 00:01:19,966
You have a A to J focus.
18
00:01:19,966 --> 00:01:23,309
um You actually host a podcast yourself.
19
00:01:23,309 --> 00:01:28,133
Why don't you kind of in the gaps there and tell us a little bit about
20
00:01:28,137 --> 00:01:30,468
who you are, what you do, and where you do it.
21
00:01:31,160 --> 00:01:31,500
Sure.
22
00:01:31,500 --> 00:01:31,940
Yeah.
23
00:01:31,940 --> 00:01:32,581
Thanks so much.
24
00:01:32,581 --> 00:01:34,211
It's really exciting to be on your show.
25
00:01:34,211 --> 00:01:35,122
That was a fun one.
26
00:01:35,122 --> 00:01:35,622
remember that.
27
00:01:35,622 --> 00:01:37,232
Horace always does a great show too.
28
00:01:37,232 --> 00:01:38,733
So that's a good one.
29
00:01:38,733 --> 00:01:39,513
Yeah, he does.
30
00:01:39,513 --> 00:01:42,365
um I'm, Cara Peterson.
31
00:01:42,365 --> 00:01:53,499
So I'm the co-founder of Describe AI and we are a two-year-old legal research company that
em we like to think of ourselves sort of as the mavericks of legal research.
32
00:01:53,499 --> 00:01:58,741
So we've been called, some have called us the Robin Hood of legal research, which is a fun
one too.
33
00:01:58,741 --> 00:02:00,610
But our, um
34
00:02:00,610 --> 00:02:11,421
whole goal is to really try to create tools that help um change the way people can access
what should be, at least in theory, somewhat public information.
35
00:02:11,421 --> 00:02:14,884
So that's why I think that post was so interesting to me.
36
00:02:14,884 --> 00:02:25,214
And so as I've gotten really interested in AI over the past couple of years and sort of
all the things that it can do to change so many different parts of our society.
37
00:02:25,230 --> 00:02:27,281
That's where the show came in that I started.
38
00:02:27,281 --> 00:02:29,132
called Building AI Boston.
39
00:02:29,132 --> 00:02:34,613
And that's all about how AI is being used by real people to do really interesting things.
40
00:02:34,613 --> 00:02:39,015
So it's beyond legal tech, but legal tech does pop up here and there.
41
00:02:39,015 --> 00:02:40,555
So that's me.
42
00:02:40,676 --> 00:02:41,836
Those are my two hats.
43
00:02:41,836 --> 00:02:46,257
And then my third hat is I have a real job that pays me, but we'll talk about that
tomorrow.
44
00:02:46,623 --> 00:02:47,324
Okay.
45
00:02:47,324 --> 00:02:48,384
Good stuff.
46
00:02:48,384 --> 00:02:55,744
Well, let's, yeah, let's talk about this, this open source data movement, legal data
movement and you know, what's reality?
47
00:02:55,744 --> 00:02:56,944
What's, what's hype?
48
00:02:56,944 --> 00:03:12,242
Um, I think it was kind of an interesting headline grabbing, um, post that talked about,
you know, 99 % of case law was being open sourced and
49
00:03:12,242 --> 00:03:21,097
You you had chimed in with some interesting um comments just around kind of what that
really means.
50
00:03:21,097 --> 00:03:26,560
It sounds like some of this data has actually been around for a long time, but might be
somewhat dated.
51
00:03:26,560 --> 00:03:32,203
Like, of separate the fact and fiction around these recent announcements.
52
00:03:33,582 --> 00:03:34,322
Right, sure.
53
00:03:34,322 --> 00:03:39,042
So this was one of those things where I saw it come through my LinkedIn feed.
54
00:03:39,042 --> 00:03:43,502
sometimes you have those moments where you're like, should I jump on this one?
55
00:03:43,502 --> 00:03:44,202
Should I not?
56
00:03:44,202 --> 00:03:45,782
And so I kind of sat on it for a little bit.
57
00:03:45,782 --> 00:03:47,282
And then I decided, you know what?
58
00:03:47,282 --> 00:03:50,662
Yeah, I'm going to put in my two cents here.
59
00:03:50,742 --> 00:03:55,862
So the gist of it was when you think about case law.
60
00:03:55,862 --> 00:03:57,122
So that's what Describe has.
61
00:03:57,122 --> 00:03:59,982
have all kinds of case law right from across the country.
62
00:04:00,002 --> 00:04:02,542
And that is public information.
63
00:04:02,542 --> 00:04:02,982
Right?
64
00:04:02,982 --> 00:04:08,782
mean, so in theory, you should just be able to get whatever you need and get your access
to that easily and so on and so forth.
65
00:04:08,782 --> 00:04:10,722
But we all know that's not the case.
66
00:04:10,922 --> 00:04:18,182
what we've done as a company is like find ways to get that case law and do things to it,
which makes it more usable.
67
00:04:18,182 --> 00:04:22,922
But this post was really interesting to me because there's a few threads.
68
00:04:22,922 --> 00:04:29,774
One, and that I'm super in favor of is this open source kind of like making materials.
69
00:04:29,774 --> 00:04:35,254
available for people, although we have followed a slightly different path that describes
it, which we can talk about later.
70
00:04:35,254 --> 00:04:41,114
And so what should be more available to people than our own laws, like, obviously.
71
00:04:41,134 --> 00:04:50,954
So this post was interesting because it was sort of presenting in some way that access to
this case law unhugging face, which is great that they're doing this, that they're posting
72
00:04:50,954 --> 00:04:53,714
this, was somehow new and different.
73
00:04:53,914 --> 00:04:58,122
But anyone who's been around in this space for a while knows that this
74
00:04:58,122 --> 00:04:59,884
is not actually a new effort.
75
00:04:59,884 --> 00:05:04,788
And so I just wanted to jump in there and sort of talk about what's actually happening in
that space.
76
00:05:04,788 --> 00:05:11,324
And then there was a second part about it that this was the reason I really jumped on it
was that, oh, the case was there.
77
00:05:11,324 --> 00:05:21,994
So anyone can just throw an AI wrapper on it and make some tools that can compete with all
the established legal research tools, which that's the part where I'm like, no, no, no, we
78
00:05:21,994 --> 00:05:24,206
need to talk about this.
79
00:05:24,220 --> 00:05:33,433
Yeah, I mean, there are multi-billion dollar companies that have built themselves on legal
research, TR, Lexis, Bloomberg, others.
80
00:05:33,534 --> 00:05:37,459
And yeah, if it were just that simple, they'd be out of business.
81
00:05:37,459 --> 00:05:38,581
why is it?
82
00:05:38,581 --> 00:05:40,663
Tell us why it's not that simple.
83
00:05:41,314 --> 00:05:50,770
Right, so you can, anyone could get access, you know, and for first I should just say as a
disclaimer, I am the sort of marketing business development side of our company.
84
00:05:50,770 --> 00:05:56,444
We have obviously another side that's the tech side, my co-founder Richard DeBona.
85
00:05:56,444 --> 00:06:00,067
So when I talk about this, I'm talking about it from more like the business perspective.
86
00:06:00,067 --> 00:06:04,390
He has much more, you know, deep thinking about how the tech actually works.
87
00:06:04,390 --> 00:06:08,973
But the point is, is just because it's there doesn't mean you can make it useful, right?
88
00:06:08,973 --> 00:06:10,786
You still need to be able to
89
00:06:10,786 --> 00:06:20,281
build tools with AI and with other types of technologies to mine the information that's
useful for the user of that data.
90
00:06:20,281 --> 00:06:21,632
So it's very easy.
91
00:06:21,632 --> 00:06:32,878
as we've seen, people who are going directly into things like ChachiBT or whatnot to do
legal research or legal um work are coming across.
92
00:06:32,878 --> 00:06:38,198
very significant problems that that data set will have for them, including hallucinations
and things like that.
93
00:06:38,198 --> 00:06:41,258
So it was vastly oversimplifying.
94
00:06:41,258 --> 00:06:42,818
And again, it's just someone's post.
95
00:06:42,818 --> 00:06:52,638
It's not like, you know, they were writing a white paper or something, but just to vastly
oversimplify that people can grab that data, take some kind of chat GPT wrapper, throw it
96
00:06:52,638 --> 00:07:02,298
on there, and you're going to create a tool that's helpful for people and furthering sort
of access to the law or access to justice or the democratization of this information was
97
00:07:02,420 --> 00:07:08,797
so misleading and so many people were jumping on that post and saying, wow, this is game
changing.
98
00:07:08,797 --> 00:07:09,909
It's gonna change everything.
99
00:07:09,909 --> 00:07:12,541
It just rubbed me the wrong way.
100
00:07:12,942 --> 00:07:16,754
Cause that it'll cause more harm than good if people think that's the result.
101
00:07:16,754 --> 00:07:19,466
Yeah, you know, I see it all the time.
102
00:07:19,466 --> 00:07:21,964
In fact, there's a guy I follow.
103
00:07:21,964 --> 00:07:28,091
I won't say his name, but he's a, he has really great content around AI use cases.
104
00:07:28,091 --> 00:07:35,275
And yesterday he posted that he just replaced his lawyer with this prompt.
105
00:07:35,275 --> 00:07:40,998
And you know, it looks, it's a, it's a great prompt, but there's, it's just not that
simple.
106
00:07:40,998 --> 00:07:44,100
And I am definitely in the camp of.
107
00:07:44,167 --> 00:07:50,520
AI today, the general models can drastically reduce the amount of legal spend.
108
00:07:50,520 --> 00:07:54,042
Um, but it does not eliminate the need for lawyers.
109
00:07:54,042 --> 00:08:05,808
Like I gave you some examples, like we have, um, we have a four nine a program here, which
is like a, a way that we incent, certain people on the team with like shadow equity in the
110
00:08:05,808 --> 00:08:06,748
company.
111
00:08:06,949 --> 00:08:13,522
And, we start, we started this program before AI existed.
112
00:08:13,522 --> 00:08:14,068
Well,
113
00:08:14,068 --> 00:08:22,187
before November of 2022 when, you know, chat GPT-35 was released, before it was easily
accessible.
114
00:08:22,187 --> 00:08:29,488
So I just for the hell of it decided to upload our docs into chat GPT.
115
00:08:29,488 --> 00:08:31,068
Actually, I used Grok for that one.
116
00:08:31,068 --> 00:08:37,828
I've been playing around with Grok and asked for, are there any irregularities or risks
associated with how we've implemented this program?
117
00:08:37,828 --> 00:08:42,528
And then I said, give me a risk ranking.
118
00:08:42,704 --> 00:08:46,948
and a probability that one of these risks could potentially materialize.
119
00:08:46,948 --> 00:08:48,730
And it did a fantastic job.
120
00:08:48,730 --> 00:08:52,353
And then I took that to our lawyers, and then we had a conversation about it.
121
00:08:52,353 --> 00:09:00,059
um So it doesn't eliminate the need for lawyers, but it can today, the general model is
not a legal specific tool.
122
00:09:01,401 --> 00:09:05,605
There are some really valuable use cases, especially for small businesses like ours.
123
00:09:05,605 --> 00:09:09,330
We're like 43 people, and um we don't have
124
00:09:09,330 --> 00:09:10,891
big legal budgets.
125
00:09:10,891 --> 00:09:16,163
it's a good way for us to just do sanity checks, um especially on lower risk stuff.
126
00:09:16,163 --> 00:09:18,503
Like I don't even read NDAs anymore.
127
00:09:18,564 --> 00:09:26,157
You know, I will, if they, if they send me an NDA, I have a, I have a custom GPT that I've
trained and say, Hey, what's the Delta?
128
00:09:26,157 --> 00:09:27,888
Are there any, is this one sided?
129
00:09:27,888 --> 00:09:30,298
Have a little prompt that I use takes me there.
130
00:09:30,298 --> 00:09:31,609
They're low risk.
131
00:09:31,609 --> 00:09:37,391
So, um, for high risk stuff, obviously lawyers still need to be involved in the real
nuanced.
132
00:09:37,391 --> 00:09:39,465
Um, but
133
00:09:39,465 --> 00:09:49,693
What, you know, your comment about, yeah, you can't just throw chachi BT point chachi BT
at these big expansive data sources for a number of reasons.
134
00:09:49,693 --> 00:10:04,125
mean, one, the context window issue, like even I think Gemini that I think is now at a
million tokens, it gets all sorts of detail gets lost in the middle when you upload a ton
135
00:10:04,125 --> 00:10:08,488
of content and try to do retrieval or
136
00:10:08,698 --> 00:10:12,315
summarize it misses things today's tech does.
137
00:10:12,315 --> 00:10:16,663
What are some other challenges with just pointing it at a big data source?
138
00:10:16,663 --> 00:10:21,972
I don't mean from a technical perspective, but you know, just kind of data wise, what are
the issues?
139
00:10:22,156 --> 00:10:28,139
Well, and I think your point about the nuance is really important and also your point
about risk, right?
140
00:10:28,139 --> 00:10:37,034
So if you think about when you're in a situation, like think about it from the consumer of
legal information or legal services point of view, right?
141
00:10:37,034 --> 00:10:38,195
So the client point of view.
142
00:10:38,195 --> 00:10:47,150
So most likely if you're engaging in some kind of legal dispute or legal situation, you
have a pretty serious thing that you're trying to work through, right?
143
00:10:47,150 --> 00:10:51,682
So for, when you talk about the risk profile of something being wrong,
144
00:10:51,828 --> 00:11:03,351
it's much scarier how it could affect people's lives in a legal sphere or a medical sphere
or something like that versus apartment hunting or planning a trip or things like that.
145
00:11:03,351 --> 00:11:09,343
And now this is all coming from someone who's very AI positive and very much a pro AI.
146
00:11:09,343 --> 00:11:11,250
I obviously I have show about it.
147
00:11:11,250 --> 00:11:12,164
I have a company about it.
148
00:11:12,164 --> 00:11:19,346
So I'm super, super uh excited about the potential the same way you are about how it can
help humans become better at what they're doing.
149
00:11:19,346 --> 00:11:20,642
But I do think
150
00:11:20,642 --> 00:11:32,893
the biggest risk I see about this idea of just pointing at all this data, having people
who frankly don't have the depth of knowledge either in the legal sphere and the tech
151
00:11:32,893 --> 00:11:35,555
sphere to understand what's coming back.
152
00:11:35,555 --> 00:11:36,115
Is it good?
153
00:11:36,115 --> 00:11:36,486
it bad?
154
00:11:36,486 --> 00:11:37,577
I mean, this is really hard.
155
00:11:37,577 --> 00:11:38,788
Benchmarking is really hard.
156
00:11:38,788 --> 00:11:39,959
We can talk about that too.
157
00:11:39,959 --> 00:11:44,076
em Because we could end up as a community.
158
00:11:44,076 --> 00:11:49,099
destroying any possibility we have of having these two will be helpful before they even
get out of the gate.
159
00:11:49,099 --> 00:12:00,324
And so I'm probably not surprising anybody listening to this call that the judiciary um is
not necessarily or the people involved in the courts and things of that nature aren't
160
00:12:00,324 --> 00:12:03,876
necessarily the most technically advanced people on earth, know, right?
161
00:12:04,157 --> 00:12:06,238
They just are and it's okay.
162
00:12:06,238 --> 00:12:09,140
And um that's not necessarily their job.
163
00:12:09,140 --> 00:12:14,122
But if you can see, and we saw it with hallucinations, if we create
164
00:12:14,270 --> 00:12:26,319
noise and we create situations where people are causing themselves, like we said, or the
system more harm than good, we could end up getting shut down, you know, regulated to a
165
00:12:26,319 --> 00:12:31,802
point where we're actually hurting the long-term goals of what AI could do.
166
00:12:32,050 --> 00:12:42,870
and this, now I'm going to sound a little bit, a little bit uh soapboxy here, but some of
this can come to the point of everyone's trying to make their dollar.
167
00:12:43,060 --> 00:12:43,612
on these tools.
168
00:12:43,612 --> 00:12:49,596
And of course I am too, so I'm not criticizing, but you know, it's a very different space
when you're in the legal space.
169
00:12:49,596 --> 00:12:50,117
Yeah.
170
00:12:50,117 --> 00:12:52,039
Well, you mentioned benchmarking.
171
00:12:52,039 --> 00:12:55,143
What are your thoughts around that?
172
00:12:55,842 --> 00:12:57,583
Yeah, so benchmarking.
173
00:12:57,583 --> 00:13:00,744
So some of this is I come at it like I'm not an attorney.
174
00:13:00,744 --> 00:13:03,095
uh I don't have a legal background.
175
00:13:03,095 --> 00:13:07,607
have, like I said, a marketing, public health, of like social justice background, right?
176
00:13:07,607 --> 00:13:11,109
um But what do I know about these things, you know?
177
00:13:11,109 --> 00:13:22,828
And so when you think about legal tech as a field, as an ecosystem, you know, we haven't
done a very good job of helping people assess what tools are good.
178
00:13:22,828 --> 00:13:29,324
what tools aren't good, what tools might work for them, which tools have the better
housekeeping, know, seal approval.
179
00:13:29,384 --> 00:13:37,992
And I know there's a lot of discussion going on around this, particularly in the academic
circles and with the legal librarians who are the people who probably will solve this for
180
00:13:37,992 --> 00:13:40,074
us eventually, which is great.
181
00:13:40,535 --> 00:13:45,329
But we're not giving consumers a very easy way to understand what's good and what's not.
182
00:13:45,329 --> 00:13:46,911
And again, I'm part of this.
183
00:13:46,911 --> 00:13:48,482
I'm not, you know,
184
00:13:48,482 --> 00:13:52,964
blaming others and saying we're perfect, but we tend to say, my product is good, trust me.
185
00:13:52,964 --> 00:13:57,066
And I'm patting my back for the people who are just watching, listening and not watching.
186
00:13:57,066 --> 00:14:06,531
em And that's not going to be enough for something like this because again, the
consequences are too dire for people getting incorrect information.
187
00:14:06,531 --> 00:14:12,334
that's where the whole, excuse me, the whole human in the loop thing becomes very, very,
very important.
188
00:14:12,348 --> 00:14:15,382
Yeah, and you know, I have seen some initiatives.
189
00:14:15,382 --> 00:14:20,353
It's been a while that probably Gosh, maybe close to a year ago.
190
00:14:20,353 --> 00:14:24,448
I heard about a uh consortium of
191
00:14:24,624 --> 00:14:27,145
sorts that had kind of gotten together.
192
00:14:27,145 --> 00:14:31,547
Harvey was on that list as participants, which I thought was interesting.
193
00:14:31,547 --> 00:14:38,810
I'm not sure how much vendors should play a role, maybe an advisory role, but I don't, you
know, it's kind of like the Fox garden, the hen house.
194
00:14:38,810 --> 00:14:43,412
If the Pete, you creating your own, you grading your own tests.
195
00:14:43,412 --> 00:14:54,546
Um, we've seen gaming in, I mean, meta got slapped around pretty hard, not long ago with
llama four and the, um,
196
00:14:54,546 --> 00:15:06,030
model that was released had material differences in benchmarking scores than what they
presented during those benchmarking tests.
197
00:15:06,030 --> 00:15:14,800
yeah, think vendors should play a role, you know, feels like, I don't know, somebody else
should be leading that effort with us supporting them.
198
00:15:15,327 --> 00:15:15,717
Agree.
199
00:15:15,717 --> 00:15:17,088
And it needs to be independent.
200
00:15:17,088 --> 00:15:22,211
And this is why I'm really in favor of it coming out of educational institutions.
201
00:15:22,211 --> 00:15:25,012
uh I worked in higher education for a long time.
202
00:15:25,012 --> 00:15:31,096
So I know that that's probably the best that we have for an independent lens.
203
00:15:31,096 --> 00:15:36,138
Excuse me, because um these firms that are marketing firms for legal tech are great.
204
00:15:36,139 --> 00:15:42,094
And, of course, they're going to do thorough, you know, sort of
205
00:15:42,094 --> 00:15:52,334
looking at these tools, but if it's a client or if it's a tool that is using their
services, it's still hard for people to feel that it's completely independent.
206
00:15:52,334 --> 00:15:57,354
And I don't think in the current environment, the government is going to be doing anything
about this.
207
00:15:57,374 --> 00:16:01,594
We couldn't really ask OpenAI or the other models to test it either.
208
00:16:01,594 --> 00:16:02,794
So it's a tough one.
209
00:16:02,794 --> 00:16:03,674
I don't know.
210
00:16:03,674 --> 00:16:09,834
And it's not regulated the way medical information or things like that is in some ways.
211
00:16:10,794 --> 00:16:22,622
I would say if there's anyone out there uh who wants to do this, I think you have a big
wide open field and I think you should make the legal tech companies pay you for it and
212
00:16:22,622 --> 00:16:23,388
support it.
213
00:16:23,388 --> 00:16:24,989
Yeah, I agree.
214
00:16:24,989 --> 00:16:28,991
Now back to the kind of legal data movement.
215
00:16:28,991 --> 00:16:43,329
um You had, when you and I talked last, you had shared some interesting information on
just kind of the history of the legal data liberation efforts, you know, in the Harvard
216
00:16:43,329 --> 00:16:46,561
case law project and the uh free law project.
217
00:16:46,561 --> 00:16:53,256
give, give our listeners a little bit of a perspective on where we've been and kind of
218
00:16:53,256 --> 00:16:54,607
where we are now.
219
00:16:55,438 --> 00:16:56,438
Right.
220
00:16:56,458 --> 00:17:05,098
So, and I'm sure there's things I don't know about, so I'll tell you the things I know,
and then folks can put chime in in the comments about things they know too.
221
00:17:05,178 --> 00:17:10,978
So the first place that we came across is we were looking for case law ourselves a couple
of years ago.
222
00:17:10,978 --> 00:17:13,738
The first place we found, you know, good access.
223
00:17:13,738 --> 00:17:22,478
And then there's also the idea of access to users to research and the access to people who
want to use that technology to create a product, right?
224
00:17:22,478 --> 00:17:25,658
Or that, excuse me, that information to create a product, which are not the same.
225
00:17:26,638 --> 00:17:34,938
So the case law access project, which was out of the Harvard Law School's library, was a
very interesting project that we came across.
226
00:17:34,938 --> 00:17:36,958
And they had all the case law.
227
00:17:36,958 --> 00:17:38,498
They have a wonderful story.
228
00:17:38,498 --> 00:17:42,118
Adam Ziegler could come on and talk about it if you want, I'm sure.
229
00:17:42,118 --> 00:17:43,658
Although you probably have to ask him.
230
00:17:43,658 --> 00:17:44,918
I'm not his rep.
231
00:17:45,038 --> 00:17:49,438
Where they went into the library and they quite literally scanned all their books.
232
00:17:49,438 --> 00:17:55,466
And there's just a fascinating story about how they had to break the bindings and the
librarians were like, ah.
233
00:17:55,466 --> 00:17:56,306
some of these old books.
234
00:17:56,306 --> 00:18:06,651
anyway, so they scanned everything they had and they collaborated with a legal tech
company that I'm forgetting the name right now, but um they uh worked with them.
235
00:18:06,651 --> 00:18:09,793
And then I think that company was sold to one of the big ones.
236
00:18:09,793 --> 00:18:13,934
I think it was Lexis, but may want to fact check me on that.
237
00:18:13,934 --> 00:18:21,318
So their data we noticed stopped around 2017 because that's when the project ended.
238
00:18:21,318 --> 00:18:21,954
So
239
00:18:21,954 --> 00:18:29,130
we had to look for another resource, but that was kind of like the original free the law
group from what I've been able to see.
240
00:18:29,130 --> 00:18:30,701
And they're still doing a lot of really cool stuff.
241
00:18:30,701 --> 00:18:33,223
think they're ramping up again.
242
00:18:33,684 --> 00:18:44,613
So then we came across a uh group out in Ottawa, California that is powering a ton of
innovation in this ecosystem and should be getting a lot of credit.
243
00:18:44,613 --> 00:18:51,052
And that was one of the other reasons I jumped on that post is because I'm like, hey, free
law has been doing this for years.
244
00:18:51,052 --> 00:19:01,908
So the free lab project is, I think they've been around about 10 years and their sole
purpose is to make sure that um things like case law and other things are actually
245
00:19:01,908 --> 00:19:03,919
accessible by regular people.
246
00:19:03,919 --> 00:19:10,463
And then they also have, you can create a commercial agreement with them where you ingest
their data, which is how we get our data.
247
00:19:10,463 --> 00:19:11,304
So it's great.
248
00:19:11,304 --> 00:19:13,015
And that's a commercial license.
249
00:19:13,015 --> 00:19:14,235
So they're doing that.
250
00:19:14,235 --> 00:19:19,018
um And then a very interesting space that's new and
251
00:19:19,520 --> 00:19:26,913
I think there's a lot of cool things happening here, including at Describe, has to do with
sort of the other parts of the law.
252
00:19:26,913 --> 00:19:29,274
Case law is one part of the law.
253
00:19:29,394 --> 00:19:34,766
But a lot of people think of statutes, regulations, and all that other stuff as the real
law.
254
00:19:34,766 --> 00:19:39,282
So happy to talk about that, because that's a big missing piece of the puzzle right now.
255
00:19:39,282 --> 00:19:42,668
Yeah, well, let's uh share your thoughts on that.
256
00:19:43,374 --> 00:19:54,914
Yeah, so again, coming from a non-lawyer, so forgive me if I'm not explaining this the way
a professor would in law school, but certainly case law is part of what lawyers, attorneys
257
00:19:54,914 --> 00:19:58,534
have to look at as they're building a case or writing a brief or doing things like that.
258
00:19:58,534 --> 00:20:05,254
But of course, much of the law is done with just the law of the land, which statutes,
regulations, things of that nature.
259
00:20:05,254 --> 00:20:08,914
And it's really, really hard to get.
260
00:20:09,014 --> 00:20:13,100
It's almost impossible to get, and that's weird.
261
00:20:13,100 --> 00:20:15,802
right, when you think about it, because it's all public information.
262
00:20:15,802 --> 00:20:24,258
And this also, I think, links back to the judiciary in many states, state governments, and
things like that don't necessarily have very sophisticated technology.
263
00:20:24,258 --> 00:20:27,320
They can't, you know, it's not something they've invested in.
264
00:20:27,320 --> 00:20:31,103
There are sort of workarounds that other companies have provided to them to make it easier
for them.
265
00:20:31,103 --> 00:20:33,044
And then that data is sort of blocked away.
266
00:20:33,044 --> 00:20:40,649
m So uh some companies or some places have started to look at pulling that data directly.
267
00:20:40,649 --> 00:20:42,110
And that is
268
00:20:42,110 --> 00:20:42,600
what we're doing.
269
00:20:42,600 --> 00:20:49,672
So we had looked in because we really need to add those resources to our database for it
to be as broad as possible.
270
00:20:49,672 --> 00:20:55,374
em And we started to look at where we could get that data without having to go get it
ourselves.
271
00:20:55,374 --> 00:21:05,167
And I think we were quoted by one group who shall remain anonymous, $100,000 just for some
small piece of this data.
272
00:21:05,167 --> 00:21:06,717
And so we thought this is absurd.
273
00:21:06,717 --> 00:21:09,476
You know, so
274
00:21:09,656 --> 00:21:10,726
We're getting it ourselves.
275
00:21:10,726 --> 00:21:18,528
So we're going to have every single piece of, tell you the list because I want to make
sure I get this right.
276
00:21:18,528 --> 00:21:25,590
Statutes, regulations, state constitutions, court rules, session laws, and attorney
general opinions from all 50 states.
277
00:21:25,669 --> 00:21:28,051
And it will be part of our database.
278
00:21:28,051 --> 00:21:30,182
We will be demoing this at Ilticon.
279
00:21:30,182 --> 00:21:31,532
So come by and see it.
280
00:21:31,532 --> 00:21:37,294
But we think we'll have the most thorough corpus of anyone, especially at the prices we
charge.
281
00:21:37,294 --> 00:21:39,233
And everything will be fully searchable.
282
00:21:39,233 --> 00:21:40,283
by natural.
283
00:21:40,584 --> 00:21:41,034
Wow.
284
00:21:41,034 --> 00:21:42,675
Yeah, that's big news.
285
00:21:42,675 --> 00:21:44,568
Yeah, that's big news.
286
00:21:44,568 --> 00:21:54,198
we are em recording on Monday the 4th, but this episode will be released on Wednesday the
6th, which is going to be in time for Ilticon so folks can come by.
287
00:21:54,198 --> 00:21:55,719
Are you guys exhibiting?
288
00:21:56,155 --> 00:21:58,476
Yeah, we're in the startup hub, which is cool.
289
00:21:58,476 --> 00:22:00,597
haven't been to Ilticon before.
290
00:22:00,597 --> 00:22:05,460
The only conference we've done before is Tech Show, which was so much fun.
291
00:22:05,460 --> 00:22:06,300
my gosh.
292
00:22:06,300 --> 00:22:09,412
We love that one with Bob Ambrosio and the Startup Alley and stuff.
293
00:22:09,412 --> 00:22:11,833
So anyone out there who's got a startup apply for that.
294
00:22:11,833 --> 00:22:13,964
It's so much fun if you can get it.
295
00:22:14,204 --> 00:22:17,236
So yeah, so we're in booth one, two, three in the Startup Alley.
296
00:22:17,236 --> 00:22:18,532
So come see us.
297
00:22:18,532 --> 00:22:19,002
Awesome.
298
00:22:19,002 --> 00:22:30,259
Yeah, we did when we were first finding our way in legal back in our predecessor company,
we were called Acrowire and I'm a former Microsoft guy and just kind of started consulting
299
00:22:30,259 --> 00:22:32,900
on the side as a lot of tech people do.
300
00:22:32,900 --> 00:22:36,212
And then the side gig started making more than the day job.
301
00:22:36,212 --> 00:22:39,284
So I quit and uh terrible timing.
302
00:22:39,284 --> 00:22:47,108
This was like right after the financial crisis, but I had enough client work to keep me
busy and my bait, my bills paid.
303
00:22:47,260 --> 00:23:00,065
So I jumped in and we fell down the legal rabbit hole early and we didn't really know
where our skills mapped on the, on the kind of the market spectrum in terms of size.
304
00:23:00,065 --> 00:23:13,411
we are our first show was like ABA tech show and it's great for kind of solo and small
firms, but for big firms really ill to legal week, those are, those are better fits for,
305
00:23:13,411 --> 00:23:15,852
for our audience, which is, which is large law.
306
00:23:15,852 --> 00:23:17,270
Um,
307
00:23:17,270 --> 00:23:21,922
our first experience in legal tech was just a few months after launching.
308
00:23:21,922 --> 00:23:28,715
We were uh a finalist for one of the legal week awards.
309
00:23:28,955 --> 00:23:30,836
Wow, that was amazing.
310
00:23:30,836 --> 00:23:33,087
Cause we had never been to anything like that.
311
00:23:33,087 --> 00:23:41,360
And then to walk into that venue down there at the Hyatt, I thought the Hilton, the Hilton
in New York, was like, holy smokes.
312
00:23:41,360 --> 00:23:43,932
Like this is, this is big time.
313
00:23:43,932 --> 00:23:45,117
It was pretty cool.
314
00:23:45,117 --> 00:23:45,709
Yeah.
315
00:23:45,709 --> 00:23:47,663
that was back when Stephanie Wilkins was running it.
316
00:23:47,663 --> 00:23:48,253
was awesome.
317
00:23:48,253 --> 00:23:49,193
Yeah.
318
00:23:49,193 --> 00:23:49,654
Yeah.
319
00:23:49,654 --> 00:23:52,094
mean, legal week is an interesting one for us.
320
00:23:52,094 --> 00:24:03,359
We, we attend every year, but it, you know, historically has been e-discovery focused and
I, you know, I know they've tried to move beyond that in recent years, but we go, we
321
00:24:03,359 --> 00:24:04,699
don't, we don't exhibit.
322
00:24:04,699 --> 00:24:09,741
Um, the exhibit hall is very disjointed.
323
00:24:09,741 --> 00:24:17,584
Um, like, you know, it's, there's all little sorts of nooks and crannies and, um but we
always make an appearance at legal week.
324
00:24:18,385 --> 00:24:19,204
for
325
00:24:19,367 --> 00:24:20,097
it's a good one.
326
00:24:20,097 --> 00:24:30,606
And then the one that I really liked a lot was very different, completely at other end of
the scale, which is the LSC ITC, which is the Legal Services Council, right?
327
00:24:30,606 --> 00:24:34,089
So that was all about access to justice stuff.
328
00:24:34,089 --> 00:24:35,220
So that was really cool.
329
00:24:35,220 --> 00:24:40,154
was kind of, Bob Brogy wrote a really interesting piece last year about it, about the
difference.
330
00:24:40,154 --> 00:24:48,340
Like he went from one to the other and he was like, wow, just that shows the gulf in
certain resources and the two ends of the...
331
00:24:48,366 --> 00:24:49,296
cool, so to speak.
332
00:24:49,296 --> 00:24:50,097
Interesting.
333
00:24:50,097 --> 00:24:51,958
Well, I'm curious.
334
00:24:51,958 --> 00:24:54,619
She had mentioned the group out of Oakland.
335
00:24:54,619 --> 00:24:57,300
Are they?
336
00:24:57,300 --> 00:24:57,551
Yeah.
337
00:24:57,551 --> 00:24:57,851
Yeah.
338
00:24:57,851 --> 00:25:01,703
So is that an academic institution?
339
00:25:01,703 --> 00:25:03,444
Like who's behind that?
340
00:25:03,798 --> 00:25:05,679
Yeah, so they're nonprofit.
341
00:25:05,679 --> 00:25:09,822
They're run by a guy named Michael Listener, who's great.
342
00:25:09,822 --> 00:25:11,262
So hi, Michael, if you're listening.
343
00:25:11,262 --> 00:25:15,785
um And they are, um like I say, they're a nonprofit.
344
00:25:15,785 --> 00:25:18,577
they are very mission driven.
345
00:25:18,577 --> 00:25:24,851
I'm sure that they've got all kinds of academics involved with them and sort of on their
board and things like that.
346
00:25:24,851 --> 00:25:29,593
But they're a straight up nonprofit, which means that they are...
347
00:25:29,593 --> 00:25:31,174
ah
348
00:25:31,478 --> 00:25:35,183
sort of at the source powering a lot of the legal tech innovation.
349
00:25:35,183 --> 00:25:39,729
And of course, they're not going to say publicly who their clients are unless they're like
us.
350
00:25:39,729 --> 00:25:42,142
And we're always willing to say we're their client.
351
00:25:42,142 --> 00:25:50,479
You would scratch the surface on a lot of these startups and you would see that's where
their data is coming from, for sure.
352
00:25:50,479 --> 00:26:04,724
with these types of organizations out there working hard to kind of open source uh legal
information, does that open the door to challenge, you know, what's largely a duopoly,
353
00:26:04,724 --> 00:26:10,859
feels like, with Westlaw and Lexis with solutions, does that open the door for that?
354
00:26:12,798 --> 00:26:13,279
Yes.
355
00:26:13,279 --> 00:26:13,629
Yes.
356
00:26:13,629 --> 00:26:16,060
So yes and no, depending on how you look at it.
357
00:26:16,060 --> 00:26:18,842
So broadly speaking, absolutely.
358
00:26:18,842 --> 00:26:20,943
Like for sure.
359
00:26:20,943 --> 00:26:29,568
Like if anything is going to challenge those duopolies and those extraordinarily powerful
companies, it's AI.
360
00:26:29,568 --> 00:26:29,988
Right?
361
00:26:29,988 --> 00:26:38,182
mean, that is the moment is here, you and they know that and that's why they're building
so fast and acquiring so quickly and you know, all this stuff.
362
00:26:38,182 --> 00:26:39,573
there's no question.
363
00:26:39,573 --> 00:26:39,873
Right.
364
00:26:39,873 --> 00:26:41,742
And we've seen some really interesting
365
00:26:41,742 --> 00:26:46,822
things happening like with Villex and Cleo and know, Harvey and Lexus.
366
00:26:46,822 --> 00:26:50,562
So everyone's paying attention to this, as you certainly know, right, Ted?
367
00:26:50,562 --> 00:26:53,542
Because you do, you're a podcast, you talk about this a lot.
368
00:26:54,442 --> 00:26:57,662
But where it also opens the door.
369
00:26:57,682 --> 00:26:59,342
So that's like one end of the market, right?
370
00:26:59,342 --> 00:27:08,322
So and a lot of people, if not almost all people are aiming for the exact same part of the
market, right?
371
00:27:08,322 --> 00:27:10,782
So the well-served...
372
00:27:11,180 --> 00:27:13,072
the places and this is fine.
373
00:27:13,072 --> 00:27:14,122
They need their tools.
374
00:27:14,122 --> 00:27:14,563
This is great.
375
00:27:14,563 --> 00:27:16,274
I have nothing, no digs or anything.
376
00:27:16,274 --> 00:27:21,709
But if everybody's always aiming for the same part of the market, there's not going to be
a lot of innovation, right?
377
00:27:21,709 --> 00:27:25,442
That's not where things are going to get exciting, at least from my point of view.
378
00:27:25,442 --> 00:27:32,968
My point of view and the way we focus at Describe is how about that gigantic, unserved
market, right?
379
00:27:32,968 --> 00:27:37,590
And I think it was, I don't want to misquote him, but I'm pretty sure it was Jack Newton.
380
00:27:37,590 --> 00:27:40,984
It was Jack Newton from Clio who said, um
381
00:27:40,984 --> 00:27:47,218
that there was maybe, I think it was like a trillion dollars of unmet need in the legal
services industry.
382
00:27:47,218 --> 00:27:58,166
Now we better fact check that or, you know, I might get a letter from Uncle Leo saying
that was wrong, but regardless, there's an enormous amount of unmet need out there for
383
00:27:58,166 --> 00:28:00,148
legal services, legal information.
384
00:28:00,148 --> 00:28:06,512
So when we were first starting out, uh we thought to ourselves, we could be a nonprofit,
right?
385
00:28:06,512 --> 00:28:09,806
We could probably be a very successful nonprofit.
386
00:28:09,806 --> 00:28:13,386
if we wanted to be, we could get grants, we could do all sorts of things.
387
00:28:13,386 --> 00:28:16,266
But we always said, nope, we're a business.
388
00:28:16,906 --> 00:28:17,626
We're a business.
389
00:28:17,626 --> 00:28:29,346
And while we have a mission-driven focus, we are trying to create a very successful and
sustainable and profitable business because AI allows us to offer very high quality tools
390
00:28:29,346 --> 00:28:35,526
that you could have not even imagined just a few years ago could possibly exist without
human intervention.
391
00:28:37,870 --> 00:28:45,087
put it out there at a price that is so absurdly low compared to the competitors that we're
just gonna gobble up how much of market share.
392
00:28:45,087 --> 00:28:46,748
that's the goal.
393
00:28:46,748 --> 00:28:54,666
So while we do like to present ourselves and we are mission driven, we have a very serious
market driven approach to what we're doing.
394
00:28:54,666 --> 00:29:00,231
um And it's really starting to come to fruition now that we have a paid tool.
395
00:29:00,231 --> 00:29:03,946
em So I think there's huge opportunity, but I think...
396
00:29:03,946 --> 00:29:15,923
I would caution other people who are looking at this space, don't automatically assume you
have to compete, you know, with Harvey, with Lexus directly, with Vlex, you know, that's
397
00:29:15,923 --> 00:29:16,954
not easy, right?
398
00:29:16,954 --> 00:29:18,715
And they're, those are very well funded firms.
399
00:29:18,715 --> 00:29:24,148
So look at other parts of the market where other people aren't looking at it and see what
you can do there.
400
00:29:24,424 --> 00:29:34,090
Yeah, you know, I candidly, I've talked about this publicly before and I don't know, it
may ruffle some feathers, the, what, what?
401
00:29:34,090 --> 00:29:34,380
Yeah.
402
00:29:34,380 --> 00:29:36,001
I mean, I'm not for everybody.
403
00:29:36,001 --> 00:29:38,252
know that my opinions don't res.
404
00:29:38,792 --> 00:29:39,672
Yeah.
405
00:29:40,133 --> 00:29:40,563
Yeah.
406
00:29:40,563 --> 00:29:41,244
Feather ruffle it.
407
00:29:41,244 --> 00:29:41,444
Yeah.
408
00:29:41,444 --> 00:29:52,275
I've ruffled a few feathers, um, in my day and, um, not intentionally, but I feel like
just being candid and if, you know, if people get offended, then they do, but
409
00:29:52,275 --> 00:29:56,788
You know, I, we have really struggled on the smaller end of the market with, within the
law firm world.
410
00:29:56,788 --> 00:30:01,812
Um, a lot of people call mid-law like 50 to like 150 lawyers.
411
00:30:01,812 --> 00:30:04,573
That's that really feels like small law for us.
412
00:30:04,573 --> 00:30:08,096
So, um, our market is really a hundred attorneys and up.
413
00:30:08,096 --> 00:30:13,129
And there are about 400 of those law firms in North America, which is our focus.
414
00:30:13,129 --> 00:30:20,530
We're doing some business in the UK now too, but by my estimates and there's no good
numbers, there's only about 600 law firms on planet earth.
415
00:30:20,530 --> 00:30:22,381
who have more than a hundred attorneys.
416
00:30:22,381 --> 00:30:35,304
There are tens of thousands, if not hundreds of thousands of solo and smaller firms in
that it's a very kind of, um, I guess steeply sloping pyramid.
417
00:30:35,304 --> 00:30:38,625
Um, yeah, that's a good way to describe it.
418
00:30:38,625 --> 00:30:41,465
Um, but it's really hard to do business there.
419
00:30:41,566 --> 00:30:50,578
And the reason it's hard to do business there as a tech company is because law firms
historically have not valued, not seen tech as strategic.
420
00:30:50,828 --> 00:30:56,110
Um, and, and, mean, that's just the fact, like people can say what they want.
421
00:30:56,110 --> 00:30:58,351
I've been in this space 20 years.
422
00:30:58,351 --> 00:31:04,874
I know it to be true that that has been the historical perspective in the law firm world
that is changing.
423
00:31:04,874 --> 00:31:07,945
And there are, and there are outliers where that is not true.
424
00:31:07,945 --> 00:31:17,649
And I, and I get that, but in the smaller, in the smaller end of the market, the things
that we've struggled with are, um, a lack of sophistication from a technical perspective,
425
00:31:17,649 --> 00:31:20,500
you know, they could be rock stars on the.
426
00:31:20,500 --> 00:31:32,643
practice side, but a lot of times in that hundred attorney and under world, you'll have an
IT director who, um, Kate, who was a, their former network administrator did a good job.
427
00:31:32,643 --> 00:31:35,064
So they got reported and now they're, you know what I mean?
428
00:31:35,064 --> 00:31:39,685
And there's nothing wrong with that, but it's, it's we.
429
00:31:40,266 --> 00:31:40,776
Right.
430
00:31:40,776 --> 00:31:41,986
Yeah, exactly.
431
00:31:41,986 --> 00:31:48,711
And so there's just been a lack of technical sophistication and you know, there's also a
lot of, I got to
432
00:31:48,711 --> 00:31:50,592
a really good friend of mine who here in St.
433
00:31:50,592 --> 00:31:54,855
Louis, he owns a 50 attorney law firm and we play golf together.
434
00:31:54,855 --> 00:32:06,493
you know, I had, he asked me for somebody to go speak at his retreat and I teed a couple
of people up and he was complaining about the price.
435
00:32:06,493 --> 00:32:12,197
I'm like, dude, you are lucky that this person is even giving you their attention, right?
436
00:32:12,197 --> 00:32:16,460
They normally speak in front of hundreds of thousands of people.
437
00:32:16,476 --> 00:32:27,089
Like that's, that's a bargain, but you know, it's an all small businesses, including us,
when you're every dollar matters, you got, really have to manage your spend.
438
00:32:27,089 --> 00:32:36,870
But when you have limited budget combined with an audience who doesn't see tech as
strategic, it's a really difficult market to sell into as a tech company.
439
00:32:37,218 --> 00:32:42,502
completely, but those people are also in very serious peril.
440
00:32:42,923 --> 00:32:43,323
Right?
441
00:32:43,323 --> 00:32:45,665
Like, and we're talking like a long arc here.
442
00:32:45,665 --> 00:32:56,694
So you're right for like individual companies, like can, can potential customers adapt
fast enough to keep us sustainable and keep us going like for sure, like on an individual
443
00:32:56,735 --> 00:33:03,340
or even like a market, like a current market level, like you could argue that the change
is going to be too slow and all that.
444
00:33:03,340 --> 00:33:05,942
But if you look at like a broader arc,
445
00:33:06,286 --> 00:33:08,466
this is not going to stay the same.
446
00:33:08,466 --> 00:33:13,766
Like think about sort of how we get our medical care now, right?
447
00:33:13,766 --> 00:33:15,846
And how much that has changed.
448
00:33:15,846 --> 00:33:32,786
And if you had a doctor's office, say, who refused to do e-charts or refused to meet with
you online or refused to take your prescriptions through the portal versus calling.
449
00:33:32,786 --> 00:33:35,724
it's just, and medical is another weird.
450
00:33:35,724 --> 00:33:41,108
you want to, but like people, couldn't survive, right?
451
00:33:41,108 --> 00:33:42,079
You couldn't survive.
452
00:33:42,079 --> 00:33:52,388
so law probably has the wrong impression that they don't, there's some people who think
that they adapt, they don't have to adapt in the same way.
453
00:33:52,388 --> 00:33:53,609
And there's a lot of resistance.
454
00:33:53,609 --> 00:33:58,563
And, for sure, one of the dumb jokes we tell, and we have many, is that when we're like,
what should we build?
455
00:33:58,563 --> 00:34:05,198
Let's find the uh part of the economy where people are most, you know,
456
00:34:05,198 --> 00:34:08,518
open to risk and they love technology and they just are ready to share.
457
00:34:08,518 --> 00:34:10,178
We're like, wow, perfect.
458
00:34:10,178 --> 00:34:12,298
know, talk about what could we have picked that was harder?
459
00:34:12,298 --> 00:34:13,918
I can't even imagine.
460
00:34:13,978 --> 00:34:19,658
But I do think we're at a point where no matter what it has to change.
461
00:34:19,658 --> 00:34:29,658
can't have the triangulation we have right now with the advent of AI and just how
fundamentally that's going to change everything we do soup to nuts.
462
00:34:29,658 --> 00:34:30,018
Right.
463
00:34:30,018 --> 00:34:35,318
And then you have a vastly underserved part of the market.
464
00:34:35,648 --> 00:34:42,594
and then of extraordinarily overpriced legacy providers, like that's gonna implode, right?
465
00:34:42,594 --> 00:34:55,155
And so the smart attorneys out there, and they're all smart, no offense, the tech forward
and the forward thinking attorneys are gonna have to figure out how to adapt to this or
466
00:34:55,155 --> 00:35:00,039
they're gonna see their share of the market, know, start to crater.
467
00:35:00,039 --> 00:35:03,372
And the new students who are going through law school and learning this stuff,
468
00:35:03,372 --> 00:35:05,267
are gonna come out with different expectations too.
469
00:35:05,267 --> 00:35:09,234
So there'll be a fight for talent as well.
470
00:35:09,234 --> 00:35:10,315
Yeah, no, for sure.
471
00:35:10,315 --> 00:35:16,989
Yeah, that kind of legacy mindset is incompatible with where we're going.
472
00:35:16,989 --> 00:35:25,075
It's just, I know so many firms that are going to struggle to adapt and not necessarily
some small firms.
473
00:35:25,075 --> 00:35:30,838
I know of an Amlaw firm who I'm really good friends with, the chief knowledge officer
there.
474
00:35:31,524 --> 00:35:32,940
And, um
475
00:35:33,402 --> 00:35:39,658
I was working on a project with her for a presentation and she was completely dark in
January.
476
00:35:39,658 --> 00:35:41,066
And I was like, where have you been?
477
00:35:41,066 --> 00:35:42,851
She's like, I've been working on a DMS upgrade.
478
00:35:42,851 --> 00:35:44,103
I'm like, DMS?
479
00:35:44,103 --> 00:35:44,733
You've been working?
480
00:35:44,733 --> 00:35:46,035
I was like, that's IT.
481
00:35:46,035 --> 00:35:46,895
What's the chief knowledge?
482
00:35:46,895 --> 00:35:48,387
She's like, I am IT.
483
00:35:48,387 --> 00:35:49,407
And I am KM.
484
00:35:49,407 --> 00:35:56,875
And it's like, OK, your firm has set you up for failure if you're going to be promoting
knowledge and innovation.
485
00:35:57,176 --> 00:35:57,474
Yeah.
486
00:35:57,474 --> 00:35:59,266
strategic error on their part.
487
00:35:59,266 --> 00:36:05,501
And you know, maybe this dovetails back to the conversation we were having about
benchmarking too, right?
488
00:36:05,501 --> 00:36:13,102
Because the risks for firms too, and for, you know, uh any size firm is huge if they pick
the wrong tool.
489
00:36:13,237 --> 00:36:23,305
It's huge if they rely on something that's spotty, you know, that's, and that's why the
momentum is hard to change with the big trusted companies because I get it.
490
00:36:23,305 --> 00:36:24,470
I mean, if I'm...
491
00:36:24,470 --> 00:36:29,214
you know, working at some firm and I'm in charge of their IT and making sure they have the
right tools.
492
00:36:29,214 --> 00:36:32,617
I'm not really interested in some, you know, brand new startup.
493
00:36:32,617 --> 00:36:34,769
Like that's, that's scary, right?
494
00:36:34,769 --> 00:36:36,400
That's, that's super scary.
495
00:36:36,400 --> 00:36:41,665
So again, there's a comeback to where, where can the industry provide a little more
reassurance?
496
00:36:41,665 --> 00:36:43,516
I think it's useful for all of us.
497
00:36:43,516 --> 00:36:46,863
Yeah, well you've had some validation.
498
00:36:46,863 --> 00:36:52,694
talk a little bit about the case text and law school curricula and what you guys are doing
there.
499
00:36:52,909 --> 00:36:54,610
Yeah, so this is really fun.
500
00:36:54,610 --> 00:36:58,430
this was this actually happened at the tech show.
501
00:36:58,970 --> 00:37:00,370
it's always good to self.
502
00:37:00,370 --> 00:37:04,550
It's always good to show up at these things when you're when you're new, because you never
know who you're going to meet.
503
00:37:05,190 --> 00:37:08,530
So we, as folks know, have been listening.
504
00:37:08,530 --> 00:37:09,890
have case law research.
505
00:37:09,890 --> 00:37:16,530
It's we have free access and we have a paid tool, $10 a month and $20 a month, depending
on the level.
506
00:37:16,530 --> 00:37:18,810
So paid, but almost free.
507
00:37:18,810 --> 00:37:19,510
Right.
508
00:37:19,510 --> 00:37:20,410
So
509
00:37:21,142 --> 00:37:29,889
The uh legal tech curriculum is from, hang on, I'm just gonna get the exact right
terminology here so I'm not saying the wrong thing.
510
00:37:29,889 --> 00:37:43,008
So we will be rolling out into uh over 350 law schools uh internationally this fall as
part of a legal curriculum that is from the National Society for Legal Technology and will
511
00:37:43,008 --> 00:37:45,379
be part of their legal research curriculum.
512
00:37:45,379 --> 00:37:49,442
And what's really fun about this is that we are replacing case text
513
00:37:49,502 --> 00:38:00,140
in their um curriculum because case tax no longer, now that they've been acquired for 650
million, you good for them, um they don't have free access any longer.
514
00:38:00,140 --> 00:38:02,192
I think that was turned off in March.
515
00:38:02,192 --> 00:38:06,405
So we will be replacing uh our friends at Case Tax in the curriculum.
516
00:38:06,405 --> 00:38:14,171
And so we will be included with Lexis Plus, Westlaw, Bloomberg Law, Fast Case, and
HeinOnline, and us, and Describe.
517
00:38:14,171 --> 00:38:16,192
So it does show that
518
00:38:16,415 --> 00:38:18,846
I guess, uh validation of the tool.
519
00:38:18,846 --> 00:38:31,164
um Doug Lusk from the CEO from uh the, NSLT said, you know, we were his favorite and he
was very impressed with what we were building.
520
00:38:31,164 --> 00:38:34,536
So that was some big validation for us.
521
00:38:34,536 --> 00:38:45,932
um Another nice piece of validation we had recently was Bob Enroji covered our AI citator
release and said we were now poised to meaningfully
522
00:38:45,998 --> 00:38:50,741
uh be thought of as a competitor for the big legal research tools.
523
00:38:50,741 --> 00:38:52,112
So these are pretty exciting things.
524
00:38:52,112 --> 00:38:56,966
And again, good luck of having having done this without AI.
525
00:38:56,966 --> 00:38:59,608
It's quite literally would not have been possible.
526
00:38:59,608 --> 00:39:09,554
So we're kind of the poster child for what, how nimble and lean and sort of creative AI
companies can be, which is pretty cool.
527
00:39:10,595 --> 00:39:11,536
Thank you.
528
00:39:11,536 --> 00:39:12,957
We're very excited.
529
00:39:14,098 --> 00:39:14,750
Yeah.
530
00:39:14,750 --> 00:39:17,180
So an interesting thing happened this week.
531
00:39:17,180 --> 00:39:19,671
So we are a TLTF portfolio company.
532
00:39:19,671 --> 00:39:28,753
um one of the directors over there tagged uh me and a few others in the post, Tom Baldwin
from Ennegrata.
533
00:39:28,954 --> 00:39:37,296
And uh a VC had written a future of law um kind of
534
00:39:38,037 --> 00:39:40,157
Manifesto and it's so good.
535
00:39:40,157 --> 00:39:41,617
Like I, I agree.
536
00:39:41,617 --> 00:39:42,977
It's a, it's, they're called catalyst.
537
00:39:42,977 --> 00:39:45,637
I'd really, I'd never heard of them prior to that.
538
00:39:45,637 --> 00:39:53,457
Maybe we can create a link in the show notes, but they mentioned info dash and like as an,
company they're excited about.
539
00:39:53,477 --> 00:40:00,893
Um, and I, know, I get like 15, I'm not exaggerating 15 to 20 emails from VCs a week.
540
00:40:00,893 --> 00:40:04,694
LinkedIn messages, they call, I don't know how they have my cell phone number, but they
do.
541
00:40:04,694 --> 00:40:10,455
And it's, I'm flattered, but I could fill up my calendar with just talking to them and we
don't need money.
542
00:40:10,455 --> 00:40:20,958
Like we're V we're, um, we're, bootstrapped and we took a little bit of funding from TLTF,
not really for the funding, but just because the working with them is amazing.
543
00:40:20,958 --> 00:40:23,208
Um, they are.
544
00:40:23,208 --> 00:40:24,399
Yeah.
545
00:40:24,399 --> 00:40:25,759
They, they can open doors.
546
00:40:25,759 --> 00:40:26,859
They know the market.
547
00:40:26,859 --> 00:40:29,122
It's, it's a, it's a really good relationship.
548
00:40:29,122 --> 00:40:33,589
Plus you get to go to their fun summit, which is always in really cool places.
549
00:40:34,592 --> 00:40:35,592
Yeah.
550
00:40:36,255 --> 00:40:38,094
Oh, Austin's the best, yeah.
551
00:40:38,094 --> 00:40:42,755
in like the Ritz Carlton and Fort Lauderdale and then last year in Key Biscayne.
552
00:40:42,795 --> 00:40:43,706
So it's really good.
553
00:40:43,706 --> 00:40:56,419
But you know, like, I think what is also interesting now with in terms of challengers to
these big established players is now you have funds like TLTF that are completely zeroed
554
00:40:56,419 --> 00:40:59,124
in and you know, reading that
555
00:40:59,124 --> 00:41:09,927
Catalyst article about the future of law made me realize something I thought they were the
VCs were hitting us up because we've had really strong growth and it's very visible on
556
00:41:09,927 --> 00:41:13,458
LinkedIn just by headcount So was like, that's why they're hitting us up.
557
00:41:13,458 --> 00:41:26,992
But after I read that I was like, oh there is a broader investment thesis that what we're
doing aligns to and I never really I never realized how smart these funds are in
558
00:41:26,992 --> 00:41:29,062
understanding like
559
00:41:29,929 --> 00:41:33,822
these are some of the smartest people ever go into these spaces.
560
00:41:33,822 --> 00:41:39,398
And VCs, boy, their whole value prop is upside down right now.
561
00:41:39,398 --> 00:41:42,371
So they're trying to figure out how to survive too, right?
562
00:41:42,371 --> 00:41:44,053
Because AI is flipping everything.
563
00:41:44,053 --> 00:41:47,126
You don't need those kinds of big investments anymore for engineering.
564
00:41:47,287 --> 00:41:50,802
So they're nervous.
565
00:41:50,802 --> 00:41:55,323
a lot of money on the sideline that needs to find a home and it's kind of their job to do
it.
566
00:41:55,323 --> 00:42:00,344
this was written like I wrote it and I've been in the space 20 years.
567
00:42:00,344 --> 00:42:02,305
I host a podcast on it.
568
00:42:02,305 --> 00:42:03,655
I speak at conferences.
569
00:42:03,655 --> 00:42:11,317
I attend conferences and this VC wrote what I thought was a fantastic outlook for where
legal is going.
570
00:42:11,317 --> 00:42:12,968
So anyway, we were...
571
00:42:13,328 --> 00:42:13,862
yeah.
572
00:42:13,862 --> 00:42:14,774
guys send me a link.
573
00:42:14,774 --> 00:42:15,554
really good.
574
00:42:15,554 --> 00:42:16,805
It's really good.
575
00:42:16,945 --> 00:42:26,330
But so how does a of a bootstrapped company, you know, a two person bootstrap company go
about competing with these?
576
00:42:26,330 --> 00:42:27,690
I mean, you've told your story.
577
00:42:27,690 --> 00:42:29,671
Is your story repeatable?
578
00:42:29,671 --> 00:42:31,872
is that a playbook people can use?
579
00:42:32,174 --> 00:42:33,734
I mean, yes and no.
580
00:42:33,734 --> 00:42:36,694
here's the other kind of thing that's cool.
581
00:42:36,694 --> 00:42:44,034
Like everybody always thinks that 20 whatever year old, you know, startup people are magic
and they are.
582
00:42:44,154 --> 00:42:51,914
But when you're a startup founder in a different part of your career, later in your
career, you can kind of do things a little differently too.
583
00:42:51,914 --> 00:42:58,154
Like hopefully you have a little bit of your own money you can invest or you have a little
more, you know, ability to kind of do it the way you want.
584
00:42:58,614 --> 00:43:00,800
I would say that, you know,
585
00:43:00,800 --> 00:43:02,151
depending on where you are in your life.
586
00:43:02,151 --> 00:43:06,134
um Because, you know, we did invest in it, right?
587
00:43:06,134 --> 00:43:10,918
We invested time and, you know, it's not like open AI was giving us free credits, you know
what I mean?
588
00:43:10,918 --> 00:43:16,453
Like, so there was some significant investment, but, we invested it ourselves, right?
589
00:43:16,453 --> 00:43:26,061
And so since we didn't have to pay for engineering, which is really expensive, and we
didn't have to pay for marketing, which is also really expensive, like we saved a ton of
590
00:43:26,061 --> 00:43:26,251
money.
591
00:43:26,251 --> 00:43:30,794
So I say, yes, it's repeatable um for the right
592
00:43:31,288 --> 00:43:32,688
people, if that makes sense.
593
00:43:32,688 --> 00:43:42,341
you kind of, have to have some of your own resources and you have to be tenacious as hell.
594
00:43:42,341 --> 00:43:43,079
You know what I mean?
595
00:43:43,079 --> 00:43:44,942
It's like, it is hard.
596
00:43:45,022 --> 00:43:47,262
It is hard, but it is fun.
597
00:43:47,342 --> 00:43:52,784
and I do, did you, I used to not have gray hair.
598
00:43:53,064 --> 00:43:54,744
So there you go.
599
00:43:54,744 --> 00:44:00,638
But I also think, and this is like a bit esoteric, so forgive me, but I do also think AI,
600
00:44:00,706 --> 00:44:12,029
We talk about this on the show a lot with VCs and stuff like, I think AI is going to also
help us think about different ways we fund things because right now we have like a very
601
00:44:12,029 --> 00:44:13,729
broken model, I think.
602
00:44:13,850 --> 00:44:21,672
And this is where state governments, if they're smart, they're going to start to step in
and things like that, which, you know, smart government can be like maybe a a big ask.
603
00:44:21,672 --> 00:44:29,814
But, you know, we have, we have, if you're a nonprofit and you want to go the full
altruistic route, there's resources for you to build.
604
00:44:29,814 --> 00:44:30,456
Right.
605
00:44:30,456 --> 00:44:33,888
There's grants, there's foundations, there's all kinds of stuff.
606
00:44:33,888 --> 00:44:39,198
If you want to go the full, I'm going to charge as much as I possibly can for my product
and make as much money as possible.
607
00:44:39,198 --> 00:44:43,693
So someone wants to buy me and I have an exit, we have a model for that, right?
608
00:44:43,693 --> 00:44:51,728
A model like ours where we're like, I could be charging 30 times more for what we have,
but I'm not because I'm trying to do something that's in between.
609
00:44:51,728 --> 00:44:53,779
It's like a social entrepreneurship, whatever.
610
00:44:53,779 --> 00:44:59,394
There's very limited money structures, support, even thinking around.
611
00:44:59,394 --> 00:45:04,358
how to build there and that's sad because that is where there's so much opportunity.
612
00:45:04,358 --> 00:45:15,005
So I think like the challenge and the call I will put out if there's any politicos or like
people in any space to think about this, like there's a real opportunity there to rethink
613
00:45:15,005 --> 00:45:21,570
how we even fund ah sort of disruptive technologies right now.
614
00:45:21,692 --> 00:45:23,183
Yeah, no, that's a great point.
615
00:45:23,183 --> 00:45:33,559
mean, you know, uh VCs, PE, growth equity, all of these funding models have to generate
return for their investment.
616
00:45:33,559 --> 00:45:41,234
And if you're leaving money on the table, which you are intentionally, it's, it's, I don't
know that that's the right fit for them.
617
00:45:41,494 --> 00:45:44,035
No, they think they don't understand.
618
00:45:44,035 --> 00:45:47,577
It's truly bizarre to them.
619
00:45:49,198 --> 00:45:57,411
But they're going to have to adapt because the model for what people are building is going
to change and they're going to be left out because people aren't going to take their money
620
00:45:57,411 --> 00:45:59,353
and then jack their prices off.
621
00:45:59,353 --> 00:46:02,645
And the younger generations want to make meaningful change.
622
00:46:02,645 --> 00:46:04,206
They don't want to just be rich.
623
00:46:04,206 --> 00:46:05,146
Some do.
624
00:46:05,192 --> 00:46:05,843
it's true.
625
00:46:05,843 --> 00:46:06,143
Yeah.
626
00:46:06,143 --> 00:46:10,299
mean, rich can be, you know, a byproduct of just doing good things.
627
00:46:10,299 --> 00:46:13,063
Um, that's the hope, right?
628
00:46:13,063 --> 00:46:13,762
And
629
00:46:13,762 --> 00:46:23,994
Yeah, they call it in Massachusetts, our former Secretary of Economic Development, um
Yvonne Howe used to call it, do well by doing good.
630
00:46:24,295 --> 00:46:25,535
I love that.
631
00:46:25,576 --> 00:46:27,149
There's no reason you can't have both.
632
00:46:27,149 --> 00:46:28,000
It makes sense.
633
00:46:28,000 --> 00:46:34,752
mean, you know, there are, could go take some funding and probably go grow a heck of a lot
faster and probably end up at a bigger payday.
634
00:46:34,752 --> 00:46:35,263
But you know what?
635
00:46:35,263 --> 00:46:36,453
I love what I do.
636
00:46:36,453 --> 00:46:37,901
I'm growing incrementally.
637
00:46:37,901 --> 00:46:40,425
I mean, we're having over a hundred percent year over year growth.
638
00:46:40,425 --> 00:46:43,736
It's not like it's slow, but I'm enjoying it.
639
00:46:43,736 --> 00:46:53,724
I like having some VC breathing down my neck telling me I got to push harder or I have to,
you know, fire people that aren't meeting these criteria.
640
00:46:54,114 --> 00:46:55,576
Or you get bumped out, right?
641
00:46:55,576 --> 00:46:56,892
They're like, oh, nevermind.
642
00:46:56,892 --> 00:46:57,806
Thanks for building this.
643
00:46:57,806 --> 00:46:58,278
Bye bye.
644
00:46:58,278 --> 00:46:59,299
Right, exactly.
645
00:46:59,299 --> 00:47:03,371
And that's just not, I'm at a stage in my life where I don't need to do that.
646
00:47:03,371 --> 00:47:05,752
And I'm just having a lot of fun.
647
00:47:05,752 --> 00:47:07,373
My reputation matters to me.
648
00:47:07,373 --> 00:47:13,163
I want to deliver great work to people and hold my head high when I walk around a legal
tech conference, you know.
649
00:47:13,163 --> 00:47:15,386
Maybe we need our own organization, Ted.
650
00:47:15,386 --> 00:47:19,730
Maybe like these stubborn GenX boot strappers.
651
00:47:19,730 --> 00:47:23,496
We need our own little conference.
652
00:47:23,496 --> 00:47:34,894
You know what I have thought about honestly, in the past, some sort of like, um, model
with which vendors could kind of organize and collaborate.
653
00:47:34,894 --> 00:47:37,456
And I haven't figured out how to do that yet.
654
00:47:37,456 --> 00:47:41,969
And I'm so busy with my day job, but, um, I started a franchisee association.
655
00:47:41,969 --> 00:47:43,970
My wife and I own five gyms here in St.
656
00:47:43,970 --> 00:47:51,295
Louis and, that brand needed some help on the franchisee side that we weren't getting from
HQ.
657
00:47:51,295 --> 00:47:52,606
They're great, but.
658
00:47:52,606 --> 00:47:54,917
They were growing so fast, they weren't giving support.
659
00:47:54,917 --> 00:47:59,036
So I started a nonprofit and it was wildly successful.
660
00:47:59,036 --> 00:48:00,430
I handed over the keys.
661
00:48:00,430 --> 00:48:03,502
It's still doing great, but man, I got so much.
662
00:48:03,502 --> 00:48:05,222
created mentoring programs.
663
00:48:05,222 --> 00:48:08,743
We did a summit where we brought everybody together.
664
00:48:08,743 --> 00:48:12,806
uh We brought in speakers and it was great.
665
00:48:12,806 --> 00:48:13,867
That's not a bad idea.
666
00:48:13,867 --> 00:48:15,882
Maybe you and I should connect at Ilta.
667
00:48:15,882 --> 00:48:16,722
talk, yeah.
668
00:48:16,722 --> 00:48:26,305
And then we should encourage, from my branding perspective, we should encourage the big
players to help fund this because we're going after markets they don't want anyway, or at
669
00:48:26,305 --> 00:48:27,186
least I am.
670
00:48:27,186 --> 00:48:35,848
So it can be like, fine, help the innovation economy, but don't worry, it's not people who
want your customers and then they can look like the good guys in the room who are helping
671
00:48:35,848 --> 00:48:38,139
uh increase access to the law.
672
00:48:38,139 --> 00:48:39,688
There you go, boom, done.
673
00:48:39,688 --> 00:48:40,219
I like it.
674
00:48:40,219 --> 00:48:43,418
They won't they won't like me though because I am taking their market share.
675
00:48:43,418 --> 00:48:44,163
competing with them.
676
00:48:44,163 --> 00:48:45,724
Okay, but we'll keep you.
677
00:48:45,724 --> 00:48:46,864
You can be a silent person.
678
00:48:46,864 --> 00:48:48,608
Exactly, exactly.
679
00:48:48,608 --> 00:48:50,792
Well, this has been a great conversation.
680
00:48:50,792 --> 00:48:58,948
um Before we wrap up, just tell people how they can find more out about what you do, your
podcasts, stuff like that.
681
00:48:59,278 --> 00:49:00,318
Yeah, great.
682
00:49:00,318 --> 00:49:04,378
So first of all, if you happen to be at Ilticon, come by the Startup Hub.
683
00:49:04,378 --> 00:49:07,338
Booth 123 will be hanging out and would love to meet you.
684
00:49:07,338 --> 00:49:09,098
You can always check out our website.
685
00:49:09,098 --> 00:49:14,618
It's describe with a Y because, you know, startups have to have weirdly spelled names.
686
00:49:14,878 --> 00:49:17,098
So describe.com with a Y.
687
00:49:17,098 --> 00:49:21,938
We're most active on LinkedIn, which is kind of like where the Legal Tech people hang out.
688
00:49:21,938 --> 00:49:23,698
So find me on there.
689
00:49:23,698 --> 00:49:25,398
Happy to connect.
690
00:49:25,738 --> 00:49:27,578
Ping us, stop by.
691
00:49:27,804 --> 00:49:32,132
the booth and yeah, just love to love to talk to people in the community.
692
00:49:32,132 --> 00:49:33,402
It's the best part of the job.
693
00:49:33,402 --> 00:49:33,882
Awesome.
694
00:49:33,882 --> 00:49:36,664
And InfoDash will be in booth 308 as well.
695
00:49:36,664 --> 00:49:39,836
So stop by there after you stop by describes.
696
00:49:39,836 --> 00:49:42,417
um Well, this has been a great conversation.
697
00:49:42,417 --> 00:49:44,408
I look forward to seeing you next week.
698
00:49:44,408 --> 00:49:46,799
And let's keep the conversation going.
699
00:49:47,042 --> 00:49:48,500
Great, thanks Ted, this was really fun.
700
00:49:48,500 --> 00:49:50,478
I'm so happy to have joined you, thank you.
701
00:49:50,478 --> 00:49:50,891
Awesome.
702
00:49:50,891 --> 00:49:51,877
You're very welcome.
703
00:49:51,877 --> 00:49:52,869
Take care.
704
00:49:53,420 --> 00:49:54,332
Bye bye.
00:00:03,846
Kara, how are you this morning?
2
00:00:04,084 --> 00:00:04,738
I'm good.
3
00:00:04,738 --> 00:00:05,817
How are doing Ted?
4
00:00:05,817 --> 00:00:08,879
I'm doing great, doing great.
5
00:00:08,879 --> 00:00:20,828
You and I were, I saw a thread on LinkedIn about some open source data and you had made a
comment in there that opened my eyes to some things that I didn't know.
6
00:00:20,949 --> 00:00:32,787
So you and I jumped on a call, you educated me a little bit and um I think we have a
really interesting conversation to be had because I can tell, I see that post.
7
00:00:33,437 --> 00:00:42,843
a million times and I think a lot of people don't understand all the details and nuance
behind it and yeah, it'll be good to kind of get some clarity around that.
8
00:00:42,843 --> 00:00:46,695
um But before we do, let's get you introduced.
9
00:00:46,695 --> 00:00:50,948
So you and I were actually on a podcast together.
10
00:00:50,948 --> 00:00:53,189
Was that a year ago or two years ago?
11
00:00:53,486 --> 00:00:54,786
Yeah, let's see.
12
00:00:54,786 --> 00:00:59,286
We, we launched like two years ago, so it has to have been within the past two year
window.
13
00:00:59,286 --> 00:01:00,786
I just can't remember when.
14
00:01:00,786 --> 00:01:01,521
Yeah.
15
00:01:01,521 --> 00:01:09,838
and I think Horace and Max from Lagora and um maybe the guy from LegalOn.
16
00:01:09,838 --> 00:01:16,404
um But anyway, so you are the co-founder of Describe AI.
17
00:01:16,404 --> 00:01:19,966
You have a A to J focus.
18
00:01:19,966 --> 00:01:23,309
um You actually host a podcast yourself.
19
00:01:23,309 --> 00:01:28,133
Why don't you kind of in the gaps there and tell us a little bit about
20
00:01:28,137 --> 00:01:30,468
who you are, what you do, and where you do it.
21
00:01:31,160 --> 00:01:31,500
Sure.
22
00:01:31,500 --> 00:01:31,940
Yeah.
23
00:01:31,940 --> 00:01:32,581
Thanks so much.
24
00:01:32,581 --> 00:01:34,211
It's really exciting to be on your show.
25
00:01:34,211 --> 00:01:35,122
That was a fun one.
26
00:01:35,122 --> 00:01:35,622
remember that.
27
00:01:35,622 --> 00:01:37,232
Horace always does a great show too.
28
00:01:37,232 --> 00:01:38,733
So that's a good one.
29
00:01:38,733 --> 00:01:39,513
Yeah, he does.
30
00:01:39,513 --> 00:01:42,365
um I'm, Cara Peterson.
31
00:01:42,365 --> 00:01:53,499
So I'm the co-founder of Describe AI and we are a two-year-old legal research company that
em we like to think of ourselves sort of as the mavericks of legal research.
32
00:01:53,499 --> 00:01:58,741
So we've been called, some have called us the Robin Hood of legal research, which is a fun
one too.
33
00:01:58,741 --> 00:02:00,610
But our, um
34
00:02:00,610 --> 00:02:11,421
whole goal is to really try to create tools that help um change the way people can access
what should be, at least in theory, somewhat public information.
35
00:02:11,421 --> 00:02:14,884
So that's why I think that post was so interesting to me.
36
00:02:14,884 --> 00:02:25,214
And so as I've gotten really interested in AI over the past couple of years and sort of
all the things that it can do to change so many different parts of our society.
37
00:02:25,230 --> 00:02:27,281
That's where the show came in that I started.
38
00:02:27,281 --> 00:02:29,132
called Building AI Boston.
39
00:02:29,132 --> 00:02:34,613
And that's all about how AI is being used by real people to do really interesting things.
40
00:02:34,613 --> 00:02:39,015
So it's beyond legal tech, but legal tech does pop up here and there.
41
00:02:39,015 --> 00:02:40,555
So that's me.
42
00:02:40,676 --> 00:02:41,836
Those are my two hats.
43
00:02:41,836 --> 00:02:46,257
And then my third hat is I have a real job that pays me, but we'll talk about that
tomorrow.
44
00:02:46,623 --> 00:02:47,324
Okay.
45
00:02:47,324 --> 00:02:48,384
Good stuff.
46
00:02:48,384 --> 00:02:55,744
Well, let's, yeah, let's talk about this, this open source data movement, legal data
movement and you know, what's reality?
47
00:02:55,744 --> 00:02:56,944
What's, what's hype?
48
00:02:56,944 --> 00:03:12,242
Um, I think it was kind of an interesting headline grabbing, um, post that talked about,
you know, 99 % of case law was being open sourced and
49
00:03:12,242 --> 00:03:21,097
You you had chimed in with some interesting um comments just around kind of what that
really means.
50
00:03:21,097 --> 00:03:26,560
It sounds like some of this data has actually been around for a long time, but might be
somewhat dated.
51
00:03:26,560 --> 00:03:32,203
Like, of separate the fact and fiction around these recent announcements.
52
00:03:33,582 --> 00:03:34,322
Right, sure.
53
00:03:34,322 --> 00:03:39,042
So this was one of those things where I saw it come through my LinkedIn feed.
54
00:03:39,042 --> 00:03:43,502
sometimes you have those moments where you're like, should I jump on this one?
55
00:03:43,502 --> 00:03:44,202
Should I not?
56
00:03:44,202 --> 00:03:45,782
And so I kind of sat on it for a little bit.
57
00:03:45,782 --> 00:03:47,282
And then I decided, you know what?
58
00:03:47,282 --> 00:03:50,662
Yeah, I'm going to put in my two cents here.
59
00:03:50,742 --> 00:03:55,862
So the gist of it was when you think about case law.
60
00:03:55,862 --> 00:03:57,122
So that's what Describe has.
61
00:03:57,122 --> 00:03:59,982
have all kinds of case law right from across the country.
62
00:04:00,002 --> 00:04:02,542
And that is public information.
63
00:04:02,542 --> 00:04:02,982
Right?
64
00:04:02,982 --> 00:04:08,782
mean, so in theory, you should just be able to get whatever you need and get your access
to that easily and so on and so forth.
65
00:04:08,782 --> 00:04:10,722
But we all know that's not the case.
66
00:04:10,922 --> 00:04:18,182
what we've done as a company is like find ways to get that case law and do things to it,
which makes it more usable.
67
00:04:18,182 --> 00:04:22,922
But this post was really interesting to me because there's a few threads.
68
00:04:22,922 --> 00:04:29,774
One, and that I'm super in favor of is this open source kind of like making materials.
69
00:04:29,774 --> 00:04:35,254
available for people, although we have followed a slightly different path that describes
it, which we can talk about later.
70
00:04:35,254 --> 00:04:41,114
And so what should be more available to people than our own laws, like, obviously.
71
00:04:41,134 --> 00:04:50,954
So this post was interesting because it was sort of presenting in some way that access to
this case law unhugging face, which is great that they're doing this, that they're posting
72
00:04:50,954 --> 00:04:53,714
this, was somehow new and different.
73
00:04:53,914 --> 00:04:58,122
But anyone who's been around in this space for a while knows that this
74
00:04:58,122 --> 00:04:59,884
is not actually a new effort.
75
00:04:59,884 --> 00:05:04,788
And so I just wanted to jump in there and sort of talk about what's actually happening in
that space.
76
00:05:04,788 --> 00:05:11,324
And then there was a second part about it that this was the reason I really jumped on it
was that, oh, the case was there.
77
00:05:11,324 --> 00:05:21,994
So anyone can just throw an AI wrapper on it and make some tools that can compete with all
the established legal research tools, which that's the part where I'm like, no, no, no, we
78
00:05:21,994 --> 00:05:24,206
need to talk about this.
79
00:05:24,220 --> 00:05:33,433
Yeah, I mean, there are multi-billion dollar companies that have built themselves on legal
research, TR, Lexis, Bloomberg, others.
80
00:05:33,534 --> 00:05:37,459
And yeah, if it were just that simple, they'd be out of business.
81
00:05:37,459 --> 00:05:38,581
why is it?
82
00:05:38,581 --> 00:05:40,663
Tell us why it's not that simple.
83
00:05:41,314 --> 00:05:50,770
Right, so you can, anyone could get access, you know, and for first I should just say as a
disclaimer, I am the sort of marketing business development side of our company.
84
00:05:50,770 --> 00:05:56,444
We have obviously another side that's the tech side, my co-founder Richard DeBona.
85
00:05:56,444 --> 00:06:00,067
So when I talk about this, I'm talking about it from more like the business perspective.
86
00:06:00,067 --> 00:06:04,390
He has much more, you know, deep thinking about how the tech actually works.
87
00:06:04,390 --> 00:06:08,973
But the point is, is just because it's there doesn't mean you can make it useful, right?
88
00:06:08,973 --> 00:06:10,786
You still need to be able to
89
00:06:10,786 --> 00:06:20,281
build tools with AI and with other types of technologies to mine the information that's
useful for the user of that data.
90
00:06:20,281 --> 00:06:21,632
So it's very easy.
91
00:06:21,632 --> 00:06:32,878
as we've seen, people who are going directly into things like ChachiBT or whatnot to do
legal research or legal um work are coming across.
92
00:06:32,878 --> 00:06:38,198
very significant problems that that data set will have for them, including hallucinations
and things like that.
93
00:06:38,198 --> 00:06:41,258
So it was vastly oversimplifying.
94
00:06:41,258 --> 00:06:42,818
And again, it's just someone's post.
95
00:06:42,818 --> 00:06:52,638
It's not like, you know, they were writing a white paper or something, but just to vastly
oversimplify that people can grab that data, take some kind of chat GPT wrapper, throw it
96
00:06:52,638 --> 00:07:02,298
on there, and you're going to create a tool that's helpful for people and furthering sort
of access to the law or access to justice or the democratization of this information was
97
00:07:02,420 --> 00:07:08,797
so misleading and so many people were jumping on that post and saying, wow, this is game
changing.
98
00:07:08,797 --> 00:07:09,909
It's gonna change everything.
99
00:07:09,909 --> 00:07:12,541
It just rubbed me the wrong way.
100
00:07:12,942 --> 00:07:16,754
Cause that it'll cause more harm than good if people think that's the result.
101
00:07:16,754 --> 00:07:19,466
Yeah, you know, I see it all the time.
102
00:07:19,466 --> 00:07:21,964
In fact, there's a guy I follow.
103
00:07:21,964 --> 00:07:28,091
I won't say his name, but he's a, he has really great content around AI use cases.
104
00:07:28,091 --> 00:07:35,275
And yesterday he posted that he just replaced his lawyer with this prompt.
105
00:07:35,275 --> 00:07:40,998
And you know, it looks, it's a, it's a great prompt, but there's, it's just not that
simple.
106
00:07:40,998 --> 00:07:44,100
And I am definitely in the camp of.
107
00:07:44,167 --> 00:07:50,520
AI today, the general models can drastically reduce the amount of legal spend.
108
00:07:50,520 --> 00:07:54,042
Um, but it does not eliminate the need for lawyers.
109
00:07:54,042 --> 00:08:05,808
Like I gave you some examples, like we have, um, we have a four nine a program here, which
is like a, a way that we incent, certain people on the team with like shadow equity in the
110
00:08:05,808 --> 00:08:06,748
company.
111
00:08:06,949 --> 00:08:13,522
And, we start, we started this program before AI existed.
112
00:08:13,522 --> 00:08:14,068
Well,
113
00:08:14,068 --> 00:08:22,187
before November of 2022 when, you know, chat GPT-35 was released, before it was easily
accessible.
114
00:08:22,187 --> 00:08:29,488
So I just for the hell of it decided to upload our docs into chat GPT.
115
00:08:29,488 --> 00:08:31,068
Actually, I used Grok for that one.
116
00:08:31,068 --> 00:08:37,828
I've been playing around with Grok and asked for, are there any irregularities or risks
associated with how we've implemented this program?
117
00:08:37,828 --> 00:08:42,528
And then I said, give me a risk ranking.
118
00:08:42,704 --> 00:08:46,948
and a probability that one of these risks could potentially materialize.
119
00:08:46,948 --> 00:08:48,730
And it did a fantastic job.
120
00:08:48,730 --> 00:08:52,353
And then I took that to our lawyers, and then we had a conversation about it.
121
00:08:52,353 --> 00:09:00,059
um So it doesn't eliminate the need for lawyers, but it can today, the general model is
not a legal specific tool.
122
00:09:01,401 --> 00:09:05,605
There are some really valuable use cases, especially for small businesses like ours.
123
00:09:05,605 --> 00:09:09,330
We're like 43 people, and um we don't have
124
00:09:09,330 --> 00:09:10,891
big legal budgets.
125
00:09:10,891 --> 00:09:16,163
it's a good way for us to just do sanity checks, um especially on lower risk stuff.
126
00:09:16,163 --> 00:09:18,503
Like I don't even read NDAs anymore.
127
00:09:18,564 --> 00:09:26,157
You know, I will, if they, if they send me an NDA, I have a, I have a custom GPT that I've
trained and say, Hey, what's the Delta?
128
00:09:26,157 --> 00:09:27,888
Are there any, is this one sided?
129
00:09:27,888 --> 00:09:30,298
Have a little prompt that I use takes me there.
130
00:09:30,298 --> 00:09:31,609
They're low risk.
131
00:09:31,609 --> 00:09:37,391
So, um, for high risk stuff, obviously lawyers still need to be involved in the real
nuanced.
132
00:09:37,391 --> 00:09:39,465
Um, but
133
00:09:39,465 --> 00:09:49,693
What, you know, your comment about, yeah, you can't just throw chachi BT point chachi BT
at these big expansive data sources for a number of reasons.
134
00:09:49,693 --> 00:10:04,125
mean, one, the context window issue, like even I think Gemini that I think is now at a
million tokens, it gets all sorts of detail gets lost in the middle when you upload a ton
135
00:10:04,125 --> 00:10:08,488
of content and try to do retrieval or
136
00:10:08,698 --> 00:10:12,315
summarize it misses things today's tech does.
137
00:10:12,315 --> 00:10:16,663
What are some other challenges with just pointing it at a big data source?
138
00:10:16,663 --> 00:10:21,972
I don't mean from a technical perspective, but you know, just kind of data wise, what are
the issues?
139
00:10:22,156 --> 00:10:28,139
Well, and I think your point about the nuance is really important and also your point
about risk, right?
140
00:10:28,139 --> 00:10:37,034
So if you think about when you're in a situation, like think about it from the consumer of
legal information or legal services point of view, right?
141
00:10:37,034 --> 00:10:38,195
So the client point of view.
142
00:10:38,195 --> 00:10:47,150
So most likely if you're engaging in some kind of legal dispute or legal situation, you
have a pretty serious thing that you're trying to work through, right?
143
00:10:47,150 --> 00:10:51,682
So for, when you talk about the risk profile of something being wrong,
144
00:10:51,828 --> 00:11:03,351
it's much scarier how it could affect people's lives in a legal sphere or a medical sphere
or something like that versus apartment hunting or planning a trip or things like that.
145
00:11:03,351 --> 00:11:09,343
And now this is all coming from someone who's very AI positive and very much a pro AI.
146
00:11:09,343 --> 00:11:11,250
I obviously I have show about it.
147
00:11:11,250 --> 00:11:12,164
I have a company about it.
148
00:11:12,164 --> 00:11:19,346
So I'm super, super uh excited about the potential the same way you are about how it can
help humans become better at what they're doing.
149
00:11:19,346 --> 00:11:20,642
But I do think
150
00:11:20,642 --> 00:11:32,893
the biggest risk I see about this idea of just pointing at all this data, having people
who frankly don't have the depth of knowledge either in the legal sphere and the tech
151
00:11:32,893 --> 00:11:35,555
sphere to understand what's coming back.
152
00:11:35,555 --> 00:11:36,115
Is it good?
153
00:11:36,115 --> 00:11:36,486
it bad?
154
00:11:36,486 --> 00:11:37,577
I mean, this is really hard.
155
00:11:37,577 --> 00:11:38,788
Benchmarking is really hard.
156
00:11:38,788 --> 00:11:39,959
We can talk about that too.
157
00:11:39,959 --> 00:11:44,076
em Because we could end up as a community.
158
00:11:44,076 --> 00:11:49,099
destroying any possibility we have of having these two will be helpful before they even
get out of the gate.
159
00:11:49,099 --> 00:12:00,324
And so I'm probably not surprising anybody listening to this call that the judiciary um is
not necessarily or the people involved in the courts and things of that nature aren't
160
00:12:00,324 --> 00:12:03,876
necessarily the most technically advanced people on earth, know, right?
161
00:12:04,157 --> 00:12:06,238
They just are and it's okay.
162
00:12:06,238 --> 00:12:09,140
And um that's not necessarily their job.
163
00:12:09,140 --> 00:12:14,122
But if you can see, and we saw it with hallucinations, if we create
164
00:12:14,270 --> 00:12:26,319
noise and we create situations where people are causing themselves, like we said, or the
system more harm than good, we could end up getting shut down, you know, regulated to a
165
00:12:26,319 --> 00:12:31,802
point where we're actually hurting the long-term goals of what AI could do.
166
00:12:32,050 --> 00:12:42,870
and this, now I'm going to sound a little bit, a little bit uh soapboxy here, but some of
this can come to the point of everyone's trying to make their dollar.
167
00:12:43,060 --> 00:12:43,612
on these tools.
168
00:12:43,612 --> 00:12:49,596
And of course I am too, so I'm not criticizing, but you know, it's a very different space
when you're in the legal space.
169
00:12:49,596 --> 00:12:50,117
Yeah.
170
00:12:50,117 --> 00:12:52,039
Well, you mentioned benchmarking.
171
00:12:52,039 --> 00:12:55,143
What are your thoughts around that?
172
00:12:55,842 --> 00:12:57,583
Yeah, so benchmarking.
173
00:12:57,583 --> 00:13:00,744
So some of this is I come at it like I'm not an attorney.
174
00:13:00,744 --> 00:13:03,095
uh I don't have a legal background.
175
00:13:03,095 --> 00:13:07,607
have, like I said, a marketing, public health, of like social justice background, right?
176
00:13:07,607 --> 00:13:11,109
um But what do I know about these things, you know?
177
00:13:11,109 --> 00:13:22,828
And so when you think about legal tech as a field, as an ecosystem, you know, we haven't
done a very good job of helping people assess what tools are good.
178
00:13:22,828 --> 00:13:29,324
what tools aren't good, what tools might work for them, which tools have the better
housekeeping, know, seal approval.
179
00:13:29,384 --> 00:13:37,992
And I know there's a lot of discussion going on around this, particularly in the academic
circles and with the legal librarians who are the people who probably will solve this for
180
00:13:37,992 --> 00:13:40,074
us eventually, which is great.
181
00:13:40,535 --> 00:13:45,329
But we're not giving consumers a very easy way to understand what's good and what's not.
182
00:13:45,329 --> 00:13:46,911
And again, I'm part of this.
183
00:13:46,911 --> 00:13:48,482
I'm not, you know,
184
00:13:48,482 --> 00:13:52,964
blaming others and saying we're perfect, but we tend to say, my product is good, trust me.
185
00:13:52,964 --> 00:13:57,066
And I'm patting my back for the people who are just watching, listening and not watching.
186
00:13:57,066 --> 00:14:06,531
em And that's not going to be enough for something like this because again, the
consequences are too dire for people getting incorrect information.
187
00:14:06,531 --> 00:14:12,334
that's where the whole, excuse me, the whole human in the loop thing becomes very, very,
very important.
188
00:14:12,348 --> 00:14:15,382
Yeah, and you know, I have seen some initiatives.
189
00:14:15,382 --> 00:14:20,353
It's been a while that probably Gosh, maybe close to a year ago.
190
00:14:20,353 --> 00:14:24,448
I heard about a uh consortium of
191
00:14:24,624 --> 00:14:27,145
sorts that had kind of gotten together.
192
00:14:27,145 --> 00:14:31,547
Harvey was on that list as participants, which I thought was interesting.
193
00:14:31,547 --> 00:14:38,810
I'm not sure how much vendors should play a role, maybe an advisory role, but I don't, you
know, it's kind of like the Fox garden, the hen house.
194
00:14:38,810 --> 00:14:43,412
If the Pete, you creating your own, you grading your own tests.
195
00:14:43,412 --> 00:14:54,546
Um, we've seen gaming in, I mean, meta got slapped around pretty hard, not long ago with
llama four and the, um,
196
00:14:54,546 --> 00:15:06,030
model that was released had material differences in benchmarking scores than what they
presented during those benchmarking tests.
197
00:15:06,030 --> 00:15:14,800
yeah, think vendors should play a role, you know, feels like, I don't know, somebody else
should be leading that effort with us supporting them.
198
00:15:15,327 --> 00:15:15,717
Agree.
199
00:15:15,717 --> 00:15:17,088
And it needs to be independent.
200
00:15:17,088 --> 00:15:22,211
And this is why I'm really in favor of it coming out of educational institutions.
201
00:15:22,211 --> 00:15:25,012
uh I worked in higher education for a long time.
202
00:15:25,012 --> 00:15:31,096
So I know that that's probably the best that we have for an independent lens.
203
00:15:31,096 --> 00:15:36,138
Excuse me, because um these firms that are marketing firms for legal tech are great.
204
00:15:36,139 --> 00:15:42,094
And, of course, they're going to do thorough, you know, sort of
205
00:15:42,094 --> 00:15:52,334
looking at these tools, but if it's a client or if it's a tool that is using their
services, it's still hard for people to feel that it's completely independent.
206
00:15:52,334 --> 00:15:57,354
And I don't think in the current environment, the government is going to be doing anything
about this.
207
00:15:57,374 --> 00:16:01,594
We couldn't really ask OpenAI or the other models to test it either.
208
00:16:01,594 --> 00:16:02,794
So it's a tough one.
209
00:16:02,794 --> 00:16:03,674
I don't know.
210
00:16:03,674 --> 00:16:09,834
And it's not regulated the way medical information or things like that is in some ways.
211
00:16:10,794 --> 00:16:22,622
I would say if there's anyone out there uh who wants to do this, I think you have a big
wide open field and I think you should make the legal tech companies pay you for it and
212
00:16:22,622 --> 00:16:23,388
support it.
213
00:16:23,388 --> 00:16:24,989
Yeah, I agree.
214
00:16:24,989 --> 00:16:28,991
Now back to the kind of legal data movement.
215
00:16:28,991 --> 00:16:43,329
um You had, when you and I talked last, you had shared some interesting information on
just kind of the history of the legal data liberation efforts, you know, in the Harvard
216
00:16:43,329 --> 00:16:46,561
case law project and the uh free law project.
217
00:16:46,561 --> 00:16:53,256
give, give our listeners a little bit of a perspective on where we've been and kind of
218
00:16:53,256 --> 00:16:54,607
where we are now.
219
00:16:55,438 --> 00:16:56,438
Right.
220
00:16:56,458 --> 00:17:05,098
So, and I'm sure there's things I don't know about, so I'll tell you the things I know,
and then folks can put chime in in the comments about things they know too.
221
00:17:05,178 --> 00:17:10,978
So the first place that we came across is we were looking for case law ourselves a couple
of years ago.
222
00:17:10,978 --> 00:17:13,738
The first place we found, you know, good access.
223
00:17:13,738 --> 00:17:22,478
And then there's also the idea of access to users to research and the access to people who
want to use that technology to create a product, right?
224
00:17:22,478 --> 00:17:25,658
Or that, excuse me, that information to create a product, which are not the same.
225
00:17:26,638 --> 00:17:34,938
So the case law access project, which was out of the Harvard Law School's library, was a
very interesting project that we came across.
226
00:17:34,938 --> 00:17:36,958
And they had all the case law.
227
00:17:36,958 --> 00:17:38,498
They have a wonderful story.
228
00:17:38,498 --> 00:17:42,118
Adam Ziegler could come on and talk about it if you want, I'm sure.
229
00:17:42,118 --> 00:17:43,658
Although you probably have to ask him.
230
00:17:43,658 --> 00:17:44,918
I'm not his rep.
231
00:17:45,038 --> 00:17:49,438
Where they went into the library and they quite literally scanned all their books.
232
00:17:49,438 --> 00:17:55,466
And there's just a fascinating story about how they had to break the bindings and the
librarians were like, ah.
233
00:17:55,466 --> 00:17:56,306
some of these old books.
234
00:17:56,306 --> 00:18:06,651
anyway, so they scanned everything they had and they collaborated with a legal tech
company that I'm forgetting the name right now, but um they uh worked with them.
235
00:18:06,651 --> 00:18:09,793
And then I think that company was sold to one of the big ones.
236
00:18:09,793 --> 00:18:13,934
I think it was Lexis, but may want to fact check me on that.
237
00:18:13,934 --> 00:18:21,318
So their data we noticed stopped around 2017 because that's when the project ended.
238
00:18:21,318 --> 00:18:21,954
So
239
00:18:21,954 --> 00:18:29,130
we had to look for another resource, but that was kind of like the original free the law
group from what I've been able to see.
240
00:18:29,130 --> 00:18:30,701
And they're still doing a lot of really cool stuff.
241
00:18:30,701 --> 00:18:33,223
think they're ramping up again.
242
00:18:33,684 --> 00:18:44,613
So then we came across a uh group out in Ottawa, California that is powering a ton of
innovation in this ecosystem and should be getting a lot of credit.
243
00:18:44,613 --> 00:18:51,052
And that was one of the other reasons I jumped on that post is because I'm like, hey, free
law has been doing this for years.
244
00:18:51,052 --> 00:19:01,908
So the free lab project is, I think they've been around about 10 years and their sole
purpose is to make sure that um things like case law and other things are actually
245
00:19:01,908 --> 00:19:03,919
accessible by regular people.
246
00:19:03,919 --> 00:19:10,463
And then they also have, you can create a commercial agreement with them where you ingest
their data, which is how we get our data.
247
00:19:10,463 --> 00:19:11,304
So it's great.
248
00:19:11,304 --> 00:19:13,015
And that's a commercial license.
249
00:19:13,015 --> 00:19:14,235
So they're doing that.
250
00:19:14,235 --> 00:19:19,018
um And then a very interesting space that's new and
251
00:19:19,520 --> 00:19:26,913
I think there's a lot of cool things happening here, including at Describe, has to do with
sort of the other parts of the law.
252
00:19:26,913 --> 00:19:29,274
Case law is one part of the law.
253
00:19:29,394 --> 00:19:34,766
But a lot of people think of statutes, regulations, and all that other stuff as the real
law.
254
00:19:34,766 --> 00:19:39,282
So happy to talk about that, because that's a big missing piece of the puzzle right now.
255
00:19:39,282 --> 00:19:42,668
Yeah, well, let's uh share your thoughts on that.
256
00:19:43,374 --> 00:19:54,914
Yeah, so again, coming from a non-lawyer, so forgive me if I'm not explaining this the way
a professor would in law school, but certainly case law is part of what lawyers, attorneys
257
00:19:54,914 --> 00:19:58,534
have to look at as they're building a case or writing a brief or doing things like that.
258
00:19:58,534 --> 00:20:05,254
But of course, much of the law is done with just the law of the land, which statutes,
regulations, things of that nature.
259
00:20:05,254 --> 00:20:08,914
And it's really, really hard to get.
260
00:20:09,014 --> 00:20:13,100
It's almost impossible to get, and that's weird.
261
00:20:13,100 --> 00:20:15,802
right, when you think about it, because it's all public information.
262
00:20:15,802 --> 00:20:24,258
And this also, I think, links back to the judiciary in many states, state governments, and
things like that don't necessarily have very sophisticated technology.
263
00:20:24,258 --> 00:20:27,320
They can't, you know, it's not something they've invested in.
264
00:20:27,320 --> 00:20:31,103
There are sort of workarounds that other companies have provided to them to make it easier
for them.
265
00:20:31,103 --> 00:20:33,044
And then that data is sort of blocked away.
266
00:20:33,044 --> 00:20:40,649
m So uh some companies or some places have started to look at pulling that data directly.
267
00:20:40,649 --> 00:20:42,110
And that is
268
00:20:42,110 --> 00:20:42,600
what we're doing.
269
00:20:42,600 --> 00:20:49,672
So we had looked in because we really need to add those resources to our database for it
to be as broad as possible.
270
00:20:49,672 --> 00:20:55,374
em And we started to look at where we could get that data without having to go get it
ourselves.
271
00:20:55,374 --> 00:21:05,167
And I think we were quoted by one group who shall remain anonymous, $100,000 just for some
small piece of this data.
272
00:21:05,167 --> 00:21:06,717
And so we thought this is absurd.
273
00:21:06,717 --> 00:21:09,476
You know, so
274
00:21:09,656 --> 00:21:10,726
We're getting it ourselves.
275
00:21:10,726 --> 00:21:18,528
So we're going to have every single piece of, tell you the list because I want to make
sure I get this right.
276
00:21:18,528 --> 00:21:25,590
Statutes, regulations, state constitutions, court rules, session laws, and attorney
general opinions from all 50 states.
277
00:21:25,669 --> 00:21:28,051
And it will be part of our database.
278
00:21:28,051 --> 00:21:30,182
We will be demoing this at Ilticon.
279
00:21:30,182 --> 00:21:31,532
So come by and see it.
280
00:21:31,532 --> 00:21:37,294
But we think we'll have the most thorough corpus of anyone, especially at the prices we
charge.
281
00:21:37,294 --> 00:21:39,233
And everything will be fully searchable.
282
00:21:39,233 --> 00:21:40,283
by natural.
283
00:21:40,584 --> 00:21:41,034
Wow.
284
00:21:41,034 --> 00:21:42,675
Yeah, that's big news.
285
00:21:42,675 --> 00:21:44,568
Yeah, that's big news.
286
00:21:44,568 --> 00:21:54,198
we are em recording on Monday the 4th, but this episode will be released on Wednesday the
6th, which is going to be in time for Ilticon so folks can come by.
287
00:21:54,198 --> 00:21:55,719
Are you guys exhibiting?
288
00:21:56,155 --> 00:21:58,476
Yeah, we're in the startup hub, which is cool.
289
00:21:58,476 --> 00:22:00,597
haven't been to Ilticon before.
290
00:22:00,597 --> 00:22:05,460
The only conference we've done before is Tech Show, which was so much fun.
291
00:22:05,460 --> 00:22:06,300
my gosh.
292
00:22:06,300 --> 00:22:09,412
We love that one with Bob Ambrosio and the Startup Alley and stuff.
293
00:22:09,412 --> 00:22:11,833
So anyone out there who's got a startup apply for that.
294
00:22:11,833 --> 00:22:13,964
It's so much fun if you can get it.
295
00:22:14,204 --> 00:22:17,236
So yeah, so we're in booth one, two, three in the Startup Alley.
296
00:22:17,236 --> 00:22:18,532
So come see us.
297
00:22:18,532 --> 00:22:19,002
Awesome.
298
00:22:19,002 --> 00:22:30,259
Yeah, we did when we were first finding our way in legal back in our predecessor company,
we were called Acrowire and I'm a former Microsoft guy and just kind of started consulting
299
00:22:30,259 --> 00:22:32,900
on the side as a lot of tech people do.
300
00:22:32,900 --> 00:22:36,212
And then the side gig started making more than the day job.
301
00:22:36,212 --> 00:22:39,284
So I quit and uh terrible timing.
302
00:22:39,284 --> 00:22:47,108
This was like right after the financial crisis, but I had enough client work to keep me
busy and my bait, my bills paid.
303
00:22:47,260 --> 00:23:00,065
So I jumped in and we fell down the legal rabbit hole early and we didn't really know
where our skills mapped on the, on the kind of the market spectrum in terms of size.
304
00:23:00,065 --> 00:23:13,411
we are our first show was like ABA tech show and it's great for kind of solo and small
firms, but for big firms really ill to legal week, those are, those are better fits for,
305
00:23:13,411 --> 00:23:15,852
for our audience, which is, which is large law.
306
00:23:15,852 --> 00:23:17,270
Um,
307
00:23:17,270 --> 00:23:21,922
our first experience in legal tech was just a few months after launching.
308
00:23:21,922 --> 00:23:28,715
We were uh a finalist for one of the legal week awards.
309
00:23:28,955 --> 00:23:30,836
Wow, that was amazing.
310
00:23:30,836 --> 00:23:33,087
Cause we had never been to anything like that.
311
00:23:33,087 --> 00:23:41,360
And then to walk into that venue down there at the Hyatt, I thought the Hilton, the Hilton
in New York, was like, holy smokes.
312
00:23:41,360 --> 00:23:43,932
Like this is, this is big time.
313
00:23:43,932 --> 00:23:45,117
It was pretty cool.
314
00:23:45,117 --> 00:23:45,709
Yeah.
315
00:23:45,709 --> 00:23:47,663
that was back when Stephanie Wilkins was running it.
316
00:23:47,663 --> 00:23:48,253
was awesome.
317
00:23:48,253 --> 00:23:49,193
Yeah.
318
00:23:49,193 --> 00:23:49,654
Yeah.
319
00:23:49,654 --> 00:23:52,094
mean, legal week is an interesting one for us.
320
00:23:52,094 --> 00:24:03,359
We, we attend every year, but it, you know, historically has been e-discovery focused and
I, you know, I know they've tried to move beyond that in recent years, but we go, we
321
00:24:03,359 --> 00:24:04,699
don't, we don't exhibit.
322
00:24:04,699 --> 00:24:09,741
Um, the exhibit hall is very disjointed.
323
00:24:09,741 --> 00:24:17,584
Um, like, you know, it's, there's all little sorts of nooks and crannies and, um but we
always make an appearance at legal week.
324
00:24:18,385 --> 00:24:19,204
for
325
00:24:19,367 --> 00:24:20,097
it's a good one.
326
00:24:20,097 --> 00:24:30,606
And then the one that I really liked a lot was very different, completely at other end of
the scale, which is the LSC ITC, which is the Legal Services Council, right?
327
00:24:30,606 --> 00:24:34,089
So that was all about access to justice stuff.
328
00:24:34,089 --> 00:24:35,220
So that was really cool.
329
00:24:35,220 --> 00:24:40,154
was kind of, Bob Brogy wrote a really interesting piece last year about it, about the
difference.
330
00:24:40,154 --> 00:24:48,340
Like he went from one to the other and he was like, wow, just that shows the gulf in
certain resources and the two ends of the...
331
00:24:48,366 --> 00:24:49,296
cool, so to speak.
332
00:24:49,296 --> 00:24:50,097
Interesting.
333
00:24:50,097 --> 00:24:51,958
Well, I'm curious.
334
00:24:51,958 --> 00:24:54,619
She had mentioned the group out of Oakland.
335
00:24:54,619 --> 00:24:57,300
Are they?
336
00:24:57,300 --> 00:24:57,551
Yeah.
337
00:24:57,551 --> 00:24:57,851
Yeah.
338
00:24:57,851 --> 00:25:01,703
So is that an academic institution?
339
00:25:01,703 --> 00:25:03,444
Like who's behind that?
340
00:25:03,798 --> 00:25:05,679
Yeah, so they're nonprofit.
341
00:25:05,679 --> 00:25:09,822
They're run by a guy named Michael Listener, who's great.
342
00:25:09,822 --> 00:25:11,262
So hi, Michael, if you're listening.
343
00:25:11,262 --> 00:25:15,785
um And they are, um like I say, they're a nonprofit.
344
00:25:15,785 --> 00:25:18,577
they are very mission driven.
345
00:25:18,577 --> 00:25:24,851
I'm sure that they've got all kinds of academics involved with them and sort of on their
board and things like that.
346
00:25:24,851 --> 00:25:29,593
But they're a straight up nonprofit, which means that they are...
347
00:25:29,593 --> 00:25:31,174
ah
348
00:25:31,478 --> 00:25:35,183
sort of at the source powering a lot of the legal tech innovation.
349
00:25:35,183 --> 00:25:39,729
And of course, they're not going to say publicly who their clients are unless they're like
us.
350
00:25:39,729 --> 00:25:42,142
And we're always willing to say we're their client.
351
00:25:42,142 --> 00:25:50,479
You would scratch the surface on a lot of these startups and you would see that's where
their data is coming from, for sure.
352
00:25:50,479 --> 00:26:04,724
with these types of organizations out there working hard to kind of open source uh legal
information, does that open the door to challenge, you know, what's largely a duopoly,
353
00:26:04,724 --> 00:26:10,859
feels like, with Westlaw and Lexis with solutions, does that open the door for that?
354
00:26:12,798 --> 00:26:13,279
Yes.
355
00:26:13,279 --> 00:26:13,629
Yes.
356
00:26:13,629 --> 00:26:16,060
So yes and no, depending on how you look at it.
357
00:26:16,060 --> 00:26:18,842
So broadly speaking, absolutely.
358
00:26:18,842 --> 00:26:20,943
Like for sure.
359
00:26:20,943 --> 00:26:29,568
Like if anything is going to challenge those duopolies and those extraordinarily powerful
companies, it's AI.
360
00:26:29,568 --> 00:26:29,988
Right?
361
00:26:29,988 --> 00:26:38,182
mean, that is the moment is here, you and they know that and that's why they're building
so fast and acquiring so quickly and you know, all this stuff.
362
00:26:38,182 --> 00:26:39,573
there's no question.
363
00:26:39,573 --> 00:26:39,873
Right.
364
00:26:39,873 --> 00:26:41,742
And we've seen some really interesting
365
00:26:41,742 --> 00:26:46,822
things happening like with Villex and Cleo and know, Harvey and Lexus.
366
00:26:46,822 --> 00:26:50,562
So everyone's paying attention to this, as you certainly know, right, Ted?
367
00:26:50,562 --> 00:26:53,542
Because you do, you're a podcast, you talk about this a lot.
368
00:26:54,442 --> 00:26:57,662
But where it also opens the door.
369
00:26:57,682 --> 00:26:59,342
So that's like one end of the market, right?
370
00:26:59,342 --> 00:27:08,322
So and a lot of people, if not almost all people are aiming for the exact same part of the
market, right?
371
00:27:08,322 --> 00:27:10,782
So the well-served...
372
00:27:11,180 --> 00:27:13,072
the places and this is fine.
373
00:27:13,072 --> 00:27:14,122
They need their tools.
374
00:27:14,122 --> 00:27:14,563
This is great.
375
00:27:14,563 --> 00:27:16,274
I have nothing, no digs or anything.
376
00:27:16,274 --> 00:27:21,709
But if everybody's always aiming for the same part of the market, there's not going to be
a lot of innovation, right?
377
00:27:21,709 --> 00:27:25,442
That's not where things are going to get exciting, at least from my point of view.
378
00:27:25,442 --> 00:27:32,968
My point of view and the way we focus at Describe is how about that gigantic, unserved
market, right?
379
00:27:32,968 --> 00:27:37,590
And I think it was, I don't want to misquote him, but I'm pretty sure it was Jack Newton.
380
00:27:37,590 --> 00:27:40,984
It was Jack Newton from Clio who said, um
381
00:27:40,984 --> 00:27:47,218
that there was maybe, I think it was like a trillion dollars of unmet need in the legal
services industry.
382
00:27:47,218 --> 00:27:58,166
Now we better fact check that or, you know, I might get a letter from Uncle Leo saying
that was wrong, but regardless, there's an enormous amount of unmet need out there for
383
00:27:58,166 --> 00:28:00,148
legal services, legal information.
384
00:28:00,148 --> 00:28:06,512
So when we were first starting out, uh we thought to ourselves, we could be a nonprofit,
right?
385
00:28:06,512 --> 00:28:09,806
We could probably be a very successful nonprofit.
386
00:28:09,806 --> 00:28:13,386
if we wanted to be, we could get grants, we could do all sorts of things.
387
00:28:13,386 --> 00:28:16,266
But we always said, nope, we're a business.
388
00:28:16,906 --> 00:28:17,626
We're a business.
389
00:28:17,626 --> 00:28:29,346
And while we have a mission-driven focus, we are trying to create a very successful and
sustainable and profitable business because AI allows us to offer very high quality tools
390
00:28:29,346 --> 00:28:35,526
that you could have not even imagined just a few years ago could possibly exist without
human intervention.
391
00:28:37,870 --> 00:28:45,087
put it out there at a price that is so absurdly low compared to the competitors that we're
just gonna gobble up how much of market share.
392
00:28:45,087 --> 00:28:46,748
that's the goal.
393
00:28:46,748 --> 00:28:54,666
So while we do like to present ourselves and we are mission driven, we have a very serious
market driven approach to what we're doing.
394
00:28:54,666 --> 00:29:00,231
um And it's really starting to come to fruition now that we have a paid tool.
395
00:29:00,231 --> 00:29:03,946
em So I think there's huge opportunity, but I think...
396
00:29:03,946 --> 00:29:15,923
I would caution other people who are looking at this space, don't automatically assume you
have to compete, you know, with Harvey, with Lexus directly, with Vlex, you know, that's
397
00:29:15,923 --> 00:29:16,954
not easy, right?
398
00:29:16,954 --> 00:29:18,715
And they're, those are very well funded firms.
399
00:29:18,715 --> 00:29:24,148
So look at other parts of the market where other people aren't looking at it and see what
you can do there.
400
00:29:24,424 --> 00:29:34,090
Yeah, you know, I candidly, I've talked about this publicly before and I don't know, it
may ruffle some feathers, the, what, what?
401
00:29:34,090 --> 00:29:34,380
Yeah.
402
00:29:34,380 --> 00:29:36,001
I mean, I'm not for everybody.
403
00:29:36,001 --> 00:29:38,252
know that my opinions don't res.
404
00:29:38,792 --> 00:29:39,672
Yeah.
405
00:29:40,133 --> 00:29:40,563
Yeah.
406
00:29:40,563 --> 00:29:41,244
Feather ruffle it.
407
00:29:41,244 --> 00:29:41,444
Yeah.
408
00:29:41,444 --> 00:29:52,275
I've ruffled a few feathers, um, in my day and, um, not intentionally, but I feel like
just being candid and if, you know, if people get offended, then they do, but
409
00:29:52,275 --> 00:29:56,788
You know, I, we have really struggled on the smaller end of the market with, within the
law firm world.
410
00:29:56,788 --> 00:30:01,812
Um, a lot of people call mid-law like 50 to like 150 lawyers.
411
00:30:01,812 --> 00:30:04,573
That's that really feels like small law for us.
412
00:30:04,573 --> 00:30:08,096
So, um, our market is really a hundred attorneys and up.
413
00:30:08,096 --> 00:30:13,129
And there are about 400 of those law firms in North America, which is our focus.
414
00:30:13,129 --> 00:30:20,530
We're doing some business in the UK now too, but by my estimates and there's no good
numbers, there's only about 600 law firms on planet earth.
415
00:30:20,530 --> 00:30:22,381
who have more than a hundred attorneys.
416
00:30:22,381 --> 00:30:35,304
There are tens of thousands, if not hundreds of thousands of solo and smaller firms in
that it's a very kind of, um, I guess steeply sloping pyramid.
417
00:30:35,304 --> 00:30:38,625
Um, yeah, that's a good way to describe it.
418
00:30:38,625 --> 00:30:41,465
Um, but it's really hard to do business there.
419
00:30:41,566 --> 00:30:50,578
And the reason it's hard to do business there as a tech company is because law firms
historically have not valued, not seen tech as strategic.
420
00:30:50,828 --> 00:30:56,110
Um, and, and, mean, that's just the fact, like people can say what they want.
421
00:30:56,110 --> 00:30:58,351
I've been in this space 20 years.
422
00:30:58,351 --> 00:31:04,874
I know it to be true that that has been the historical perspective in the law firm world
that is changing.
423
00:31:04,874 --> 00:31:07,945
And there are, and there are outliers where that is not true.
424
00:31:07,945 --> 00:31:17,649
And I, and I get that, but in the smaller, in the smaller end of the market, the things
that we've struggled with are, um, a lack of sophistication from a technical perspective,
425
00:31:17,649 --> 00:31:20,500
you know, they could be rock stars on the.
426
00:31:20,500 --> 00:31:32,643
practice side, but a lot of times in that hundred attorney and under world, you'll have an
IT director who, um, Kate, who was a, their former network administrator did a good job.
427
00:31:32,643 --> 00:31:35,064
So they got reported and now they're, you know what I mean?
428
00:31:35,064 --> 00:31:39,685
And there's nothing wrong with that, but it's, it's we.
429
00:31:40,266 --> 00:31:40,776
Right.
430
00:31:40,776 --> 00:31:41,986
Yeah, exactly.
431
00:31:41,986 --> 00:31:48,711
And so there's just been a lack of technical sophistication and you know, there's also a
lot of, I got to
432
00:31:48,711 --> 00:31:50,592
a really good friend of mine who here in St.
433
00:31:50,592 --> 00:31:54,855
Louis, he owns a 50 attorney law firm and we play golf together.
434
00:31:54,855 --> 00:32:06,493
you know, I had, he asked me for somebody to go speak at his retreat and I teed a couple
of people up and he was complaining about the price.
435
00:32:06,493 --> 00:32:12,197
I'm like, dude, you are lucky that this person is even giving you their attention, right?
436
00:32:12,197 --> 00:32:16,460
They normally speak in front of hundreds of thousands of people.
437
00:32:16,476 --> 00:32:27,089
Like that's, that's a bargain, but you know, it's an all small businesses, including us,
when you're every dollar matters, you got, really have to manage your spend.
438
00:32:27,089 --> 00:32:36,870
But when you have limited budget combined with an audience who doesn't see tech as
strategic, it's a really difficult market to sell into as a tech company.
439
00:32:37,218 --> 00:32:42,502
completely, but those people are also in very serious peril.
440
00:32:42,923 --> 00:32:43,323
Right?
441
00:32:43,323 --> 00:32:45,665
Like, and we're talking like a long arc here.
442
00:32:45,665 --> 00:32:56,694
So you're right for like individual companies, like can, can potential customers adapt
fast enough to keep us sustainable and keep us going like for sure, like on an individual
443
00:32:56,735 --> 00:33:03,340
or even like a market, like a current market level, like you could argue that the change
is going to be too slow and all that.
444
00:33:03,340 --> 00:33:05,942
But if you look at like a broader arc,
445
00:33:06,286 --> 00:33:08,466
this is not going to stay the same.
446
00:33:08,466 --> 00:33:13,766
Like think about sort of how we get our medical care now, right?
447
00:33:13,766 --> 00:33:15,846
And how much that has changed.
448
00:33:15,846 --> 00:33:32,786
And if you had a doctor's office, say, who refused to do e-charts or refused to meet with
you online or refused to take your prescriptions through the portal versus calling.
449
00:33:32,786 --> 00:33:35,724
it's just, and medical is another weird.
450
00:33:35,724 --> 00:33:41,108
you want to, but like people, couldn't survive, right?
451
00:33:41,108 --> 00:33:42,079
You couldn't survive.
452
00:33:42,079 --> 00:33:52,388
so law probably has the wrong impression that they don't, there's some people who think
that they adapt, they don't have to adapt in the same way.
453
00:33:52,388 --> 00:33:53,609
And there's a lot of resistance.
454
00:33:53,609 --> 00:33:58,563
And, for sure, one of the dumb jokes we tell, and we have many, is that when we're like,
what should we build?
455
00:33:58,563 --> 00:34:05,198
Let's find the uh part of the economy where people are most, you know,
456
00:34:05,198 --> 00:34:08,518
open to risk and they love technology and they just are ready to share.
457
00:34:08,518 --> 00:34:10,178
We're like, wow, perfect.
458
00:34:10,178 --> 00:34:12,298
know, talk about what could we have picked that was harder?
459
00:34:12,298 --> 00:34:13,918
I can't even imagine.
460
00:34:13,978 --> 00:34:19,658
But I do think we're at a point where no matter what it has to change.
461
00:34:19,658 --> 00:34:29,658
can't have the triangulation we have right now with the advent of AI and just how
fundamentally that's going to change everything we do soup to nuts.
462
00:34:29,658 --> 00:34:30,018
Right.
463
00:34:30,018 --> 00:34:35,318
And then you have a vastly underserved part of the market.
464
00:34:35,648 --> 00:34:42,594
and then of extraordinarily overpriced legacy providers, like that's gonna implode, right?
465
00:34:42,594 --> 00:34:55,155
And so the smart attorneys out there, and they're all smart, no offense, the tech forward
and the forward thinking attorneys are gonna have to figure out how to adapt to this or
466
00:34:55,155 --> 00:35:00,039
they're gonna see their share of the market, know, start to crater.
467
00:35:00,039 --> 00:35:03,372
And the new students who are going through law school and learning this stuff,
468
00:35:03,372 --> 00:35:05,267
are gonna come out with different expectations too.
469
00:35:05,267 --> 00:35:09,234
So there'll be a fight for talent as well.
470
00:35:09,234 --> 00:35:10,315
Yeah, no, for sure.
471
00:35:10,315 --> 00:35:16,989
Yeah, that kind of legacy mindset is incompatible with where we're going.
472
00:35:16,989 --> 00:35:25,075
It's just, I know so many firms that are going to struggle to adapt and not necessarily
some small firms.
473
00:35:25,075 --> 00:35:30,838
I know of an Amlaw firm who I'm really good friends with, the chief knowledge officer
there.
474
00:35:31,524 --> 00:35:32,940
And, um
475
00:35:33,402 --> 00:35:39,658
I was working on a project with her for a presentation and she was completely dark in
January.
476
00:35:39,658 --> 00:35:41,066
And I was like, where have you been?
477
00:35:41,066 --> 00:35:42,851
She's like, I've been working on a DMS upgrade.
478
00:35:42,851 --> 00:35:44,103
I'm like, DMS?
479
00:35:44,103 --> 00:35:44,733
You've been working?
480
00:35:44,733 --> 00:35:46,035
I was like, that's IT.
481
00:35:46,035 --> 00:35:46,895
What's the chief knowledge?
482
00:35:46,895 --> 00:35:48,387
She's like, I am IT.
483
00:35:48,387 --> 00:35:49,407
And I am KM.
484
00:35:49,407 --> 00:35:56,875
And it's like, OK, your firm has set you up for failure if you're going to be promoting
knowledge and innovation.
485
00:35:57,176 --> 00:35:57,474
Yeah.
486
00:35:57,474 --> 00:35:59,266
strategic error on their part.
487
00:35:59,266 --> 00:36:05,501
And you know, maybe this dovetails back to the conversation we were having about
benchmarking too, right?
488
00:36:05,501 --> 00:36:13,102
Because the risks for firms too, and for, you know, uh any size firm is huge if they pick
the wrong tool.
489
00:36:13,237 --> 00:36:23,305
It's huge if they rely on something that's spotty, you know, that's, and that's why the
momentum is hard to change with the big trusted companies because I get it.
490
00:36:23,305 --> 00:36:24,470
I mean, if I'm...
491
00:36:24,470 --> 00:36:29,214
you know, working at some firm and I'm in charge of their IT and making sure they have the
right tools.
492
00:36:29,214 --> 00:36:32,617
I'm not really interested in some, you know, brand new startup.
493
00:36:32,617 --> 00:36:34,769
Like that's, that's scary, right?
494
00:36:34,769 --> 00:36:36,400
That's, that's super scary.
495
00:36:36,400 --> 00:36:41,665
So again, there's a comeback to where, where can the industry provide a little more
reassurance?
496
00:36:41,665 --> 00:36:43,516
I think it's useful for all of us.
497
00:36:43,516 --> 00:36:46,863
Yeah, well you've had some validation.
498
00:36:46,863 --> 00:36:52,694
talk a little bit about the case text and law school curricula and what you guys are doing
there.
499
00:36:52,909 --> 00:36:54,610
Yeah, so this is really fun.
500
00:36:54,610 --> 00:36:58,430
this was this actually happened at the tech show.
501
00:36:58,970 --> 00:37:00,370
it's always good to self.
502
00:37:00,370 --> 00:37:04,550
It's always good to show up at these things when you're when you're new, because you never
know who you're going to meet.
503
00:37:05,190 --> 00:37:08,530
So we, as folks know, have been listening.
504
00:37:08,530 --> 00:37:09,890
have case law research.
505
00:37:09,890 --> 00:37:16,530
It's we have free access and we have a paid tool, $10 a month and $20 a month, depending
on the level.
506
00:37:16,530 --> 00:37:18,810
So paid, but almost free.
507
00:37:18,810 --> 00:37:19,510
Right.
508
00:37:19,510 --> 00:37:20,410
So
509
00:37:21,142 --> 00:37:29,889
The uh legal tech curriculum is from, hang on, I'm just gonna get the exact right
terminology here so I'm not saying the wrong thing.
510
00:37:29,889 --> 00:37:43,008
So we will be rolling out into uh over 350 law schools uh internationally this fall as
part of a legal curriculum that is from the National Society for Legal Technology and will
511
00:37:43,008 --> 00:37:45,379
be part of their legal research curriculum.
512
00:37:45,379 --> 00:37:49,442
And what's really fun about this is that we are replacing case text
513
00:37:49,502 --> 00:38:00,140
in their um curriculum because case tax no longer, now that they've been acquired for 650
million, you good for them, um they don't have free access any longer.
514
00:38:00,140 --> 00:38:02,192
I think that was turned off in March.
515
00:38:02,192 --> 00:38:06,405
So we will be replacing uh our friends at Case Tax in the curriculum.
516
00:38:06,405 --> 00:38:14,171
And so we will be included with Lexis Plus, Westlaw, Bloomberg Law, Fast Case, and
HeinOnline, and us, and Describe.
517
00:38:14,171 --> 00:38:16,192
So it does show that
518
00:38:16,415 --> 00:38:18,846
I guess, uh validation of the tool.
519
00:38:18,846 --> 00:38:31,164
um Doug Lusk from the CEO from uh the, NSLT said, you know, we were his favorite and he
was very impressed with what we were building.
520
00:38:31,164 --> 00:38:34,536
So that was some big validation for us.
521
00:38:34,536 --> 00:38:45,932
um Another nice piece of validation we had recently was Bob Enroji covered our AI citator
release and said we were now poised to meaningfully
522
00:38:45,998 --> 00:38:50,741
uh be thought of as a competitor for the big legal research tools.
523
00:38:50,741 --> 00:38:52,112
So these are pretty exciting things.
524
00:38:52,112 --> 00:38:56,966
And again, good luck of having having done this without AI.
525
00:38:56,966 --> 00:38:59,608
It's quite literally would not have been possible.
526
00:38:59,608 --> 00:39:09,554
So we're kind of the poster child for what, how nimble and lean and sort of creative AI
companies can be, which is pretty cool.
527
00:39:10,595 --> 00:39:11,536
Thank you.
528
00:39:11,536 --> 00:39:12,957
We're very excited.
529
00:39:14,098 --> 00:39:14,750
Yeah.
530
00:39:14,750 --> 00:39:17,180
So an interesting thing happened this week.
531
00:39:17,180 --> 00:39:19,671
So we are a TLTF portfolio company.
532
00:39:19,671 --> 00:39:28,753
um one of the directors over there tagged uh me and a few others in the post, Tom Baldwin
from Ennegrata.
533
00:39:28,954 --> 00:39:37,296
And uh a VC had written a future of law um kind of
534
00:39:38,037 --> 00:39:40,157
Manifesto and it's so good.
535
00:39:40,157 --> 00:39:41,617
Like I, I agree.
536
00:39:41,617 --> 00:39:42,977
It's a, it's, they're called catalyst.
537
00:39:42,977 --> 00:39:45,637
I'd really, I'd never heard of them prior to that.
538
00:39:45,637 --> 00:39:53,457
Maybe we can create a link in the show notes, but they mentioned info dash and like as an,
company they're excited about.
539
00:39:53,477 --> 00:40:00,893
Um, and I, know, I get like 15, I'm not exaggerating 15 to 20 emails from VCs a week.
540
00:40:00,893 --> 00:40:04,694
LinkedIn messages, they call, I don't know how they have my cell phone number, but they
do.
541
00:40:04,694 --> 00:40:10,455
And it's, I'm flattered, but I could fill up my calendar with just talking to them and we
don't need money.
542
00:40:10,455 --> 00:40:20,958
Like we're V we're, um, we're, bootstrapped and we took a little bit of funding from TLTF,
not really for the funding, but just because the working with them is amazing.
543
00:40:20,958 --> 00:40:23,208
Um, they are.
544
00:40:23,208 --> 00:40:24,399
Yeah.
545
00:40:24,399 --> 00:40:25,759
They, they can open doors.
546
00:40:25,759 --> 00:40:26,859
They know the market.
547
00:40:26,859 --> 00:40:29,122
It's, it's a, it's a really good relationship.
548
00:40:29,122 --> 00:40:33,589
Plus you get to go to their fun summit, which is always in really cool places.
549
00:40:34,592 --> 00:40:35,592
Yeah.
550
00:40:36,255 --> 00:40:38,094
Oh, Austin's the best, yeah.
551
00:40:38,094 --> 00:40:42,755
in like the Ritz Carlton and Fort Lauderdale and then last year in Key Biscayne.
552
00:40:42,795 --> 00:40:43,706
So it's really good.
553
00:40:43,706 --> 00:40:56,419
But you know, like, I think what is also interesting now with in terms of challengers to
these big established players is now you have funds like TLTF that are completely zeroed
554
00:40:56,419 --> 00:40:59,124
in and you know, reading that
555
00:40:59,124 --> 00:41:09,927
Catalyst article about the future of law made me realize something I thought they were the
VCs were hitting us up because we've had really strong growth and it's very visible on
556
00:41:09,927 --> 00:41:13,458
LinkedIn just by headcount So was like, that's why they're hitting us up.
557
00:41:13,458 --> 00:41:26,992
But after I read that I was like, oh there is a broader investment thesis that what we're
doing aligns to and I never really I never realized how smart these funds are in
558
00:41:26,992 --> 00:41:29,062
understanding like
559
00:41:29,929 --> 00:41:33,822
these are some of the smartest people ever go into these spaces.
560
00:41:33,822 --> 00:41:39,398
And VCs, boy, their whole value prop is upside down right now.
561
00:41:39,398 --> 00:41:42,371
So they're trying to figure out how to survive too, right?
562
00:41:42,371 --> 00:41:44,053
Because AI is flipping everything.
563
00:41:44,053 --> 00:41:47,126
You don't need those kinds of big investments anymore for engineering.
564
00:41:47,287 --> 00:41:50,802
So they're nervous.
565
00:41:50,802 --> 00:41:55,323
a lot of money on the sideline that needs to find a home and it's kind of their job to do
it.
566
00:41:55,323 --> 00:42:00,344
this was written like I wrote it and I've been in the space 20 years.
567
00:42:00,344 --> 00:42:02,305
I host a podcast on it.
568
00:42:02,305 --> 00:42:03,655
I speak at conferences.
569
00:42:03,655 --> 00:42:11,317
I attend conferences and this VC wrote what I thought was a fantastic outlook for where
legal is going.
570
00:42:11,317 --> 00:42:12,968
So anyway, we were...
571
00:42:13,328 --> 00:42:13,862
yeah.
572
00:42:13,862 --> 00:42:14,774
guys send me a link.
573
00:42:14,774 --> 00:42:15,554
really good.
574
00:42:15,554 --> 00:42:16,805
It's really good.
575
00:42:16,945 --> 00:42:26,330
But so how does a of a bootstrapped company, you know, a two person bootstrap company go
about competing with these?
576
00:42:26,330 --> 00:42:27,690
I mean, you've told your story.
577
00:42:27,690 --> 00:42:29,671
Is your story repeatable?
578
00:42:29,671 --> 00:42:31,872
is that a playbook people can use?
579
00:42:32,174 --> 00:42:33,734
I mean, yes and no.
580
00:42:33,734 --> 00:42:36,694
here's the other kind of thing that's cool.
581
00:42:36,694 --> 00:42:44,034
Like everybody always thinks that 20 whatever year old, you know, startup people are magic
and they are.
582
00:42:44,154 --> 00:42:51,914
But when you're a startup founder in a different part of your career, later in your
career, you can kind of do things a little differently too.
583
00:42:51,914 --> 00:42:58,154
Like hopefully you have a little bit of your own money you can invest or you have a little
more, you know, ability to kind of do it the way you want.
584
00:42:58,614 --> 00:43:00,800
I would say that, you know,
585
00:43:00,800 --> 00:43:02,151
depending on where you are in your life.
586
00:43:02,151 --> 00:43:06,134
um Because, you know, we did invest in it, right?
587
00:43:06,134 --> 00:43:10,918
We invested time and, you know, it's not like open AI was giving us free credits, you know
what I mean?
588
00:43:10,918 --> 00:43:16,453
Like, so there was some significant investment, but, we invested it ourselves, right?
589
00:43:16,453 --> 00:43:26,061
And so since we didn't have to pay for engineering, which is really expensive, and we
didn't have to pay for marketing, which is also really expensive, like we saved a ton of
590
00:43:26,061 --> 00:43:26,251
money.
591
00:43:26,251 --> 00:43:30,794
So I say, yes, it's repeatable um for the right
592
00:43:31,288 --> 00:43:32,688
people, if that makes sense.
593
00:43:32,688 --> 00:43:42,341
you kind of, have to have some of your own resources and you have to be tenacious as hell.
594
00:43:42,341 --> 00:43:43,079
You know what I mean?
595
00:43:43,079 --> 00:43:44,942
It's like, it is hard.
596
00:43:45,022 --> 00:43:47,262
It is hard, but it is fun.
597
00:43:47,342 --> 00:43:52,784
and I do, did you, I used to not have gray hair.
598
00:43:53,064 --> 00:43:54,744
So there you go.
599
00:43:54,744 --> 00:44:00,638
But I also think, and this is like a bit esoteric, so forgive me, but I do also think AI,
600
00:44:00,706 --> 00:44:12,029
We talk about this on the show a lot with VCs and stuff like, I think AI is going to also
help us think about different ways we fund things because right now we have like a very
601
00:44:12,029 --> 00:44:13,729
broken model, I think.
602
00:44:13,850 --> 00:44:21,672
And this is where state governments, if they're smart, they're going to start to step in
and things like that, which, you know, smart government can be like maybe a a big ask.
603
00:44:21,672 --> 00:44:29,814
But, you know, we have, we have, if you're a nonprofit and you want to go the full
altruistic route, there's resources for you to build.
604
00:44:29,814 --> 00:44:30,456
Right.
605
00:44:30,456 --> 00:44:33,888
There's grants, there's foundations, there's all kinds of stuff.
606
00:44:33,888 --> 00:44:39,198
If you want to go the full, I'm going to charge as much as I possibly can for my product
and make as much money as possible.
607
00:44:39,198 --> 00:44:43,693
So someone wants to buy me and I have an exit, we have a model for that, right?
608
00:44:43,693 --> 00:44:51,728
A model like ours where we're like, I could be charging 30 times more for what we have,
but I'm not because I'm trying to do something that's in between.
609
00:44:51,728 --> 00:44:53,779
It's like a social entrepreneurship, whatever.
610
00:44:53,779 --> 00:44:59,394
There's very limited money structures, support, even thinking around.
611
00:44:59,394 --> 00:45:04,358
how to build there and that's sad because that is where there's so much opportunity.
612
00:45:04,358 --> 00:45:15,005
So I think like the challenge and the call I will put out if there's any politicos or like
people in any space to think about this, like there's a real opportunity there to rethink
613
00:45:15,005 --> 00:45:21,570
how we even fund ah sort of disruptive technologies right now.
614
00:45:21,692 --> 00:45:23,183
Yeah, no, that's a great point.
615
00:45:23,183 --> 00:45:33,559
mean, you know, uh VCs, PE, growth equity, all of these funding models have to generate
return for their investment.
616
00:45:33,559 --> 00:45:41,234
And if you're leaving money on the table, which you are intentionally, it's, it's, I don't
know that that's the right fit for them.
617
00:45:41,494 --> 00:45:44,035
No, they think they don't understand.
618
00:45:44,035 --> 00:45:47,577
It's truly bizarre to them.
619
00:45:49,198 --> 00:45:57,411
But they're going to have to adapt because the model for what people are building is going
to change and they're going to be left out because people aren't going to take their money
620
00:45:57,411 --> 00:45:59,353
and then jack their prices off.
621
00:45:59,353 --> 00:46:02,645
And the younger generations want to make meaningful change.
622
00:46:02,645 --> 00:46:04,206
They don't want to just be rich.
623
00:46:04,206 --> 00:46:05,146
Some do.
624
00:46:05,192 --> 00:46:05,843
it's true.
625
00:46:05,843 --> 00:46:06,143
Yeah.
626
00:46:06,143 --> 00:46:10,299
mean, rich can be, you know, a byproduct of just doing good things.
627
00:46:10,299 --> 00:46:13,063
Um, that's the hope, right?
628
00:46:13,063 --> 00:46:13,762
And
629
00:46:13,762 --> 00:46:23,994
Yeah, they call it in Massachusetts, our former Secretary of Economic Development, um
Yvonne Howe used to call it, do well by doing good.
630
00:46:24,295 --> 00:46:25,535
I love that.
631
00:46:25,576 --> 00:46:27,149
There's no reason you can't have both.
632
00:46:27,149 --> 00:46:28,000
It makes sense.
633
00:46:28,000 --> 00:46:34,752
mean, you know, there are, could go take some funding and probably go grow a heck of a lot
faster and probably end up at a bigger payday.
634
00:46:34,752 --> 00:46:35,263
But you know what?
635
00:46:35,263 --> 00:46:36,453
I love what I do.
636
00:46:36,453 --> 00:46:37,901
I'm growing incrementally.
637
00:46:37,901 --> 00:46:40,425
I mean, we're having over a hundred percent year over year growth.
638
00:46:40,425 --> 00:46:43,736
It's not like it's slow, but I'm enjoying it.
639
00:46:43,736 --> 00:46:53,724
I like having some VC breathing down my neck telling me I got to push harder or I have to,
you know, fire people that aren't meeting these criteria.
640
00:46:54,114 --> 00:46:55,576
Or you get bumped out, right?
641
00:46:55,576 --> 00:46:56,892
They're like, oh, nevermind.
642
00:46:56,892 --> 00:46:57,806
Thanks for building this.
643
00:46:57,806 --> 00:46:58,278
Bye bye.
644
00:46:58,278 --> 00:46:59,299
Right, exactly.
645
00:46:59,299 --> 00:47:03,371
And that's just not, I'm at a stage in my life where I don't need to do that.
646
00:47:03,371 --> 00:47:05,752
And I'm just having a lot of fun.
647
00:47:05,752 --> 00:47:07,373
My reputation matters to me.
648
00:47:07,373 --> 00:47:13,163
I want to deliver great work to people and hold my head high when I walk around a legal
tech conference, you know.
649
00:47:13,163 --> 00:47:15,386
Maybe we need our own organization, Ted.
650
00:47:15,386 --> 00:47:19,730
Maybe like these stubborn GenX boot strappers.
651
00:47:19,730 --> 00:47:23,496
We need our own little conference.
652
00:47:23,496 --> 00:47:34,894
You know what I have thought about honestly, in the past, some sort of like, um, model
with which vendors could kind of organize and collaborate.
653
00:47:34,894 --> 00:47:37,456
And I haven't figured out how to do that yet.
654
00:47:37,456 --> 00:47:41,969
And I'm so busy with my day job, but, um, I started a franchisee association.
655
00:47:41,969 --> 00:47:43,970
My wife and I own five gyms here in St.
656
00:47:43,970 --> 00:47:51,295
Louis and, that brand needed some help on the franchisee side that we weren't getting from
HQ.
657
00:47:51,295 --> 00:47:52,606
They're great, but.
658
00:47:52,606 --> 00:47:54,917
They were growing so fast, they weren't giving support.
659
00:47:54,917 --> 00:47:59,036
So I started a nonprofit and it was wildly successful.
660
00:47:59,036 --> 00:48:00,430
I handed over the keys.
661
00:48:00,430 --> 00:48:03,502
It's still doing great, but man, I got so much.
662
00:48:03,502 --> 00:48:05,222
created mentoring programs.
663
00:48:05,222 --> 00:48:08,743
We did a summit where we brought everybody together.
664
00:48:08,743 --> 00:48:12,806
uh We brought in speakers and it was great.
665
00:48:12,806 --> 00:48:13,867
That's not a bad idea.
666
00:48:13,867 --> 00:48:15,882
Maybe you and I should connect at Ilta.
667
00:48:15,882 --> 00:48:16,722
talk, yeah.
668
00:48:16,722 --> 00:48:26,305
And then we should encourage, from my branding perspective, we should encourage the big
players to help fund this because we're going after markets they don't want anyway, or at
669
00:48:26,305 --> 00:48:27,186
least I am.
670
00:48:27,186 --> 00:48:35,848
So it can be like, fine, help the innovation economy, but don't worry, it's not people who
want your customers and then they can look like the good guys in the room who are helping
671
00:48:35,848 --> 00:48:38,139
uh increase access to the law.
672
00:48:38,139 --> 00:48:39,688
There you go, boom, done.
673
00:48:39,688 --> 00:48:40,219
I like it.
674
00:48:40,219 --> 00:48:43,418
They won't they won't like me though because I am taking their market share.
675
00:48:43,418 --> 00:48:44,163
competing with them.
676
00:48:44,163 --> 00:48:45,724
Okay, but we'll keep you.
677
00:48:45,724 --> 00:48:46,864
You can be a silent person.
678
00:48:46,864 --> 00:48:48,608
Exactly, exactly.
679
00:48:48,608 --> 00:48:50,792
Well, this has been a great conversation.
680
00:48:50,792 --> 00:48:58,948
um Before we wrap up, just tell people how they can find more out about what you do, your
podcasts, stuff like that.
681
00:48:59,278 --> 00:49:00,318
Yeah, great.
682
00:49:00,318 --> 00:49:04,378
So first of all, if you happen to be at Ilticon, come by the Startup Hub.
683
00:49:04,378 --> 00:49:07,338
Booth 123 will be hanging out and would love to meet you.
684
00:49:07,338 --> 00:49:09,098
You can always check out our website.
685
00:49:09,098 --> 00:49:14,618
It's describe with a Y because, you know, startups have to have weirdly spelled names.
686
00:49:14,878 --> 00:49:17,098
So describe.com with a Y.
687
00:49:17,098 --> 00:49:21,938
We're most active on LinkedIn, which is kind of like where the Legal Tech people hang out.
688
00:49:21,938 --> 00:49:23,698
So find me on there.
689
00:49:23,698 --> 00:49:25,398
Happy to connect.
690
00:49:25,738 --> 00:49:27,578
Ping us, stop by.
691
00:49:27,804 --> 00:49:32,132
the booth and yeah, just love to love to talk to people in the community.
692
00:49:32,132 --> 00:49:33,402
It's the best part of the job.
693
00:49:33,402 --> 00:49:33,882
Awesome.
694
00:49:33,882 --> 00:49:36,664
And InfoDash will be in booth 308 as well.
695
00:49:36,664 --> 00:49:39,836
So stop by there after you stop by describes.
696
00:49:39,836 --> 00:49:42,417
um Well, this has been a great conversation.
697
00:49:42,417 --> 00:49:44,408
I look forward to seeing you next week.
698
00:49:44,408 --> 00:49:46,799
And let's keep the conversation going.
699
00:49:47,042 --> 00:49:48,500
Great, thanks Ted, this was really fun.
700
00:49:48,500 --> 00:49:50,478
I'm so happy to have joined you, thank you.
701
00:49:50,478 --> 00:49:50,891
Awesome.
702
00:49:50,891 --> 00:49:51,877
You're very welcome.
703
00:49:51,877 --> 00:49:52,869
Take care.
704
00:49:53,420 --> 00:49:54,332
Bye bye. -->
Subscribe
Stay up on the latest innovations in legal technology and knowledge management.