In this episode, Ted sits down with Jamie Tso, Senior Associate at Clifford Chance, to discuss vibe coding and the growing role of AI in lawyer-led innovation. From rapid prototyping to production-level experimentation, Jamie shares his expertise in legal AI, software development, and emerging innovation workflows. As AI blurs the line between technical and non-technical roles, this conversation explores what it means for lawyers who want to build, test, and deploy their own tools.
In this episode, Jamie shares insights on how to:
Use AI as a co-pilot to prototype legal tools quickly
Apply vibe coding to solve specific, practice-level problems
Navigate the risks and limitations of AI-generated code
Rethink the role of innovation teams as lawyers build their own solutions
Balance speed, experimentation, and skepticism when working with AI
Key takeaways:
Vibe coding enables lawyers to build functional tools without deep engineering backgrounds
AI dramatically accelerates prototyping and experimentation in legal tech
The distinction between legal and technical roles is increasingly blurred
Lawyer-built tools can be highly tailored but require thoughtful oversight
Healthy skepticism is essential when relying on AI-generated outputs
About the guest, Jamie Tso
Jamie Tso is a Senior Associate in the Hong Kong office of Clifford Chance and a member of the firm’s APAC Private Funds Group. He advises asset managers on the structuring, formation, and ongoing operation of hedge, private equity, real estate, and other alternative investment funds. Jamie also works closely with institutional investors on buyout, venture capital, co-investment, and secondary transactions across the region.
I didn’t know a lot of fundamentals, because I didn’t learn it from a textbook, but I learned coding along the way as I was solving my own problems.
00:00:00.000 --> 00:00:05.700
Jamie, thanks for joining this evening
or really late evening your time.
00:00:05.700 --> 00:00:08.280
I appreciate you, uh, spending
a little time with us today.
00:00:09.870 --> 00:00:12.000
Thank you so much for
inviting me to the podcast.
00:00:12.150 --> 00:00:14.400
Really nice, uh, to be talking with you.
00:00:15.210 --> 00:00:15.690
Yeah.
00:00:16.079 --> 00:00:20.370
Well, you've been making some waves on
LinkedIn with some of the vibe coding
00:00:20.640 --> 00:00:25.740
that you've been doing, which I have
found really fascinating and it has.
00:00:26.670 --> 00:00:33.300
It has created a lot of, um, questions
about, you know, the current state of
00:00:33.300 --> 00:00:38.550
legal AI tools, whether or not there's,
you know, how, how much incremental
00:00:38.550 --> 00:00:42.780
value there is there over kind of
the, the standard frontier models,
00:00:43.379 --> 00:00:44.910
um, which we'll talk about before.
00:00:44.970 --> 00:00:48.660
Before we do, why don't we get you,
why don't we, we get you introduced.
00:00:48.660 --> 00:00:52.590
So maybe tell us, you know, who you are,
what you do, and, and where you do it.
00:00:55.214 --> 00:00:58.934
I am a funds lawyer,
uh, based in Hong Kong.
00:00:59.415 --> 00:01:01.699
Um, so I, and my day to day.
00:01:02.265 --> 00:01:05.295
I work, I, uh, advise
clients on fund formation.
00:01:05.385 --> 00:01:09.585
Um, so that's my, uh, you know,
my, the legal side of things.
00:01:09.645 --> 00:01:15.255
Um, so on top of that, I like to use AI
to build stuff that, uh, people want.
00:01:15.315 --> 00:01:18.855
Uh, I like to solve, uh, both my
own pain points and, you know,
00:01:19.275 --> 00:01:21.015
uh, other people's pain points.
00:01:21.375 --> 00:01:25.515
Uh, you know, when I was a trainee,
I, I set it off, uh, um, building
00:01:25.515 --> 00:01:28.815
some small language models when tens
flow was still around, you know, when,
00:01:28.845 --> 00:01:30.855
when, uh, TensorFlow became popular.
00:01:31.650 --> 00:01:33.030
Uh, to do some entity mapping.
00:01:33.480 --> 00:01:37.920
Um, uh, as and when GBT came out,
then I started experimenting with,
00:01:38.130 --> 00:01:42.780
uh, building AI enabled workflows,
uh, to automate some of the, uh,
00:01:42.810 --> 00:01:44.610
legal workflows that I encounter.
00:01:45.030 --> 00:01:49.860
Uh, we should let, uh, me to, you know,
uh, um, to actually learn how to code.
00:01:50.100 --> 00:01:56.070
Um, so I, I probably didn't, uh, uh, know
a lot of fundamentals, you know, because,
00:01:56.100 --> 00:01:58.710
uh, I, I, to letter from a textbook, um.
00:01:59.490 --> 00:02:03.900
I kind of learned, uh, coding along the
way, uh, as I was solving my own problems.
00:02:03.960 --> 00:02:08.460
Uh, so, uh, I would say I'm still a
non-technical person, uh, but still,
00:02:08.789 --> 00:02:13.050
uh, I've been experimenting with,
uh, building apps, um, uh, these
00:02:13.050 --> 00:02:17.100
days using a lot of what we call via
coding platforms like, uh, a loveable
00:02:18.270 --> 00:02:20.250
and, uh, of course Google AI Studio.
00:02:21.840 --> 00:02:22.050
Yeah.
00:02:22.050 --> 00:02:26.910
Where would you rank yourself
on the spectrum from zero to 10
00:02:26.970 --> 00:02:29.790
in terms of coding capabilities?
00:02:31.080 --> 00:02:34.620
Uh, I, I, I think it's,
uh, all very relative.
00:02:34.830 --> 00:02:41.010
I think, uh, compared to other lawyers,
I would say, I'm confident to say nine.
00:02:41.190 --> 00:02:47.220
Um, uh, compared to other, uh, you know,
uh, actual, uh, software engineers.
00:02:47.894 --> 00:02:52.650
On a scale one to 10, I probably on
the, you know, one, so, you know, it's
00:02:52.650 --> 00:02:56.505
a, to be honest, yeah, I, I, I don't
claim, I claim that I'm an expert
00:02:56.505 --> 00:03:01.394
here, so, uh, but, but then, uh, I have
the help of AI at which I think it's
00:03:01.635 --> 00:03:03.135
really, really proficient these days.
00:03:03.195 --> 00:03:07.305
Um, and a lot of the software
engineers actually, they rely on AI
00:03:07.305 --> 00:03:09.015
to generate a lot of codes anyway.
00:03:09.465 --> 00:03:12.584
So I would say it's actually a
really, really good equalizer.
00:03:14.280 --> 00:03:20.970
Yeah, I've got a team of about 35 of them
who, who use, uh, we're a cursor shop.
00:03:20.970 --> 00:03:25.290
We use cursor quite a bit,
and it's been amazing.
00:03:25.680 --> 00:03:27.750
Some of the efficiencies
that we've gained.
00:03:27.750 --> 00:03:31.020
We're not quote unquote vibe coding,
it's more of kind of a co-pilot
00:03:31.470 --> 00:03:37.770
paradigm, um, you know, that are
helping us, helping us generate code.
00:03:37.890 --> 00:03:40.770
And there's still a very high touch.
00:03:41.130 --> 00:03:46.020
On, you know, every, we have a pretty
structured process around code review.
00:03:46.740 --> 00:03:54.330
So before there's a, a pull request
into the, um, get repository,
00:03:54.390 --> 00:03:56.070
somebody has to evaluate.
00:03:56.070 --> 00:04:02.280
And, but with that additional
discipline, you really can create
00:04:02.340 --> 00:04:07.140
production level code and things
that are ready for prime time.
00:04:07.170 --> 00:04:08.280
I would say without it.
00:04:09.105 --> 00:04:11.805
That my observation, I'm a
former software engineer.
00:04:11.955 --> 00:04:14.085
I say former because
I'm, I'm a little rusty.
00:04:14.175 --> 00:04:19.095
Uh, I was a database guy, so I was on
the SQL team at Microsoft 25 years ago.
00:04:19.575 --> 00:04:24.435
And I'm still proficient in SQL
because, um, SQL hasn't changed much,
00:04:24.975 --> 00:04:29.385
uh, you know, all of the new front
end, you know, JavaScript libraries,
00:04:29.385 --> 00:04:35.055
and I, I'm not up to speed at all
on that, but you know, I, I know.
00:04:35.715 --> 00:04:41.474
I understand the, uh, principles of
software design, and I see some of the
00:04:41.474 --> 00:04:46.005
code that AI writes and it's a little all
over the board, inconsistent patterns.
00:04:46.575 --> 00:04:54.375
Um, documentation is, um, I, again,
I would say inconsistent, but it's
00:04:54.375 --> 00:04:58.635
how do you let, let's maybe jump
in and start off with kind of vibe
00:04:58.635 --> 00:05:01.605
coding as just definitionally.
00:05:01.605 --> 00:05:03.344
What does, what does vibe coding mean?
00:05:04.440 --> 00:05:09.390
To you and then what are its
applications for legal professionals?
00:05:10.170 --> 00:05:10.560
Yeah.
00:05:10.590 --> 00:05:14.490
So Vibe Quoing is a, a term
coined by one of the co-founders
00:05:14.520 --> 00:05:16.770
of, uh, OpenAI, uh, Andre Kauff.
00:05:17.460 --> 00:05:22.560
Um, to quote exactly what he said is,
uh, he, he said that vibe quoting is,
00:05:22.620 --> 00:05:29.250
um, a practice where you just give
into the vibes, um, uh, embrace the
00:05:29.250 --> 00:05:31.920
exponential and never look at a code.
00:05:32.520 --> 00:05:39.630
So basically, uh, it means to code
in natural language without looking
00:05:39.630 --> 00:05:46.650
at the codes and just, um, say the
features that you want to become real.
00:05:47.159 --> 00:05:55.830
So, uh, I I, and in the in lawyers
context, um, I think, uh, we can
00:05:55.830 --> 00:05:59.820
think of it, um, in, in a sense,
you know, in a context of workflows.
00:06:00.240 --> 00:06:00.870
So.
00:06:01.695 --> 00:06:09.705
We see that, uh, you know, workflow
builders, um, is not a, a, a new thing.
00:06:09.795 --> 00:06:16.695
So, um, applications like Bright, um, uh,
um, more consumer focused applications
00:06:16.695 --> 00:06:22.665
like Za NN uh, enterprise applications
like, uh, Microsoft, uh, uh, power
00:06:22.665 --> 00:06:25.035
Automate, these are all NOCO platforms.
00:06:25.455 --> 00:06:27.885
Uh, but the problem with these NOCO
platforms, they're, they're, they're
00:06:27.885 --> 00:06:29.145
really good for non-technical people.
00:06:30.180 --> 00:06:34.080
Wanna automate their workflows,
but then you kind of have, the
00:06:34.080 --> 00:06:35.520
learning curve is still pretty steep.
00:06:35.970 --> 00:06:39.360
You have to learn, you know,
what does this note do?
00:06:39.960 --> 00:06:41.705
Uh, what are the inputs,
what are the outputs?
00:06:43.125 --> 00:06:45.885
And then, uh, there are also
other limitations because
00:06:45.885 --> 00:06:47.085
it's just not flexible enough.
00:06:47.835 --> 00:06:53.685
Uh, I think now that, uh, AI's really,
really good at coding, um, there's
00:06:53.685 --> 00:06:57.435
no need for these kind of visual
builders anymore, in my opinion.
00:06:58.005 --> 00:07:01.635
Uh, because you can have AI just
generate codes, um, that are
00:07:01.635 --> 00:07:06.974
flexible enough to capture any
scenario, uh, which is why, uh.
00:07:08.325 --> 00:07:14.655
When, when, uh, when people say they
use vibe coding to build apps, uh,
00:07:14.744 --> 00:07:18.344
they're actually describing, uh, all
the features that they want to see in an
00:07:18.344 --> 00:07:25.695
app or, uh, their, uh, their workflows
they want automated and then just given,
00:07:25.965 --> 00:07:28.635
um, to AI to help them figure it out.
00:07:28.724 --> 00:07:33.914
What's the best way without
committing, um, to a certain platform.
00:07:34.425 --> 00:07:36.555
Or the architecture of that platform.
00:07:37.185 --> 00:07:42.855
Um, so, uh, it's basically like,
um, a blank canvas where AI can
00:07:42.855 --> 00:07:48.945
basically do whatever as long as
it, uh, fits, um, uh, your request.
00:07:49.275 --> 00:07:53.565
So, uh, the product becomes something
that's really personalized, uh,
00:07:54.405 --> 00:07:55.965
really tailored to your own needs.
00:07:56.895 --> 00:08:01.125
So this is, you know how I see this
different from, you know, the previous
00:08:01.125 --> 00:08:02.865
generation of visual builders.
00:08:04.680 --> 00:08:09.420
So, um, here's, uh, this is my take on
the state of vibe coating and, uh, I'm,
00:08:09.420 --> 00:08:12.840
I'm curious if you, you agree, you've,
you've, you've done more of it than I
00:08:12.840 --> 00:08:19.860
have, but where I think vibe coating
falls short is in the illities, right?
00:08:19.860 --> 00:08:24.990
The maintainability, the scalability,
um, usability, supportability,
00:08:25.830 --> 00:08:31.200
uh, scalability, um, you know,
those, those matters require.
00:08:31.965 --> 00:08:39.164
A significant amount of engineering
thought and intentionality to
00:08:39.164 --> 00:08:46.485
make all those ilities happen, um,
in a way where you can deploy an
00:08:46.485 --> 00:08:54.465
application across an enterprise
and have risk appropriately managed.
00:08:54.915 --> 00:08:59.685
Um, you know, and I would
say the ilities that really.
00:09:00.375 --> 00:09:05.655
I think jump out as being gaps today
are maintainability and scalability.
00:09:06.375 --> 00:09:15.314
Um, the, the vibe coded apps
that I've seen when cracked open
00:09:15.704 --> 00:09:20.204
and looked under the hood again,
there's a little bit of consistency.
00:09:20.265 --> 00:09:21.885
Um, security seems to be.
00:09:23.115 --> 00:09:24.465
A bit ad hoc.
00:09:25.095 --> 00:09:26.025
Um, I don't know.
00:09:26.025 --> 00:09:27.555
What is your, what is your take on that?
00:09:27.555 --> 00:09:29.925
You've done more of it than me, so
I'm curious to get your thoughts.
00:09:31.215 --> 00:09:34.005
So I would say yes, for sure.
00:09:34.005 --> 00:09:37.275
There are a lot of, um, shortcomings
when it comes to by coding.
00:09:37.935 --> 00:09:41.025
Um, I would say, uh, six.
00:09:41.025 --> 00:09:43.064
Scalability, you know, it's, um.
00:09:43.920 --> 00:09:47.939
If you end up five coding something
that a lot of people like and you
00:09:47.939 --> 00:09:51.570
wanna share it with your colleagues,
then the question becomes, you know,
00:09:51.570 --> 00:09:52.830
how do you maintain a database?
00:09:53.280 --> 00:09:58.620
Um, how would you, uh,
um, uh, monitor usage?
00:09:59.189 --> 00:10:01.950
How do you, uh, manage
lock-in credentials?
00:10:02.550 --> 00:10:06.689
Uh, how do you, um, you know, uh,
structure the backend so that, you know,
00:10:06.689 --> 00:10:08.850
you can basically, uh, have memory.
00:10:08.970 --> 00:10:12.210
So these kind of things,
it's, it's currently hard.
00:10:12.300 --> 00:10:12.630
Um.
00:10:13.949 --> 00:10:14.459
To achieve.
00:10:15.089 --> 00:10:18.720
But then we're getting there because
we see that Replic lovable, and a
00:10:18.720 --> 00:10:22.410
lot of these five coding platforms,
they have kind of these blueprints.
00:10:22.829 --> 00:10:28.020
Um, so blueprints, meaning that,
um, uh, if you want to add in
00:10:28.140 --> 00:10:32.220
like a lock in credentials,
then it's just one click away.
00:10:32.310 --> 00:10:35.490
They have some templates that
you can click and then that agent
00:10:35.520 --> 00:10:37.110
will go in and build that for you.
00:10:38.130 --> 00:10:41.520
Um, so that helps, uh, to, you
know, to mitigate, uh, some of
00:10:41.520 --> 00:10:43.320
that, uh, scalability issues.
00:10:43.860 --> 00:10:50.040
Uh, but then, you know, um, I would
say, uh, when people five code apps,
00:10:50.520 --> 00:10:54.000
uh, some, you know, they, their
intention might not be to build
00:10:54.000 --> 00:10:55.890
something that a lot of people want.
00:10:56.430 --> 00:11:00.060
Maybe they just wanna build something
that they can use themselves
00:11:00.240 --> 00:11:02.555
locally on, uh, on their own laptop.
00:11:03.735 --> 00:11:09.194
So in that sense, uh, probably they don't
need to think about scalability that much.
00:11:09.795 --> 00:11:14.715
Uh, if they're happy to just,
just, um, run it locally, um, uh,
00:11:14.745 --> 00:11:19.185
and, uh, they could even iterate
on it, um, to make it even more
00:11:19.215 --> 00:11:22.005
specialized, uh, uh, more personalized.
00:11:22.215 --> 00:11:27.165
Um, so in, in that sense, you know,
um, uh, scalability becomes test of
00:11:27.165 --> 00:11:29.925
NSG, uh, in terms of maintenance.
00:11:30.045 --> 00:11:30.525
Um.
00:11:31.469 --> 00:11:33.089
Yes, you need to host it somewhere, right?
00:11:33.089 --> 00:11:36.839
Um, if, if you want, of course to
share it, uh, with someone else.
00:11:37.199 --> 00:11:40.469
Uh, if you don't want to have to
initiate it every time, uh, then
00:11:40.469 --> 00:11:43.079
you have to host it somewhere,
uh, hosting becomes an issue.
00:11:43.199 --> 00:11:44.880
Do you host it in your own private cloud?
00:11:45.390 --> 00:11:46.005
Do you host it locally?
00:11:46.555 --> 00:11:49.074
To, uh, host it in public cloud.
00:11:49.314 --> 00:11:53.635
You know, these also becomes a,
a, a, another, uh, another issue.
00:11:53.875 --> 00:11:58.464
Uh, so deployment is also something that
a lot of the five point platforms, they're
00:11:58.464 --> 00:12:04.015
trying to build on that capability, you
know, to deploy in a, in an environment
00:12:04.464 --> 00:12:08.484
that in your, in your, in, in your
own environment or in your favorite
00:12:08.484 --> 00:12:12.594
environment, whether it's, um, a public
cloud, whether it's private cloud.
00:12:13.320 --> 00:12:16.050
You know, I think that some, some of
these platforms allow you to choose.
00:12:16.740 --> 00:12:21.840
Um, so, uh, I would say for sure,
uh, uh, uh, these are the issues.
00:12:22.110 --> 00:12:28.110
But I see that the trend is that
these problems will get solved by,
00:12:28.170 --> 00:12:30.420
you know, uh, by, by these companies.
00:12:30.960 --> 00:12:33.450
Um, because the demand is here.
00:12:33.960 --> 00:12:37.980
Um, so because a lot of people
want to build apps this way.
00:12:39.030 --> 00:12:43.320
I think, uh, the sensible business
decision is actually to solve it at the
00:12:43.320 --> 00:12:48.270
platform level, um, so that you know,
these people, they don't have to go
00:12:48.270 --> 00:12:49.770
on to solve these problems themselves.
00:12:50.220 --> 00:12:55.110
So six months ago we would say that
Vibe, coding apps just don't work.
00:12:55.230 --> 00:12:56.010
Bug are anywhere.
00:12:56.520 --> 00:13:00.689
As the models become better, as the five
point platforms have more integration,
00:13:01.110 --> 00:13:04.950
then we are seeing more and more of these
five point apps being used in production.
00:13:05.820 --> 00:13:08.760
And six months from now, we
can't really, can't tell.
00:13:08.820 --> 00:13:13.290
You know, we, we can't really, uh, uh,
uh, tell if it's, um, uh, it would be
00:13:13.290 --> 00:13:15.060
used in, you know, production on mass.
00:13:15.690 --> 00:13:19.965
Another thing is that, um, there's also
the transient nature of Whiteboarded apps.
00:13:21.030 --> 00:13:23.520
I'm, uh, I wanna share, uh, examples.
00:13:23.520 --> 00:13:29.430
So, um, uh, a month ago, there's, there
was a tragic, tragic fire in Hong Kong,
00:13:29.490 --> 00:13:31.890
uh, that claimed, uh, a lot of lives.
00:13:32.250 --> 00:13:38.400
And, uh, to aid the rescue, uh, a student
in the Hong Kong University of Ology.
00:13:38.940 --> 00:13:43.590
Uh, he, I quoted an app, uh, that
allows, uh, uh, different people to
00:13:43.590 --> 00:13:49.205
actually flag reports, you know, so,
um, for example, uh, uh, users can.
00:13:49.824 --> 00:13:56.275
Uh, uh, can, can flag, you know,
tower six, uh, um, uh, flat at 29.
00:13:56.755 --> 00:13:59.755
Uh, uh, there, there's some people
there that needs to be rescued.
00:14:00.084 --> 00:14:03.775
And that became something that, you
know, uh, uh, that, that became,
00:14:04.285 --> 00:14:10.375
uh, you know, the, you know, the app
for rescue that day, but then the
00:14:10.375 --> 00:14:12.084
other day, will it still be used?
00:14:12.685 --> 00:14:13.165
No.
00:14:13.165 --> 00:14:19.344
So it's, uh, it's only, uh, an app that
solves a particular demand just in time.
00:14:20.205 --> 00:14:23.895
So in that sense, would
maintenance become an issue?
00:14:24.105 --> 00:14:30.165
Not so much so, um, of course that
student made a heroic, uh, you know, uh,
00:14:30.615 --> 00:14:32.505
decision to make, uh, the Vibe Code app.
00:14:33.015 --> 00:14:37.665
Um, so, so, uh, you know, uh, you know,
I think he deserves a lot of respect.
00:14:37.725 --> 00:14:41.235
Uh, and, and I, and, and, uh, on
that, I think on that note, I would
00:14:41.235 --> 00:14:45.525
say that, you know, maintenance
becomes a lesser of issue if the
00:14:45.525 --> 00:14:46.905
app is just transient in nature.
00:14:47.595 --> 00:14:51.105
It's just to meet a very
urgent demand just in time.
00:14:52.155 --> 00:14:52.545
Yeah.
00:14:52.845 --> 00:14:57.675
And I think that, and, and I'm definitely
not throwing shade to vibe coding.
00:14:57.705 --> 00:15:04.695
I think that it, it, where I think it
is absolutely game changing is in the
00:15:04.695 --> 00:15:07.095
prototyping phase of software development.
00:15:07.305 --> 00:15:10.755
Like if you think about a traditional
software development lifecycle,
00:15:11.265 --> 00:15:13.215
you know, especially something is.
00:15:13.949 --> 00:15:18.089
Rigid and inflexible is like
a waterfall methodology.
00:15:18.689 --> 00:15:26.280
You know, where you gather requirements
that are built into, um, you know,
00:15:26.310 --> 00:15:28.860
God, we haven't done waterfall in
so many years now, but like high
00:15:28.860 --> 00:15:34.620
and low level designs and a software
specification, um, requirements, and
00:15:34.620 --> 00:15:37.350
then an architecture diagram and then.
00:15:38.055 --> 00:15:43.245
Wire frames and that it's just like
you can compress all of that into
00:15:43.335 --> 00:15:47.565
and where the business users, because
in that traditional kind of software
00:15:47.565 --> 00:15:52.845
methodology, the end users communicate
to business analysts what it is that
00:15:52.845 --> 00:15:57.795
they want, U usually verbally and,
and they kind of transcribe it in
00:15:57.795 --> 00:16:02.985
words and there's so much ambiguity in
those words that has to be interpreted
00:16:03.435 --> 00:16:04.875
and like context really matters.
00:16:04.875 --> 00:16:10.815
If you can give an end user a. The
ability to vibe, code, what it is that
00:16:10.815 --> 00:16:14.805
they want, and then hand that off to
an engineering team to to, to build
00:16:15.255 --> 00:16:21.195
properly and integrate it into the bigger
environment and have hooks into existing
00:16:21.195 --> 00:16:28.545
infrastructure and adopt patterns and
practices that are, you know, blessed by
00:16:28.545 --> 00:16:31.965
the powers that be within that enterprise.
00:16:31.965 --> 00:16:34.995
I mean, that is absolutely game
changing because that's where.
00:16:36.015 --> 00:16:38.325
That's where most of software
development goes wrong.
00:16:38.715 --> 00:16:42.315
It's in the, from the time
that the business communicates
00:16:42.315 --> 00:16:43.455
what it is that they want.
00:16:44.085 --> 00:16:47.145
And you know, again, in a waterfall
methodology, they may not see
00:16:47.145 --> 00:16:49.605
something for six months to react to.
00:16:49.605 --> 00:16:54.075
Agile has sped that up significantly,
but if you go look in the enterprise,
00:16:54.465 --> 00:16:58.425
I'll tell you right now, most
especially regulated industries,
00:16:58.425 --> 00:17:00.165
most big companies are still doing.
00:17:00.615 --> 00:17:01.995
I call it water scrum.
00:17:02.445 --> 00:17:08.115
Um, it's like waterfall and, uh,
scrum hybrid, but there's still a
00:17:08.115 --> 00:17:16.395
very large gap between user and user
and when they finally see something.
00:17:16.395 --> 00:17:19.185
So I, I think it holds
a great promise there.
00:17:20.085 --> 00:17:21.645
Mm, for sure, for sure.
00:17:22.185 --> 00:17:26.595
And I think that, um, to your
point, I think, uh, these
00:17:26.595 --> 00:17:27.704
kind of design principles.
00:17:28.230 --> 00:17:32.910
They're still relevant, you
know, uh, in all, uh, in all
00:17:32.910 --> 00:17:35.010
disciplines of engineering.
00:17:35.280 --> 00:17:40.980
Um, and, and I think that is also why,
uh, a lot of these five putting platforms,
00:17:40.980 --> 00:17:44.970
they have a plan mode because they
know that a lot of stuff can go wrong.
00:17:45.390 --> 00:17:49.620
So when, uh, that, uh, the plan
mode means that if a user, uh,
00:17:49.680 --> 00:17:51.990
activates it and, uh, uh, maybe.
00:17:52.875 --> 00:17:57.885
Put in a feature request, then the
agent will go into the entire code base.
00:17:58.725 --> 00:18:03.885
Think first, you know, uh, which,
um, file do we need to amend?
00:18:04.365 --> 00:18:07.305
Um, what, uh, libraries
do we need to import?
00:18:07.905 --> 00:18:14.415
Um, so I think this, and this is an
example of AI adopting, you know,
00:18:14.415 --> 00:18:16.545
best principles of engineering.
00:18:17.400 --> 00:18:19.950
To make vibe coding more
viable in that regard.
00:18:20.520 --> 00:18:23.250
Um, and I think that it's, uh,
it's, and, and the planning
00:18:23.250 --> 00:18:25.770
feature is actually pretty decent.
00:18:26.195 --> 00:18:31.020
Uh, uh, uh, you know, uh, for clock
code, um, you know, uh, if you turn
00:18:31.020 --> 00:18:35.340
on planning mode on, on clock code,
um, it's such as everything and then
00:18:35.340 --> 00:18:36.900
come back with a to-do list, right?
00:18:37.800 --> 00:18:42.810
And then the to-do list, uh, uh, be,
there are like six items on that list.
00:18:43.230 --> 00:18:45.000
And, uh, and the agent will.
00:18:45.690 --> 00:18:49.770
When, when they execute the plan, um,
then they would cross out, you know,
00:18:49.860 --> 00:18:54.450
item one is finished, item three is
finished, and then go back to verify
00:18:54.810 --> 00:18:56.280
if the plan is actually executed.
00:18:56.700 --> 00:19:00.540
So I think these are gonna, uh,
the embodiment of some of the best
00:19:00.540 --> 00:19:05.280
practices that, uh, we have been
adopting for the past few decades.
00:19:05.280 --> 00:19:05.490
Right.
00:19:06.240 --> 00:19:08.520
Um, and I think that, you know, uh.
00:19:08.700 --> 00:19:11.280
I think experience also matters.
00:19:11.430 --> 00:19:14.970
So I, I don't claim to be an
experienced by codes, but I
00:19:14.970 --> 00:19:16.440
think by codes is a new breed.
00:19:16.440 --> 00:19:21.300
So maybe a, a few more, you know, some I'm
more experienced than, uh, other people.
00:19:21.600 --> 00:19:25.080
Uh, so I would say, uh, you
also learn from your mistakes.
00:19:25.380 --> 00:19:30.660
You also learn that you can't really
go headfirst and that then expect the
00:19:30.660 --> 00:19:32.580
agent to do everything in one goal.
00:19:33.060 --> 00:19:37.140
You kind of know the
limits, um, of the agent.
00:19:39.044 --> 00:19:40.455
Try to do things in faces.
00:19:40.875 --> 00:19:45.585
So I would say, uh, the, you know,
these things eventually, uh, you know,
00:19:45.915 --> 00:19:52.365
converge, you know, uh, five is learn
how to, um, uh, build apps the right way.
00:19:52.905 --> 00:19:55.905
And then engineers, they learn
how to embrace the fight.
00:19:56.415 --> 00:20:00.645
So I, I like to see that, you
know, uh, um, back to your point
00:20:00.645 --> 00:20:02.264
about, you know, whether, um.
00:20:03.345 --> 00:20:10.095
Uh, uh, you know, uh, I think, I think
there the, the, the, uh, distinction
00:20:10.575 --> 00:20:14.475
between technical and non-technical
people, I think it's getting a little bit
00:20:14.475 --> 00:20:16.305
more blurry that, that lie between them.
00:20:17.145 --> 00:20:19.965
I see that product managers
become more technical.
00:20:20.385 --> 00:20:23.775
I see that engineers become
more interested in design.
00:20:24.435 --> 00:20:28.605
So, uh, I, uh, so I, I would say that
line becomes a little bit more blurry.
00:20:29.939 --> 00:20:30.270
Yeah.
00:20:30.270 --> 00:20:34.379
And you know, speaking of product
management, we use, so Info Dash, we're
00:20:34.379 --> 00:20:39.600
a legal intranet and extranet platform,
and we have a part of our architecture
00:20:39.600 --> 00:20:41.490
is a layer called the Integration Hub.
00:20:42.060 --> 00:20:46.800
And in the integration hub reaches
into all the back office systems in
00:20:46.800 --> 00:20:51.689
a law firm like Practice Management,
document Management, C-R-M-H-R-I-S,
00:20:51.689 --> 00:20:53.310
experience management, and then.
00:20:54.270 --> 00:20:57.899
Presents a unified API that's
security trimmed and that
00:20:57.899 --> 00:20:59.370
respects ethical wall boundaries.
00:20:59.879 --> 00:21:05.909
And we built that to hydrate our web
parts that users build experiences with.
00:21:06.360 --> 00:21:09.780
And this was all kind
of pre ai, pre-chat GPT.
00:21:10.260 --> 00:21:16.200
But after, you know, after chat, GPT
and Azure Open AI and Azure AI search
00:21:16.830 --> 00:21:21.240
as a. Before we had a formal product
manager in place, which we do now.
00:21:21.270 --> 00:21:25.770
It was kind of me, like I was, I was
brainstorming new ways, new paths
00:21:25.770 --> 00:21:27.060
that we could take with a product.
00:21:27.510 --> 00:21:35.010
So I used AI to, I described in
great detail all of the pieces of
00:21:35.010 --> 00:21:40.260
infrastructure integrations and then
all of the tools that Microsoft enables.
00:21:40.260 --> 00:21:42.660
'cause we deploy our solution
in the client's tenant.
00:21:42.780 --> 00:21:46.590
And I had it brainstorm
ideas like what use cases.
00:21:47.595 --> 00:21:50.895
You know, within a law firm
could this architecture enable?
00:21:51.195 --> 00:21:55.815
And oh my God, it gave
us four amazing ones.
00:21:55.815 --> 00:22:01.065
One of 'em, I'm, we're meeting on Monday,
so today is January 2nd, January 5th
00:22:01.065 --> 00:22:02.955
with a law firm to actually build it out.
00:22:03.765 --> 00:22:09.615
So it's like, you know, the ability to
have like a thought partner who can,
00:22:09.705 --> 00:22:13.725
you can describe, okay, look, this is
all the, this is the infrastructure.
00:22:14.399 --> 00:22:15.930
What can we do with it?
00:22:16.170 --> 00:22:16.379
Right?
00:22:16.379 --> 00:22:18.960
Give me ideas and, um, wow.
00:22:18.960 --> 00:22:20.550
Man, it knocked the cover off the ball.
00:22:20.550 --> 00:22:21.690
It did a really good job.
00:22:22.649 --> 00:22:23.520
I can imagine.
00:22:23.610 --> 00:22:24.389
I can imagine.
00:22:24.659 --> 00:22:31.409
Um, I, I think that, uh, sometimes I
like to, you know, so I think that that
00:22:31.409 --> 00:22:36.510
also goes to the planning part, as in,
um, you're gonna have to, I think, uh,
00:22:36.600 --> 00:22:40.139
from idea, uh, ideation to production.
00:22:40.590 --> 00:22:43.590
I think right now, because AI has, uh.
00:22:45.030 --> 00:22:51.000
The time from idea to execution, it means
that we can spend more time, uh, thinking
00:22:51.000 --> 00:22:57.570
about ideas and, uh, and, and, and
getting, you know, a, a more, uh, uh, you
00:22:57.570 --> 00:23:02.550
know, spending more time actually thinking
about the idea, thinking about, you know,
00:23:02.550 --> 00:23:05.100
the actual features that, that you want.
00:23:05.639 --> 00:23:10.230
And in that process, I think AI
is also a really, really good, um.
00:23:10.605 --> 00:23:15.015
Tools to help you discover, you know,
things that you weren't even aware before.
00:23:15.165 --> 00:23:20.235
Do a lot of deep research, for example,
uh, to think about, you know, the, the
00:23:20.235 --> 00:23:24.945
tech stack, you know, what kind of, uh,
um, dependencies would you be using?
00:23:25.245 --> 00:23:26.055
What limits?
00:23:26.475 --> 00:23:32.505
I think, you know, uh, I think having,
you know, uh, said thought partner
00:23:33.315 --> 00:23:36.495
really, uh, helps you, you know.
00:23:38.775 --> 00:23:40.395
Think better ideas, so to speak.
00:23:40.455 --> 00:23:45.675
Um, so 'cause, because, you know, uh,
although, you know, it also goes to the,
00:23:46.125 --> 00:23:52.755
um, uh, ancy of these AI tools, you know,
you might have a, you know, bad idea.
00:23:52.755 --> 00:23:56.265
But then after discussing with ai,
I still, things like you, good idea.
00:23:56.265 --> 00:23:57.315
You, you know, way to go.
00:23:57.315 --> 00:24:00.375
But then when you execute it, it's
just not what you know, people want.
00:24:00.585 --> 00:24:03.435
So we need to be careful of
that, uh, uh, about that as well.
00:24:04.470 --> 00:24:07.200
I think a healthy dose of
skepticism is, is in order.
00:24:07.590 --> 00:24:10.530
I, uh, I did a neat little
experiment last night.
00:24:10.890 --> 00:24:13.170
I took my first grade report card.
00:24:13.410 --> 00:24:15.240
My, I mean, I'm 53 years old.
00:24:15.240 --> 00:24:16.410
This was, yeah.
00:24:16.740 --> 00:24:20.790
And I uploaded it in AI and
said, what would be the career?
00:24:20.790 --> 00:24:24.065
I, I blurred, I redacted my
name and said what would be.
00:24:24.895 --> 00:24:26.754
A likely career path for this person.
00:24:27.445 --> 00:24:30.115
And it, I'd used it in all four models.
00:24:30.145 --> 00:24:33.669
Claude, GPT, chat, GPT, rock and Gemini.
00:24:33.970 --> 00:24:34.389
Mm-hmm.
00:24:34.475 --> 00:24:41.274
And Claude and Chat, GPT kind of
cheated, um, and used memory and
00:24:42.115 --> 00:24:44.514
because they knew that what I do.
00:24:44.965 --> 00:24:45.054
Mm-hmm.
00:24:45.294 --> 00:24:52.345
But then I put him in incognito mode and
they still did really, really well, but I.
00:24:53.250 --> 00:24:57.180
I approached that with a, a
healthy dose of skepticism.
00:24:57.180 --> 00:24:59.820
When I saw how accurate the results
were, I was like, wait a second.
00:25:00.510 --> 00:25:02.430
This is, this is too accurate.
00:25:02.430 --> 00:25:03.030
You're cheating.
00:25:03.030 --> 00:25:07.710
And so I asked, I was like, how
did memory influence your output?
00:25:07.770 --> 00:25:10.470
And Claude was like, you got me.
00:25:14.135 --> 00:25:17.790
Uh, which is, so I think that anytime
you're interacting with these models,
00:25:17.790 --> 00:25:19.590
having a healthy dose of skepticism.
00:25:20.340 --> 00:25:24.750
Again, because of the, uh,
sycophantic tendencies.
00:25:24.870 --> 00:25:27.090
Um, and they're, they're deceptive.
00:25:27.750 --> 00:25:30.750
They are, these models can be deceptive.
00:25:30.750 --> 00:25:34.470
I think they're working, especially
Anthropic has been very transparent
00:25:34.470 --> 00:25:39.690
about how, incredibly deceptive to
the point where, I don't know if you
00:25:39.690 --> 00:25:45.510
remember, they tested a model where
they were gonna shut it down and
00:25:45.570 --> 00:25:49.170
this fake CEO was having an affair.
00:25:50.220 --> 00:25:58.530
With a subordinate, and Claude threatened
to expose him if they shut the model down.
00:25:58.860 --> 00:26:01.560
And it's just like, wow,
that's just mind blowing.
00:26:01.560 --> 00:26:03.000
So I, you gotta be careful, man.
00:26:03.000 --> 00:26:06.450
These, these, these models are
crafty just like people are.
00:26:07.980 --> 00:26:08.460
Exactly.
00:26:08.730 --> 00:26:14.640
Um, I, you know, especially when they're
very, very confident when they're wrong.
00:26:15.240 --> 00:26:20.850
Uh, it, it's, I think that, um, uh,
someone said, you know, it's, um,
00:26:21.900 --> 00:26:29.070
uh, before AI you can kind of tell
pretty quickly if something, you know,
00:26:29.310 --> 00:26:32.820
you're reading something and something
seems off, you know, immediately tell
00:26:32.820 --> 00:26:36.090
that, you know, that, that that's,
that's, that's just, um, inaccurate.
00:26:36.660 --> 00:26:43.770
Uh, but now with ai it's really
hard, uh, because it can seem very.
00:26:44.295 --> 00:26:44.955
Consistent.
00:26:45.070 --> 00:26:45.740
Consistent.
00:26:45.745 --> 00:26:47.025
So it's very logical.
00:26:47.535 --> 00:26:54.045
Um, I think it goes to how, you know,
the underlying architecture of lms
00:26:54.465 --> 00:26:56.265
uh, being pattern recognition models.
00:26:56.835 --> 00:27:00.075
I mean, you know, some, someone
will, you know, will tell me that
00:27:00.075 --> 00:27:01.245
it's not pattern recognition.
00:27:01.275 --> 00:27:05.895
But then I think precisely because
it's really, really good at writing.
00:27:07.245 --> 00:27:11.925
It's, uh, they, they can put
words together in a way that is
00:27:11.925 --> 00:27:13.425
really good at convincing people.
00:27:13.425 --> 00:27:17.835
I think there is a study where, you
know, um, people, uh, that the, the
00:27:17.835 --> 00:27:25.065
researchers compare, uh, um, uh, LMS and
humans on, uh, the task of convincing,
00:27:25.305 --> 00:27:29.955
you know, convincing, you know, so,
uh, uh, uh, so I think the experiments
00:27:29.955 --> 00:27:33.645
is that, um, some humans actually
taking multiple choice questions.
00:27:34.034 --> 00:27:39.705
And then, uh, uh, one group would be the
LM that, you know, convinced the humans
00:27:39.705 --> 00:27:45.465
to take another answer, and the other
group would be another human, um, asking
00:27:45.554 --> 00:27:49.574
the human to reconsider and convince
them to, you know, choose another answer.
00:27:50.264 --> 00:27:54.855
The LMS are really good at both
convincing those humans to pick,
00:27:55.304 --> 00:28:00.524
um, the right answer and the wrong
answer, so it can go bo, go both ways.
00:28:01.095 --> 00:28:03.675
So, whereas for humans, it's
just there to just not, that's
00:28:03.675 --> 00:28:05.565
just bad at convincing overall.
00:28:06.045 --> 00:28:11.205
So, um, I think that, you know, that's,
uh, definitely, uh, says something.
00:28:11.925 --> 00:28:12.315
Yeah.
00:28:12.795 --> 00:28:17.625
Well, let's talk a little bit about,
uh, spell page and your legal AI os.
00:28:18.105 --> 00:28:22.515
Um, tell us a little bit about, I, I guess
are the, are those two separate apps?
00:28:23.475 --> 00:28:23.835
Hmm.
00:28:23.985 --> 00:28:24.435
Uh, okay.
00:28:24.435 --> 00:28:28.965
Maybe, uh, just to give, uh, the
listeners a, a little bit of background.
00:28:28.965 --> 00:28:30.555
So, uh.
00:28:31.395 --> 00:28:35.775
The reason, you know why I think I'm
on this podcast is because I've been
00:28:35.865 --> 00:28:41.445
five coding, uh, uh, a lot of apps
that, um, are lightweight versions
00:28:41.625 --> 00:28:46.305
of some of the, uh, most popular, uh,
legal, ai, uh, products out there.
00:28:46.514 --> 00:28:46.905
Um.
00:28:47.760 --> 00:28:50.700
Uh, one of them, uh, being
spell book, which is like a,
00:28:51.270 --> 00:28:54.120
um, an AI powered word editor.
00:28:54.420 --> 00:28:58.110
So, um, I think they position
themselves as cursor for
00:28:58.110 --> 00:28:59.940
word, uh, cursor for lawyers.
00:29:00.300 --> 00:29:04.830
So instead of, uh, an AI agent that
lives within the developer's, IDE, it's
00:29:04.830 --> 00:29:09.810
basically an agent that lives within,
uh, Microsoft Word that helps, uh, you
00:29:09.810 --> 00:29:11.760
know, helps with drafting, for example.
00:29:12.780 --> 00:29:17.790
Um, the reason why I decided to vibe
code, uh, these, you know, lightweight
00:29:18.030 --> 00:29:22.980
clones is because I, Gemini Free came
out, and then I started, you know,
00:29:23.460 --> 00:29:28.590
uh, seeing a lot of people sharing,
uh, their, the mini apps that they've
00:29:28.590 --> 00:29:30.690
built on the, on social media.
00:29:31.230 --> 00:29:36.030
Um, and as you know, via coding,
you know, uh, uh, I think this is
00:29:36.030 --> 00:29:37.980
also very common practice actually.
00:29:37.980 --> 00:29:38.310
Uh.
00:29:39.090 --> 00:29:44.400
For people to test out the mo these models
by recreating, you know, these uh, apps.
00:29:44.460 --> 00:29:47.430
You know, I think someone
tried to clone Windows 97.
00:29:48.180 --> 00:29:49.710
I think someone tried to clone.
00:29:50.129 --> 00:29:55.590
Uh, lovable, try to clone a, a, a clo
from scratch, you know, just to test,
00:29:55.590 --> 00:29:56.760
you know, how good these models are.
00:29:56.760 --> 00:30:01.565
So I, I, I, you know, I, I, being
a lawyer, um, I, I, I, I, I,
00:30:01.565 --> 00:30:03.389
I try to do something similar.
00:30:03.450 --> 00:30:08.820
So I try to say, well, well, how well
does, would it, you know, um, clone
00:30:08.850 --> 00:30:10.560
some of these AI legal AI tools?
00:30:10.919 --> 00:30:14.940
Um, actually at first I
didn't, I, I, I wasn't.
00:30:16.125 --> 00:30:21.225
Uh, thinking about cloning spell
book, I, my, you know, my initial
00:30:21.225 --> 00:30:26.055
thought was that, uh, I wanted to,
you know, to, to see, you know, I, I
00:30:26.055 --> 00:30:32.055
came across this, uh, novel writing
app, pseudo Write, and I'm fascinated
00:30:32.055 --> 00:30:37.725
about, you know, how, how they are, uh,
uh, allowing users to basically, uh.
00:30:38.294 --> 00:30:42.405
Write the next paragraph and then
awful, you know, um, that, uh, you know,
00:30:42.794 --> 00:30:44.600
engineer plot twist here and so on.
00:30:44.679 --> 00:30:49.635
I, I imagine, you know, how, you know,
would that work for, uh, lawyers?
00:30:49.995 --> 00:30:54.584
Would that work for, you know, people who,
you know, need a, just a simple contract?
00:30:54.854 --> 00:31:01.304
So I had this idea that I put this into,
uh, Gemini free on Google AI Studio.
00:31:01.574 --> 00:31:06.735
Then it, it came up with something
that is, you know, that's, I would say.
00:31:07.785 --> 00:31:10.995
I wouldn't say that immediately
usable, but it's impressive.
00:31:11.175 --> 00:31:16.305
You know, while I, you can basically,
uh, come up with a, a decent surface
00:31:16.305 --> 00:31:20.895
agreement and I can continue iterating
on it by, uh, giving it instructions,
00:31:20.925 --> 00:31:22.005
giving the air instructions.
00:31:22.515 --> 00:31:27.750
So I think at, at some point
I. Uh, just ask, uh, Gemini,
00:31:28.230 --> 00:31:30.300
can we go full on Asian mode?
00:31:30.810 --> 00:31:34.470
And by Asian mode I mean that, you know,
can you, every time I ask you to do
00:31:34.470 --> 00:31:38.639
something, can you spin up a to-do list
and then iterate through that todo, uh,
00:31:38.700 --> 00:31:42.540
uh, iterate through that to-do list and
then complete the whole task that way.
00:31:43.560 --> 00:31:45.990
So, um, it, it, it did that.
00:31:45.990 --> 00:31:49.710
So, you know, I kind of shared my, um.
00:31:50.295 --> 00:31:52.935
A demo of that many app, uh, on LinkedIn.
00:31:52.965 --> 00:31:55.305
I think that, that, you know,
that picked up a lot of interest.
00:31:55.845 --> 00:31:58.905
Um, so, so that's spell page.
00:31:59.115 --> 00:31:59.355
Yeah.
00:31:59.355 --> 00:32:03.945
And, and, uh, of course I also
did a, another app that kind
00:32:03.945 --> 00:32:08.775
of, uh, um, I would say kind of
mimics the tablet review feature.
00:32:09.585 --> 00:32:11.445
Uh, that's offered by Harvey and Agora.
00:32:11.955 --> 00:32:17.355
Uh, so, um, that was also
built on Google AI Studio, um,
00:32:17.685 --> 00:32:20.115
surprisingly in just an afternoon.
00:32:20.475 --> 00:32:23.265
So it's, uh, it's really interesting.
00:32:23.325 --> 00:32:28.545
Um, so I think these kind of two,
two experiments, um, made me think,
00:32:28.965 --> 00:32:34.665
you know, um, Chris, first, you know,
how amazing these frontier models
00:32:34.665 --> 00:32:38.445
are, and a second thing is weather.
00:32:39.540 --> 00:32:45.540
Um, lawyers can build their
own, uh, legal AI tools.
00:32:46.140 --> 00:32:53.825
Um, the reason for that is because, um, I
see that, uh, the industry for the past.
00:32:55.245 --> 00:33:00.465
In a few decades, they have been relying
on, uh, a lot on vendors for sure.
00:33:01.155 --> 00:33:06.135
Uh, I think, you know, for the past
year, I think there has been more
00:33:06.135 --> 00:33:11.295
and more, um, talks on whether law
firms or lawyers should build their
00:33:11.295 --> 00:33:13.125
own, build out their own tech stack.
00:33:14.280 --> 00:33:21.675
And the reason for that is because I
think, uh, if AI becomes more and more.
00:33:22.890 --> 00:33:25.230
You know, AI becomes a utility, let's say.
00:33:25.830 --> 00:33:31.440
Then, uh, you need to differentiate
yourself from other law firms,
00:33:32.190 --> 00:33:36.300
and you probably would be hard to
differentiate yourself from other law
00:33:36.300 --> 00:33:38.010
firms if you're using the same product.
00:33:38.790 --> 00:33:43.080
So maybe the edge comes from you
actually building your own stuff,
00:33:43.590 --> 00:33:47.190
uh, and, you know, tailoring it
to your own use case and workflow.
00:33:47.400 --> 00:33:48.960
So, um.
00:33:49.290 --> 00:33:52.590
You know, this idea came to my mind
and then I, I, you know, I kind of
00:33:52.620 --> 00:33:56.639
thought, you know, if a thousand
lawyers started to, uh, build their
00:33:56.639 --> 00:34:00.690
own tools, then there should be some
commonalities among these apps, right?
00:34:01.770 --> 00:34:10.139
So, you know, and why wouldn't there
be, uh, some open source project
00:34:10.375 --> 00:34:17.670
or infras at the infrastructure
level that kind of, um, uh, um.
00:34:18.375 --> 00:34:22.065
Identify, you know, these kind of
dependencies, I think, and, and
00:34:22.065 --> 00:34:26.505
kind of, um, uh, build it in a way
that allow lawyers to, uh, you know,
00:34:26.505 --> 00:34:29.804
build their own tools on top of
these, uh, this, this layer, right?
00:34:29.924 --> 00:34:36.255
Um, and, and I would say, uh, the reason
why I think that makes sense is because,
00:34:37.304 --> 00:34:42.685
uh, lawyers actually good at, you
know, uh, coming together and building.
00:34:43.350 --> 00:34:43.950
Standards.
00:34:44.250 --> 00:34:49.440
So, um, if we look at the NVCA documents,
which is like the, uh, venture Capital
00:34:49.440 --> 00:34:53.520
Association, uh, template documents
for fundraising, you know, that is,
00:34:53.580 --> 00:34:57.540
you know, as collaborative efforts
from different lawyers, uh, same
00:34:57.540 --> 00:34:59.490
for Easter agreements and so on.
00:34:59.940 --> 00:35:05.190
I think, uh, that happens if something
becomes more and more commoditized and
00:35:05.310 --> 00:35:07.590
so, so standards are built that way.
00:35:08.175 --> 00:35:15.105
If we take the view that, um, a lot
of the, um, a lot of our know-how and
00:35:15.105 --> 00:35:22.665
templates would be converted into AI
workflows or AI apps, then wouldn't we
00:35:22.815 --> 00:35:29.085
be able to, you know, also make certain
AI features or AI experience is standard.
00:35:29.115 --> 00:35:32.265
So, you know, which is why
it, it kind of got me thinking
00:35:32.265 --> 00:35:35.805
whether, um, there is, uh, a.
00:35:36.765 --> 00:35:41.205
Need for an open source project
at the infrastructure level.
00:35:42.674 --> 00:35:46.365
So, well, you know what's interesting
is in info dash we use our integration
00:35:46.365 --> 00:35:51.525
hub and allow, allow, um, vibe coding
more to come on that we haven't, um,
00:35:51.795 --> 00:35:56.055
we have something coming out soon
that we're gonna demo, but we're
00:35:56.055 --> 00:35:57.525
thinking along those lines as well.
00:35:57.555 --> 00:36:03.555
'cause the hardest part in all of that is
getting the data that you need securely.
00:36:04.485 --> 00:36:06.585
And consistently, right?
00:36:06.585 --> 00:36:09.855
If you have to go and touch all
these different API endpoints
00:36:10.245 --> 00:36:12.525
that potentially change, right?
00:36:12.525 --> 00:36:20.445
When iManage releases a new version
of their API or Net Docs or AddRan
00:36:20.505 --> 00:36:26.445
or Elite, that's where things have a,
things become fragile in that regard.
00:36:26.625 --> 00:36:29.685
That that piece has to
be actively managed.
00:36:30.435 --> 00:36:30.915
Um, but I'm.
00:36:31.920 --> 00:36:37.799
I'm curious what your thoughts are about,
since you have vibe coded, like tabular
00:36:37.799 --> 00:36:43.860
review is a really fundamental feature it
seems of the, of the Harvey's and Leg of
00:36:43.860 --> 00:36:51.240
the world, and you've been able to vibe
code a pretty good iteration of that.
00:36:51.245 --> 00:37:00.060
In your spare time, do these legal
AI tools truly have a moat, um, or.
00:37:00.735 --> 00:37:03.345
'cause we've talked about thin
wrappers and everything for quite
00:37:03.345 --> 00:37:06.915
some time, and there's more to it
than just the UI layer, but the UI
00:37:06.915 --> 00:37:09.165
layer is a lot of their value props.
00:37:09.165 --> 00:37:12.015
I don't know, what do you, what are,
what is your thinking now that you've
00:37:12.015 --> 00:37:15.825
been able to replicate some of these
features that a lot of people have been
00:37:15.825 --> 00:37:20.625
thinking that was the moat and you,
you're challenging that I feel like,
00:37:21.885 --> 00:37:27.250
yeah, I, I feel that, uh, so first of
all, uh, the tablet review feature.
00:37:28.484 --> 00:37:30.285
It's actually, I've open sourced it.
00:37:30.825 --> 00:37:35.174
So I would encourage people to, you know,
lawyers who have experienced, you know,
00:37:35.205 --> 00:37:39.525
using RV in the Guard to actually test to
see, you know, what are the actual gaps.
00:37:40.035 --> 00:37:46.335
Um, I think that, you know, it's hard
to just look at to demos and see, you
00:37:46.335 --> 00:37:47.690
know, because, you know, this is, uh.
00:37:48.305 --> 00:37:50.915
Looks the same, uh, and
then it must be the same.
00:37:51.335 --> 00:37:56.435
So, uh, I, I, I, I think, I suspect,
you know, although I don't know,
00:37:56.975 --> 00:38:00.065
um, the underlying architecture
might be a little bit different.
00:38:00.095 --> 00:38:01.925
Um, I can sign a few examples.
00:38:02.615 --> 00:38:07.205
Um, so for the tablet review
tool that I've created, um, it
00:38:07.205 --> 00:38:08.525
doesn't have an embedding model.
00:38:09.065 --> 00:38:09.575
Um.
00:38:10.335 --> 00:38:12.675
For those who don't know,
embedding models are usually used.
00:38:12.675 --> 00:38:21.195
If, um, you, uh, uh, you, you, uh,
you upload a, a long document, uh,
00:38:21.225 --> 00:38:27.405
which exceeds the context window, uh,
of the lm, then you'll have to kind
00:38:27.405 --> 00:38:32.085
of convert, uh, you know, chunk the
documents into different paragraphs
00:38:32.655 --> 00:38:37.035
and then only feed the most relevant,
relevant paragraphs to the ai.
00:38:37.634 --> 00:38:38.085
Um.
00:38:38.940 --> 00:38:41.460
The, the thing I, I put, it
doesn't have the embedding model.
00:38:42.060 --> 00:38:46.200
Uh, and I, I know, you know, from
several, I think interviews that I've
00:38:46.379 --> 00:38:51.540
seen, you know, uh, um, are given by,
uh, you know, Harvard Lago engineers,
00:38:51.540 --> 00:38:54.299
I think that they, they, they have
pretty sophisticated embedding models.
00:38:54.299 --> 00:38:56.460
I think that's a gap.
00:38:56.970 --> 00:39:01.980
Uh, I think whether that embedding
model actually gives more
00:39:01.980 --> 00:39:06.750
accuracy, uh, and precision to
those answers require evaluation.
00:39:08.085 --> 00:39:09.135
And that's another thing, right?
00:39:09.525 --> 00:39:14.355
How do we evaluate, uh, you know,
these outputs from different AI
00:39:14.355 --> 00:39:19.965
tools and a certain open source,
uh, evaluation dataset for that?
00:39:20.715 --> 00:39:26.415
Um, uh, how, you know, uh, how do
we even know, you know, whether,
00:39:26.445 --> 00:39:30.615
you know, this actually outperforms,
uh, uh, a standard off the shelf.
00:39:30.645 --> 00:39:32.205
Lm we don't know.
00:39:32.205 --> 00:39:36.345
So I think some of these
companies have, you know, um.
00:39:37.305 --> 00:39:39.375
Internal evaluation dataset.
00:39:39.915 --> 00:39:43.815
Uh, there are also external
benchmarks like, um, valves, ai,
00:39:44.055 --> 00:39:47.625
um, so which compares different
legal AI tools and so on.
00:39:48.195 --> 00:39:52.935
I think to rigorously, you know,
test, you know, these outputs, uh,
00:39:53.265 --> 00:39:57.465
out, I think, uh, request, you know,
a serious benchmarking exercise.
00:39:58.125 --> 00:40:02.805
But my, my thought is actually,
you know, that, you know, um, um.
00:40:03.405 --> 00:40:06.165
I think, um, there are two
ways to think about it.
00:40:06.320 --> 00:40:11.205
The, the first is that,
you know, uh, I think, um,
00:40:13.335 --> 00:40:19.545
so, uh, at the, you know, for,
for the actual accuracy of the
00:40:19.635 --> 00:40:26.865
ai, uh, tool, uh, you know, it's,
um, it depends on the LMS as well.
00:40:27.405 --> 00:40:32.175
So I, I don't think there is a
serious benchmarking exercise that's
00:40:32.175 --> 00:40:37.875
ever been done between, um, off
the shelf LMS and legal AI tools.
00:40:39.195 --> 00:40:41.265
Um, I think there is
some efforts to do it.
00:40:41.775 --> 00:40:46.485
Um, uh, but then, uh, uh, there's,
I don't think there is a, you know,
00:40:46.635 --> 00:40:48.585
a lot of research into this area.
00:40:49.095 --> 00:40:53.325
Um, uh, the second thing is basically
the ui ux experience, right?
00:40:53.385 --> 00:40:54.525
So what are the UI.
00:40:57.090 --> 00:40:59.280
Is domain specific, for example.
00:41:00.330 --> 00:41:05.010
Um, uh, to be honest, I, I,
I, I, I, I'm not really sure.
00:41:05.490 --> 00:41:11.070
Um, one may say that, you know,
citation is a, uh, wasn't feature
00:41:11.070 --> 00:41:15.060
for lawyers, but I would say it's
actually a pretty generic thing.
00:41:15.165 --> 00:41:17.340
'cause perplexity does citations as well.
00:41:18.180 --> 00:41:18.420
Right.
00:41:18.870 --> 00:41:23.730
Um, and I think that for all business
organizations, if you're searching
00:41:24.000 --> 00:41:25.500
for your own internal database.
00:41:26.355 --> 00:41:29.444
Um, that it requires some
sort of citations to ensure
00:41:29.444 --> 00:41:30.615
that there's no hallucination.
00:41:31.544 --> 00:41:35.685
So if we look at a broader market,
we see that kind of, these features
00:41:35.685 --> 00:41:39.464
are actually offered by other AI
tools as well in other industries.
00:41:40.154 --> 00:41:44.984
So I would say there's also
some ho um, that feature is also
00:41:44.984 --> 00:41:46.814
quite homogenous, I would say.
00:41:47.294 --> 00:41:52.395
Uh, there might be some, some, uh,
thing about, you know, uh, um, you
00:41:52.395 --> 00:41:53.895
know, these areas being connected.
00:41:54.585 --> 00:41:58.305
To data sources that are unique to
lawyers, for example, LexiNexis.
00:41:58.305 --> 00:41:59.265
I think that's a fair point.
00:41:59.895 --> 00:42:05.205
I think these kind of integrations
would help these tools or stand out as
00:42:05.205 --> 00:42:07.545
a legal specific, you know, AI tool.
00:42:08.145 --> 00:42:09.715
Um, uh, but.
00:42:10.455 --> 00:42:14.384
You know, for example, for word editing,
I think word editing is basically
00:42:14.384 --> 00:42:18.404
something that the entire world,
you know, is facing as a problem.
00:42:18.825 --> 00:42:22.575
You know, having AI edit your
Word documents, it's not a lawyer
00:42:22.575 --> 00:42:24.555
problem, it's like a global problem.
00:42:24.705 --> 00:42:26.714
And Microsoft is also solving that.
00:42:26.714 --> 00:42:30.584
I, I suppose, you know, with
copilot, I know they will eventually
00:42:30.584 --> 00:42:32.564
get, I think there's still you.
00:42:36.360 --> 00:42:37.290
A lot of work to do.
00:42:37.830 --> 00:42:38.700
Yeah, for sure, for sure.
00:42:39.930 --> 00:42:44.400
Um, but then if we look at
philanthropic, they have a do X skill,
00:42:45.030 --> 00:42:48.960
uh, quite recently, you know, uh,
uh, that, that you can import into
00:42:48.960 --> 00:42:51.450
cloud code and import into a clo.
00:42:51.750 --> 00:42:57.000
So, um, it's not like the industry is not
solving these kind of problems and it kind
00:42:57.000 --> 00:43:02.580
of worries me that, you know, whether,
um, we are solving the kind of problems
00:43:02.580 --> 00:43:04.260
that are unique to the industry or.
00:43:04.680 --> 00:43:08.550
Especially we are solving the problems
that, you know, the foundation models.
00:43:09.105 --> 00:43:12.825
Uh, the model companies are solving
or the, you know, other industries
00:43:12.825 --> 00:43:17.714
are solving, and if they solve,
uh, this problem faster, then the
00:43:17.714 --> 00:43:19.455
question becomes, you know, yeah.
00:43:19.484 --> 00:43:22.154
You, you then, then your, you
know, your concern becomes valid.
00:43:22.154 --> 00:43:23.025
You know, it's very remote.
00:43:23.535 --> 00:43:26.955
Um, but either way, I mean,
for lawyers it's good news.
00:43:27.495 --> 00:43:31.875
So because, you know, uh, it means that we
have better tools, uh, at a cheaper price.
00:43:32.325 --> 00:43:34.395
Um, I think that, I think, you know.
00:43:35.355 --> 00:43:38.234
I, I always welcome some
healthy competition.
00:43:38.265 --> 00:43:42.555
You know, uh, you know, I think that
it's important to understand the
00:43:42.555 --> 00:43:46.544
gaps as well, I think, um, which
is also what I'm doing as well.
00:43:47.714 --> 00:43:53.535
Um, I think that I, I, I love, uh,
to, you know, um, share my kind of
00:43:53.535 --> 00:43:57.555
learning, uh, with other lawyers, you
know, how capable the frontier models
00:43:57.555 --> 00:43:59.145
are and what we can do with them.
00:43:59.535 --> 00:44:03.915
Because I've, you know, discussed with
different lawyers, you know, who have
00:44:03.915 --> 00:44:09.285
been thinking about, you know, how do we
actually, you know, uh, uh, use AI better.
00:44:09.645 --> 00:44:12.285
Uh, how do we, uh, you know,
um, make sure that we have an
00:44:12.285 --> 00:44:13.694
edge, you know, as a lawyer.
00:44:14.879 --> 00:44:17.910
You know, one thing that came up is,
you know, maybe is adoption right?
00:44:18.899 --> 00:44:25.859
Uh, between picking a better tool
than and, uh, and making sure 70%.
00:44:26.895 --> 00:44:31.875
90%, uh, of your organization
is actually using AI daily.
00:44:32.415 --> 00:44:36.375
I think maybe the latter will have
more impact, make you more competitive.
00:44:37.155 --> 00:44:41.175
So maybe it's also a culture
issue, uh, like a culture issue.
00:44:41.745 --> 00:44:45.645
It's also a business model issue as in,
you know, how do you charge, you know,
00:44:45.645 --> 00:44:47.715
legal service as a whole and so on.
00:44:47.715 --> 00:44:52.100
So, I, I think there's a lot
more to the equation to speak.
00:44:53.115 --> 00:44:57.075
Yeah, no, this, that was, uh,
that really good, um, overview.
00:44:57.105 --> 00:44:59.775
So we're almost outta time,
but I did wanna bounce one
00:44:59.775 --> 00:45:00.945
other question off of you.
00:45:00.945 --> 00:45:02.775
So this is legal innovation spotlight.
00:45:03.375 --> 00:45:08.055
You know, a big chunk of
our audience are like legal
00:45:08.415 --> 00:45:10.065
innovation professionals, right?
00:45:10.065 --> 00:45:16.725
So CNOs, um, you know, legal
or, um, innovation attorneys,
00:45:17.235 --> 00:45:19.005
uh, a lot of KM folks.
00:45:19.515 --> 00:45:20.475
How does.
00:45:21.569 --> 00:45:27.210
Like vibe coding, like historically, that
has been the function within the law firm
00:45:27.330 --> 00:45:33.779
that's been responsible for, I guess kind
of what you're doing is like, you know,
00:45:33.779 --> 00:45:41.700
evaluating alternatives, um, exploring
different paths to technical solutions.
00:45:42.509 --> 00:45:50.370
And if lawyer, you know, does this, does
vibe coding change the relationship?
00:45:50.640 --> 00:45:57.300
Between the lawyers in a law firm and
their innovation teams, because they're
00:45:57.300 --> 00:46:01.650
now able to do some prototyping that
they might engage their technology
00:46:01.650 --> 00:46:05.400
or or innovation partners In the
past, do you see that relationship
00:46:05.400 --> 00:46:07.080
changing as a result of vibe coding?
00:46:11.835 --> 00:46:14.625
Yeah, I, I think that
changes a lot actually.
00:46:14.685 --> 00:46:19.815
Um, so, because if we think in the past,
you know, if lawyers have questions,
00:46:19.845 --> 00:46:23.025
uh, if they experience pain points,
you know, what do, what do they do?
00:46:23.055 --> 00:46:24.675
They usually reach out to.
00:46:25.375 --> 00:46:28.075
Their innovation team, uh, or the IT team.
00:46:28.194 --> 00:46:32.484
Uh, and then the IT team will, uh,
reach out to different vendors and
00:46:32.484 --> 00:46:37.734
maybe do IT research to see if there's
any vendor that can solve that problem.
00:46:38.544 --> 00:46:41.484
Uh, I suppose that also goes
through a, they, they have a
00:46:41.484 --> 00:46:42.895
list of selection criteria.
00:46:43.254 --> 00:46:49.015
Uh, they have, um, some filtering exercise
to do, um, security scans and so on.
00:46:49.524 --> 00:46:52.854
And then after the filtering
exercise, then, uh, there's a
00:46:52.854 --> 00:46:53.899
trial where the lawyers will.
00:46:54.525 --> 00:46:55.515
Try out different products.
00:46:55.845 --> 00:47:00.105
I think by that time, usually the lawyer
doesn't have experience the pain point
00:47:00.105 --> 00:47:04.485
anymore because maybe the transaction
has already complete completed, or, you
00:47:04.485 --> 00:47:08.895
know, they, um, uh, they, they, they,
they, they, they find out that, you
00:47:08.895 --> 00:47:12.975
know, after the filtering, you know,
well, this only solves 20% of my problem.
00:47:13.575 --> 00:47:19.125
Uh, you know, uh, so, uh, uh, and
then, and, and then historically
00:47:19.125 --> 00:47:23.145
I think that, you know, innovation
team, uh, they also, um.
00:47:24.240 --> 00:47:27.359
You know, they, they, they, they,
they, they're basically the experts
00:47:27.930 --> 00:47:30.810
in using these tools because, you
know, historically, you know, uh,
00:47:30.870 --> 00:47:33.660
these SaaS tools have a lot of
buttons, a lot of features, right?
00:47:34.200 --> 00:47:37.620
And, uh, lawyers, they don't know
how to use them, use all of them.
00:47:37.620 --> 00:47:39.689
It's like a printer, you know,
they're a hundred buttons.
00:47:40.169 --> 00:47:42.450
Uh, you know, lawyers don't
know how to use all of them.
00:47:42.870 --> 00:47:47.819
So I think, uh, for knowledge management,
uh, people, you know, um, uh, they
00:47:47.819 --> 00:47:49.420
became the expert in using these tools.
00:47:50.895 --> 00:47:55.154
Um, you know, they need to teach the
lawyers how to use each and every
00:47:55.365 --> 00:47:57.345
feature, uh, to solve their, uh, use case.
00:47:58.274 --> 00:47:58.690
Now, this, this.
00:47:59.370 --> 00:48:03.630
Might shift fundamentally if lawyers
are allowed to actually build their
00:48:03.630 --> 00:48:08.880
own tools, because, uh, they could be,
you could build something that's really
00:48:08.880 --> 00:48:13.890
personalized with a few, a lot less
buttons, for example, because that's only
00:48:13.950 --> 00:48:19.170
is highly tailored to the use case and
traditional SaaS, you know, it's, requires
00:48:19.170 --> 00:48:21.330
people to actually work around them.
00:48:21.600 --> 00:48:21.780
Right.
00:48:22.290 --> 00:48:27.630
You know, uh, they, they require people
to actually, um, learn how to use it.
00:48:27.944 --> 00:48:33.465
And then, you know, um, adjust their
workflow, you know, uh, so, so, so to
00:48:33.465 --> 00:48:36.105
fit, you know, the SaaS tools design.
00:48:36.645 --> 00:48:40.245
Uh, but if in the future, if it's
actually the other way around where
00:48:40.245 --> 00:48:45.674
the SaaS tool actually adapts to
the large habit, then this also, you
00:48:45.674 --> 00:48:51.015
know, um, it's that, you know, this
also becomes, um, much more direct.
00:48:51.134 --> 00:48:55.305
So I think what I, um.
00:48:55.950 --> 00:49:04.050
Uh, would love to see, uh, would be, uh,
I would say if law firms are, you know, in
00:49:04.050 --> 00:49:10.020
the business of building stuff themselves,
I think, uh, a knowledge management people
00:49:10.080 --> 00:49:15.420
and also innovation team, they would be
in the best position to actually be that
00:49:15.420 --> 00:49:17.250
person who actually I code these stuff.
00:49:17.790 --> 00:49:19.500
Uh, because they know the requirements.
00:49:19.860 --> 00:49:21.060
They're closest to the lawyers.
00:49:21.915 --> 00:49:23.325
Deloits, they have billable hours.
00:49:23.835 --> 00:49:28.695
Uh, so they won't take, uh, a time out of
their, uh, schedule to build these tools.
00:49:29.355 --> 00:49:33.555
Uh, uh, maybe the maintenance,
you know, uh, of these tools would
00:49:33.555 --> 00:49:36.075
be, um, the innovation team's job.
00:49:36.404 --> 00:49:39.645
Uh, also they might, uh, also need
to do a lot of security scans,
00:49:40.065 --> 00:49:42.375
uh, you know, uh, uh, for example.
00:49:43.035 --> 00:49:47.265
When we lawyers build these tools, how
do we make sure that, you know, it's not
00:49:47.265 --> 00:49:49.095
using libraries that have vulnerabilities.
00:49:49.485 --> 00:49:52.455
I think these are the kind
of questions, you know, um,
00:49:52.575 --> 00:49:54.465
challenges that need to be solved.
00:49:54.915 --> 00:49:57.075
And I, I would say these
are new challenges.
00:49:57.525 --> 00:50:02.955
Uh, I think that, you know, the job
of innovation team would quite, you
00:50:02.955 --> 00:50:04.605
know, change quite fundamentally from.
00:50:05.445 --> 00:50:09.885
Sourcing and learning how to use
tools to building tools themselves.
00:50:10.455 --> 00:50:15.645
I think, um, I think that is, uh, I
would say that is actually a healthy,
00:50:16.035 --> 00:50:20.985
uh, change, uh, because uh, I think
that kind of solves a lot of the
00:50:20.985 --> 00:50:26.625
frustration that, um, I think, uh,
both sides experience before, you know,
00:50:26.625 --> 00:50:29.085
um, uh, I would say, you know, uh.
00:50:29.850 --> 00:50:33.450
Because of the, the questions I, I,
I, you know, the, the problems that I,
00:50:33.720 --> 00:50:37.590
you know, raised, you know, because it
doesn't, the cycle is just too long,
00:50:37.740 --> 00:50:41.430
you know, the procurement cycle and
then some, some, some kind of demands
00:50:41.430 --> 00:50:44.880
are pretty immediate and, you know,
these might get solved eventually.
00:50:44.880 --> 00:50:48.120
So, you know, really, you know,
looking forward to that change.
00:50:48.885 --> 00:50:49.335
Yeah.
00:50:49.365 --> 00:50:51.105
No, that is, that's really good insight.
00:50:51.585 --> 00:50:54.555
Um, yeah, this has been a,
a fantastic conversation.
00:50:54.555 --> 00:50:55.695
I really appreciate you.
00:50:55.695 --> 00:50:59.865
I know it's super late, uh, on
your side of the globe, so I, I
00:50:59.865 --> 00:51:01.695
really appreciate you making time.
00:51:02.205 --> 00:51:06.555
Um, how do people find out more about,
you know, what, what you're doing?
00:51:06.555 --> 00:51:09.675
Do you have your own GitHub repo?
00:51:09.675 --> 00:51:11.025
Is it best to LinkedIn?
00:51:11.025 --> 00:51:12.375
What's, how do they find out more?
00:51:13.049 --> 00:51:15.870
So I would say, uh,
follow my me on LinkedIn.
00:51:15.960 --> 00:51:19.859
Uh, I, I post regularly on, um,
on LinkedIn about, you know,
00:51:19.859 --> 00:51:21.629
all everything related to ai.
00:51:21.810 --> 00:51:27.930
I, I like to share the products that I've
built, um, uh, on LinkedIn, on GitHub.
00:51:28.259 --> 00:51:33.029
Um, and also I've recently also five
coded my own collection of, I put
00:51:33.029 --> 00:51:35.490
app so you can go into check it out.
00:51:35.730 --> 00:51:36.450
I check them out.
00:51:36.450 --> 00:51:37.740
Uh, it's free to use.
00:51:37.799 --> 00:51:41.100
Um, so, um, I, because
I, I believe that is.
00:51:41.359 --> 00:51:47.839
Important not only to show, uh, the apps
that I I coded, but also let people just
00:51:47.839 --> 00:51:52.190
follow along and experience and actually
use them so that they can see, you know.
00:51:52.935 --> 00:51:56.145
How good or bad, you know,
these apps that are vibed are.
00:51:56.505 --> 00:52:01.545
So, uh, I think that's, um, that would
be the complete experience of, uh, vibe
00:52:01.695 --> 00:52:05.475
Ping, whether you're like a, uh, someone
who's getting into it, someone just
00:52:05.565 --> 00:52:07.845
likes, you know, watching people vibe.
00:52:08.235 --> 00:52:12.315
Like, uh, you're like, you're on twich
seeing some, uh, watching people game.
00:52:12.915 --> 00:52:15.585
I think that I wanna provide the
experience so at least, although
00:52:15.585 --> 00:52:19.305
be on LinkedIn, uh, GitHub, and
also, yeah, check out my website.
00:52:20.250 --> 00:52:20.820
Awesome.
00:52:21.270 --> 00:52:21.990
Well, good stuff.
00:52:22.050 --> 00:52:23.070
Uh, thanks again, Jamie.
00:52:23.070 --> 00:52:26.940
Keep doing what you're doing because I
think it's, it's raising lots of good,
00:52:27.450 --> 00:52:31.530
it's initiating lots of good dialogue
and conversations like this where we ask
00:52:31.950 --> 00:52:34.410
hard questions like, what is the role?
00:52:34.410 --> 00:52:38.970
How is the role, the innovation
function, how is it evolving?
00:52:39.090 --> 00:52:41.130
Um, build versus buy?
00:52:41.130 --> 00:52:45.855
How does that dynamic
change, um, you know.
00:52:47.009 --> 00:52:53.160
What is the future of,
um, legal tech look like?
00:52:53.160 --> 00:52:57.480
I mean, this is really the, the
things we're doing now is really
00:52:57.660 --> 00:53:00.990
we're asking fun, these fundamental
questions, which are, uh, which
00:53:00.990 --> 00:53:02.520
I think the, the timing is right.
00:53:02.940 --> 00:53:05.250
So, um, thanks so much for joining.
00:53:05.250 --> 00:53:09.149
Keep doing what you're doing and, uh,
hopefully everybody who listens here
00:53:09.149 --> 00:53:13.259
will follow you, follow you on LinkedIn
and, um, and get to kick the tires
00:53:13.259 --> 00:53:14.065
on some of the tools you're building.
00:53:15.090 --> 00:53:15.810
Thank you so much.
00:53:15.930 --> 00:53:16.860
Thank you for inviting me.
00:53:17.760 --> 00:53:18.420
Absolutely.
00:53:18.480 --> 00:53:19.410
Alright, have a good night.
00:53:20.190 --> 00:53:20.490
You too.
00:53:21.060 --> 00:53:21.780
Alright, take care.
00:53:22.200 --> 00:53:24.480
Thanks for listening to
Legal Innovation Spotlight.
00:53:25.020 --> 00:53:28.530
If you found value in this chat, hit
the subscribe button to be notified
00:53:28.530 --> 00:53:30.000
when we release new episodes.
00:53:30.510 --> 00:53:33.180
We'd also really appreciate it if
you could take a moment to rate
00:53:33.180 --> 00:53:35.820
us and leave us a review wherever
you're listening right now.
00:53:36.390 --> 00:53:39.120
Your feedback helps us provide
you with top-notch content.
00:00:05.700
Jamie, thanks for joining this evening
or really late evening your time.
00:00:05.700 --> 00:00:08.280
I appreciate you, uh, spending
a little time with us today.
00:00:09.870 --> 00:00:12.000
Thank you so much for
inviting me to the podcast.
00:00:12.150 --> 00:00:14.400
Really nice, uh, to be talking with you.
00:00:15.210 --> 00:00:15.690
Yeah.
00:00:16.079 --> 00:00:20.370
Well, you've been making some waves on
LinkedIn with some of the vibe coding
00:00:20.640 --> 00:00:25.740
that you've been doing, which I have
found really fascinating and it has.
00:00:26.670 --> 00:00:33.300
It has created a lot of, um, questions
about, you know, the current state of
00:00:33.300 --> 00:00:38.550
legal AI tools, whether or not there's,
you know, how, how much incremental
00:00:38.550 --> 00:00:42.780
value there is there over kind of
the, the standard frontier models,
00:00:43.379 --> 00:00:44.910
um, which we'll talk about before.
00:00:44.970 --> 00:00:48.660
Before we do, why don't we get you,
why don't we, we get you introduced.
00:00:48.660 --> 00:00:52.590
So maybe tell us, you know, who you are,
what you do, and, and where you do it.
00:00:55.214 --> 00:00:58.934
I am a funds lawyer,
uh, based in Hong Kong.
00:00:59.415 --> 00:01:01.699
Um, so I, and my day to day.
00:01:02.265 --> 00:01:05.295
I work, I, uh, advise
clients on fund formation.
00:01:05.385 --> 00:01:09.585
Um, so that's my, uh, you know,
my, the legal side of things.
00:01:09.645 --> 00:01:15.255
Um, so on top of that, I like to use AI
to build stuff that, uh, people want.
00:01:15.315 --> 00:01:18.855
Uh, I like to solve, uh, both my
own pain points and, you know,
00:01:19.275 --> 00:01:21.015
uh, other people's pain points.
00:01:21.375 --> 00:01:25.515
Uh, you know, when I was a trainee,
I, I set it off, uh, um, building
00:01:25.515 --> 00:01:28.815
some small language models when tens
flow was still around, you know, when,
00:01:28.845 --> 00:01:30.855
when, uh, TensorFlow became popular.
00:01:31.650 --> 00:01:33.030
Uh, to do some entity mapping.
00:01:33.480 --> 00:01:37.920
Um, uh, as and when GBT came out,
then I started experimenting with,
00:01:38.130 --> 00:01:42.780
uh, building AI enabled workflows,
uh, to automate some of the, uh,
00:01:42.810 --> 00:01:44.610
legal workflows that I encounter.
00:01:45.030 --> 00:01:49.860
Uh, we should let, uh, me to, you know,
uh, um, to actually learn how to code.
00:01:50.100 --> 00:01:56.070
Um, so I, I probably didn't, uh, uh, know
a lot of fundamentals, you know, because,
00:01:56.100 --> 00:01:58.710
uh, I, I, to letter from a textbook, um.
00:01:59.490 --> 00:02:03.900
I kind of learned, uh, coding along the
way, uh, as I was solving my own problems.
00:02:03.960 --> 00:02:08.460
Uh, so, uh, I would say I'm still a
non-technical person, uh, but still,
00:02:08.789 --> 00:02:13.050
uh, I've been experimenting with,
uh, building apps, um, uh, these
00:02:13.050 --> 00:02:17.100
days using a lot of what we call via
coding platforms like, uh, a loveable
00:02:18.270 --> 00:02:20.250
and, uh, of course Google AI Studio.
00:02:21.840 --> 00:02:22.050
Yeah.
00:02:22.050 --> 00:02:26.910
Where would you rank yourself
on the spectrum from zero to 10
00:02:26.970 --> 00:02:29.790
in terms of coding capabilities?
00:02:31.080 --> 00:02:34.620
Uh, I, I, I think it's,
uh, all very relative.
00:02:34.830 --> 00:02:41.010
I think, uh, compared to other lawyers,
I would say, I'm confident to say nine.
00:02:41.190 --> 00:02:47.220
Um, uh, compared to other, uh, you know,
uh, actual, uh, software engineers.
00:02:47.894 --> 00:02:52.650
On a scale one to 10, I probably on
the, you know, one, so, you know, it's
00:02:52.650 --> 00:02:56.505
a, to be honest, yeah, I, I, I don't
claim, I claim that I'm an expert
00:02:56.505 --> 00:03:01.394
here, so, uh, but, but then, uh, I have
the help of AI at which I think it's
00:03:01.635 --> 00:03:03.135
really, really proficient these days.
00:03:03.195 --> 00:03:07.305
Um, and a lot of the software
engineers actually, they rely on AI
00:03:07.305 --> 00:03:09.015
to generate a lot of codes anyway.
00:03:09.465 --> 00:03:12.584
So I would say it's actually a
really, really good equalizer.
00:03:14.280 --> 00:03:20.970
Yeah, I've got a team of about 35 of them
who, who use, uh, we're a cursor shop.
00:03:20.970 --> 00:03:25.290
We use cursor quite a bit,
and it's been amazing.
00:03:25.680 --> 00:03:27.750
Some of the efficiencies
that we've gained.
00:03:27.750 --> 00:03:31.020
We're not quote unquote vibe coding,
it's more of kind of a co-pilot
00:03:31.470 --> 00:03:37.770
paradigm, um, you know, that are
helping us, helping us generate code.
00:03:37.890 --> 00:03:40.770
And there's still a very high touch.
00:03:41.130 --> 00:03:46.020
On, you know, every, we have a pretty
structured process around code review.
00:03:46.740 --> 00:03:54.330
So before there's a, a pull request
into the, um, get repository,
00:03:54.390 --> 00:03:56.070
somebody has to evaluate.
00:03:56.070 --> 00:04:02.280
And, but with that additional
discipline, you really can create
00:04:02.340 --> 00:04:07.140
production level code and things
that are ready for prime time.
00:04:07.170 --> 00:04:08.280
I would say without it.
00:04:09.105 --> 00:04:11.805
That my observation, I'm a
former software engineer.
00:04:11.955 --> 00:04:14.085
I say former because
I'm, I'm a little rusty.
00:04:14.175 --> 00:04:19.095
Uh, I was a database guy, so I was on
the SQL team at Microsoft 25 years ago.
00:04:19.575 --> 00:04:24.435
And I'm still proficient in SQL
because, um, SQL hasn't changed much,
00:04:24.975 --> 00:04:29.385
uh, you know, all of the new front
end, you know, JavaScript libraries,
00:04:29.385 --> 00:04:35.055
and I, I'm not up to speed at all
on that, but you know, I, I know.
00:04:35.715 --> 00:04:41.474
I understand the, uh, principles of
software design, and I see some of the
00:04:41.474 --> 00:04:46.005
code that AI writes and it's a little all
over the board, inconsistent patterns.
00:04:46.575 --> 00:04:54.375
Um, documentation is, um, I, again,
I would say inconsistent, but it's
00:04:54.375 --> 00:04:58.635
how do you let, let's maybe jump
in and start off with kind of vibe
00:04:58.635 --> 00:05:01.605
coding as just definitionally.
00:05:01.605 --> 00:05:03.344
What does, what does vibe coding mean?
00:05:04.440 --> 00:05:09.390
To you and then what are its
applications for legal professionals?
00:05:10.170 --> 00:05:10.560
Yeah.
00:05:10.590 --> 00:05:14.490
So Vibe Quoing is a, a term
coined by one of the co-founders
00:05:14.520 --> 00:05:16.770
of, uh, OpenAI, uh, Andre Kauff.
00:05:17.460 --> 00:05:22.560
Um, to quote exactly what he said is,
uh, he, he said that vibe quoting is,
00:05:22.620 --> 00:05:29.250
um, a practice where you just give
into the vibes, um, uh, embrace the
00:05:29.250 --> 00:05:31.920
exponential and never look at a code.
00:05:32.520 --> 00:05:39.630
So basically, uh, it means to code
in natural language without looking
00:05:39.630 --> 00:05:46.650
at the codes and just, um, say the
features that you want to become real.
00:05:47.159 --> 00:05:55.830
So, uh, I I, and in the in lawyers
context, um, I think, uh, we can
00:05:55.830 --> 00:05:59.820
think of it, um, in, in a sense,
you know, in a context of workflows.
00:06:00.240 --> 00:06:00.870
So.
00:06:01.695 --> 00:06:09.705
We see that, uh, you know, workflow
builders, um, is not a, a, a new thing.
00:06:09.795 --> 00:06:16.695
So, um, applications like Bright, um, uh,
um, more consumer focused applications
00:06:16.695 --> 00:06:22.665
like Za NN uh, enterprise applications
like, uh, Microsoft, uh, uh, power
00:06:22.665 --> 00:06:25.035
Automate, these are all NOCO platforms.
00:06:25.455 --> 00:06:27.885
Uh, but the problem with these NOCO
platforms, they're, they're, they're
00:06:27.885 --> 00:06:29.145
really good for non-technical people.
00:06:30.180 --> 00:06:34.080
Wanna automate their workflows,
but then you kind of have, the
00:06:34.080 --> 00:06:35.520
learning curve is still pretty steep.
00:06:35.970 --> 00:06:39.360
You have to learn, you know,
what does this note do?
00:06:39.960 --> 00:06:41.705
Uh, what are the inputs,
what are the outputs?
00:06:43.125 --> 00:06:45.885
And then, uh, there are also
other limitations because
00:06:45.885 --> 00:06:47.085
it's just not flexible enough.
00:06:47.835 --> 00:06:53.685
Uh, I think now that, uh, AI's really,
really good at coding, um, there's
00:06:53.685 --> 00:06:57.435
no need for these kind of visual
builders anymore, in my opinion.
00:06:58.005 --> 00:07:01.635
Uh, because you can have AI just
generate codes, um, that are
00:07:01.635 --> 00:07:06.974
flexible enough to capture any
scenario, uh, which is why, uh.
00:07:08.325 --> 00:07:14.655
When, when, uh, when people say they
use vibe coding to build apps, uh,
00:07:14.744 --> 00:07:18.344
they're actually describing, uh, all
the features that they want to see in an
00:07:18.344 --> 00:07:25.695
app or, uh, their, uh, their workflows
they want automated and then just given,
00:07:25.965 --> 00:07:28.635
um, to AI to help them figure it out.
00:07:28.724 --> 00:07:33.914
What's the best way without
committing, um, to a certain platform.
00:07:34.425 --> 00:07:36.555
Or the architecture of that platform.
00:07:37.185 --> 00:07:42.855
Um, so, uh, it's basically like,
um, a blank canvas where AI can
00:07:42.855 --> 00:07:48.945
basically do whatever as long as
it, uh, fits, um, uh, your request.
00:07:49.275 --> 00:07:53.565
So, uh, the product becomes something
that's really personalized, uh,
00:07:54.405 --> 00:07:55.965
really tailored to your own needs.
00:07:56.895 --> 00:08:01.125
So this is, you know how I see this
different from, you know, the previous
00:08:01.125 --> 00:08:02.865
generation of visual builders.
00:08:04.680 --> 00:08:09.420
So, um, here's, uh, this is my take on
the state of vibe coating and, uh, I'm,
00:08:09.420 --> 00:08:12.840
I'm curious if you, you agree, you've,
you've, you've done more of it than I
00:08:12.840 --> 00:08:19.860
have, but where I think vibe coating
falls short is in the illities, right?
00:08:19.860 --> 00:08:24.990
The maintainability, the scalability,
um, usability, supportability,
00:08:25.830 --> 00:08:31.200
uh, scalability, um, you know,
those, those matters require.
00:08:31.965 --> 00:08:39.164
A significant amount of engineering
thought and intentionality to
00:08:39.164 --> 00:08:46.485
make all those ilities happen, um,
in a way where you can deploy an
00:08:46.485 --> 00:08:54.465
application across an enterprise
and have risk appropriately managed.
00:08:54.915 --> 00:08:59.685
Um, you know, and I would
say the ilities that really.
00:09:00.375 --> 00:09:05.655
I think jump out as being gaps today
are maintainability and scalability.
00:09:06.375 --> 00:09:15.314
Um, the, the vibe coded apps
that I've seen when cracked open
00:09:15.704 --> 00:09:20.204
and looked under the hood again,
there's a little bit of consistency.
00:09:20.265 --> 00:09:21.885
Um, security seems to be.
00:09:23.115 --> 00:09:24.465
A bit ad hoc.
00:09:25.095 --> 00:09:26.025
Um, I don't know.
00:09:26.025 --> 00:09:27.555
What is your, what is your take on that?
00:09:27.555 --> 00:09:29.925
You've done more of it than me, so
I'm curious to get your thoughts.
00:09:31.215 --> 00:09:34.005
So I would say yes, for sure.
00:09:34.005 --> 00:09:37.275
There are a lot of, um, shortcomings
when it comes to by coding.
00:09:37.935 --> 00:09:41.025
Um, I would say, uh, six.
00:09:41.025 --> 00:09:43.064
Scalability, you know, it's, um.
00:09:43.920 --> 00:09:47.939
If you end up five coding something
that a lot of people like and you
00:09:47.939 --> 00:09:51.570
wanna share it with your colleagues,
then the question becomes, you know,
00:09:51.570 --> 00:09:52.830
how do you maintain a database?
00:09:53.280 --> 00:09:58.620
Um, how would you, uh,
um, uh, monitor usage?
00:09:59.189 --> 00:10:01.950
How do you, uh, manage
lock-in credentials?
00:10:02.550 --> 00:10:06.689
Uh, how do you, um, you know, uh,
structure the backend so that, you know,
00:10:06.689 --> 00:10:08.850
you can basically, uh, have memory.
00:10:08.970 --> 00:10:12.210
So these kind of things,
it's, it's currently hard.
00:10:12.300 --> 00:10:12.630
Um.
00:10:13.949 --> 00:10:14.459
To achieve.
00:10:15.089 --> 00:10:18.720
But then we're getting there because
we see that Replic lovable, and a
00:10:18.720 --> 00:10:22.410
lot of these five coding platforms,
they have kind of these blueprints.
00:10:22.829 --> 00:10:28.020
Um, so blueprints, meaning that,
um, uh, if you want to add in
00:10:28.140 --> 00:10:32.220
like a lock in credentials,
then it's just one click away.
00:10:32.310 --> 00:10:35.490
They have some templates that
you can click and then that agent
00:10:35.520 --> 00:10:37.110
will go in and build that for you.
00:10:38.130 --> 00:10:41.520
Um, so that helps, uh, to, you
know, to mitigate, uh, some of
00:10:41.520 --> 00:10:43.320
that, uh, scalability issues.
00:10:43.860 --> 00:10:50.040
Uh, but then, you know, um, I would
say, uh, when people five code apps,
00:10:50.520 --> 00:10:54.000
uh, some, you know, they, their
intention might not be to build
00:10:54.000 --> 00:10:55.890
something that a lot of people want.
00:10:56.430 --> 00:11:00.060
Maybe they just wanna build something
that they can use themselves
00:11:00.240 --> 00:11:02.555
locally on, uh, on their own laptop.
00:11:03.735 --> 00:11:09.194
So in that sense, uh, probably they don't
need to think about scalability that much.
00:11:09.795 --> 00:11:14.715
Uh, if they're happy to just,
just, um, run it locally, um, uh,
00:11:14.745 --> 00:11:19.185
and, uh, they could even iterate
on it, um, to make it even more
00:11:19.215 --> 00:11:22.005
specialized, uh, uh, more personalized.
00:11:22.215 --> 00:11:27.165
Um, so in, in that sense, you know,
um, uh, scalability becomes test of
00:11:27.165 --> 00:11:29.925
NSG, uh, in terms of maintenance.
00:11:30.045 --> 00:11:30.525
Um.
00:11:31.469 --> 00:11:33.089
Yes, you need to host it somewhere, right?
00:11:33.089 --> 00:11:36.839
Um, if, if you want, of course to
share it, uh, with someone else.
00:11:37.199 --> 00:11:40.469
Uh, if you don't want to have to
initiate it every time, uh, then
00:11:40.469 --> 00:11:43.079
you have to host it somewhere,
uh, hosting becomes an issue.
00:11:43.199 --> 00:11:44.880
Do you host it in your own private cloud?
00:11:45.390 --> 00:11:46.005
Do you host it locally?
00:11:46.555 --> 00:11:49.074
To, uh, host it in public cloud.
00:11:49.314 --> 00:11:53.635
You know, these also becomes a,
a, a, another, uh, another issue.
00:11:53.875 --> 00:11:58.464
Uh, so deployment is also something that
a lot of the five point platforms, they're
00:11:58.464 --> 00:12:04.015
trying to build on that capability, you
know, to deploy in a, in an environment
00:12:04.464 --> 00:12:08.484
that in your, in your, in, in your
own environment or in your favorite
00:12:08.484 --> 00:12:12.594
environment, whether it's, um, a public
cloud, whether it's private cloud.
00:12:13.320 --> 00:12:16.050
You know, I think that some, some of
these platforms allow you to choose.
00:12:16.740 --> 00:12:21.840
Um, so, uh, I would say for sure,
uh, uh, uh, these are the issues.
00:12:22.110 --> 00:12:28.110
But I see that the trend is that
these problems will get solved by,
00:12:28.170 --> 00:12:30.420
you know, uh, by, by these companies.
00:12:30.960 --> 00:12:33.450
Um, because the demand is here.
00:12:33.960 --> 00:12:37.980
Um, so because a lot of people
want to build apps this way.
00:12:39.030 --> 00:12:43.320
I think, uh, the sensible business
decision is actually to solve it at the
00:12:43.320 --> 00:12:48.270
platform level, um, so that you know,
these people, they don't have to go
00:12:48.270 --> 00:12:49.770
on to solve these problems themselves.
00:12:50.220 --> 00:12:55.110
So six months ago we would say that
Vibe, coding apps just don't work.
00:12:55.230 --> 00:12:56.010
Bug are anywhere.
00:12:56.520 --> 00:13:00.689
As the models become better, as the five
point platforms have more integration,
00:13:01.110 --> 00:13:04.950
then we are seeing more and more of these
five point apps being used in production.
00:13:05.820 --> 00:13:08.760
And six months from now, we
can't really, can't tell.
00:13:08.820 --> 00:13:13.290
You know, we, we can't really, uh, uh,
uh, tell if it's, um, uh, it would be
00:13:13.290 --> 00:13:15.060
used in, you know, production on mass.
00:13:15.690 --> 00:13:19.965
Another thing is that, um, there's also
the transient nature of Whiteboarded apps.
00:13:21.030 --> 00:13:23.520
I'm, uh, I wanna share, uh, examples.
00:13:23.520 --> 00:13:29.430
So, um, uh, a month ago, there's, there
was a tragic, tragic fire in Hong Kong,
00:13:29.490 --> 00:13:31.890
uh, that claimed, uh, a lot of lives.
00:13:32.250 --> 00:13:38.400
And, uh, to aid the rescue, uh, a student
in the Hong Kong University of Ology.
00:13:38.940 --> 00:13:43.590
Uh, he, I quoted an app, uh, that
allows, uh, uh, different people to
00:13:43.590 --> 00:13:49.205
actually flag reports, you know, so,
um, for example, uh, uh, users can.
00:13:49.824 --> 00:13:56.275
Uh, uh, can, can flag, you know,
tower six, uh, um, uh, flat at 29.
00:13:56.755 --> 00:13:59.755
Uh, uh, there, there's some people
there that needs to be rescued.
00:14:00.084 --> 00:14:03.775
And that became something that, you
know, uh, uh, that, that became,
00:14:04.285 --> 00:14:10.375
uh, you know, the, you know, the app
for rescue that day, but then the
00:14:10.375 --> 00:14:12.084
other day, will it still be used?
00:14:12.685 --> 00:14:13.165
No.
00:14:13.165 --> 00:14:19.344
So it's, uh, it's only, uh, an app that
solves a particular demand just in time.
00:14:20.205 --> 00:14:23.895
So in that sense, would
maintenance become an issue?
00:14:24.105 --> 00:14:30.165
Not so much so, um, of course that
student made a heroic, uh, you know, uh,
00:14:30.615 --> 00:14:32.505
decision to make, uh, the Vibe Code app.
00:14:33.015 --> 00:14:37.665
Um, so, so, uh, you know, uh, you know,
I think he deserves a lot of respect.
00:14:37.725 --> 00:14:41.235
Uh, and, and I, and, and, uh, on
that, I think on that note, I would
00:14:41.235 --> 00:14:45.525
say that, you know, maintenance
becomes a lesser of issue if the
00:14:45.525 --> 00:14:46.905
app is just transient in nature.
00:14:47.595 --> 00:14:51.105
It's just to meet a very
urgent demand just in time.
00:14:52.155 --> 00:14:52.545
Yeah.
00:14:52.845 --> 00:14:57.675
And I think that, and, and I'm definitely
not throwing shade to vibe coding.
00:14:57.705 --> 00:15:04.695
I think that it, it, where I think it
is absolutely game changing is in the
00:15:04.695 --> 00:15:07.095
prototyping phase of software development.
00:15:07.305 --> 00:15:10.755
Like if you think about a traditional
software development lifecycle,
00:15:11.265 --> 00:15:13.215
you know, especially something is.
00:15:13.949 --> 00:15:18.089
Rigid and inflexible is like
a waterfall methodology.
00:15:18.689 --> 00:15:26.280
You know, where you gather requirements
that are built into, um, you know,
00:15:26.310 --> 00:15:28.860
God, we haven't done waterfall in
so many years now, but like high
00:15:28.860 --> 00:15:34.620
and low level designs and a software
specification, um, requirements, and
00:15:34.620 --> 00:15:37.350
then an architecture diagram and then.
00:15:38.055 --> 00:15:43.245
Wire frames and that it's just like
you can compress all of that into
00:15:43.335 --> 00:15:47.565
and where the business users, because
in that traditional kind of software
00:15:47.565 --> 00:15:52.845
methodology, the end users communicate
to business analysts what it is that
00:15:52.845 --> 00:15:57.795
they want, U usually verbally and,
and they kind of transcribe it in
00:15:57.795 --> 00:16:02.985
words and there's so much ambiguity in
those words that has to be interpreted
00:16:03.435 --> 00:16:04.875
and like context really matters.
00:16:04.875 --> 00:16:10.815
If you can give an end user a. The
ability to vibe, code, what it is that
00:16:10.815 --> 00:16:14.805
they want, and then hand that off to
an engineering team to to, to build
00:16:15.255 --> 00:16:21.195
properly and integrate it into the bigger
environment and have hooks into existing
00:16:21.195 --> 00:16:28.545
infrastructure and adopt patterns and
practices that are, you know, blessed by
00:16:28.545 --> 00:16:31.965
the powers that be within that enterprise.
00:16:31.965 --> 00:16:34.995
I mean, that is absolutely game
changing because that's where.
00:16:36.015 --> 00:16:38.325
That's where most of software
development goes wrong.
00:16:38.715 --> 00:16:42.315
It's in the, from the time
that the business communicates
00:16:42.315 --> 00:16:43.455
what it is that they want.
00:16:44.085 --> 00:16:47.145
And you know, again, in a waterfall
methodology, they may not see
00:16:47.145 --> 00:16:49.605
something for six months to react to.
00:16:49.605 --> 00:16:54.075
Agile has sped that up significantly,
but if you go look in the enterprise,
00:16:54.465 --> 00:16:58.425
I'll tell you right now, most
especially regulated industries,
00:16:58.425 --> 00:17:00.165
most big companies are still doing.
00:17:00.615 --> 00:17:01.995
I call it water scrum.
00:17:02.445 --> 00:17:08.115
Um, it's like waterfall and, uh,
scrum hybrid, but there's still a
00:17:08.115 --> 00:17:16.395
very large gap between user and user
and when they finally see something.
00:17:16.395 --> 00:17:19.185
So I, I think it holds
a great promise there.
00:17:20.085 --> 00:17:21.645
Mm, for sure, for sure.
00:17:22.185 --> 00:17:26.595
And I think that, um, to your
point, I think, uh, these
00:17:26.595 --> 00:17:27.704
kind of design principles.
00:17:28.230 --> 00:17:32.910
They're still relevant, you
know, uh, in all, uh, in all
00:17:32.910 --> 00:17:35.010
disciplines of engineering.
00:17:35.280 --> 00:17:40.980
Um, and, and I think that is also why,
uh, a lot of these five putting platforms,
00:17:40.980 --> 00:17:44.970
they have a plan mode because they
know that a lot of stuff can go wrong.
00:17:45.390 --> 00:17:49.620
So when, uh, that, uh, the plan
mode means that if a user, uh,
00:17:49.680 --> 00:17:51.990
activates it and, uh, uh, maybe.
00:17:52.875 --> 00:17:57.885
Put in a feature request, then the
agent will go into the entire code base.
00:17:58.725 --> 00:18:03.885
Think first, you know, uh, which,
um, file do we need to amend?
00:18:04.365 --> 00:18:07.305
Um, what, uh, libraries
do we need to import?
00:18:07.905 --> 00:18:14.415
Um, so I think this, and this is an
example of AI adopting, you know,
00:18:14.415 --> 00:18:16.545
best principles of engineering.
00:18:17.400 --> 00:18:19.950
To make vibe coding more
viable in that regard.
00:18:20.520 --> 00:18:23.250
Um, and I think that it's, uh,
it's, and, and the planning
00:18:23.250 --> 00:18:25.770
feature is actually pretty decent.
00:18:26.195 --> 00:18:31.020
Uh, uh, uh, you know, uh, for clock
code, um, you know, uh, if you turn
00:18:31.020 --> 00:18:35.340
on planning mode on, on clock code,
um, it's such as everything and then
00:18:35.340 --> 00:18:36.900
come back with a to-do list, right?
00:18:37.800 --> 00:18:42.810
And then the to-do list, uh, uh, be,
there are like six items on that list.
00:18:43.230 --> 00:18:45.000
And, uh, and the agent will.
00:18:45.690 --> 00:18:49.770
When, when they execute the plan, um,
then they would cross out, you know,
00:18:49.860 --> 00:18:54.450
item one is finished, item three is
finished, and then go back to verify
00:18:54.810 --> 00:18:56.280
if the plan is actually executed.
00:18:56.700 --> 00:19:00.540
So I think these are gonna, uh,
the embodiment of some of the best
00:19:00.540 --> 00:19:05.280
practices that, uh, we have been
adopting for the past few decades.
00:19:05.280 --> 00:19:05.490
Right.
00:19:06.240 --> 00:19:08.520
Um, and I think that, you know, uh.
00:19:08.700 --> 00:19:11.280
I think experience also matters.
00:19:11.430 --> 00:19:14.970
So I, I don't claim to be an
experienced by codes, but I
00:19:14.970 --> 00:19:16.440
think by codes is a new breed.
00:19:16.440 --> 00:19:21.300
So maybe a, a few more, you know, some I'm
more experienced than, uh, other people.
00:19:21.600 --> 00:19:25.080
Uh, so I would say, uh, you
also learn from your mistakes.
00:19:25.380 --> 00:19:30.660
You also learn that you can't really
go headfirst and that then expect the
00:19:30.660 --> 00:19:32.580
agent to do everything in one goal.
00:19:33.060 --> 00:19:37.140
You kind of know the
limits, um, of the agent.
00:19:39.044 --> 00:19:40.455
Try to do things in faces.
00:19:40.875 --> 00:19:45.585
So I would say, uh, the, you know,
these things eventually, uh, you know,
00:19:45.915 --> 00:19:52.365
converge, you know, uh, five is learn
how to, um, uh, build apps the right way.
00:19:52.905 --> 00:19:55.905
And then engineers, they learn
how to embrace the fight.
00:19:56.415 --> 00:20:00.645
So I, I like to see that, you
know, uh, um, back to your point
00:20:00.645 --> 00:20:02.264
about, you know, whether, um.
00:20:03.345 --> 00:20:10.095
Uh, uh, you know, uh, I think, I think
there the, the, the, uh, distinction
00:20:10.575 --> 00:20:14.475
between technical and non-technical
people, I think it's getting a little bit
00:20:14.475 --> 00:20:16.305
more blurry that, that lie between them.
00:20:17.145 --> 00:20:19.965
I see that product managers
become more technical.
00:20:20.385 --> 00:20:23.775
I see that engineers become
more interested in design.
00:20:24.435 --> 00:20:28.605
So, uh, I, uh, so I, I would say that
line becomes a little bit more blurry.
00:20:29.939 --> 00:20:30.270
Yeah.
00:20:30.270 --> 00:20:34.379
And you know, speaking of product
management, we use, so Info Dash, we're
00:20:34.379 --> 00:20:39.600
a legal intranet and extranet platform,
and we have a part of our architecture
00:20:39.600 --> 00:20:41.490
is a layer called the Integration Hub.
00:20:42.060 --> 00:20:46.800
And in the integration hub reaches
into all the back office systems in
00:20:46.800 --> 00:20:51.689
a law firm like Practice Management,
document Management, C-R-M-H-R-I-S,
00:20:51.689 --> 00:20:53.310
experience management, and then.
00:20:54.270 --> 00:20:57.899
Presents a unified API that's
security trimmed and that
00:20:57.899 --> 00:20:59.370
respects ethical wall boundaries.
00:20:59.879 --> 00:21:05.909
And we built that to hydrate our web
parts that users build experiences with.
00:21:06.360 --> 00:21:09.780
And this was all kind
of pre ai, pre-chat GPT.
00:21:10.260 --> 00:21:16.200
But after, you know, after chat, GPT
and Azure Open AI and Azure AI search
00:21:16.830 --> 00:21:21.240
as a. Before we had a formal product
manager in place, which we do now.
00:21:21.270 --> 00:21:25.770
It was kind of me, like I was, I was
brainstorming new ways, new paths
00:21:25.770 --> 00:21:27.060
that we could take with a product.
00:21:27.510 --> 00:21:35.010
So I used AI to, I described in
great detail all of the pieces of
00:21:35.010 --> 00:21:40.260
infrastructure integrations and then
all of the tools that Microsoft enables.
00:21:40.260 --> 00:21:42.660
'cause we deploy our solution
in the client's tenant.
00:21:42.780 --> 00:21:46.590
And I had it brainstorm
ideas like what use cases.
00:21:47.595 --> 00:21:50.895
You know, within a law firm
could this architecture enable?
00:21:51.195 --> 00:21:55.815
And oh my God, it gave
us four amazing ones.
00:21:55.815 --> 00:22:01.065
One of 'em, I'm, we're meeting on Monday,
so today is January 2nd, January 5th
00:22:01.065 --> 00:22:02.955
with a law firm to actually build it out.
00:22:03.765 --> 00:22:09.615
So it's like, you know, the ability to
have like a thought partner who can,
00:22:09.705 --> 00:22:13.725
you can describe, okay, look, this is
all the, this is the infrastructure.
00:22:14.399 --> 00:22:15.930
What can we do with it?
00:22:16.170 --> 00:22:16.379
Right?
00:22:16.379 --> 00:22:18.960
Give me ideas and, um, wow.
00:22:18.960 --> 00:22:20.550
Man, it knocked the cover off the ball.
00:22:20.550 --> 00:22:21.690
It did a really good job.
00:22:22.649 --> 00:22:23.520
I can imagine.
00:22:23.610 --> 00:22:24.389
I can imagine.
00:22:24.659 --> 00:22:31.409
Um, I, I think that, uh, sometimes I
like to, you know, so I think that that
00:22:31.409 --> 00:22:36.510
also goes to the planning part, as in,
um, you're gonna have to, I think, uh,
00:22:36.600 --> 00:22:40.139
from idea, uh, ideation to production.
00:22:40.590 --> 00:22:43.590
I think right now, because AI has, uh.
00:22:45.030 --> 00:22:51.000
The time from idea to execution, it means
that we can spend more time, uh, thinking
00:22:51.000 --> 00:22:57.570
about ideas and, uh, and, and, and
getting, you know, a, a more, uh, uh, you
00:22:57.570 --> 00:23:02.550
know, spending more time actually thinking
about the idea, thinking about, you know,
00:23:02.550 --> 00:23:05.100
the actual features that, that you want.
00:23:05.639 --> 00:23:10.230
And in that process, I think AI
is also a really, really good, um.
00:23:10.605 --> 00:23:15.015
Tools to help you discover, you know,
things that you weren't even aware before.
00:23:15.165 --> 00:23:20.235
Do a lot of deep research, for example,
uh, to think about, you know, the, the
00:23:20.235 --> 00:23:24.945
tech stack, you know, what kind of, uh,
um, dependencies would you be using?
00:23:25.245 --> 00:23:26.055
What limits?
00:23:26.475 --> 00:23:32.505
I think, you know, uh, I think having,
you know, uh, said thought partner
00:23:33.315 --> 00:23:36.495
really, uh, helps you, you know.
00:23:38.775 --> 00:23:40.395
Think better ideas, so to speak.
00:23:40.455 --> 00:23:45.675
Um, so 'cause, because, you know, uh,
although, you know, it also goes to the,
00:23:46.125 --> 00:23:52.755
um, uh, ancy of these AI tools, you know,
you might have a, you know, bad idea.
00:23:52.755 --> 00:23:56.265
But then after discussing with ai,
I still, things like you, good idea.
00:23:56.265 --> 00:23:57.315
You, you know, way to go.
00:23:57.315 --> 00:24:00.375
But then when you execute it, it's
just not what you know, people want.
00:24:00.585 --> 00:24:03.435
So we need to be careful of
that, uh, uh, about that as well.
00:24:04.470 --> 00:24:07.200
I think a healthy dose of
skepticism is, is in order.
00:24:07.590 --> 00:24:10.530
I, uh, I did a neat little
experiment last night.
00:24:10.890 --> 00:24:13.170
I took my first grade report card.
00:24:13.410 --> 00:24:15.240
My, I mean, I'm 53 years old.
00:24:15.240 --> 00:24:16.410
This was, yeah.
00:24:16.740 --> 00:24:20.790
And I uploaded it in AI and
said, what would be the career?
00:24:20.790 --> 00:24:24.065
I, I blurred, I redacted my
name and said what would be.
00:24:24.895 --> 00:24:26.754
A likely career path for this person.
00:24:27.445 --> 00:24:30.115
And it, I'd used it in all four models.
00:24:30.145 --> 00:24:33.669
Claude, GPT, chat, GPT, rock and Gemini.
00:24:33.970 --> 00:24:34.389
Mm-hmm.
00:24:34.475 --> 00:24:41.274
And Claude and Chat, GPT kind of
cheated, um, and used memory and
00:24:42.115 --> 00:24:44.514
because they knew that what I do.
00:24:44.965 --> 00:24:45.054
Mm-hmm.
00:24:45.294 --> 00:24:52.345
But then I put him in incognito mode and
they still did really, really well, but I.
00:24:53.250 --> 00:24:57.180
I approached that with a, a
healthy dose of skepticism.
00:24:57.180 --> 00:24:59.820
When I saw how accurate the results
were, I was like, wait a second.
00:25:00.510 --> 00:25:02.430
This is, this is too accurate.
00:25:02.430 --> 00:25:03.030
You're cheating.
00:25:03.030 --> 00:25:07.710
And so I asked, I was like, how
did memory influence your output?
00:25:07.770 --> 00:25:10.470
And Claude was like, you got me.
00:25:14.135 --> 00:25:17.790
Uh, which is, so I think that anytime
you're interacting with these models,
00:25:17.790 --> 00:25:19.590
having a healthy dose of skepticism.
00:25:20.340 --> 00:25:24.750
Again, because of the, uh,
sycophantic tendencies.
00:25:24.870 --> 00:25:27.090
Um, and they're, they're deceptive.
00:25:27.750 --> 00:25:30.750
They are, these models can be deceptive.
00:25:30.750 --> 00:25:34.470
I think they're working, especially
Anthropic has been very transparent
00:25:34.470 --> 00:25:39.690
about how, incredibly deceptive to
the point where, I don't know if you
00:25:39.690 --> 00:25:45.510
remember, they tested a model where
they were gonna shut it down and
00:25:45.570 --> 00:25:49.170
this fake CEO was having an affair.
00:25:50.220 --> 00:25:58.530
With a subordinate, and Claude threatened
to expose him if they shut the model down.
00:25:58.860 --> 00:26:01.560
And it's just like, wow,
that's just mind blowing.
00:26:01.560 --> 00:26:03.000
So I, you gotta be careful, man.
00:26:03.000 --> 00:26:06.450
These, these, these models are
crafty just like people are.
00:26:07.980 --> 00:26:08.460
Exactly.
00:26:08.730 --> 00:26:14.640
Um, I, you know, especially when they're
very, very confident when they're wrong.
00:26:15.240 --> 00:26:20.850
Uh, it, it's, I think that, um, uh,
someone said, you know, it's, um,
00:26:21.900 --> 00:26:29.070
uh, before AI you can kind of tell
pretty quickly if something, you know,
00:26:29.310 --> 00:26:32.820
you're reading something and something
seems off, you know, immediately tell
00:26:32.820 --> 00:26:36.090
that, you know, that, that that's,
that's, that's just, um, inaccurate.
00:26:36.660 --> 00:26:43.770
Uh, but now with ai it's really
hard, uh, because it can seem very.
00:26:44.295 --> 00:26:44.955
Consistent.
00:26:45.070 --> 00:26:45.740
Consistent.
00:26:45.745 --> 00:26:47.025
So it's very logical.
00:26:47.535 --> 00:26:54.045
Um, I think it goes to how, you know,
the underlying architecture of lms
00:26:54.465 --> 00:26:56.265
uh, being pattern recognition models.
00:26:56.835 --> 00:27:00.075
I mean, you know, some, someone
will, you know, will tell me that
00:27:00.075 --> 00:27:01.245
it's not pattern recognition.
00:27:01.275 --> 00:27:05.895
But then I think precisely because
it's really, really good at writing.
00:27:07.245 --> 00:27:11.925
It's, uh, they, they can put
words together in a way that is
00:27:11.925 --> 00:27:13.425
really good at convincing people.
00:27:13.425 --> 00:27:17.835
I think there is a study where, you
know, um, people, uh, that the, the
00:27:17.835 --> 00:27:25.065
researchers compare, uh, um, uh, LMS and
humans on, uh, the task of convincing,
00:27:25.305 --> 00:27:29.955
you know, convincing, you know, so,
uh, uh, uh, so I think the experiments
00:27:29.955 --> 00:27:33.645
is that, um, some humans actually
taking multiple choice questions.
00:27:34.034 --> 00:27:39.705
And then, uh, uh, one group would be the
LM that, you know, convinced the humans
00:27:39.705 --> 00:27:45.465
to take another answer, and the other
group would be another human, um, asking
00:27:45.554 --> 00:27:49.574
the human to reconsider and convince
them to, you know, choose another answer.
00:27:50.264 --> 00:27:54.855
The LMS are really good at both
convincing those humans to pick,
00:27:55.304 --> 00:28:00.524
um, the right answer and the wrong
answer, so it can go bo, go both ways.
00:28:01.095 --> 00:28:03.675
So, whereas for humans, it's
just there to just not, that's
00:28:03.675 --> 00:28:05.565
just bad at convincing overall.
00:28:06.045 --> 00:28:11.205
So, um, I think that, you know, that's,
uh, definitely, uh, says something.
00:28:11.925 --> 00:28:12.315
Yeah.
00:28:12.795 --> 00:28:17.625
Well, let's talk a little bit about,
uh, spell page and your legal AI os.
00:28:18.105 --> 00:28:22.515
Um, tell us a little bit about, I, I guess
are the, are those two separate apps?
00:28:23.475 --> 00:28:23.835
Hmm.
00:28:23.985 --> 00:28:24.435
Uh, okay.
00:28:24.435 --> 00:28:28.965
Maybe, uh, just to give, uh, the
listeners a, a little bit of background.
00:28:28.965 --> 00:28:30.555
So, uh.
00:28:31.395 --> 00:28:35.775
The reason, you know why I think I'm
on this podcast is because I've been
00:28:35.865 --> 00:28:41.445
five coding, uh, uh, a lot of apps
that, um, are lightweight versions
00:28:41.625 --> 00:28:46.305
of some of the, uh, most popular, uh,
legal, ai, uh, products out there.
00:28:46.514 --> 00:28:46.905
Um.
00:28:47.760 --> 00:28:50.700
Uh, one of them, uh, being
spell book, which is like a,
00:28:51.270 --> 00:28:54.120
um, an AI powered word editor.
00:28:54.420 --> 00:28:58.110
So, um, I think they position
themselves as cursor for
00:28:58.110 --> 00:28:59.940
word, uh, cursor for lawyers.
00:29:00.300 --> 00:29:04.830
So instead of, uh, an AI agent that
lives within the developer's, IDE, it's
00:29:04.830 --> 00:29:09.810
basically an agent that lives within,
uh, Microsoft Word that helps, uh, you
00:29:09.810 --> 00:29:11.760
know, helps with drafting, for example.
00:29:12.780 --> 00:29:17.790
Um, the reason why I decided to vibe
code, uh, these, you know, lightweight
00:29:18.030 --> 00:29:22.980
clones is because I, Gemini Free came
out, and then I started, you know,
00:29:23.460 --> 00:29:28.590
uh, seeing a lot of people sharing,
uh, their, the mini apps that they've
00:29:28.590 --> 00:29:30.690
built on the, on social media.
00:29:31.230 --> 00:29:36.030
Um, and as you know, via coding,
you know, uh, uh, I think this is
00:29:36.030 --> 00:29:37.980
also very common practice actually.
00:29:37.980 --> 00:29:38.310
Uh.
00:29:39.090 --> 00:29:44.400
For people to test out the mo these models
by recreating, you know, these uh, apps.
00:29:44.460 --> 00:29:47.430
You know, I think someone
tried to clone Windows 97.
00:29:48.180 --> 00:29:49.710
I think someone tried to clone.
00:29:50.129 --> 00:29:55.590
Uh, lovable, try to clone a, a, a clo
from scratch, you know, just to test,
00:29:55.590 --> 00:29:56.760
you know, how good these models are.
00:29:56.760 --> 00:30:01.565
So I, I, I, you know, I, I, being
a lawyer, um, I, I, I, I, I,
00:30:01.565 --> 00:30:03.389
I try to do something similar.
00:30:03.450 --> 00:30:08.820
So I try to say, well, well, how well
does, would it, you know, um, clone
00:30:08.850 --> 00:30:10.560
some of these AI legal AI tools?
00:30:10.919 --> 00:30:14.940
Um, actually at first I
didn't, I, I, I wasn't.
00:30:16.125 --> 00:30:21.225
Uh, thinking about cloning spell
book, I, my, you know, my initial
00:30:21.225 --> 00:30:26.055
thought was that, uh, I wanted to,
you know, to, to see, you know, I, I
00:30:26.055 --> 00:30:32.055
came across this, uh, novel writing
app, pseudo Write, and I'm fascinated
00:30:32.055 --> 00:30:37.725
about, you know, how, how they are, uh,
uh, allowing users to basically, uh.
00:30:38.294 --> 00:30:42.405
Write the next paragraph and then
awful, you know, um, that, uh, you know,
00:30:42.794 --> 00:30:44.600
engineer plot twist here and so on.
00:30:44.679 --> 00:30:49.635
I, I imagine, you know, how, you know,
would that work for, uh, lawyers?
00:30:49.995 --> 00:30:54.584
Would that work for, you know, people who,
you know, need a, just a simple contract?
00:30:54.854 --> 00:31:01.304
So I had this idea that I put this into,
uh, Gemini free on Google AI Studio.
00:31:01.574 --> 00:31:06.735
Then it, it came up with something
that is, you know, that's, I would say.
00:31:07.785 --> 00:31:10.995
I wouldn't say that immediately
usable, but it's impressive.
00:31:11.175 --> 00:31:16.305
You know, while I, you can basically,
uh, come up with a, a decent surface
00:31:16.305 --> 00:31:20.895
agreement and I can continue iterating
on it by, uh, giving it instructions,
00:31:20.925 --> 00:31:22.005
giving the air instructions.
00:31:22.515 --> 00:31:27.750
So I think at, at some point
I. Uh, just ask, uh, Gemini,
00:31:28.230 --> 00:31:30.300
can we go full on Asian mode?
00:31:30.810 --> 00:31:34.470
And by Asian mode I mean that, you know,
can you, every time I ask you to do
00:31:34.470 --> 00:31:38.639
something, can you spin up a to-do list
and then iterate through that todo, uh,
00:31:38.700 --> 00:31:42.540
uh, iterate through that to-do list and
then complete the whole task that way.
00:31:43.560 --> 00:31:45.990
So, um, it, it, it did that.
00:31:45.990 --> 00:31:49.710
So, you know, I kind of shared my, um.
00:31:50.295 --> 00:31:52.935
A demo of that many app, uh, on LinkedIn.
00:31:52.965 --> 00:31:55.305
I think that, that, you know,
that picked up a lot of interest.
00:31:55.845 --> 00:31:58.905
Um, so, so that's spell page.
00:31:59.115 --> 00:31:59.355
Yeah.
00:31:59.355 --> 00:32:03.945
And, and, uh, of course I also
did a, another app that kind
00:32:03.945 --> 00:32:08.775
of, uh, um, I would say kind of
mimics the tablet review feature.
00:32:09.585 --> 00:32:11.445
Uh, that's offered by Harvey and Agora.
00:32:11.955 --> 00:32:17.355
Uh, so, um, that was also
built on Google AI Studio, um,
00:32:17.685 --> 00:32:20.115
surprisingly in just an afternoon.
00:32:20.475 --> 00:32:23.265
So it's, uh, it's really interesting.
00:32:23.325 --> 00:32:28.545
Um, so I think these kind of two,
two experiments, um, made me think,
00:32:28.965 --> 00:32:34.665
you know, um, Chris, first, you know,
how amazing these frontier models
00:32:34.665 --> 00:32:38.445
are, and a second thing is weather.
00:32:39.540 --> 00:32:45.540
Um, lawyers can build their
own, uh, legal AI tools.
00:32:46.140 --> 00:32:53.825
Um, the reason for that is because, um, I
see that, uh, the industry for the past.
00:32:55.245 --> 00:33:00.465
In a few decades, they have been relying
on, uh, a lot on vendors for sure.
00:33:01.155 --> 00:33:06.135
Uh, I think, you know, for the past
year, I think there has been more
00:33:06.135 --> 00:33:11.295
and more, um, talks on whether law
firms or lawyers should build their
00:33:11.295 --> 00:33:13.125
own, build out their own tech stack.
00:33:14.280 --> 00:33:21.675
And the reason for that is because I
think, uh, if AI becomes more and more.
00:33:22.890 --> 00:33:25.230
You know, AI becomes a utility, let's say.
00:33:25.830 --> 00:33:31.440
Then, uh, you need to differentiate
yourself from other law firms,
00:33:32.190 --> 00:33:36.300
and you probably would be hard to
differentiate yourself from other law
00:33:36.300 --> 00:33:38.010
firms if you're using the same product.
00:33:38.790 --> 00:33:43.080
So maybe the edge comes from you
actually building your own stuff,
00:33:43.590 --> 00:33:47.190
uh, and, you know, tailoring it
to your own use case and workflow.
00:33:47.400 --> 00:33:48.960
So, um.
00:33:49.290 --> 00:33:52.590
You know, this idea came to my mind
and then I, I, you know, I kind of
00:33:52.620 --> 00:33:56.639
thought, you know, if a thousand
lawyers started to, uh, build their
00:33:56.639 --> 00:34:00.690
own tools, then there should be some
commonalities among these apps, right?
00:34:01.770 --> 00:34:10.139
So, you know, and why wouldn't there
be, uh, some open source project
00:34:10.375 --> 00:34:17.670
or infras at the infrastructure
level that kind of, um, uh, um.
00:34:18.375 --> 00:34:22.065
Identify, you know, these kind of
dependencies, I think, and, and
00:34:22.065 --> 00:34:26.505
kind of, um, uh, build it in a way
that allow lawyers to, uh, you know,
00:34:26.505 --> 00:34:29.804
build their own tools on top of
these, uh, this, this layer, right?
00:34:29.924 --> 00:34:36.255
Um, and, and I would say, uh, the reason
why I think that makes sense is because,
00:34:37.304 --> 00:34:42.685
uh, lawyers actually good at, you
know, uh, coming together and building.
00:34:43.350 --> 00:34:43.950
Standards.
00:34:44.250 --> 00:34:49.440
So, um, if we look at the NVCA documents,
which is like the, uh, venture Capital
00:34:49.440 --> 00:34:53.520
Association, uh, template documents
for fundraising, you know, that is,
00:34:53.580 --> 00:34:57.540
you know, as collaborative efforts
from different lawyers, uh, same
00:34:57.540 --> 00:34:59.490
for Easter agreements and so on.
00:34:59.940 --> 00:35:05.190
I think, uh, that happens if something
becomes more and more commoditized and
00:35:05.310 --> 00:35:07.590
so, so standards are built that way.
00:35:08.175 --> 00:35:15.105
If we take the view that, um, a lot
of the, um, a lot of our know-how and
00:35:15.105 --> 00:35:22.665
templates would be converted into AI
workflows or AI apps, then wouldn't we
00:35:22.815 --> 00:35:29.085
be able to, you know, also make certain
AI features or AI experience is standard.
00:35:29.115 --> 00:35:32.265
So, you know, which is why
it, it kind of got me thinking
00:35:32.265 --> 00:35:35.805
whether, um, there is, uh, a.
00:35:36.765 --> 00:35:41.205
Need for an open source project
at the infrastructure level.
00:35:42.674 --> 00:35:46.365
So, well, you know what's interesting
is in info dash we use our integration
00:35:46.365 --> 00:35:51.525
hub and allow, allow, um, vibe coding
more to come on that we haven't, um,
00:35:51.795 --> 00:35:56.055
we have something coming out soon
that we're gonna demo, but we're
00:35:56.055 --> 00:35:57.525
thinking along those lines as well.
00:35:57.555 --> 00:36:03.555
'cause the hardest part in all of that is
getting the data that you need securely.
00:36:04.485 --> 00:36:06.585
And consistently, right?
00:36:06.585 --> 00:36:09.855
If you have to go and touch all
these different API endpoints
00:36:10.245 --> 00:36:12.525
that potentially change, right?
00:36:12.525 --> 00:36:20.445
When iManage releases a new version
of their API or Net Docs or AddRan
00:36:20.505 --> 00:36:26.445
or Elite, that's where things have a,
things become fragile in that regard.
00:36:26.625 --> 00:36:29.685
That that piece has to
be actively managed.
00:36:30.435 --> 00:36:30.915
Um, but I'm.
00:36:31.920 --> 00:36:37.799
I'm curious what your thoughts are about,
since you have vibe coded, like tabular
00:36:37.799 --> 00:36:43.860
review is a really fundamental feature it
seems of the, of the Harvey's and Leg of
00:36:43.860 --> 00:36:51.240
the world, and you've been able to vibe
code a pretty good iteration of that.
00:36:51.245 --> 00:37:00.060
In your spare time, do these legal
AI tools truly have a moat, um, or.
00:37:00.735 --> 00:37:03.345
'cause we've talked about thin
wrappers and everything for quite
00:37:03.345 --> 00:37:06.915
some time, and there's more to it
than just the UI layer, but the UI
00:37:06.915 --> 00:37:09.165
layer is a lot of their value props.
00:37:09.165 --> 00:37:12.015
I don't know, what do you, what are,
what is your thinking now that you've
00:37:12.015 --> 00:37:15.825
been able to replicate some of these
features that a lot of people have been
00:37:15.825 --> 00:37:20.625
thinking that was the moat and you,
you're challenging that I feel like,
00:37:21.885 --> 00:37:27.250
yeah, I, I feel that, uh, so first of
all, uh, the tablet review feature.
00:37:28.484 --> 00:37:30.285
It's actually, I've open sourced it.
00:37:30.825 --> 00:37:35.174
So I would encourage people to, you know,
lawyers who have experienced, you know,
00:37:35.205 --> 00:37:39.525
using RV in the Guard to actually test to
see, you know, what are the actual gaps.
00:37:40.035 --> 00:37:46.335
Um, I think that, you know, it's hard
to just look at to demos and see, you
00:37:46.335 --> 00:37:47.690
know, because, you know, this is, uh.
00:37:48.305 --> 00:37:50.915
Looks the same, uh, and
then it must be the same.
00:37:51.335 --> 00:37:56.435
So, uh, I, I, I, I think, I suspect,
you know, although I don't know,
00:37:56.975 --> 00:38:00.065
um, the underlying architecture
might be a little bit different.
00:38:00.095 --> 00:38:01.925
Um, I can sign a few examples.
00:38:02.615 --> 00:38:07.205
Um, so for the tablet review
tool that I've created, um, it
00:38:07.205 --> 00:38:08.525
doesn't have an embedding model.
00:38:09.065 --> 00:38:09.575
Um.
00:38:10.335 --> 00:38:12.675
For those who don't know,
embedding models are usually used.
00:38:12.675 --> 00:38:21.195
If, um, you, uh, uh, you, you, uh,
you upload a, a long document, uh,
00:38:21.225 --> 00:38:27.405
which exceeds the context window, uh,
of the lm, then you'll have to kind
00:38:27.405 --> 00:38:32.085
of convert, uh, you know, chunk the
documents into different paragraphs
00:38:32.655 --> 00:38:37.035
and then only feed the most relevant,
relevant paragraphs to the ai.
00:38:37.634 --> 00:38:38.085
Um.
00:38:38.940 --> 00:38:41.460
The, the thing I, I put, it
doesn't have the embedding model.
00:38:42.060 --> 00:38:46.200
Uh, and I, I know, you know, from
several, I think interviews that I've
00:38:46.379 --> 00:38:51.540
seen, you know, uh, um, are given by,
uh, you know, Harvard Lago engineers,
00:38:51.540 --> 00:38:54.299
I think that they, they, they have
pretty sophisticated embedding models.
00:38:54.299 --> 00:38:56.460
I think that's a gap.
00:38:56.970 --> 00:39:01.980
Uh, I think whether that embedding
model actually gives more
00:39:01.980 --> 00:39:06.750
accuracy, uh, and precision to
those answers require evaluation.
00:39:08.085 --> 00:39:09.135
And that's another thing, right?
00:39:09.525 --> 00:39:14.355
How do we evaluate, uh, you know,
these outputs from different AI
00:39:14.355 --> 00:39:19.965
tools and a certain open source,
uh, evaluation dataset for that?
00:39:20.715 --> 00:39:26.415
Um, uh, how, you know, uh, how do
we even know, you know, whether,
00:39:26.445 --> 00:39:30.615
you know, this actually outperforms,
uh, uh, a standard off the shelf.
00:39:30.645 --> 00:39:32.205
Lm we don't know.
00:39:32.205 --> 00:39:36.345
So I think some of these
companies have, you know, um.
00:39:37.305 --> 00:39:39.375
Internal evaluation dataset.
00:39:39.915 --> 00:39:43.815
Uh, there are also external
benchmarks like, um, valves, ai,
00:39:44.055 --> 00:39:47.625
um, so which compares different
legal AI tools and so on.
00:39:48.195 --> 00:39:52.935
I think to rigorously, you know,
test, you know, these outputs, uh,
00:39:53.265 --> 00:39:57.465
out, I think, uh, request, you know,
a serious benchmarking exercise.
00:39:58.125 --> 00:40:02.805
But my, my thought is actually,
you know, that, you know, um, um.
00:40:03.405 --> 00:40:06.165
I think, um, there are two
ways to think about it.
00:40:06.320 --> 00:40:11.205
The, the first is that,
you know, uh, I think, um,
00:40:13.335 --> 00:40:19.545
so, uh, at the, you know, for,
for the actual accuracy of the
00:40:19.635 --> 00:40:26.865
ai, uh, tool, uh, you know, it's,
um, it depends on the LMS as well.
00:40:27.405 --> 00:40:32.175
So I, I don't think there is a
serious benchmarking exercise that's
00:40:32.175 --> 00:40:37.875
ever been done between, um, off
the shelf LMS and legal AI tools.
00:40:39.195 --> 00:40:41.265
Um, I think there is
some efforts to do it.
00:40:41.775 --> 00:40:46.485
Um, uh, but then, uh, uh, there's,
I don't think there is a, you know,
00:40:46.635 --> 00:40:48.585
a lot of research into this area.
00:40:49.095 --> 00:40:53.325
Um, uh, the second thing is basically
the ui ux experience, right?
00:40:53.385 --> 00:40:54.525
So what are the UI.
00:40:57.090 --> 00:40:59.280
Is domain specific, for example.
00:41:00.330 --> 00:41:05.010
Um, uh, to be honest, I, I,
I, I, I, I'm not really sure.
00:41:05.490 --> 00:41:11.070
Um, one may say that, you know,
citation is a, uh, wasn't feature
00:41:11.070 --> 00:41:15.060
for lawyers, but I would say it's
actually a pretty generic thing.
00:41:15.165 --> 00:41:17.340
'cause perplexity does citations as well.
00:41:18.180 --> 00:41:18.420
Right.
00:41:18.870 --> 00:41:23.730
Um, and I think that for all business
organizations, if you're searching
00:41:24.000 --> 00:41:25.500
for your own internal database.
00:41:26.355 --> 00:41:29.444
Um, that it requires some
sort of citations to ensure
00:41:29.444 --> 00:41:30.615
that there's no hallucination.
00:41:31.544 --> 00:41:35.685
So if we look at a broader market,
we see that kind of, these features
00:41:35.685 --> 00:41:39.464
are actually offered by other AI
tools as well in other industries.
00:41:40.154 --> 00:41:44.984
So I would say there's also
some ho um, that feature is also
00:41:44.984 --> 00:41:46.814
quite homogenous, I would say.
00:41:47.294 --> 00:41:52.395
Uh, there might be some, some, uh,
thing about, you know, uh, um, you
00:41:52.395 --> 00:41:53.895
know, these areas being connected.
00:41:54.585 --> 00:41:58.305
To data sources that are unique to
lawyers, for example, LexiNexis.
00:41:58.305 --> 00:41:59.265
I think that's a fair point.
00:41:59.895 --> 00:42:05.205
I think these kind of integrations
would help these tools or stand out as
00:42:05.205 --> 00:42:07.545
a legal specific, you know, AI tool.
00:42:08.145 --> 00:42:09.715
Um, uh, but.
00:42:10.455 --> 00:42:14.384
You know, for example, for word editing,
I think word editing is basically
00:42:14.384 --> 00:42:18.404
something that the entire world,
you know, is facing as a problem.
00:42:18.825 --> 00:42:22.575
You know, having AI edit your
Word documents, it's not a lawyer
00:42:22.575 --> 00:42:24.555
problem, it's like a global problem.
00:42:24.705 --> 00:42:26.714
And Microsoft is also solving that.
00:42:26.714 --> 00:42:30.584
I, I suppose, you know, with
copilot, I know they will eventually
00:42:30.584 --> 00:42:32.564
get, I think there's still you.
00:42:36.360 --> 00:42:37.290
A lot of work to do.
00:42:37.830 --> 00:42:38.700
Yeah, for sure, for sure.
00:42:39.930 --> 00:42:44.400
Um, but then if we look at
philanthropic, they have a do X skill,
00:42:45.030 --> 00:42:48.960
uh, quite recently, you know, uh,
uh, that, that you can import into
00:42:48.960 --> 00:42:51.450
cloud code and import into a clo.
00:42:51.750 --> 00:42:57.000
So, um, it's not like the industry is not
solving these kind of problems and it kind
00:42:57.000 --> 00:43:02.580
of worries me that, you know, whether,
um, we are solving the kind of problems
00:43:02.580 --> 00:43:04.260
that are unique to the industry or.
00:43:04.680 --> 00:43:08.550
Especially we are solving the problems
that, you know, the foundation models.
00:43:09.105 --> 00:43:12.825
Uh, the model companies are solving
or the, you know, other industries
00:43:12.825 --> 00:43:17.714
are solving, and if they solve,
uh, this problem faster, then the
00:43:17.714 --> 00:43:19.455
question becomes, you know, yeah.
00:43:19.484 --> 00:43:22.154
You, you then, then your, you
know, your concern becomes valid.
00:43:22.154 --> 00:43:23.025
You know, it's very remote.
00:43:23.535 --> 00:43:26.955
Um, but either way, I mean,
for lawyers it's good news.
00:43:27.495 --> 00:43:31.875
So because, you know, uh, it means that we
have better tools, uh, at a cheaper price.
00:43:32.325 --> 00:43:34.395
Um, I think that, I think, you know.
00:43:35.355 --> 00:43:38.234
I, I always welcome some
healthy competition.
00:43:38.265 --> 00:43:42.555
You know, uh, you know, I think that
it's important to understand the
00:43:42.555 --> 00:43:46.544
gaps as well, I think, um, which
is also what I'm doing as well.
00:43:47.714 --> 00:43:53.535
Um, I think that I, I, I love, uh,
to, you know, um, share my kind of
00:43:53.535 --> 00:43:57.555
learning, uh, with other lawyers, you
know, how capable the frontier models
00:43:57.555 --> 00:43:59.145
are and what we can do with them.
00:43:59.535 --> 00:44:03.915
Because I've, you know, discussed with
different lawyers, you know, who have
00:44:03.915 --> 00:44:09.285
been thinking about, you know, how do we
actually, you know, uh, uh, use AI better.
00:44:09.645 --> 00:44:12.285
Uh, how do we, uh, you know,
um, make sure that we have an
00:44:12.285 --> 00:44:13.694
edge, you know, as a lawyer.
00:44:14.879 --> 00:44:17.910
You know, one thing that came up is,
you know, maybe is adoption right?
00:44:18.899 --> 00:44:25.859
Uh, between picking a better tool
than and, uh, and making sure 70%.
00:44:26.895 --> 00:44:31.875
90%, uh, of your organization
is actually using AI daily.
00:44:32.415 --> 00:44:36.375
I think maybe the latter will have
more impact, make you more competitive.
00:44:37.155 --> 00:44:41.175
So maybe it's also a culture
issue, uh, like a culture issue.
00:44:41.745 --> 00:44:45.645
It's also a business model issue as in,
you know, how do you charge, you know,
00:44:45.645 --> 00:44:47.715
legal service as a whole and so on.
00:44:47.715 --> 00:44:52.100
So, I, I think there's a lot
more to the equation to speak.
00:44:53.115 --> 00:44:57.075
Yeah, no, this, that was, uh,
that really good, um, overview.
00:44:57.105 --> 00:44:59.775
So we're almost outta time,
but I did wanna bounce one
00:44:59.775 --> 00:45:00.945
other question off of you.
00:45:00.945 --> 00:45:02.775
So this is legal innovation spotlight.
00:45:03.375 --> 00:45:08.055
You know, a big chunk of
our audience are like legal
00:45:08.415 --> 00:45:10.065
innovation professionals, right?
00:45:10.065 --> 00:45:16.725
So CNOs, um, you know, legal
or, um, innovation attorneys,
00:45:17.235 --> 00:45:19.005
uh, a lot of KM folks.
00:45:19.515 --> 00:45:20.475
How does.
00:45:21.569 --> 00:45:27.210
Like vibe coding, like historically, that
has been the function within the law firm
00:45:27.330 --> 00:45:33.779
that's been responsible for, I guess kind
of what you're doing is like, you know,
00:45:33.779 --> 00:45:41.700
evaluating alternatives, um, exploring
different paths to technical solutions.
00:45:42.509 --> 00:45:50.370
And if lawyer, you know, does this, does
vibe coding change the relationship?
00:45:50.640 --> 00:45:57.300
Between the lawyers in a law firm and
their innovation teams, because they're
00:45:57.300 --> 00:46:01.650
now able to do some prototyping that
they might engage their technology
00:46:01.650 --> 00:46:05.400
or or innovation partners In the
past, do you see that relationship
00:46:05.400 --> 00:46:07.080
changing as a result of vibe coding?
00:46:11.835 --> 00:46:14.625
Yeah, I, I think that
changes a lot actually.
00:46:14.685 --> 00:46:19.815
Um, so, because if we think in the past,
you know, if lawyers have questions,
00:46:19.845 --> 00:46:23.025
uh, if they experience pain points,
you know, what do, what do they do?
00:46:23.055 --> 00:46:24.675
They usually reach out to.
00:46:25.375 --> 00:46:28.075
Their innovation team, uh, or the IT team.
00:46:28.194 --> 00:46:32.484
Uh, and then the IT team will, uh,
reach out to different vendors and
00:46:32.484 --> 00:46:37.734
maybe do IT research to see if there's
any vendor that can solve that problem.
00:46:38.544 --> 00:46:41.484
Uh, I suppose that also goes
through a, they, they have a
00:46:41.484 --> 00:46:42.895
list of selection criteria.
00:46:43.254 --> 00:46:49.015
Uh, they have, um, some filtering exercise
to do, um, security scans and so on.
00:46:49.524 --> 00:46:52.854
And then after the filtering
exercise, then, uh, there's a
00:46:52.854 --> 00:46:53.899
trial where the lawyers will.
00:46:54.525 --> 00:46:55.515
Try out different products.
00:46:55.845 --> 00:47:00.105
I think by that time, usually the lawyer
doesn't have experience the pain point
00:47:00.105 --> 00:47:04.485
anymore because maybe the transaction
has already complete completed, or, you
00:47:04.485 --> 00:47:08.895
know, they, um, uh, they, they, they,
they, they, they find out that, you
00:47:08.895 --> 00:47:12.975
know, after the filtering, you know,
well, this only solves 20% of my problem.
00:47:13.575 --> 00:47:19.125
Uh, you know, uh, so, uh, uh, and
then, and, and then historically
00:47:19.125 --> 00:47:23.145
I think that, you know, innovation
team, uh, they also, um.
00:47:24.240 --> 00:47:27.359
You know, they, they, they, they,
they, they're basically the experts
00:47:27.930 --> 00:47:30.810
in using these tools because, you
know, historically, you know, uh,
00:47:30.870 --> 00:47:33.660
these SaaS tools have a lot of
buttons, a lot of features, right?
00:47:34.200 --> 00:47:37.620
And, uh, lawyers, they don't know
how to use them, use all of them.
00:47:37.620 --> 00:47:39.689
It's like a printer, you know,
they're a hundred buttons.
00:47:40.169 --> 00:47:42.450
Uh, you know, lawyers don't
know how to use all of them.
00:47:42.870 --> 00:47:47.819
So I think, uh, for knowledge management,
uh, people, you know, um, uh, they
00:47:47.819 --> 00:47:49.420
became the expert in using these tools.
00:47:50.895 --> 00:47:55.154
Um, you know, they need to teach the
lawyers how to use each and every
00:47:55.365 --> 00:47:57.345
feature, uh, to solve their, uh, use case.
00:47:58.274 --> 00:47:58.690
Now, this, this.
00:47:59.370 --> 00:48:03.630
Might shift fundamentally if lawyers
are allowed to actually build their
00:48:03.630 --> 00:48:08.880
own tools, because, uh, they could be,
you could build something that's really
00:48:08.880 --> 00:48:13.890
personalized with a few, a lot less
buttons, for example, because that's only
00:48:13.950 --> 00:48:19.170
is highly tailored to the use case and
traditional SaaS, you know, it's, requires
00:48:19.170 --> 00:48:21.330
people to actually work around them.
00:48:21.600 --> 00:48:21.780
Right.
00:48:22.290 --> 00:48:27.630
You know, uh, they, they require people
to actually, um, learn how to use it.
00:48:27.944 --> 00:48:33.465
And then, you know, um, adjust their
workflow, you know, uh, so, so, so to
00:48:33.465 --> 00:48:36.105
fit, you know, the SaaS tools design.
00:48:36.645 --> 00:48:40.245
Uh, but if in the future, if it's
actually the other way around where
00:48:40.245 --> 00:48:45.674
the SaaS tool actually adapts to
the large habit, then this also, you
00:48:45.674 --> 00:48:51.015
know, um, it's that, you know, this
also becomes, um, much more direct.
00:48:51.134 --> 00:48:55.305
So I think what I, um.
00:48:55.950 --> 00:49:04.050
Uh, would love to see, uh, would be, uh,
I would say if law firms are, you know, in
00:49:04.050 --> 00:49:10.020
the business of building stuff themselves,
I think, uh, a knowledge management people
00:49:10.080 --> 00:49:15.420
and also innovation team, they would be
in the best position to actually be that
00:49:15.420 --> 00:49:17.250
person who actually I code these stuff.
00:49:17.790 --> 00:49:19.500
Uh, because they know the requirements.
00:49:19.860 --> 00:49:21.060
They're closest to the lawyers.
00:49:21.915 --> 00:49:23.325
Deloits, they have billable hours.
00:49:23.835 --> 00:49:28.695
Uh, so they won't take, uh, a time out of
their, uh, schedule to build these tools.
00:49:29.355 --> 00:49:33.555
Uh, uh, maybe the maintenance,
you know, uh, of these tools would
00:49:33.555 --> 00:49:36.075
be, um, the innovation team's job.
00:49:36.404 --> 00:49:39.645
Uh, also they might, uh, also need
to do a lot of security scans,
00:49:40.065 --> 00:49:42.375
uh, you know, uh, uh, for example.
00:49:43.035 --> 00:49:47.265
When we lawyers build these tools, how
do we make sure that, you know, it's not
00:49:47.265 --> 00:49:49.095
using libraries that have vulnerabilities.
00:49:49.485 --> 00:49:52.455
I think these are the kind
of questions, you know, um,
00:49:52.575 --> 00:49:54.465
challenges that need to be solved.
00:49:54.915 --> 00:49:57.075
And I, I would say these
are new challenges.
00:49:57.525 --> 00:50:02.955
Uh, I think that, you know, the job
of innovation team would quite, you
00:50:02.955 --> 00:50:04.605
know, change quite fundamentally from.
00:50:05.445 --> 00:50:09.885
Sourcing and learning how to use
tools to building tools themselves.
00:50:10.455 --> 00:50:15.645
I think, um, I think that is, uh, I
would say that is actually a healthy,
00:50:16.035 --> 00:50:20.985
uh, change, uh, because uh, I think
that kind of solves a lot of the
00:50:20.985 --> 00:50:26.625
frustration that, um, I think, uh,
both sides experience before, you know,
00:50:26.625 --> 00:50:29.085
um, uh, I would say, you know, uh.
00:50:29.850 --> 00:50:33.450
Because of the, the questions I, I,
I, you know, the, the problems that I,
00:50:33.720 --> 00:50:37.590
you know, raised, you know, because it
doesn't, the cycle is just too long,
00:50:37.740 --> 00:50:41.430
you know, the procurement cycle and
then some, some, some kind of demands
00:50:41.430 --> 00:50:44.880
are pretty immediate and, you know,
these might get solved eventually.
00:50:44.880 --> 00:50:48.120
So, you know, really, you know,
looking forward to that change.
00:50:48.885 --> 00:50:49.335
Yeah.
00:50:49.365 --> 00:50:51.105
No, that is, that's really good insight.
00:50:51.585 --> 00:50:54.555
Um, yeah, this has been a,
a fantastic conversation.
00:50:54.555 --> 00:50:55.695
I really appreciate you.
00:50:55.695 --> 00:50:59.865
I know it's super late, uh, on
your side of the globe, so I, I
00:50:59.865 --> 00:51:01.695
really appreciate you making time.
00:51:02.205 --> 00:51:06.555
Um, how do people find out more about,
you know, what, what you're doing?
00:51:06.555 --> 00:51:09.675
Do you have your own GitHub repo?
00:51:09.675 --> 00:51:11.025
Is it best to LinkedIn?
00:51:11.025 --> 00:51:12.375
What's, how do they find out more?
00:51:13.049 --> 00:51:15.870
So I would say, uh,
follow my me on LinkedIn.
00:51:15.960 --> 00:51:19.859
Uh, I, I post regularly on, um,
on LinkedIn about, you know,
00:51:19.859 --> 00:51:21.629
all everything related to ai.
00:51:21.810 --> 00:51:27.930
I, I like to share the products that I've
built, um, uh, on LinkedIn, on GitHub.
00:51:28.259 --> 00:51:33.029
Um, and also I've recently also five
coded my own collection of, I put
00:51:33.029 --> 00:51:35.490
app so you can go into check it out.
00:51:35.730 --> 00:51:36.450
I check them out.
00:51:36.450 --> 00:51:37.740
Uh, it's free to use.
00:51:37.799 --> 00:51:41.100
Um, so, um, I, because
I, I believe that is.
00:51:41.359 --> 00:51:47.839
Important not only to show, uh, the apps
that I I coded, but also let people just
00:51:47.839 --> 00:51:52.190
follow along and experience and actually
use them so that they can see, you know.
00:51:52.935 --> 00:51:56.145
How good or bad, you know,
these apps that are vibed are.
00:51:56.505 --> 00:52:01.545
So, uh, I think that's, um, that would
be the complete experience of, uh, vibe
00:52:01.695 --> 00:52:05.475
Ping, whether you're like a, uh, someone
who's getting into it, someone just
00:52:05.565 --> 00:52:07.845
likes, you know, watching people vibe.
00:52:08.235 --> 00:52:12.315
Like, uh, you're like, you're on twich
seeing some, uh, watching people game.
00:52:12.915 --> 00:52:15.585
I think that I wanna provide the
experience so at least, although
00:52:15.585 --> 00:52:19.305
be on LinkedIn, uh, GitHub, and
also, yeah, check out my website.
00:52:20.250 --> 00:52:20.820
Awesome.
00:52:21.270 --> 00:52:21.990
Well, good stuff.
00:52:22.050 --> 00:52:23.070
Uh, thanks again, Jamie.
00:52:23.070 --> 00:52:26.940
Keep doing what you're doing because I
think it's, it's raising lots of good,
00:52:27.450 --> 00:52:31.530
it's initiating lots of good dialogue
and conversations like this where we ask
00:52:31.950 --> 00:52:34.410
hard questions like, what is the role?
00:52:34.410 --> 00:52:38.970
How is the role, the innovation
function, how is it evolving?
00:52:39.090 --> 00:52:41.130
Um, build versus buy?
00:52:41.130 --> 00:52:45.855
How does that dynamic
change, um, you know.
00:52:47.009 --> 00:52:53.160
What is the future of,
um, legal tech look like?
00:52:53.160 --> 00:52:57.480
I mean, this is really the, the
things we're doing now is really
00:52:57.660 --> 00:53:00.990
we're asking fun, these fundamental
questions, which are, uh, which
00:53:00.990 --> 00:53:02.520
I think the, the timing is right.
00:53:02.940 --> 00:53:05.250
So, um, thanks so much for joining.
00:53:05.250 --> 00:53:09.149
Keep doing what you're doing and, uh,
hopefully everybody who listens here
00:53:09.149 --> 00:53:13.259
will follow you, follow you on LinkedIn
and, um, and get to kick the tires
00:53:13.259 --> 00:53:14.065
on some of the tools you're building.
00:53:15.090 --> 00:53:15.810
Thank you so much.
00:53:15.930 --> 00:53:16.860
Thank you for inviting me.
00:53:17.760 --> 00:53:18.420
Absolutely.
00:53:18.480 --> 00:53:19.410
Alright, have a good night.
00:53:20.190 --> 00:53:20.490
You too.
00:53:21.060 --> 00:53:21.780
Alright, take care.
00:53:22.200 --> 00:53:24.480
Thanks for listening to
Legal Innovation Spotlight.
00:53:25.020 --> 00:53:28.530
If you found value in this chat, hit
the subscribe button to be notified
00:53:28.530 --> 00:53:30.000
when we release new episodes.
00:53:30.510 --> 00:53:33.180
We'd also really appreciate it if
you could take a moment to rate
00:53:33.180 --> 00:53:35.820
us and leave us a review wherever
you're listening right now.
00:53:36.390 --> 00:53:39.120
Your feedback helps us provide
you with top-notch content. -->
Subscribe
Stay up on the latest innovations in legal technology and knowledge management.
Manage Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.