In this episode, Ted sits down with Ray Sun, Tech Lawyer at Herbert Smith Freehills Kramer, to discuss the global landscape of AI regulation and what it means for the future of law. From building the Global AI Regulation Tracker to comparing U.S. export controls with China’s open-source push, Ray shares his expertise in technology law and policy. With insights on how AI regulation is becoming a geopolitical battleground, this conversation helps law professionals understand the forces shaping innovation and legal practice worldwide.
In this episode, Ray shares insights on how to:
Track and interpret global AI regulations across jurisdictions
Understand the different approaches of China and the U.S. to AI policy
Recognize the role of export controls and national security in shaping AI
See how open-source expectations influence technology adoption in China
Anticipate the impact of AI regulation on the future of legal work
Key takeaways:
AI regulation is no longer just a legal issue—it’s a geopolitical one
China’s AI strategy emphasizes self-sufficiency and open-source development
The U.S. is focusing on export controls and national security in its AI policy
Standardization of AI tasks will determine how the technology reshapes law
The future of AI in legal practice is likely to free lawyers for more strategic work
About the guest, Ray Sun
Ray Sun is a technology lawyer and developer known for creating innovative AI-driven tools, including the Global AI Regulation Tracker and SyncTrainer, an AI-enabled dance analysis app. Recognized as a LinkedIn Top Voice on AI regulation and Australia’s 2023 Technology Lawyer of the Year, he combines legal expertise with hands-on development to bridge the gap between law, technology, and innovation. Ray also shares insights on AI through his brand techie_ray, building a global audience across YouTube, TikTok, and beyond.
“It’s important not to see AI policy in isolation, but how it interconnects with every other domestic policy of a country.”
1
00:00:01,240 --> 00:00:04,629
Ray, how are you this afternoon or morning your time?
2
00:00:04,629 --> 00:00:05,952
How are you this morning?
3
00:00:05,984 --> 00:00:09,155
Yeah, it's morning time and yeah, hey Ted, feeling really good?
4
00:00:09,155 --> 00:00:09,997
Yeah.
5
00:00:10,094 --> 00:00:10,844
Good, man.
6
00:00:10,844 --> 00:00:11,775
Good.
7
00:00:11,775 --> 00:00:13,755
It's good to have you on the show.
8
00:00:14,033 --> 00:00:17,417
I enjoy reading your content on LinkedIn.
9
00:00:17,974 --> 00:00:35,735
I was looking at your global AI regulation tracker and knew we had to have a conversation
because, and how timely, uh we are in the midst of all sorts of regulatory movement, both
10
00:00:35,735 --> 00:00:36,839
US and China.
11
00:00:36,839 --> 00:00:38,056
So we'll get into that.
12
00:00:38,056 --> 00:00:39,086
But before
13
00:00:39,086 --> 00:00:43,691
Before we do, let's get you introduced.
14
00:00:43,691 --> 00:00:49,167
you're a lawyer and a tech lawyer and a developer, and you don't see that a lot.
15
00:00:49,167 --> 00:00:57,386
Why don't you tell us a little bit about your background and how you ended up being a tech
lawyer and application developer.
16
00:00:57,624 --> 00:00:58,755
Yeah, yeah.
17
00:00:58,755 --> 00:00:59,835
So...
18
00:01:00,156 --> 00:01:05,067
My story's gonna probably be one of those that starts off really random.
19
00:01:05,067 --> 00:01:09,093
You don't know where it's going, or eventually fall into place, makes sense.
20
00:01:09,093 --> 00:01:10,044
So...
21
00:01:10,244 --> 00:01:14,386
It really started when I was 5 or 6 years old.
22
00:01:14,494 --> 00:01:21,613
I was really into shows like Thomas the Tank Engine, Astro Boy, Iron Man.
23
00:01:22,154 --> 00:01:27,328
Anything that's got to do with machines, you know, coming alive and doing cool stuff.
24
00:01:27,470 --> 00:01:30,852
That has been my fascination even till today.
25
00:01:31,112 --> 00:01:36,996
And so, you know, throughout primary school and high school, I've always liked building
things on the side.
26
00:01:36,996 --> 00:01:40,058
So in primary school, I would like to build my own toys.
27
00:01:40,058 --> 00:01:47,322
And in high school, you know, I started playing computer games and then started to pick up
coding to learn how to build my own computer games.
28
00:01:47,322 --> 00:01:50,704
And what really helped was that my friends were also into the same thing.
29
00:01:50,704 --> 00:01:54,676
So we're all motivating each other and just doing cool things together.
30
00:01:55,390 --> 00:02:07,096
I think it was then during high school that I started reading, going to the library, and
then I was also interested in books around, know, all of the detective genres, and also
31
00:02:07,096 --> 00:02:10,137
legal thrillers, like the John Grisham series.
32
00:02:10,518 --> 00:02:19,902
And that's where I got into, I really enjoy stories that talk about uh evidence and,
33
00:02:20,398 --> 00:02:22,790
trying to connect pieces together to uncover the truth.
34
00:02:22,790 --> 00:02:25,941
And even though along those lines, I find really fascinating.
35
00:02:26,042 --> 00:02:41,292
And so when it got to the end of high school, what I want to do for further study, I was
really looking at either law or computer science, given my ongoing interests.
36
00:02:41,393 --> 00:02:49,678
And it came down to a really simple conclusion, which is that I can make coding a hobby,
but I can't make law a hobby.
37
00:02:49,824 --> 00:02:52,145
So why not make law my career?
38
00:02:52,145 --> 00:02:56,097
And I was, you know, continuing my coding on the side.
39
00:02:56,398 --> 00:02:59,409
And so all my friends did computer science.
40
00:02:59,409 --> 00:03:09,065
I was the only one who went on to do law, but I still, you know, continued building games
and websites just to, you know, as a way to still connect to my friends.
41
00:03:09,385 --> 00:03:14,888
And yeah, throughout uni, I was doing my legal studies and building random maps.
42
00:03:15,089 --> 00:03:17,902
I think, what was like a
43
00:03:17,902 --> 00:03:34,262
A big turning point was I was looking for work as a paralegal and I wasn't able to find
one until I started to just look into internships in non-traditional law firms.
44
00:03:34,502 --> 00:03:39,922
So this was around 2017-18.
45
00:03:39,922 --> 00:03:44,582
I applied for an internship at a legal tech startup called Law Path.
46
00:03:44,926 --> 00:03:47,638
and I was doing as a media intern.
47
00:03:47,638 --> 00:03:52,521
So I was writing blog articles around, I think back then it was blockchain and smart
contracts.
48
00:03:52,521 --> 00:03:54,453
That was the craze back then.
49
00:03:54,453 --> 00:03:56,674
So I was writing a lot of articles on that.
50
00:03:56,674 --> 00:04:02,538
And then around that time, I also did a hackathon, one of the world's first legal tech
hackathon.
51
00:04:02,898 --> 00:04:05,620
And I was representing my university.
52
00:04:05,740 --> 00:04:07,732
I was the developer within my team.
53
00:04:07,732 --> 00:04:14,446
So I was building a prototype app, which basically helps streamline payments between the
client
54
00:04:14,510 --> 00:04:18,930
and the Barrister, which is a very, you know, UK, Australia unique thing.
55
00:04:19,130 --> 00:04:25,450
We don't have the Barrister and sister that merge as an attorney like in the US, but
basically it's just an app.
56
00:04:25,450 --> 00:04:27,010
It's like a payment app, right?
57
00:04:27,010 --> 00:04:36,470
And then when I pitched the app, it turns out that one of the judges was the boss of that
legal tech startup that I was interning at.
58
00:04:36,470 --> 00:04:38,478
And so when I came back, the boss said,
59
00:04:38,478 --> 00:04:40,978
Hey Ray, I didn't know you could code like that.
60
00:04:40,978 --> 00:04:46,438
Why don't you move into our engineering department rather than the comms media department?
61
00:04:46,438 --> 00:04:48,018
And I was like, yeah, sure.
62
00:04:48,038 --> 00:04:54,298
And then that's how I got my actual job as like a legal engineer at that startup.
63
00:04:54,298 --> 00:04:57,598
And I was helping build out the document automation platform.
64
00:04:57,598 --> 00:05:06,758
And so that was like the first serious, I guess, gig that made me think, oh, wow, okay,
tech and law can be combined in some way into some actual.
65
00:05:06,818 --> 00:05:17,943
meaningful career and since then when I was applying for graduate roles and you know
Becoming a lawyer it has always been my path to sort of go down that tech lawyer sort of
66
00:05:17,943 --> 00:05:28,188
route and that's where I am right now I'm currently a practicing tech lawyer at Herbert
Smith Freehills Kramer So and uh which is a global law firm and I do a lot of stuff within
67
00:05:28,188 --> 00:05:33,830
the AI uh legal space but at the same time, you know uh
68
00:05:33,922 --> 00:05:36,673
So that's that's like how my career sort of evolved.
69
00:05:36,673 --> 00:05:49,057
But on the side, as I said, I'm still building things and the AR regulation tracker is
just one of those sort of uh projects that really developed as I was like, got really
70
00:05:49,057 --> 00:05:54,519
interested in AR ethics and robots, but I'm sure we can talk more about that one.
71
00:05:54,519 --> 00:05:57,410
But that's basically the rundown of my career today.
72
00:05:57,472 --> 00:05:58,342
Interesting.
73
00:05:58,342 --> 00:06:06,485
And yeah, so let's talk about the global AI regulation tracker that kind of started off as
a personal project for you.
74
00:06:06,485 --> 00:06:13,148
And then it's kind of got in traction and now is an industry resource.
75
00:06:13,248 --> 00:06:16,179
like take us back to the origin story around that.
76
00:06:16,179 --> 00:06:27,243
think the last time you and I spoke, was you were procrastinating a reading about trolley
problems and machine ethics and tell us about how this thing came about.
77
00:06:27,342 --> 00:06:31,662
Yeah, basically it started with procrastination.
78
00:06:31,662 --> 00:06:36,422
I think during uni there were some subjects which were a bit, you know, a bit of a bludge.
79
00:06:36,502 --> 00:06:41,422
was just, I'm a person who likes history and geography.
80
00:06:41,422 --> 00:06:46,742
So I like reading, you know, random articles on geopolitics and history.
81
00:06:46,742 --> 00:06:56,134
And I came across this YouTube documentary and this was around a time where self-driving
vehicles were being trialed, experimented.
82
00:06:56,778 --> 00:06:59,899
and know, accidents, unfortunate accidents were happening.
83
00:06:59,899 --> 00:07:03,781
And then there videos just talking about the ethics in self-driving vehicles.
84
00:07:03,781 --> 00:07:15,085
So if a car were to, if it can't stop and can't change tracks, you, and it was approaching
either an elderly person or a baby, which one should the car hit to minimize like damage?
85
00:07:15,085 --> 00:07:25,960
And that's such a tough question that it just opened up the whole door of AR ethics, which
I never thought was a theme, but you know, the more I read into it, you know, reading
86
00:07:25,960 --> 00:07:26,562
about
87
00:07:26,562 --> 00:07:30,845
the three laws of robots and then the Trotty problem.
88
00:07:30,845 --> 00:07:39,531
And it was around that time where the EU government was also thinking about the EU AI Act,
which today is an actual thing in effect.
89
00:07:39,531 --> 00:07:40,692
But like, this is 2019.
90
00:07:40,692 --> 00:07:44,754
This is just an idea just being floated around in government.
91
00:07:44,754 --> 00:07:49,978
And the whole idea of regulating AI back then was so alien.
92
00:07:49,978 --> 00:07:54,841
It was such a foreign concept that, wow, I never thought this could be an actual thing.
93
00:07:55,262 --> 00:07:56,140
And so,
94
00:07:56,140 --> 00:08:00,432
And as I was reading, I like to sort of write notes in a notebook.
95
00:08:00,633 --> 00:08:08,216
I don't know, like I didn't really have any particular reason why I just thought, you
know, if there's anything interesting, I'll just write it down in the diary.
96
00:08:08,397 --> 00:08:11,598
And it got to a point where my diary started filling out.
97
00:08:11,959 --> 00:08:18,002
And then I was having the conversation with friends and also classmates and eventually
colleagues.
98
00:08:18,002 --> 00:08:25,484
And I realized that, well, I actually have sort of quite a lot of ideas and insight over
the past, you know,
99
00:08:25,484 --> 00:08:36,410
or threes of reading randomly and I thought you know maybe I could share this on LinkedIn
so um when I first started becoming a lawyer on the side you know it was also during COVID
100
00:08:36,410 --> 00:08:47,406
so I started my career during the COVID lockdown so there's a bit of a lot of quite a lot
of spare time to just play around so I was just writing stuff on LinkedIn initially like
101
00:08:47,406 --> 00:08:54,650
it only hit a very niche audience so I'll write updates around AR regulation and ideas
102
00:08:54,894 --> 00:08:57,675
um I did that for like a year or so.
103
00:08:58,275 --> 00:09:02,076
Modest engagement, but I had a lot of fun writing.
104
00:09:02,076 --> 00:09:12,579
And then when ChaiGPD came out, that's where everything changed because all of sudden,
total proof of AI and AI regulation became like really popular and my posts started
105
00:09:12,579 --> 00:09:14,019
getting bit more attention.
106
00:09:14,019 --> 00:09:15,920
So that encouraged me to write more.
107
00:09:15,920 --> 00:09:18,781
I was also writing all around the world.
108
00:09:18,781 --> 00:09:22,322
not just one particular country, but as many countries as possible.
109
00:09:22,322 --> 00:09:23,726
It got to a point where
110
00:09:23,726 --> 00:09:30,706
my LinkedIn had all these posts in different countries and I thought, no, let's try to
organize into one hub.
111
00:09:30,866 --> 00:09:32,566
And so I already had a website myself.
112
00:09:32,566 --> 00:09:38,726
So I just thought, you know, why not just add a new page to our website that's categorizes
all my LinkedIn posts per country.
113
00:09:38,766 --> 00:09:42,906
So initially it was like a simple table, but then I thought that's so boring.
114
00:09:42,906 --> 00:09:44,626
Let's just take this a step further.
115
00:09:44,626 --> 00:09:45,786
I create upside.
116
00:09:45,786 --> 00:09:50,526
I put like a map on it and click on the country to show my LinkedIn posts.
117
00:09:50,526 --> 00:09:51,594
Then I thought,
118
00:09:51,594 --> 00:09:53,645
Why stop on my own LinkedIn page?
119
00:09:53,645 --> 00:09:58,979
This is actually prepare an uh encyclopedia summary to each country.
120
00:09:59,239 --> 00:10:04,333
And yeah, that's where I started summarizing each country's AI policies and regulations.
121
00:10:04,333 --> 00:10:09,987
I initially started with the G20 countries, but then I've sort of expanded.
122
00:10:09,987 --> 00:10:14,070
I've been running this project now for three months all by myself.
123
00:10:14,070 --> 00:10:21,024
And now it took me two and a half years to now cover every country uh and territory in the
world.
124
00:10:21,110 --> 00:10:22,131
So over like 200.
125
00:10:22,131 --> 00:10:31,135
yeah, so I think it's been really great to be, I guess, one of the early ones building
this sort of tool.
126
00:10:31,135 --> 00:10:37,298
And then a lot of people were really supportive and yeah, just a lot of encouragement from
out the global industry.
127
00:10:37,298 --> 00:10:44,352
And that really helps me complete the map and also add new features to it to make it more
user friendly and developer friendly.
128
00:10:44,352 --> 00:10:45,302
yeah.
129
00:10:46,103 --> 00:10:51,673
Yeah, so is your does your firm leverage the research that you've compiled?
130
00:10:53,471 --> 00:10:55,052
not, not directly.
131
00:10:55,052 --> 00:10:58,363
I think this is something that I do like in my own personal time.
132
00:10:58,363 --> 00:11:09,760
Um, and just everyone around the industry is mostly targeted to less say, you know, small
businesses, academics, researchers, and developers, especially because there's now a new
133
00:11:09,760 --> 00:11:16,133
API that developers can now link the apps on top of it to sort of run the own monitoring
tools, whatever.
134
00:11:16,133 --> 00:11:20,606
So it's more targeted at that sort of grassroots smaller end.
135
00:11:20,606 --> 00:11:21,386
Yeah.
136
00:11:21,985 --> 00:11:24,890
Do you have any plans for this?
137
00:11:24,890 --> 00:11:31,601
to get funding or, you know, either through a grant or private funding to help get some
help with it?
138
00:11:31,601 --> 00:11:32,663
This sounds like a lot of work.
139
00:11:32,663 --> 00:11:34,096
uh
140
00:11:34,096 --> 00:11:40,189
yeah, it's um, it's it sounds like work, but it's actually like, it's not that much work.
141
00:11:40,189 --> 00:11:48,042
Because I because I progressively updated every day only takes like five minutes of my
time each day just to like monitor updates.
142
00:11:48,042 --> 00:11:52,024
I've got a lot of tools in the background to help curate news items for me.
143
00:11:52,024 --> 00:11:54,395
So it's not a lot of work per day.
144
00:11:54,395 --> 00:11:55,865
But we put it all together.
145
00:11:55,865 --> 00:11:57,816
Sounds like it's quite a lot.
146
00:11:58,036 --> 00:11:59,637
in terms of your other question.
147
00:11:59,637 --> 00:12:00,207
Yeah, sure.
148
00:12:00,207 --> 00:12:02,274
Like I'm always on the, you know,
149
00:12:02,274 --> 00:12:06,139
on the lookout for opportunities, I'm also keeping myself open-minded.
150
00:12:06,139 --> 00:12:08,842
I'm also not desperate for it.
151
00:12:08,842 --> 00:12:15,029
It's something that's nice to have at end of the day for me, just to learn about the
world.
152
00:12:15,029 --> 00:12:16,230
eh
153
00:12:16,811 --> 00:12:17,431
Interesting.
154
00:12:17,431 --> 00:12:22,493
yeah, and you're, you're, LinkedIn posts have, have gotten a lot of traction.
155
00:12:22,493 --> 00:12:24,103
mean, you're on the other side of the world.
156
00:12:24,103 --> 00:12:25,814
um And I've seen your stuff.
157
00:12:25,814 --> 00:12:31,505
You're a top voice, which, uh know, that's a, that's a hard designation to get.
158
00:12:31,505 --> 00:12:36,367
um You really have to put in some effort and some work around that.
159
00:12:36,547 --> 00:12:40,118
Well, you've had, so we're recording this in the beginning of August.
160
00:12:40,118 --> 00:12:43,831
This will probably come out towards the later part of the month.
161
00:12:43,831 --> 00:12:46,762
But you've had a busy week or so, right?
162
00:12:46,762 --> 00:12:57,545
Because we've had major developments with the Trump administration announcing their AI
action plan and then a very quick turnaround on a China response.
163
00:12:57,545 --> 00:13:01,086
um what is your, yeah.
164
00:13:01,086 --> 00:13:04,766
And you had some great posts that I thought were interesting.
165
00:13:05,427 --> 00:13:12,969
specifically on the China side, there's a lot on the U S side too, but you, let's start
with, with China first.
166
00:13:12,969 --> 00:13:13,889
So.
167
00:13:13,985 --> 00:13:20,831
You kind of zeroed in on some nuance around language um with the China plan.
168
00:13:20,831 --> 00:13:26,785
Like, and if I read your post correctly, it was like 95 % of this is not new.
169
00:13:26,785 --> 00:13:30,128
Um, but the 5 % that is, is interesting.
170
00:13:30,128 --> 00:13:36,546
um tell us what your kind of take is on the, the, the China piece first.
171
00:13:36,546 --> 00:13:38,967
Yeah, yeah, yeah, of course.
172
00:13:39,347 --> 00:13:46,190
I guess I'll just first lay out sort of the macro context behind China's AI policy
thinking.
173
00:13:46,190 --> 00:13:57,074
And this is based on, you know, both research and also being in the country talking to
like the big tech, like companies are driving this sort of change.
174
00:13:57,655 --> 00:14:05,378
So I think really it comes down to one or two things, which is China has a very strong
push for
175
00:14:05,496 --> 00:14:08,248
what they call like, know, sovereignty in AI.
176
00:14:08,248 --> 00:14:11,450
So being self-sufficient in the full stack.
177
00:14:11,811 --> 00:14:23,639
And part of this has been driven because of the pressure from US export controls, limiting
access to the necessary chips that are required to build really advanced AI systems.
178
00:14:23,639 --> 00:14:33,696
So this whole central theme around self-sufficiency, having control of the full stack,
that is like the major theme and that sort of...
179
00:14:33,814 --> 00:14:44,180
Manifesting itself in other smaller sub themes around, know, where with which certain
industries will require investment and trade policies and all that.
180
00:14:44,380 --> 00:14:53,085
And the second thing is like China is also trying to uh lead in standards, especially for
the global South.
181
00:14:53,325 --> 00:15:01,934
There's all part of like the whole BRICS initiative, all part of the Belt and Road
project, which has been ongoing for a decade already.
182
00:15:01,934 --> 00:15:14,214
So there's that also that mindset just really to set the standard because standards are
important because if you think about the internet, the internet's built on US led
183
00:15:14,214 --> 00:15:21,534
standards and that has given the US a lot of leverage over how the internet ecosystem
should operate.
184
00:15:21,594 --> 00:15:29,094
And it's one of those things where it's a hugely contested front and actually it has a
huge role in geopolitics.
185
00:15:29,094 --> 00:15:30,766
So when it comes to the new
186
00:15:30,766 --> 00:15:40,326
breakthrough technology like AI being like the next, the current general purpose
technology that is as big or even bigger than the internet, yeah, obviously that's where
187
00:15:40,326 --> 00:15:44,046
countries start thinking about, okay, let's be the ones to set the standard.
188
00:15:44,046 --> 00:15:46,446
So that's the macro context in mind.
189
00:15:46,446 --> 00:16:00,014
And so when the AI action plan came out from China, and to be accurate, when we translate
in English, it's called like the AI action plan, but the actual Chinese like,
190
00:16:00,014 --> 00:16:04,294
text, it's actually called the global AI governance sort of action plan.
191
00:16:04,294 --> 00:16:10,574
So it's like a, it has a global sort of mindset embedded into it.
192
00:16:10,574 --> 00:16:24,014
And the 95%, which I said was not new, that's basically the sort of stuff that we've seen
in previous papers and also in government representative speeches around, know, as I said,
193
00:16:24,014 --> 00:16:28,654
securing the full stack, investing in green and sustainable ways of
194
00:16:28,654 --> 00:16:31,415
powering models, all that sort of stuff.
195
00:16:31,735 --> 00:16:39,858
The 5 % which I thought was sort of was interesting and highlighted was around open
source.
196
00:16:40,259 --> 00:16:49,863
So I think when I say open source in China, people often think of DeepSeek, which that is
really the big um milestone that we saw.
197
00:16:49,863 --> 00:16:58,712
um So, but before DeepSeek, it has always been like this sort of uh strategy of
198
00:16:58,712 --> 00:17:01,704
tech clients to release open source products.
199
00:17:01,945 --> 00:17:12,424
And there's a lot of reason why open source is such a huge theme in China is, but there's
all of like, I think fundamentally it's because the internal domestic competition is so
200
00:17:12,424 --> 00:17:13,434
fierce.
201
00:17:13,695 --> 00:17:21,061
People often talk about the US and China competition as like the first layer of
competition, who are actually in China.
202
00:17:21,441 --> 00:17:26,958
Companies care more about the competition with their next door neighbor, which is like the
domestic.
203
00:17:26,958 --> 00:17:36,220
competition and so fierce, there is sort of a race to the bottom in terms of who can
produce the highest quality model for the lowest price.
204
00:17:36,501 --> 00:17:45,863
And initially there was a race towards like who can provide the lowest API options until
some of the big companies were like, actually, like, not just open source it?
205
00:17:45,863 --> 00:17:49,284
That's that's technically zero, zero dollars for free.
206
00:17:49,284 --> 00:17:54,866
So you basically beat everyone on the on the price front and just provide a very powerful
model.
207
00:17:54,866 --> 00:17:56,846
And so already for like
208
00:17:56,846 --> 00:18:07,046
For a year or two all the big tech companies in China were releasing their own open source
AR models What made DeepSea quite special is that it found like a new ways to make the
209
00:18:07,046 --> 00:18:15,906
training process even more cheaper and That caused a lot of headlines in the West as well
as well of attention has been brought to DeepSea even though it's part of a it's only a
210
00:18:15,906 --> 00:18:25,336
small part of the bigger open source picture But what this plan So even though open source
has been a long thing what this plan was quite different was that
211
00:18:25,336 --> 00:18:30,059
that its choice of language was uh very selective.
212
00:18:30,059 --> 00:18:38,054
And when it comes to Chinese policies, like the language itself probably says more about
the story than the actual message.
213
00:18:38,375 --> 00:18:44,318
Certain words are selected to convey a certain sentiment.
214
00:18:44,419 --> 00:18:54,115
And one of the things to look out for is which phrases are being repeated, like which
mantras and which combination of words are repeated throughout the paper.
215
00:18:54,115 --> 00:18:55,118
That's often the
216
00:18:55,118 --> 00:18:58,899
indicative of government thinking.
217
00:18:59,259 --> 00:19:05,731
like for like I say, for the past year, let me just bring out my notes.
218
00:19:05,731 --> 00:19:20,145
For the past year, there was like a um particular phrasing that government will use and it
was called, to translate directly into English, is, know, uh safe, reliable and
219
00:19:20,145 --> 00:19:20,925
controllable.
220
00:19:20,925 --> 00:19:25,280
So these are the, that's a typical trio of words that we see
221
00:19:25,280 --> 00:19:28,351
in speeches and it's a very systems focused view.
222
00:19:28,351 --> 00:19:35,134
So it's all about trying to make any particular use case safe, reliable, controllable.
223
00:19:35,134 --> 00:19:48,820
But since then we start to see the repetitive slogan expanding more into broader terms to
now what we call, again, direct translation, inclusive, open, sustainable, fair, secure
224
00:19:48,820 --> 00:19:52,221
and reliable, digital and intelligent future for all.
225
00:19:52,221 --> 00:19:55,212
So it's a mouthful when I say in English, but it's only like eight.
226
00:19:55,212 --> 00:19:56,583
eight characters in Chinese, right?
227
00:19:56,583 --> 00:20:04,948
So, that stuff repeats a lot throughout the policy and a much more global rather than
system specific focus.
228
00:20:05,329 --> 00:20:08,201
And how that relates to open source?
229
00:20:08,201 --> 00:20:23,471
Well, in the actual paragraph that mentions open source, we in English, we call it open
source, but in Chinese, it's technically open sharing of resources and nowhere in the
230
00:20:23,471 --> 00:20:24,922
actual text
231
00:20:25,228 --> 00:20:31,380
have I seen the words open source code, open source software, or open source models?
232
00:20:31,641 --> 00:20:40,164
Now, again, in English, when we say open source, we tend to mean that one thing, is, know,
putting your stuff on GitHub, everyone can see the code and you can download it.
233
00:20:40,164 --> 00:20:46,347
But in Chinese, open source has so many different ways of expressing that one concept.
234
00:20:46,347 --> 00:20:52,830
And if you wanna talk about open source models, open source code, there's an actual
literal direct way of saying that.
235
00:20:52,830 --> 00:20:54,094
So it's not a draft.
236
00:20:54,094 --> 00:20:59,414
I don't think it's a draft in oversight because there's so many different ways of
expressing that one thing.
237
00:20:59,414 --> 00:21:07,574
There's got to be some conscious effort behind why it's only open sharing of resources
compared to, let's say, open source models or code.
238
00:21:07,574 --> 00:21:11,914
And as I said, when it comes to Chinese policies, you have to read into the language.
239
00:21:11,914 --> 00:21:15,234
it's not, I'm not, I don't think I'm reading too deeply into it.
240
00:21:15,234 --> 00:21:17,774
I think it's meant to be read in that way.
241
00:21:17,994 --> 00:21:22,402
And taking that interpretation, if we're only talking about sharing,
242
00:21:22,402 --> 00:21:29,448
tech documentation, manuals, like the surface layer documentation instead of the actual
code.
243
00:21:29,549 --> 00:21:42,019
What I'm thinking is that this is such a clever policy balance by China to sort of
influence global standards, but also keeping the secret source back at home, which is a
244
00:21:42,340 --> 00:21:44,412
very subtle and clever sort of balance.
245
00:21:44,412 --> 00:21:46,734
So that's what I noticed in this policy.
246
00:21:46,734 --> 00:21:49,046
And again, it will take another few...
247
00:21:49,058 --> 00:21:53,220
policies or papers in the future to see if that message is being reinforced.
248
00:21:53,220 --> 00:22:03,316
But until then, this is like the first one that I think might be that slight pivot towards
that selective open sharing technique.
249
00:22:03,351 --> 00:22:11,103
You know, what's interesting is us in the West, like open China is like an oxymoron.
250
00:22:11,103 --> 00:22:14,224
We don't think of China and open anything.
251
00:22:14,224 --> 00:22:21,226
We think of very, you know, closed, controlled, um, not open.
252
00:22:21,466 --> 00:22:31,409
And, this seems like a, again, from a Westerner standpoint, it seems like a departure from
what we would expect.
253
00:22:31,427 --> 00:22:47,289
from China, which again, the great Chinese firewall and just how um there's also been uh
issues around intellectual property rights within China.
254
00:22:47,289 --> 00:22:57,937
um I guess, again, from a Westerner standpoint, it seems surprising that China wants to
have an open policy.
255
00:22:58,079 --> 00:22:58,891
around this.
256
00:22:58,891 --> 00:23:01,518
oh What about on your side of the globe?
257
00:23:01,518 --> 00:23:07,311
Is this surprising or does this line up exactly the direction you thought they'd head?
258
00:23:09,002 --> 00:23:20,820
For me, it's not surprising because I think the commercial drivers sort of explain the
story why there's a drive towards um open source in the say from the Western definition
259
00:23:20,820 --> 00:23:21,331
standpoint.
260
00:23:21,331 --> 00:23:28,566
As I said, the domestic competition is already so fierce that really the only real way to
stand out is to be open source.
261
00:23:28,566 --> 00:23:37,688
And actually, if you talk to local developers in China, if an AI company doesn't have an
open source version of their product, they're not going to be considered
262
00:23:37,688 --> 00:23:45,975
by the developers in the tech stack because even if you don't use the open source tool,
it's sort of like a fashion statement, right?
263
00:23:45,975 --> 00:23:53,232
Saying that, okay, like we're doing open source so that we know that we are within the top
band of the market.
264
00:23:53,232 --> 00:23:56,295
If we don't do open source, it means that why you're hiding, right?
265
00:23:56,295 --> 00:23:59,468
That's sort of the suspicion that you get from the local developer base.
266
00:23:59,468 --> 00:24:03,411
So I think in China, open source is the market expectation.
267
00:24:03,411 --> 00:24:04,742
It's the market standard.
268
00:24:04,974 --> 00:24:08,715
um Unlike in the West, I think there is a slight pivot today.
269
00:24:08,715 --> 00:24:14,137
I think especially OpenAI doing open weights now, but I think in China is a different
story.
270
00:24:14,137 --> 00:24:22,674
yeah, it's just a matter of like, so open source is always gonna be the direction.
271
00:24:22,674 --> 00:24:26,380
I think the question is what extent is open?
272
00:24:26,380 --> 00:24:34,478
And that's where you get really specific with is it open source documents or code or
weights or models or the whole thing.
273
00:24:34,478 --> 00:24:40,698
I think that's the question that China over there is still figuring out from a policy
perspective.
274
00:24:41,017 --> 00:24:41,877
Interesting.
275
00:24:41,877 --> 00:24:53,197
And, I don't know how, if this is different in China, but here in the U S the lawmakers
don't have a clue, um, about the tech.
276
00:24:53,197 --> 00:25:04,437
mean, I, um, until recently, Donald Trump didn't know, know who, uh, Jensen Wang was and,
yeah, a $4 trillion company.
277
00:25:04,437 --> 00:25:10,577
And, uh, our president doesn't didn't know who the CEO was, um, by his own admission.
278
00:25:10,701 --> 00:25:24,731
And apparently there's now a lot of dialogue going on and he's surrounded himself with
advisors like David Sachs and people who really understand the technology and the, if we
279
00:25:24,731 --> 00:25:30,705
can pivot to the U S for a minute, um, the commentary I've heard, and I haven't read the
entire thing.
280
00:25:30,705 --> 00:25:39,120
I've read excerpts, but on the U S side, it sounds like it's written from an informed
perspective.
281
00:25:39,121 --> 00:25:40,041
So.
282
00:25:40,178 --> 00:25:56,952
it, whether or not you agree with, cause there's some, there are some, there are some, uh,
controversial words in the U S policy around, you know, control over kind of the tone and
283
00:25:56,972 --> 00:26:05,739
wokeness and DEI and all those sorts of things that are very politically charged, um,
topics of conversation here in the U S.
284
00:26:05,739 --> 00:26:08,241
But what was your take on
285
00:26:08,267 --> 00:26:15,319
And if I, if I recall correctly, the, U S, action plan came out and China's came out
within 48 hours.
286
00:26:15,319 --> 00:26:16,632
was boom, boom.
287
00:26:16,632 --> 00:26:22,241
Um, but what, what, what is your take on kind of the U S's action plan on AI?
288
00:26:24,428 --> 00:26:29,402
Yeah, I wasn't surprised by the action points in the plan.
289
00:26:29,402 --> 00:26:36,158
I can give credit in the sense that they've been consistent in what they're going to do.
290
00:26:36,158 --> 00:26:44,615
just a quick recap, the action plan is a lot, but it's been manifested through three key
executive orders.
291
00:26:44,615 --> 00:26:49,659
So the first order is around energy infrastructure, promoting that.
292
00:26:49,659 --> 00:26:53,622
Second order is around the export controls layer.
293
00:26:54,100 --> 00:27:05,306
and the third order is really around what they call trying to regulate, well not trying to
regulate, but ensuring that language models do not generate quote unquote work or biased
294
00:27:05,306 --> 00:27:06,017
material.
295
00:27:06,017 --> 00:27:11,079
So these are the three uh key themes of the action plan.
296
00:27:11,079 --> 00:27:15,882
And it's not the first, it's not complete surprise that these were covered.
297
00:27:15,882 --> 00:27:23,626
I think it's been a consistent policy of the government since the change of government in
298
00:27:23,626 --> 00:27:24,927
start of the year.
299
00:27:25,888 --> 00:27:35,636
For me, my focus because I'm more interested in the global dynamics, the second one was
the most interesting for me, which is around the export controls.
300
00:27:36,077 --> 00:27:45,464
And so ever since the Biden administration, the US has been tightening export controls
around chips.
301
00:27:45,505 --> 00:27:47,520
And to get even more specific,
302
00:27:47,520 --> 00:27:52,643
Initially, was only restrictions around just the actual final chips themselves.
303
00:27:52,643 --> 00:27:58,007
So the final product before it's being shipped to select countries.
304
00:27:58,007 --> 00:28:08,103
the export controls really target like China, Russia, Iran, and there's other sort of like
what the USC's as the competitor nations.
305
00:28:08,925 --> 00:28:17,390
But what I found really interesting in the action plan is that there is, it's only a small
paragraph within the actual plan, but it mentions of
306
00:28:17,390 --> 00:28:32,190
the government or requiring the DOC to really look into targeted export controls around
semiconductor manufacturing components.
307
00:28:33,070 --> 00:28:36,670
So if you think about the full stack, there is the actual chip itself.
308
00:28:36,670 --> 00:28:42,270
take your Nvidia chip or AMD chip, the final full package.
309
00:28:42,270 --> 00:28:44,470
Within that, there's a lot of different components.
310
00:28:45,050 --> 00:28:54,277
So existing controls target both the final one and also recently the components within
that chip because there's been a growing sort of recognition within government that,
311
00:28:54,277 --> 00:29:04,003
actually maybe the chip is designed and exported from US, but all the different parts are
being imported across the world, right?
312
00:29:04,003 --> 00:29:08,166
And so you've got to mostly make sure you're accountable for these different sub
components.
313
00:29:08,666 --> 00:29:14,600
But now they're looking above the value chain as well, which is the actual manufacturing.
314
00:29:15,018 --> 00:29:15,629
of the chips.
315
00:29:15,629 --> 00:29:19,622
So there are already existing controls on manufacturing equipment.
316
00:29:19,622 --> 00:29:23,025
So we're talking about these big lithographic machines.
317
00:29:23,025 --> 00:29:35,276
So just a quick, uh quick explainer, like how these chips are built, like they're very
small, like it's, basically a very intricate design on a silicon wafer, right?
318
00:29:35,417 --> 00:29:36,708
But it's really small.
319
00:29:36,708 --> 00:29:40,846
It's as small as like some like adamants is measured in nanometers, right?
320
00:29:40,846 --> 00:29:42,146
How the heck do people do that?
321
00:29:42,146 --> 00:29:52,006
Well, it's all done because there's like this big machine that's sole purpose is to fire a
very specific beam of light through a bunch of mirrors.
322
00:29:52,006 --> 00:29:59,026
And that's what carves out these very small designs on a very small piece of silicon at a
nanometer sort of level.
323
00:29:59,026 --> 00:30:00,786
That's how these things are created.
324
00:30:00,786 --> 00:30:10,694
Now, these big machines, like these lithographic machines, can only be built by one
company, which is ASML, which is a base, a company based in the Netherlands, right?
325
00:30:11,042 --> 00:30:15,124
and that stuff's been bought by companies across the world to build these chips.
326
00:30:15,124 --> 00:30:24,932
So we're talking about the final big machine, but that machine itself, I think reports
have said it's built from 700,000 components.
327
00:30:24,932 --> 00:30:31,777
They have components from Germany, from Spain, from France, even from China, from like
Southeast Asia, everywhere.
328
00:30:31,777 --> 00:30:36,160
We're talking about the most complex supply chain in history.
329
00:30:36,300 --> 00:30:38,894
And I reckon it builds the...
330
00:30:38,894 --> 00:30:42,254
I reckon it's in the Guinness World Records or something for that complexity.
331
00:30:42,254 --> 00:30:52,374
But anyways, the action plan is considering not just targeting semiconductor manufacturing
equipment, but also components within that manufacturing equipment.
332
00:30:52,374 --> 00:31:03,894
Now it might just be two words on the paper, but those two words could have massive
complications because if we're gonna export control components within that big machine,
333
00:31:04,074 --> 00:31:06,654
that could potentially cover the whole world.
334
00:31:06,654 --> 00:31:08,258
Like these export controls could
335
00:31:08,258 --> 00:31:14,861
basically target every single supplier that pitches in into that one big machine.
336
00:31:14,962 --> 00:31:24,057
And it's really hard to evaluate like what's the actual specific impact, but all I know is
that it's gonna make supply chains really complicated.
337
00:31:24,057 --> 00:31:33,882
And a lot of the costs, the supply side inflation that's happening in the world right now,
partly due to oil price, but there's also a lot due to just cheap prices in generally
338
00:31:33,882 --> 00:31:37,502
making the cost of anything tech, any
339
00:31:37,502 --> 00:31:51,548
any good basically that's that's digitized that's they already quite expensive on
themselves just from the current um export controls but with with further controls around
340
00:31:51,548 --> 00:32:02,793
the manufacturing components then yeah that yeah i'm just we're gonna brace ourselves for
that i'm sure like again the action plan is only indicating that this is an area that the
341
00:32:02,793 --> 00:32:06,092
relevant departments have to look at so just to be clear it's not
342
00:32:06,092 --> 00:32:10,036
is not a direct action, it's just telling the agencies to look into that question.
343
00:32:10,036 --> 00:32:15,463
So I'm sure there'll be a lot of expert analysis into that, but that's one thing just to
get your heads up for.
344
00:32:15,463 --> 00:32:16,693
Yeah.
345
00:32:17,725 --> 00:32:24,909
And speaking of export controls, so I banged on DeepSeek a little bit and lately Kimi.
346
00:32:25,129 --> 00:32:34,974
And uh I'm really impressed with, I was really impressed with DeepSeek uh R1 when it came
out and I'm really impressed with Kimi.
347
00:32:35,895 --> 00:32:47,437
Are these export controls maybe having an unintended consequence of forcing these uh
countries that have constraints?
348
00:32:47,437 --> 00:32:57,887
to become more efficient and creative and engineer more uh interesting solutions to these
problems?
349
00:32:57,887 --> 00:33:00,311
Is it having that unintended consequence?
350
00:33:00,856 --> 00:33:09,814
Yeah, it's like kind of like the whole, you know, the famous quotes like, you know,
necessity is the model of invention or that sort of thinking.
351
00:33:09,814 --> 00:33:13,617
And actually I'll show you something in Chinese internet meme culture.
352
00:33:13,617 --> 00:33:23,742
Like there's a lot of, it's a pretty popular meme that among Chinese netizens that Trump
and Biden are the founders of China or the.
353
00:33:23,742 --> 00:33:28,015
or that it's called nation building fathers of China.
354
00:33:28,015 --> 00:33:29,666
That's like the joke, right?
355
00:33:29,666 --> 00:33:40,012
Because the reason is that their export controls have pressured Chinese industries, have
limited their resources so much in a way that they just have to find new ways to build
356
00:33:40,012 --> 00:33:41,134
models.
357
00:33:41,134 --> 00:33:44,056
as you mentioned, Kimmy, but also notably DeepSeek, right?
358
00:33:44,056 --> 00:33:47,978
The fact that, so it's just a quick rundown, like.
359
00:33:49,106 --> 00:33:57,950
The traditional thinking is that in order to build a powerful AR model, you need a lot of
labelled data and a lot of processing power to train on the labelled data.
360
00:33:57,950 --> 00:34:09,335
But DeepSeek, based on their paper, says that you can actually build a powerful model with
less labelled data, but a lot more from reinforcement learning.
361
00:34:09,335 --> 00:34:14,517
And reinforcement learning, the advantage is that you don't need labelled data to do
reinforcement learning.
362
00:34:14,517 --> 00:34:17,037
You have some to get it up to like a
363
00:34:17,037 --> 00:34:19,037
particular sort of like head start.
364
00:34:19,037 --> 00:34:28,279
But from there on, that's like the first 10%, but the rest, the 90%, the model just plays
by itself and just learns from its own mistakes and then reapplies them.
365
00:34:28,279 --> 00:34:32,260
then that's the beauty of reinforcement learning.
366
00:34:32,560 --> 00:34:38,601
And then apparently they were able to do it on like older like chips, like H20 chips.
367
00:34:38,601 --> 00:34:43,622
But I think that claim is still being tested by independent experts.
368
00:34:44,002 --> 00:34:55,434
That's an example of really stretching the boundaries of existing legacy tech and finding
new software layer, new algorithms to make the most out of your hardware components.
369
00:34:55,434 --> 00:34:59,878
So yeah, I'm sure there is that effect.
370
00:35:00,233 --> 00:35:15,821
Yeah, and like speaking of other policy effects like you know the didn't meta recently
refused to sign the EU's uh You know acknowledgement around around their policies am I
371
00:35:15,821 --> 00:35:17,151
correct on that?
372
00:35:17,666 --> 00:35:22,049
Yeah, I think you're referring to the general purpose AI code.
373
00:35:22,049 --> 00:35:24,290
Yeah, it's been uh a...
374
00:35:24,891 --> 00:35:35,638
It's been one of those hotly contentious policy documents in the industry and even caused
some sort of division among the big tech companies themselves.
375
00:35:35,638 --> 00:35:39,391
I think it's all been finalized since last week.
376
00:35:39,391 --> 00:35:41,744
So I think there is a list of signatories.
377
00:35:41,744 --> 00:35:43,344
You can see who signed and who's not.
378
00:35:43,344 --> 00:35:47,316
But yeah, I think in the weeks lead up to it, yeah, certain...
379
00:35:47,448 --> 00:35:50,254
Companies have said they'll sign on, some say they won't.
380
00:35:50,254 --> 00:35:51,345
Yeah.
381
00:35:51,928 --> 00:35:52,598
Yeah.
382
00:35:52,598 --> 00:36:04,093
And you know, I think another tone from the U S action plan is there's going to be very
little regard for, um, environmental concerns.
383
00:36:04,093 --> 00:36:06,444
It's, you know, build baby build.
384
00:36:06,544 --> 00:36:17,158
And, know, again, this seems like the, this seems a little bit like a, a, a uh flipping of
the script, you know, um, historically the picture has been painted that China has had
385
00:36:17,158 --> 00:36:18,849
less regard for.
386
00:36:18,881 --> 00:36:26,005
know, green initiatives and really the West has been putting that more in focus.
387
00:36:26,005 --> 00:36:38,672
And it seems to me the tone of the, you know, this administration's policy is that is in
the way in the backseat, maybe in the trunk.
388
00:36:38,672 --> 00:36:46,557
uh First and foremost is about establishing global dominance around AI and maintaining the
lead.
389
00:36:46,650 --> 00:36:48,657
Is that the way you read it as well?
390
00:36:50,082 --> 00:36:51,543
Yeah, interesting.
391
00:36:52,044 --> 00:37:05,595
It depends on how you define lead, because I hear lot of commentary around the whole AI
race and who's leading what, but it's a very, I personally find it's very simplified view
392
00:37:05,595 --> 00:37:09,358
of how this whole ecosystem works.
393
00:37:09,358 --> 00:37:12,600
First of all, you have to divide it within like layers, right?
394
00:37:12,841 --> 00:37:15,543
And it depends on which layer you're looking at.
395
00:37:15,543 --> 00:37:18,744
So at the app layer, I'd say like,
396
00:37:18,744 --> 00:37:27,339
both China and US have equally diverse and widely used apps at the application layer.
397
00:37:27,739 --> 00:37:38,956
And as you go deeper within the stack, so I think maybe just to really simplify, at the
first layer, it's really a question around diversity and USichu has the most diverse and
398
00:37:38,956 --> 00:37:40,867
used ecosystem.
399
00:37:40,867 --> 00:37:42,508
Then we get to the model layer.
400
00:37:42,508 --> 00:37:45,720
That's okay, that's why I can see the race concept being.
401
00:37:45,720 --> 00:37:51,354
being true because it's really a race to who can build the smallest and cheapest model.
402
00:37:51,354 --> 00:37:54,916
That's really what I see the races and it's different approaches, right?
403
00:37:54,916 --> 00:38:06,643
So from a US perspective, it's really driven by the private sector, private sector and
also the inter competition trying to produce the cheapest sort of APIs for powerful
404
00:38:06,643 --> 00:38:07,250
models.
405
00:38:07,250 --> 00:38:12,086
Whereas in China is also that domestic competition from an open source level.
406
00:38:13,076 --> 00:38:19,846
At the infrastructure hardware chips layer, I didn't really see it as a race.
407
00:38:19,846 --> 00:38:21,218
It's more like...
408
00:38:23,062 --> 00:38:27,024
you find your own adventure to building self-sufficiency.
409
00:38:27,024 --> 00:38:28,485
That's how I see it.
410
00:38:28,845 --> 00:38:35,509
And you can either do it from a constructive or deconstructive uh approach.
411
00:38:35,509 --> 00:38:44,133
Constructive being, so when I say that, constructive means, for example, subsidies,
government investments, promoting trade.
412
00:38:44,174 --> 00:38:46,715
So stuff that kind of helps grow.
413
00:38:46,775 --> 00:38:52,608
And deconstructive, is export controls, tariffs, or other um
414
00:38:52,608 --> 00:38:57,741
anti-free trade policies that try to stifle what your competitors are doing.
415
00:38:57,741 --> 00:39:02,164
And it's not that you can only be constructive, it can't be deconstructive.
416
00:39:02,164 --> 00:39:03,915
You're gonna have a balance between those two, right?
417
00:39:03,915 --> 00:39:05,576
That's how policy works, right?
418
00:39:05,576 --> 00:39:16,732
So I think that layer, it's really, again, choose your own adventure, but your policy mix
depends on your current country circumstances.
419
00:39:16,932 --> 00:39:18,033
So that's how I see it.
420
00:39:18,033 --> 00:39:20,134
um
421
00:39:21,230 --> 00:39:34,701
In terms of more broadly, I think it is true that both, I think since 2022 and whole
January AI, I think before that, the whole AI policy debate was really around just the
422
00:39:34,701 --> 00:39:38,705
typical safety and reliability, all that stuff.
423
00:39:38,705 --> 00:39:43,708
Since 2022, AI is gonna become a more geopolitical topic.
424
00:39:43,869 --> 00:39:50,254
And so the idea of like, so as I said, the idea of leading is more around
425
00:39:50,444 --> 00:39:57,056
establishing, I think like just who has more influence on standards.
426
00:39:57,056 --> 00:40:03,288
I think that's one specific angle that I can see where there's that strong competition, as
I said before.
427
00:40:03,288 --> 00:40:07,719
Once you set the standards, your whole ecosystem becomes sticky.
428
00:40:07,719 --> 00:40:12,750
And when your system becomes sticky, people have to use it, revenue comes in, your GDP
booms.
429
00:40:12,750 --> 00:40:16,802
That's like, that's how, I think that's sort of the more long meta strategy.
430
00:40:16,802 --> 00:40:18,442
So I see that.
431
00:40:18,994 --> 00:40:24,198
And you also mentioned the whole green and uh energy thing.
432
00:40:24,499 --> 00:40:28,302
That's also a big part of it because AI consumes lot of power.
433
00:40:28,582 --> 00:40:38,431
I think this is also where it's important not to see AI policy in isolation, but how it
interconnects with every other domestic policy of a country.
434
00:40:38,431 --> 00:40:47,638
So AI crosses over into energy policy, also crosses over into land policy, because the
amount of land you have to dedicate to data centers.
435
00:40:47,680 --> 00:40:49,622
it crosses over into like tax.
436
00:40:49,622 --> 00:40:50,933
That's a huge thing, right?
437
00:40:50,933 --> 00:40:54,316
Tax incentive and all that to incentivize development.
438
00:40:54,316 --> 00:40:56,670
You to see all this in one big picture.
439
00:40:56,670 --> 00:41:07,647
I think where it's true, at least from the objective stats, is that China does have a huge
head start in this space because they have a lot of capacity, like in terms of electric
440
00:41:07,647 --> 00:41:14,172
generation, there's a lot of land that's still being underdeveloped that can be turned
into electric plants.
441
00:41:14,172 --> 00:41:16,802
There is a stronger central
442
00:41:16,802 --> 00:41:19,783
government push in energy.
443
00:41:20,083 --> 00:41:23,985
That's been quite a, it's been consistent for many, many years.
444
00:41:24,345 --> 00:41:32,608
The green tech industry over there has a lot of state support and also a of private sector
activity.
445
00:41:32,729 --> 00:41:34,780
It's also a very popular STEM subject.
446
00:41:34,780 --> 00:41:40,552
We talked to Chinese developers, a lot of them want to go into energy tech as their
engineering field.
447
00:41:40,552 --> 00:41:45,454
So there's that sort of capacity that's all there, that they're just making use of that.
448
00:41:45,454 --> 00:41:47,634
and part of it's going towards AI.
449
00:41:47,634 --> 00:41:51,114
And the US is also now focusing the same efforts now.
450
00:41:51,114 --> 00:41:56,034
I think it's just a good, I think it's a consistent challenge with the West around energy.
451
00:41:56,774 --> 00:42:05,434
There's a lot of debate around which sources you have to use and it doesn't, each
particular energy source has its own big policy debate.
452
00:42:05,434 --> 00:42:09,514
But I think in China, they sort of have, they sort of just have that one set.
453
00:42:09,514 --> 00:42:15,318
Like they just go with that one source and then go, so not one source, they go with a
certain mix of sources.
454
00:42:15,318 --> 00:42:16,210
And just run with that.
455
00:42:16,210 --> 00:42:21,970
They kind of skip the whole policy debate in the beginning, just go straight to
implementation.
456
00:42:22,541 --> 00:42:22,831
Yeah.
457
00:42:22,831 --> 00:42:27,993
And then, we're almost out of time and we've been taught this is, could talk about this
stuff all day.
458
00:42:27,993 --> 00:42:33,806
Um, I geek out on, you know, AI in general, but bringing it back to legal.
459
00:42:34,026 --> 00:42:48,332
So what do you, what do you envision and, know, on what sort of timeline do you see legal
work and the resources, the inputs, which are mostly human capital today?
460
00:42:48,432 --> 00:42:51,157
When do you see that being disrupted?
461
00:42:51,157 --> 00:43:01,384
where we will see material impact to law firm revenue, law firm headcount, inside uh
council processes.
462
00:43:01,384 --> 00:43:06,147
Like today, there's a lot of experimentation and I think there is some impact.
463
00:43:06,147 --> 00:43:10,349
But when do you see real disruption taking place in the legal space?
464
00:43:11,086 --> 00:43:24,766
Yeah, I could draw a graph here, but I feel like it's just a function of the more
standardized and the lower risk value of the work, the more prone it is to automation.
465
00:43:25,006 --> 00:43:37,726
And not just AI automation, but just any form of automation, like even your traditional
boring, algorithmic sort of if-else statements, that stuff can also act as automation.
466
00:43:38,574 --> 00:43:40,794
as a standardized low risk.
467
00:43:40,794 --> 00:43:53,474
the reason why I say that is because obviously standardized is a consistent process that's
really easy to encode into code and low risk being that if something goes wrong, the loss,
468
00:43:53,474 --> 00:43:55,954
the chance of harm is still going to be quite low.
469
00:43:55,954 --> 00:44:04,374
And there's, and it's also one of those like if something goes wrong, it's still easy or
still practical for someone to just jump in and fix things.
470
00:44:04,814 --> 00:44:05,966
like, yeah.
471
00:44:05,966 --> 00:44:16,786
Usual suspects that people talk about as like, you know, low value real estate
transactions, like mortgages, conveyancing, that's to the extent that's still done by
472
00:44:16,786 --> 00:44:17,866
lawyers.
473
00:44:18,606 --> 00:44:24,326
Some aspects of like loan, like loan contracts, equity, that's like all stock standard
terms.
474
00:44:24,786 --> 00:44:30,086
Certain tech contracts, software contracts, again, that's anything that's got to do stock
standard terms.
475
00:44:30,086 --> 00:44:31,086
Yeah, definitely.
476
00:44:31,086 --> 00:44:34,866
I mean, that's like the primary, lot of these legal tech startups are targeting.
477
00:44:36,300 --> 00:44:41,353
I'd say that's something that's already in the process of, since the past four, five
years.
478
00:44:41,413 --> 00:44:49,038
The next two, three years or so, we'll start to target the more, still relatively
standardized and still relatively low risk.
479
00:44:49,038 --> 00:44:55,762
But I'd say this, I think maybe 80 % standardized and an extra 10 % in risk.
480
00:44:55,762 --> 00:44:57,403
That's like the medium state level.
481
00:44:57,403 --> 00:45:01,646
um That's where we start to see stronger.
482
00:45:01,772 --> 00:45:07,444
reasoning capabilities of these models to be able to tackle these sort of semi
standardized problems.
483
00:45:07,444 --> 00:45:17,319
So they've still got some consistency in a problem, but there's also a level of
customization or nuance thinking that these models have to have to sort of recognize, but
484
00:45:17,319 --> 00:45:21,290
not too nuanced that it sort of confuses the model.
485
00:45:21,791 --> 00:45:29,954
So that's probably where we're getting to again, the same errors I just identified, but
the bit more complex like problems.
486
00:45:29,986 --> 00:45:39,444
We also start to see areas like, I say, crime, like sudden, I uh don't know what's the
right word to use, but petty crime.
487
00:45:39,444 --> 00:45:42,436
I think that's where you can start using for petty crime.
488
00:45:42,596 --> 00:45:49,181
Also, um yeah, a lot more areas of commercial law, so commercial contracting.
489
00:45:50,403 --> 00:45:59,510
What everyone's really excited about is in the next, I say, eight or 10 years, where we
really start tackling highly nuanced legal problems.
490
00:45:59,968 --> 00:46:04,560
And actually, this is where I honestly don't really know what will be the end outcome.
491
00:46:04,560 --> 00:46:11,663
As a practicing lawyer myself, when I say highly nuanced problems, I do mean they're
highly nuanced.
492
00:46:11,663 --> 00:46:21,027
I think the common misconception that people have is that like contracts or reading laws,
it's all based on what's written on the paper.
493
00:46:21,027 --> 00:46:24,198
As long as you know what's on paper, you can interpret that.
494
00:46:24,198 --> 00:46:26,909
You basically have the full answer.
495
00:46:26,909 --> 00:46:29,614
Actually, no, like the text.
496
00:46:29,614 --> 00:46:33,254
let's say probably only addresses 30 % of your problem.
497
00:46:33,274 --> 00:46:40,874
The 60 % is actually understanding your client's needs, the problem at hand, and also the
market.
498
00:46:41,054 --> 00:46:45,274
And the question is, how do you encode all of that into numbers?
499
00:46:45,274 --> 00:46:48,894
That's ultimately what developers have to do.
500
00:46:49,034 --> 00:46:53,814
Encode the legal problem into numbers that can be read by a machine.
501
00:46:53,814 --> 00:46:55,594
That's what you have to do at the end of the day.
502
00:46:55,594 --> 00:46:58,272
How do you encode client interests
503
00:46:58,272 --> 00:47:03,794
marker standard, marker practices, to extent that they're not written down in words or
standards.
504
00:47:03,794 --> 00:47:07,606
We're just talking about conversation dialogues and all that.
505
00:47:07,606 --> 00:47:15,669
How do you encode that in a consistent manner that a model can reliably reference for XYZ
problems?
506
00:47:15,669 --> 00:47:17,490
I've tried doing that myself.
507
00:47:17,490 --> 00:47:19,591
It's really hard, right?
508
00:47:19,851 --> 00:47:20,601
But who knows?
509
00:47:20,601 --> 00:47:24,223
Maybe at that point, the whole architecture will change.
510
00:47:24,223 --> 00:47:27,192
We're currently still on a transformer architecture.
511
00:47:27,192 --> 00:47:32,482
which is very much a predict the next word, predict next token.
512
00:47:32,482 --> 00:47:34,376
Obviously there's a lot of layers around that.
513
00:47:34,376 --> 00:47:37,808
It's not just that, but fundamentally that's still what happens.
514
00:47:37,808 --> 00:47:45,612
Who knows, it might be a new mainstream model, like the state space model that might allow
us to do a way more nuanced reasoning.
515
00:47:46,113 --> 00:47:50,335
Right now, all of the reasoning models are just limited to like chain of thought.
516
00:47:51,135 --> 00:47:53,417
But I think chain of thought is just level one.
517
00:47:53,417 --> 00:47:56,158
There's like way more levels down the line.
518
00:47:56,526 --> 00:48:01,186
which I don't know yet because I'm not within the research centers themselves.
519
00:48:01,186 --> 00:48:08,786
yeah, really, I'm probably, I'm gonna be one of those people who are really optimistic
around disruption in law.
520
00:48:08,786 --> 00:48:14,846
It's weird for a lawyer to say that, but I feel like it's gonna be amazing because it'll
free up a lot of our time.
521
00:48:14,846 --> 00:48:17,286
I think laws would just be happier in general.
522
00:48:17,286 --> 00:48:19,606
We don't wanna be bogged down with boring work.
523
00:48:19,606 --> 00:48:23,246
We wanna do cool, more strategic work and there'll be new types of.
524
00:48:23,246 --> 00:48:27,805
industries and work coming out of that as well that we can't conceive of today.
525
00:48:28,026 --> 00:48:40,946
If you think about the idea of a corporation, like when the Dutch Empire wanted to expand,
that's when they created this idea of a corporation as a vehicle to collect private funds
526
00:48:40,946 --> 00:48:42,926
to fund expansion.
527
00:48:43,046 --> 00:48:52,526
that's when you have the idea of a corporation that created the idea of shares, which then
created the whole stock market, which then created the whole securities law.
528
00:48:53,058 --> 00:48:58,261
commercial law, corporate law, all of that just came from one new abstract idea.
529
00:48:58,261 --> 00:49:08,908
Who knows one day there'll be a new abstract idea that we can't conceive of today, but it
will be there in the future and that will create a whole new area of law that's way above
530
00:49:08,908 --> 00:49:13,931
the pay grade of AI models and we humans have to navigate through that.
531
00:49:14,312 --> 00:49:16,193
So I'm very optimistic.
532
00:49:16,193 --> 00:49:18,034
Yeah, I'm actually so keen for it.
533
00:49:18,143 --> 00:49:26,475
Yeah, I mean, it's as a legal tech CEO, I am really enjoying myself.
534
00:49:26,475 --> 00:49:30,166
It doesn't come without uh heartburn.
535
00:49:30,166 --> 00:49:33,877
You know, we are solely dependent on law firms for our business.
536
00:49:33,877 --> 00:49:38,519
And, you know, I see a lot of complacency.
537
00:49:38,519 --> 00:49:44,750
um And I also see firms that are being aggressive and going out and hiring talent and
making investment.
538
00:49:44,750 --> 00:49:47,211
So I see kind of all ends of the spectrum.
539
00:49:47,245 --> 00:49:52,968
But I worry that things, I don't know how it is in Australia, but here in the US, it's a
very fragmented law.
540
00:49:52,968 --> 00:50:02,412
AmLaw 200, mean, 200 law firm, the AmLaw 100 is, if you add up all the revenue, they'd be
like Fortune 150.
541
00:50:02,412 --> 00:50:08,936
So it um does concern me, but I'm optimistic as well.
542
00:50:08,936 --> 00:50:11,877
And yeah, this has been a fantastic conversation.
543
00:50:12,010 --> 00:50:20,402
Before we wrap up, how do people find out more about the work that you're doing with your
regulation tracker or any other projects that you're working on?
544
00:50:20,960 --> 00:50:26,012
Yeah, so simply I have a website that links everything.
545
00:50:26,072 --> 00:50:35,816
So it's like www.techcareer.com or you can also just search for me on LinkedIn Raymond Sun
and yeah, they have all links in there.
546
00:50:35,816 --> 00:50:40,193
But yeah, just start with these two and yeah, hopefully you find my content fun.
547
00:50:40,193 --> 00:50:45,036
Yeah, we'll include links in the show notes um so people can get to you.
548
00:50:45,036 --> 00:50:50,660
Well, Ray, I really appreciate you taking a little bit of time out of your morning to have
a conversation with me.
549
00:50:50,660 --> 00:50:52,320
This has been a lot of fun.
550
00:50:52,821 --> 00:50:55,302
let's keep doing the work that you're doing, man.
551
00:50:55,302 --> 00:50:59,266
uh We're all benefited from it, so we appreciate it.
552
00:50:59,266 --> 00:51:00,169
Yeah, likewise, Ted.
553
00:51:00,169 --> 00:51:04,133
Thank you very much for bringing me on board, and I always love chatting with you,
especially on these topics.
554
00:51:04,133 --> 00:51:05,536
Yeah, thank you.
555
00:51:05,536 --> 00:51:06,017
All right.
556
00:51:06,017 --> 00:51:07,418
Have a good afternoon.
557
00:51:07,821 --> 00:51:08,781
Thanks.
00:00:04,629
Ray, how are you this afternoon or morning your time?
2
00:00:04,629 --> 00:00:05,952
How are you this morning?
3
00:00:05,984 --> 00:00:09,155
Yeah, it's morning time and yeah, hey Ted, feeling really good?
4
00:00:09,155 --> 00:00:09,997
Yeah.
5
00:00:10,094 --> 00:00:10,844
Good, man.
6
00:00:10,844 --> 00:00:11,775
Good.
7
00:00:11,775 --> 00:00:13,755
It's good to have you on the show.
8
00:00:14,033 --> 00:00:17,417
I enjoy reading your content on LinkedIn.
9
00:00:17,974 --> 00:00:35,735
I was looking at your global AI regulation tracker and knew we had to have a conversation
because, and how timely, uh we are in the midst of all sorts of regulatory movement, both
10
00:00:35,735 --> 00:00:36,839
US and China.
11
00:00:36,839 --> 00:00:38,056
So we'll get into that.
12
00:00:38,056 --> 00:00:39,086
But before
13
00:00:39,086 --> 00:00:43,691
Before we do, let's get you introduced.
14
00:00:43,691 --> 00:00:49,167
you're a lawyer and a tech lawyer and a developer, and you don't see that a lot.
15
00:00:49,167 --> 00:00:57,386
Why don't you tell us a little bit about your background and how you ended up being a tech
lawyer and application developer.
16
00:00:57,624 --> 00:00:58,755
Yeah, yeah.
17
00:00:58,755 --> 00:00:59,835
So...
18
00:01:00,156 --> 00:01:05,067
My story's gonna probably be one of those that starts off really random.
19
00:01:05,067 --> 00:01:09,093
You don't know where it's going, or eventually fall into place, makes sense.
20
00:01:09,093 --> 00:01:10,044
So...
21
00:01:10,244 --> 00:01:14,386
It really started when I was 5 or 6 years old.
22
00:01:14,494 --> 00:01:21,613
I was really into shows like Thomas the Tank Engine, Astro Boy, Iron Man.
23
00:01:22,154 --> 00:01:27,328
Anything that's got to do with machines, you know, coming alive and doing cool stuff.
24
00:01:27,470 --> 00:01:30,852
That has been my fascination even till today.
25
00:01:31,112 --> 00:01:36,996
And so, you know, throughout primary school and high school, I've always liked building
things on the side.
26
00:01:36,996 --> 00:01:40,058
So in primary school, I would like to build my own toys.
27
00:01:40,058 --> 00:01:47,322
And in high school, you know, I started playing computer games and then started to pick up
coding to learn how to build my own computer games.
28
00:01:47,322 --> 00:01:50,704
And what really helped was that my friends were also into the same thing.
29
00:01:50,704 --> 00:01:54,676
So we're all motivating each other and just doing cool things together.
30
00:01:55,390 --> 00:02:07,096
I think it was then during high school that I started reading, going to the library, and
then I was also interested in books around, know, all of the detective genres, and also
31
00:02:07,096 --> 00:02:10,137
legal thrillers, like the John Grisham series.
32
00:02:10,518 --> 00:02:19,902
And that's where I got into, I really enjoy stories that talk about uh evidence and,
33
00:02:20,398 --> 00:02:22,790
trying to connect pieces together to uncover the truth.
34
00:02:22,790 --> 00:02:25,941
And even though along those lines, I find really fascinating.
35
00:02:26,042 --> 00:02:41,292
And so when it got to the end of high school, what I want to do for further study, I was
really looking at either law or computer science, given my ongoing interests.
36
00:02:41,393 --> 00:02:49,678
And it came down to a really simple conclusion, which is that I can make coding a hobby,
but I can't make law a hobby.
37
00:02:49,824 --> 00:02:52,145
So why not make law my career?
38
00:02:52,145 --> 00:02:56,097
And I was, you know, continuing my coding on the side.
39
00:02:56,398 --> 00:02:59,409
And so all my friends did computer science.
40
00:02:59,409 --> 00:03:09,065
I was the only one who went on to do law, but I still, you know, continued building games
and websites just to, you know, as a way to still connect to my friends.
41
00:03:09,385 --> 00:03:14,888
And yeah, throughout uni, I was doing my legal studies and building random maps.
42
00:03:15,089 --> 00:03:17,902
I think, what was like a
43
00:03:17,902 --> 00:03:34,262
A big turning point was I was looking for work as a paralegal and I wasn't able to find
one until I started to just look into internships in non-traditional law firms.
44
00:03:34,502 --> 00:03:39,922
So this was around 2017-18.
45
00:03:39,922 --> 00:03:44,582
I applied for an internship at a legal tech startup called Law Path.
46
00:03:44,926 --> 00:03:47,638
and I was doing as a media intern.
47
00:03:47,638 --> 00:03:52,521
So I was writing blog articles around, I think back then it was blockchain and smart
contracts.
48
00:03:52,521 --> 00:03:54,453
That was the craze back then.
49
00:03:54,453 --> 00:03:56,674
So I was writing a lot of articles on that.
50
00:03:56,674 --> 00:04:02,538
And then around that time, I also did a hackathon, one of the world's first legal tech
hackathon.
51
00:04:02,898 --> 00:04:05,620
And I was representing my university.
52
00:04:05,740 --> 00:04:07,732
I was the developer within my team.
53
00:04:07,732 --> 00:04:14,446
So I was building a prototype app, which basically helps streamline payments between the
client
54
00:04:14,510 --> 00:04:18,930
and the Barrister, which is a very, you know, UK, Australia unique thing.
55
00:04:19,130 --> 00:04:25,450
We don't have the Barrister and sister that merge as an attorney like in the US, but
basically it's just an app.
56
00:04:25,450 --> 00:04:27,010
It's like a payment app, right?
57
00:04:27,010 --> 00:04:36,470
And then when I pitched the app, it turns out that one of the judges was the boss of that
legal tech startup that I was interning at.
58
00:04:36,470 --> 00:04:38,478
And so when I came back, the boss said,
59
00:04:38,478 --> 00:04:40,978
Hey Ray, I didn't know you could code like that.
60
00:04:40,978 --> 00:04:46,438
Why don't you move into our engineering department rather than the comms media department?
61
00:04:46,438 --> 00:04:48,018
And I was like, yeah, sure.
62
00:04:48,038 --> 00:04:54,298
And then that's how I got my actual job as like a legal engineer at that startup.
63
00:04:54,298 --> 00:04:57,598
And I was helping build out the document automation platform.
64
00:04:57,598 --> 00:05:06,758
And so that was like the first serious, I guess, gig that made me think, oh, wow, okay,
tech and law can be combined in some way into some actual.
65
00:05:06,818 --> 00:05:17,943
meaningful career and since then when I was applying for graduate roles and you know
Becoming a lawyer it has always been my path to sort of go down that tech lawyer sort of
66
00:05:17,943 --> 00:05:28,188
route and that's where I am right now I'm currently a practicing tech lawyer at Herbert
Smith Freehills Kramer So and uh which is a global law firm and I do a lot of stuff within
67
00:05:28,188 --> 00:05:33,830
the AI uh legal space but at the same time, you know uh
68
00:05:33,922 --> 00:05:36,673
So that's that's like how my career sort of evolved.
69
00:05:36,673 --> 00:05:49,057
But on the side, as I said, I'm still building things and the AR regulation tracker is
just one of those sort of uh projects that really developed as I was like, got really
70
00:05:49,057 --> 00:05:54,519
interested in AR ethics and robots, but I'm sure we can talk more about that one.
71
00:05:54,519 --> 00:05:57,410
But that's basically the rundown of my career today.
72
00:05:57,472 --> 00:05:58,342
Interesting.
73
00:05:58,342 --> 00:06:06,485
And yeah, so let's talk about the global AI regulation tracker that kind of started off as
a personal project for you.
74
00:06:06,485 --> 00:06:13,148
And then it's kind of got in traction and now is an industry resource.
75
00:06:13,248 --> 00:06:16,179
like take us back to the origin story around that.
76
00:06:16,179 --> 00:06:27,243
think the last time you and I spoke, was you were procrastinating a reading about trolley
problems and machine ethics and tell us about how this thing came about.
77
00:06:27,342 --> 00:06:31,662
Yeah, basically it started with procrastination.
78
00:06:31,662 --> 00:06:36,422
I think during uni there were some subjects which were a bit, you know, a bit of a bludge.
79
00:06:36,502 --> 00:06:41,422
was just, I'm a person who likes history and geography.
80
00:06:41,422 --> 00:06:46,742
So I like reading, you know, random articles on geopolitics and history.
81
00:06:46,742 --> 00:06:56,134
And I came across this YouTube documentary and this was around a time where self-driving
vehicles were being trialed, experimented.
82
00:06:56,778 --> 00:06:59,899
and know, accidents, unfortunate accidents were happening.
83
00:06:59,899 --> 00:07:03,781
And then there videos just talking about the ethics in self-driving vehicles.
84
00:07:03,781 --> 00:07:15,085
So if a car were to, if it can't stop and can't change tracks, you, and it was approaching
either an elderly person or a baby, which one should the car hit to minimize like damage?
85
00:07:15,085 --> 00:07:25,960
And that's such a tough question that it just opened up the whole door of AR ethics, which
I never thought was a theme, but you know, the more I read into it, you know, reading
86
00:07:25,960 --> 00:07:26,562
about
87
00:07:26,562 --> 00:07:30,845
the three laws of robots and then the Trotty problem.
88
00:07:30,845 --> 00:07:39,531
And it was around that time where the EU government was also thinking about the EU AI Act,
which today is an actual thing in effect.
89
00:07:39,531 --> 00:07:40,692
But like, this is 2019.
90
00:07:40,692 --> 00:07:44,754
This is just an idea just being floated around in government.
91
00:07:44,754 --> 00:07:49,978
And the whole idea of regulating AI back then was so alien.
92
00:07:49,978 --> 00:07:54,841
It was such a foreign concept that, wow, I never thought this could be an actual thing.
93
00:07:55,262 --> 00:07:56,140
And so,
94
00:07:56,140 --> 00:08:00,432
And as I was reading, I like to sort of write notes in a notebook.
95
00:08:00,633 --> 00:08:08,216
I don't know, like I didn't really have any particular reason why I just thought, you
know, if there's anything interesting, I'll just write it down in the diary.
96
00:08:08,397 --> 00:08:11,598
And it got to a point where my diary started filling out.
97
00:08:11,959 --> 00:08:18,002
And then I was having the conversation with friends and also classmates and eventually
colleagues.
98
00:08:18,002 --> 00:08:25,484
And I realized that, well, I actually have sort of quite a lot of ideas and insight over
the past, you know,
99
00:08:25,484 --> 00:08:36,410
or threes of reading randomly and I thought you know maybe I could share this on LinkedIn
so um when I first started becoming a lawyer on the side you know it was also during COVID
100
00:08:36,410 --> 00:08:47,406
so I started my career during the COVID lockdown so there's a bit of a lot of quite a lot
of spare time to just play around so I was just writing stuff on LinkedIn initially like
101
00:08:47,406 --> 00:08:54,650
it only hit a very niche audience so I'll write updates around AR regulation and ideas
102
00:08:54,894 --> 00:08:57,675
um I did that for like a year or so.
103
00:08:58,275 --> 00:09:02,076
Modest engagement, but I had a lot of fun writing.
104
00:09:02,076 --> 00:09:12,579
And then when ChaiGPD came out, that's where everything changed because all of sudden,
total proof of AI and AI regulation became like really popular and my posts started
105
00:09:12,579 --> 00:09:14,019
getting bit more attention.
106
00:09:14,019 --> 00:09:15,920
So that encouraged me to write more.
107
00:09:15,920 --> 00:09:18,781
I was also writing all around the world.
108
00:09:18,781 --> 00:09:22,322
not just one particular country, but as many countries as possible.
109
00:09:22,322 --> 00:09:23,726
It got to a point where
110
00:09:23,726 --> 00:09:30,706
my LinkedIn had all these posts in different countries and I thought, no, let's try to
organize into one hub.
111
00:09:30,866 --> 00:09:32,566
And so I already had a website myself.
112
00:09:32,566 --> 00:09:38,726
So I just thought, you know, why not just add a new page to our website that's categorizes
all my LinkedIn posts per country.
113
00:09:38,766 --> 00:09:42,906
So initially it was like a simple table, but then I thought that's so boring.
114
00:09:42,906 --> 00:09:44,626
Let's just take this a step further.
115
00:09:44,626 --> 00:09:45,786
I create upside.
116
00:09:45,786 --> 00:09:50,526
I put like a map on it and click on the country to show my LinkedIn posts.
117
00:09:50,526 --> 00:09:51,594
Then I thought,
118
00:09:51,594 --> 00:09:53,645
Why stop on my own LinkedIn page?
119
00:09:53,645 --> 00:09:58,979
This is actually prepare an uh encyclopedia summary to each country.
120
00:09:59,239 --> 00:10:04,333
And yeah, that's where I started summarizing each country's AI policies and regulations.
121
00:10:04,333 --> 00:10:09,987
I initially started with the G20 countries, but then I've sort of expanded.
122
00:10:09,987 --> 00:10:14,070
I've been running this project now for three months all by myself.
123
00:10:14,070 --> 00:10:21,024
And now it took me two and a half years to now cover every country uh and territory in the
world.
124
00:10:21,110 --> 00:10:22,131
So over like 200.
125
00:10:22,131 --> 00:10:31,135
yeah, so I think it's been really great to be, I guess, one of the early ones building
this sort of tool.
126
00:10:31,135 --> 00:10:37,298
And then a lot of people were really supportive and yeah, just a lot of encouragement from
out the global industry.
127
00:10:37,298 --> 00:10:44,352
And that really helps me complete the map and also add new features to it to make it more
user friendly and developer friendly.
128
00:10:44,352 --> 00:10:45,302
yeah.
129
00:10:46,103 --> 00:10:51,673
Yeah, so is your does your firm leverage the research that you've compiled?
130
00:10:53,471 --> 00:10:55,052
not, not directly.
131
00:10:55,052 --> 00:10:58,363
I think this is something that I do like in my own personal time.
132
00:10:58,363 --> 00:11:09,760
Um, and just everyone around the industry is mostly targeted to less say, you know, small
businesses, academics, researchers, and developers, especially because there's now a new
133
00:11:09,760 --> 00:11:16,133
API that developers can now link the apps on top of it to sort of run the own monitoring
tools, whatever.
134
00:11:16,133 --> 00:11:20,606
So it's more targeted at that sort of grassroots smaller end.
135
00:11:20,606 --> 00:11:21,386
Yeah.
136
00:11:21,985 --> 00:11:24,890
Do you have any plans for this?
137
00:11:24,890 --> 00:11:31,601
to get funding or, you know, either through a grant or private funding to help get some
help with it?
138
00:11:31,601 --> 00:11:32,663
This sounds like a lot of work.
139
00:11:32,663 --> 00:11:34,096
uh
140
00:11:34,096 --> 00:11:40,189
yeah, it's um, it's it sounds like work, but it's actually like, it's not that much work.
141
00:11:40,189 --> 00:11:48,042
Because I because I progressively updated every day only takes like five minutes of my
time each day just to like monitor updates.
142
00:11:48,042 --> 00:11:52,024
I've got a lot of tools in the background to help curate news items for me.
143
00:11:52,024 --> 00:11:54,395
So it's not a lot of work per day.
144
00:11:54,395 --> 00:11:55,865
But we put it all together.
145
00:11:55,865 --> 00:11:57,816
Sounds like it's quite a lot.
146
00:11:58,036 --> 00:11:59,637
in terms of your other question.
147
00:11:59,637 --> 00:12:00,207
Yeah, sure.
148
00:12:00,207 --> 00:12:02,274
Like I'm always on the, you know,
149
00:12:02,274 --> 00:12:06,139
on the lookout for opportunities, I'm also keeping myself open-minded.
150
00:12:06,139 --> 00:12:08,842
I'm also not desperate for it.
151
00:12:08,842 --> 00:12:15,029
It's something that's nice to have at end of the day for me, just to learn about the
world.
152
00:12:15,029 --> 00:12:16,230
eh
153
00:12:16,811 --> 00:12:17,431
Interesting.
154
00:12:17,431 --> 00:12:22,493
yeah, and you're, you're, LinkedIn posts have, have gotten a lot of traction.
155
00:12:22,493 --> 00:12:24,103
mean, you're on the other side of the world.
156
00:12:24,103 --> 00:12:25,814
um And I've seen your stuff.
157
00:12:25,814 --> 00:12:31,505
You're a top voice, which, uh know, that's a, that's a hard designation to get.
158
00:12:31,505 --> 00:12:36,367
um You really have to put in some effort and some work around that.
159
00:12:36,547 --> 00:12:40,118
Well, you've had, so we're recording this in the beginning of August.
160
00:12:40,118 --> 00:12:43,831
This will probably come out towards the later part of the month.
161
00:12:43,831 --> 00:12:46,762
But you've had a busy week or so, right?
162
00:12:46,762 --> 00:12:57,545
Because we've had major developments with the Trump administration announcing their AI
action plan and then a very quick turnaround on a China response.
163
00:12:57,545 --> 00:13:01,086
um what is your, yeah.
164
00:13:01,086 --> 00:13:04,766
And you had some great posts that I thought were interesting.
165
00:13:05,427 --> 00:13:12,969
specifically on the China side, there's a lot on the U S side too, but you, let's start
with, with China first.
166
00:13:12,969 --> 00:13:13,889
So.
167
00:13:13,985 --> 00:13:20,831
You kind of zeroed in on some nuance around language um with the China plan.
168
00:13:20,831 --> 00:13:26,785
Like, and if I read your post correctly, it was like 95 % of this is not new.
169
00:13:26,785 --> 00:13:30,128
Um, but the 5 % that is, is interesting.
170
00:13:30,128 --> 00:13:36,546
um tell us what your kind of take is on the, the, the China piece first.
171
00:13:36,546 --> 00:13:38,967
Yeah, yeah, yeah, of course.
172
00:13:39,347 --> 00:13:46,190
I guess I'll just first lay out sort of the macro context behind China's AI policy
thinking.
173
00:13:46,190 --> 00:13:57,074
And this is based on, you know, both research and also being in the country talking to
like the big tech, like companies are driving this sort of change.
174
00:13:57,655 --> 00:14:05,378
So I think really it comes down to one or two things, which is China has a very strong
push for
175
00:14:05,496 --> 00:14:08,248
what they call like, know, sovereignty in AI.
176
00:14:08,248 --> 00:14:11,450
So being self-sufficient in the full stack.
177
00:14:11,811 --> 00:14:23,639
And part of this has been driven because of the pressure from US export controls, limiting
access to the necessary chips that are required to build really advanced AI systems.
178
00:14:23,639 --> 00:14:33,696
So this whole central theme around self-sufficiency, having control of the full stack,
that is like the major theme and that sort of...
179
00:14:33,814 --> 00:14:44,180
Manifesting itself in other smaller sub themes around, know, where with which certain
industries will require investment and trade policies and all that.
180
00:14:44,380 --> 00:14:53,085
And the second thing is like China is also trying to uh lead in standards, especially for
the global South.
181
00:14:53,325 --> 00:15:01,934
There's all part of like the whole BRICS initiative, all part of the Belt and Road
project, which has been ongoing for a decade already.
182
00:15:01,934 --> 00:15:14,214
So there's that also that mindset just really to set the standard because standards are
important because if you think about the internet, the internet's built on US led
183
00:15:14,214 --> 00:15:21,534
standards and that has given the US a lot of leverage over how the internet ecosystem
should operate.
184
00:15:21,594 --> 00:15:29,094
And it's one of those things where it's a hugely contested front and actually it has a
huge role in geopolitics.
185
00:15:29,094 --> 00:15:30,766
So when it comes to the new
186
00:15:30,766 --> 00:15:40,326
breakthrough technology like AI being like the next, the current general purpose
technology that is as big or even bigger than the internet, yeah, obviously that's where
187
00:15:40,326 --> 00:15:44,046
countries start thinking about, okay, let's be the ones to set the standard.
188
00:15:44,046 --> 00:15:46,446
So that's the macro context in mind.
189
00:15:46,446 --> 00:16:00,014
And so when the AI action plan came out from China, and to be accurate, when we translate
in English, it's called like the AI action plan, but the actual Chinese like,
190
00:16:00,014 --> 00:16:04,294
text, it's actually called the global AI governance sort of action plan.
191
00:16:04,294 --> 00:16:10,574
So it's like a, it has a global sort of mindset embedded into it.
192
00:16:10,574 --> 00:16:24,014
And the 95%, which I said was not new, that's basically the sort of stuff that we've seen
in previous papers and also in government representative speeches around, know, as I said,
193
00:16:24,014 --> 00:16:28,654
securing the full stack, investing in green and sustainable ways of
194
00:16:28,654 --> 00:16:31,415
powering models, all that sort of stuff.
195
00:16:31,735 --> 00:16:39,858
The 5 % which I thought was sort of was interesting and highlighted was around open
source.
196
00:16:40,259 --> 00:16:49,863
So I think when I say open source in China, people often think of DeepSeek, which that is
really the big um milestone that we saw.
197
00:16:49,863 --> 00:16:58,712
um So, but before DeepSeek, it has always been like this sort of uh strategy of
198
00:16:58,712 --> 00:17:01,704
tech clients to release open source products.
199
00:17:01,945 --> 00:17:12,424
And there's a lot of reason why open source is such a huge theme in China is, but there's
all of like, I think fundamentally it's because the internal domestic competition is so
200
00:17:12,424 --> 00:17:13,434
fierce.
201
00:17:13,695 --> 00:17:21,061
People often talk about the US and China competition as like the first layer of
competition, who are actually in China.
202
00:17:21,441 --> 00:17:26,958
Companies care more about the competition with their next door neighbor, which is like the
domestic.
203
00:17:26,958 --> 00:17:36,220
competition and so fierce, there is sort of a race to the bottom in terms of who can
produce the highest quality model for the lowest price.
204
00:17:36,501 --> 00:17:45,863
And initially there was a race towards like who can provide the lowest API options until
some of the big companies were like, actually, like, not just open source it?
205
00:17:45,863 --> 00:17:49,284
That's that's technically zero, zero dollars for free.
206
00:17:49,284 --> 00:17:54,866
So you basically beat everyone on the on the price front and just provide a very powerful
model.
207
00:17:54,866 --> 00:17:56,846
And so already for like
208
00:17:56,846 --> 00:18:07,046
For a year or two all the big tech companies in China were releasing their own open source
AR models What made DeepSea quite special is that it found like a new ways to make the
209
00:18:07,046 --> 00:18:15,906
training process even more cheaper and That caused a lot of headlines in the West as well
as well of attention has been brought to DeepSea even though it's part of a it's only a
210
00:18:15,906 --> 00:18:25,336
small part of the bigger open source picture But what this plan So even though open source
has been a long thing what this plan was quite different was that
211
00:18:25,336 --> 00:18:30,059
that its choice of language was uh very selective.
212
00:18:30,059 --> 00:18:38,054
And when it comes to Chinese policies, like the language itself probably says more about
the story than the actual message.
213
00:18:38,375 --> 00:18:44,318
Certain words are selected to convey a certain sentiment.
214
00:18:44,419 --> 00:18:54,115
And one of the things to look out for is which phrases are being repeated, like which
mantras and which combination of words are repeated throughout the paper.
215
00:18:54,115 --> 00:18:55,118
That's often the
216
00:18:55,118 --> 00:18:58,899
indicative of government thinking.
217
00:18:59,259 --> 00:19:05,731
like for like I say, for the past year, let me just bring out my notes.
218
00:19:05,731 --> 00:19:20,145
For the past year, there was like a um particular phrasing that government will use and it
was called, to translate directly into English, is, know, uh safe, reliable and
219
00:19:20,145 --> 00:19:20,925
controllable.
220
00:19:20,925 --> 00:19:25,280
So these are the, that's a typical trio of words that we see
221
00:19:25,280 --> 00:19:28,351
in speeches and it's a very systems focused view.
222
00:19:28,351 --> 00:19:35,134
So it's all about trying to make any particular use case safe, reliable, controllable.
223
00:19:35,134 --> 00:19:48,820
But since then we start to see the repetitive slogan expanding more into broader terms to
now what we call, again, direct translation, inclusive, open, sustainable, fair, secure
224
00:19:48,820 --> 00:19:52,221
and reliable, digital and intelligent future for all.
225
00:19:52,221 --> 00:19:55,212
So it's a mouthful when I say in English, but it's only like eight.
226
00:19:55,212 --> 00:19:56,583
eight characters in Chinese, right?
227
00:19:56,583 --> 00:20:04,948
So, that stuff repeats a lot throughout the policy and a much more global rather than
system specific focus.
228
00:20:05,329 --> 00:20:08,201
And how that relates to open source?
229
00:20:08,201 --> 00:20:23,471
Well, in the actual paragraph that mentions open source, we in English, we call it open
source, but in Chinese, it's technically open sharing of resources and nowhere in the
230
00:20:23,471 --> 00:20:24,922
actual text
231
00:20:25,228 --> 00:20:31,380
have I seen the words open source code, open source software, or open source models?
232
00:20:31,641 --> 00:20:40,164
Now, again, in English, when we say open source, we tend to mean that one thing, is, know,
putting your stuff on GitHub, everyone can see the code and you can download it.
233
00:20:40,164 --> 00:20:46,347
But in Chinese, open source has so many different ways of expressing that one concept.
234
00:20:46,347 --> 00:20:52,830
And if you wanna talk about open source models, open source code, there's an actual
literal direct way of saying that.
235
00:20:52,830 --> 00:20:54,094
So it's not a draft.
236
00:20:54,094 --> 00:20:59,414
I don't think it's a draft in oversight because there's so many different ways of
expressing that one thing.
237
00:20:59,414 --> 00:21:07,574
There's got to be some conscious effort behind why it's only open sharing of resources
compared to, let's say, open source models or code.
238
00:21:07,574 --> 00:21:11,914
And as I said, when it comes to Chinese policies, you have to read into the language.
239
00:21:11,914 --> 00:21:15,234
it's not, I'm not, I don't think I'm reading too deeply into it.
240
00:21:15,234 --> 00:21:17,774
I think it's meant to be read in that way.
241
00:21:17,994 --> 00:21:22,402
And taking that interpretation, if we're only talking about sharing,
242
00:21:22,402 --> 00:21:29,448
tech documentation, manuals, like the surface layer documentation instead of the actual
code.
243
00:21:29,549 --> 00:21:42,019
What I'm thinking is that this is such a clever policy balance by China to sort of
influence global standards, but also keeping the secret source back at home, which is a
244
00:21:42,340 --> 00:21:44,412
very subtle and clever sort of balance.
245
00:21:44,412 --> 00:21:46,734
So that's what I noticed in this policy.
246
00:21:46,734 --> 00:21:49,046
And again, it will take another few...
247
00:21:49,058 --> 00:21:53,220
policies or papers in the future to see if that message is being reinforced.
248
00:21:53,220 --> 00:22:03,316
But until then, this is like the first one that I think might be that slight pivot towards
that selective open sharing technique.
249
00:22:03,351 --> 00:22:11,103
You know, what's interesting is us in the West, like open China is like an oxymoron.
250
00:22:11,103 --> 00:22:14,224
We don't think of China and open anything.
251
00:22:14,224 --> 00:22:21,226
We think of very, you know, closed, controlled, um, not open.
252
00:22:21,466 --> 00:22:31,409
And, this seems like a, again, from a Westerner standpoint, it seems like a departure from
what we would expect.
253
00:22:31,427 --> 00:22:47,289
from China, which again, the great Chinese firewall and just how um there's also been uh
issues around intellectual property rights within China.
254
00:22:47,289 --> 00:22:57,937
um I guess, again, from a Westerner standpoint, it seems surprising that China wants to
have an open policy.
255
00:22:58,079 --> 00:22:58,891
around this.
256
00:22:58,891 --> 00:23:01,518
oh What about on your side of the globe?
257
00:23:01,518 --> 00:23:07,311
Is this surprising or does this line up exactly the direction you thought they'd head?
258
00:23:09,002 --> 00:23:20,820
For me, it's not surprising because I think the commercial drivers sort of explain the
story why there's a drive towards um open source in the say from the Western definition
259
00:23:20,820 --> 00:23:21,331
standpoint.
260
00:23:21,331 --> 00:23:28,566
As I said, the domestic competition is already so fierce that really the only real way to
stand out is to be open source.
261
00:23:28,566 --> 00:23:37,688
And actually, if you talk to local developers in China, if an AI company doesn't have an
open source version of their product, they're not going to be considered
262
00:23:37,688 --> 00:23:45,975
by the developers in the tech stack because even if you don't use the open source tool,
it's sort of like a fashion statement, right?
263
00:23:45,975 --> 00:23:53,232
Saying that, okay, like we're doing open source so that we know that we are within the top
band of the market.
264
00:23:53,232 --> 00:23:56,295
If we don't do open source, it means that why you're hiding, right?
265
00:23:56,295 --> 00:23:59,468
That's sort of the suspicion that you get from the local developer base.
266
00:23:59,468 --> 00:24:03,411
So I think in China, open source is the market expectation.
267
00:24:03,411 --> 00:24:04,742
It's the market standard.
268
00:24:04,974 --> 00:24:08,715
um Unlike in the West, I think there is a slight pivot today.
269
00:24:08,715 --> 00:24:14,137
I think especially OpenAI doing open weights now, but I think in China is a different
story.
270
00:24:14,137 --> 00:24:22,674
yeah, it's just a matter of like, so open source is always gonna be the direction.
271
00:24:22,674 --> 00:24:26,380
I think the question is what extent is open?
272
00:24:26,380 --> 00:24:34,478
And that's where you get really specific with is it open source documents or code or
weights or models or the whole thing.
273
00:24:34,478 --> 00:24:40,698
I think that's the question that China over there is still figuring out from a policy
perspective.
274
00:24:41,017 --> 00:24:41,877
Interesting.
275
00:24:41,877 --> 00:24:53,197
And, I don't know how, if this is different in China, but here in the U S the lawmakers
don't have a clue, um, about the tech.
276
00:24:53,197 --> 00:25:04,437
mean, I, um, until recently, Donald Trump didn't know, know who, uh, Jensen Wang was and,
yeah, a $4 trillion company.
277
00:25:04,437 --> 00:25:10,577
And, uh, our president doesn't didn't know who the CEO was, um, by his own admission.
278
00:25:10,701 --> 00:25:24,731
And apparently there's now a lot of dialogue going on and he's surrounded himself with
advisors like David Sachs and people who really understand the technology and the, if we
279
00:25:24,731 --> 00:25:30,705
can pivot to the U S for a minute, um, the commentary I've heard, and I haven't read the
entire thing.
280
00:25:30,705 --> 00:25:39,120
I've read excerpts, but on the U S side, it sounds like it's written from an informed
perspective.
281
00:25:39,121 --> 00:25:40,041
So.
282
00:25:40,178 --> 00:25:56,952
it, whether or not you agree with, cause there's some, there are some, there are some, uh,
controversial words in the U S policy around, you know, control over kind of the tone and
283
00:25:56,972 --> 00:26:05,739
wokeness and DEI and all those sorts of things that are very politically charged, um,
topics of conversation here in the U S.
284
00:26:05,739 --> 00:26:08,241
But what was your take on
285
00:26:08,267 --> 00:26:15,319
And if I, if I recall correctly, the, U S, action plan came out and China's came out
within 48 hours.
286
00:26:15,319 --> 00:26:16,632
was boom, boom.
287
00:26:16,632 --> 00:26:22,241
Um, but what, what, what is your take on kind of the U S's action plan on AI?
288
00:26:24,428 --> 00:26:29,402
Yeah, I wasn't surprised by the action points in the plan.
289
00:26:29,402 --> 00:26:36,158
I can give credit in the sense that they've been consistent in what they're going to do.
290
00:26:36,158 --> 00:26:44,615
just a quick recap, the action plan is a lot, but it's been manifested through three key
executive orders.
291
00:26:44,615 --> 00:26:49,659
So the first order is around energy infrastructure, promoting that.
292
00:26:49,659 --> 00:26:53,622
Second order is around the export controls layer.
293
00:26:54,100 --> 00:27:05,306
and the third order is really around what they call trying to regulate, well not trying to
regulate, but ensuring that language models do not generate quote unquote work or biased
294
00:27:05,306 --> 00:27:06,017
material.
295
00:27:06,017 --> 00:27:11,079
So these are the three uh key themes of the action plan.
296
00:27:11,079 --> 00:27:15,882
And it's not the first, it's not complete surprise that these were covered.
297
00:27:15,882 --> 00:27:23,626
I think it's been a consistent policy of the government since the change of government in
298
00:27:23,626 --> 00:27:24,927
start of the year.
299
00:27:25,888 --> 00:27:35,636
For me, my focus because I'm more interested in the global dynamics, the second one was
the most interesting for me, which is around the export controls.
300
00:27:36,077 --> 00:27:45,464
And so ever since the Biden administration, the US has been tightening export controls
around chips.
301
00:27:45,505 --> 00:27:47,520
And to get even more specific,
302
00:27:47,520 --> 00:27:52,643
Initially, was only restrictions around just the actual final chips themselves.
303
00:27:52,643 --> 00:27:58,007
So the final product before it's being shipped to select countries.
304
00:27:58,007 --> 00:28:08,103
the export controls really target like China, Russia, Iran, and there's other sort of like
what the USC's as the competitor nations.
305
00:28:08,925 --> 00:28:17,390
But what I found really interesting in the action plan is that there is, it's only a small
paragraph within the actual plan, but it mentions of
306
00:28:17,390 --> 00:28:32,190
the government or requiring the DOC to really look into targeted export controls around
semiconductor manufacturing components.
307
00:28:33,070 --> 00:28:36,670
So if you think about the full stack, there is the actual chip itself.
308
00:28:36,670 --> 00:28:42,270
take your Nvidia chip or AMD chip, the final full package.
309
00:28:42,270 --> 00:28:44,470
Within that, there's a lot of different components.
310
00:28:45,050 --> 00:28:54,277
So existing controls target both the final one and also recently the components within
that chip because there's been a growing sort of recognition within government that,
311
00:28:54,277 --> 00:29:04,003
actually maybe the chip is designed and exported from US, but all the different parts are
being imported across the world, right?
312
00:29:04,003 --> 00:29:08,166
And so you've got to mostly make sure you're accountable for these different sub
components.
313
00:29:08,666 --> 00:29:14,600
But now they're looking above the value chain as well, which is the actual manufacturing.
314
00:29:15,018 --> 00:29:15,629
of the chips.
315
00:29:15,629 --> 00:29:19,622
So there are already existing controls on manufacturing equipment.
316
00:29:19,622 --> 00:29:23,025
So we're talking about these big lithographic machines.
317
00:29:23,025 --> 00:29:35,276
So just a quick, uh quick explainer, like how these chips are built, like they're very
small, like it's, basically a very intricate design on a silicon wafer, right?
318
00:29:35,417 --> 00:29:36,708
But it's really small.
319
00:29:36,708 --> 00:29:40,846
It's as small as like some like adamants is measured in nanometers, right?
320
00:29:40,846 --> 00:29:42,146
How the heck do people do that?
321
00:29:42,146 --> 00:29:52,006
Well, it's all done because there's like this big machine that's sole purpose is to fire a
very specific beam of light through a bunch of mirrors.
322
00:29:52,006 --> 00:29:59,026
And that's what carves out these very small designs on a very small piece of silicon at a
nanometer sort of level.
323
00:29:59,026 --> 00:30:00,786
That's how these things are created.
324
00:30:00,786 --> 00:30:10,694
Now, these big machines, like these lithographic machines, can only be built by one
company, which is ASML, which is a base, a company based in the Netherlands, right?
325
00:30:11,042 --> 00:30:15,124
and that stuff's been bought by companies across the world to build these chips.
326
00:30:15,124 --> 00:30:24,932
So we're talking about the final big machine, but that machine itself, I think reports
have said it's built from 700,000 components.
327
00:30:24,932 --> 00:30:31,777
They have components from Germany, from Spain, from France, even from China, from like
Southeast Asia, everywhere.
328
00:30:31,777 --> 00:30:36,160
We're talking about the most complex supply chain in history.
329
00:30:36,300 --> 00:30:38,894
And I reckon it builds the...
330
00:30:38,894 --> 00:30:42,254
I reckon it's in the Guinness World Records or something for that complexity.
331
00:30:42,254 --> 00:30:52,374
But anyways, the action plan is considering not just targeting semiconductor manufacturing
equipment, but also components within that manufacturing equipment.
332
00:30:52,374 --> 00:31:03,894
Now it might just be two words on the paper, but those two words could have massive
complications because if we're gonna export control components within that big machine,
333
00:31:04,074 --> 00:31:06,654
that could potentially cover the whole world.
334
00:31:06,654 --> 00:31:08,258
Like these export controls could
335
00:31:08,258 --> 00:31:14,861
basically target every single supplier that pitches in into that one big machine.
336
00:31:14,962 --> 00:31:24,057
And it's really hard to evaluate like what's the actual specific impact, but all I know is
that it's gonna make supply chains really complicated.
337
00:31:24,057 --> 00:31:33,882
And a lot of the costs, the supply side inflation that's happening in the world right now,
partly due to oil price, but there's also a lot due to just cheap prices in generally
338
00:31:33,882 --> 00:31:37,502
making the cost of anything tech, any
339
00:31:37,502 --> 00:31:51,548
any good basically that's that's digitized that's they already quite expensive on
themselves just from the current um export controls but with with further controls around
340
00:31:51,548 --> 00:32:02,793
the manufacturing components then yeah that yeah i'm just we're gonna brace ourselves for
that i'm sure like again the action plan is only indicating that this is an area that the
341
00:32:02,793 --> 00:32:06,092
relevant departments have to look at so just to be clear it's not
342
00:32:06,092 --> 00:32:10,036
is not a direct action, it's just telling the agencies to look into that question.
343
00:32:10,036 --> 00:32:15,463
So I'm sure there'll be a lot of expert analysis into that, but that's one thing just to
get your heads up for.
344
00:32:15,463 --> 00:32:16,693
Yeah.
345
00:32:17,725 --> 00:32:24,909
And speaking of export controls, so I banged on DeepSeek a little bit and lately Kimi.
346
00:32:25,129 --> 00:32:34,974
And uh I'm really impressed with, I was really impressed with DeepSeek uh R1 when it came
out and I'm really impressed with Kimi.
347
00:32:35,895 --> 00:32:47,437
Are these export controls maybe having an unintended consequence of forcing these uh
countries that have constraints?
348
00:32:47,437 --> 00:32:57,887
to become more efficient and creative and engineer more uh interesting solutions to these
problems?
349
00:32:57,887 --> 00:33:00,311
Is it having that unintended consequence?
350
00:33:00,856 --> 00:33:09,814
Yeah, it's like kind of like the whole, you know, the famous quotes like, you know,
necessity is the model of invention or that sort of thinking.
351
00:33:09,814 --> 00:33:13,617
And actually I'll show you something in Chinese internet meme culture.
352
00:33:13,617 --> 00:33:23,742
Like there's a lot of, it's a pretty popular meme that among Chinese netizens that Trump
and Biden are the founders of China or the.
353
00:33:23,742 --> 00:33:28,015
or that it's called nation building fathers of China.
354
00:33:28,015 --> 00:33:29,666
That's like the joke, right?
355
00:33:29,666 --> 00:33:40,012
Because the reason is that their export controls have pressured Chinese industries, have
limited their resources so much in a way that they just have to find new ways to build
356
00:33:40,012 --> 00:33:41,134
models.
357
00:33:41,134 --> 00:33:44,056
as you mentioned, Kimmy, but also notably DeepSeek, right?
358
00:33:44,056 --> 00:33:47,978
The fact that, so it's just a quick rundown, like.
359
00:33:49,106 --> 00:33:57,950
The traditional thinking is that in order to build a powerful AR model, you need a lot of
labelled data and a lot of processing power to train on the labelled data.
360
00:33:57,950 --> 00:34:09,335
But DeepSeek, based on their paper, says that you can actually build a powerful model with
less labelled data, but a lot more from reinforcement learning.
361
00:34:09,335 --> 00:34:14,517
And reinforcement learning, the advantage is that you don't need labelled data to do
reinforcement learning.
362
00:34:14,517 --> 00:34:17,037
You have some to get it up to like a
363
00:34:17,037 --> 00:34:19,037
particular sort of like head start.
364
00:34:19,037 --> 00:34:28,279
But from there on, that's like the first 10%, but the rest, the 90%, the model just plays
by itself and just learns from its own mistakes and then reapplies them.
365
00:34:28,279 --> 00:34:32,260
then that's the beauty of reinforcement learning.
366
00:34:32,560 --> 00:34:38,601
And then apparently they were able to do it on like older like chips, like H20 chips.
367
00:34:38,601 --> 00:34:43,622
But I think that claim is still being tested by independent experts.
368
00:34:44,002 --> 00:34:55,434
That's an example of really stretching the boundaries of existing legacy tech and finding
new software layer, new algorithms to make the most out of your hardware components.
369
00:34:55,434 --> 00:34:59,878
So yeah, I'm sure there is that effect.
370
00:35:00,233 --> 00:35:15,821
Yeah, and like speaking of other policy effects like you know the didn't meta recently
refused to sign the EU's uh You know acknowledgement around around their policies am I
371
00:35:15,821 --> 00:35:17,151
correct on that?
372
00:35:17,666 --> 00:35:22,049
Yeah, I think you're referring to the general purpose AI code.
373
00:35:22,049 --> 00:35:24,290
Yeah, it's been uh a...
374
00:35:24,891 --> 00:35:35,638
It's been one of those hotly contentious policy documents in the industry and even caused
some sort of division among the big tech companies themselves.
375
00:35:35,638 --> 00:35:39,391
I think it's all been finalized since last week.
376
00:35:39,391 --> 00:35:41,744
So I think there is a list of signatories.
377
00:35:41,744 --> 00:35:43,344
You can see who signed and who's not.
378
00:35:43,344 --> 00:35:47,316
But yeah, I think in the weeks lead up to it, yeah, certain...
379
00:35:47,448 --> 00:35:50,254
Companies have said they'll sign on, some say they won't.
380
00:35:50,254 --> 00:35:51,345
Yeah.
381
00:35:51,928 --> 00:35:52,598
Yeah.
382
00:35:52,598 --> 00:36:04,093
And you know, I think another tone from the U S action plan is there's going to be very
little regard for, um, environmental concerns.
383
00:36:04,093 --> 00:36:06,444
It's, you know, build baby build.
384
00:36:06,544 --> 00:36:17,158
And, know, again, this seems like the, this seems a little bit like a, a, a uh flipping of
the script, you know, um, historically the picture has been painted that China has had
385
00:36:17,158 --> 00:36:18,849
less regard for.
386
00:36:18,881 --> 00:36:26,005
know, green initiatives and really the West has been putting that more in focus.
387
00:36:26,005 --> 00:36:38,672
And it seems to me the tone of the, you know, this administration's policy is that is in
the way in the backseat, maybe in the trunk.
388
00:36:38,672 --> 00:36:46,557
uh First and foremost is about establishing global dominance around AI and maintaining the
lead.
389
00:36:46,650 --> 00:36:48,657
Is that the way you read it as well?
390
00:36:50,082 --> 00:36:51,543
Yeah, interesting.
391
00:36:52,044 --> 00:37:05,595
It depends on how you define lead, because I hear lot of commentary around the whole AI
race and who's leading what, but it's a very, I personally find it's very simplified view
392
00:37:05,595 --> 00:37:09,358
of how this whole ecosystem works.
393
00:37:09,358 --> 00:37:12,600
First of all, you have to divide it within like layers, right?
394
00:37:12,841 --> 00:37:15,543
And it depends on which layer you're looking at.
395
00:37:15,543 --> 00:37:18,744
So at the app layer, I'd say like,
396
00:37:18,744 --> 00:37:27,339
both China and US have equally diverse and widely used apps at the application layer.
397
00:37:27,739 --> 00:37:38,956
And as you go deeper within the stack, so I think maybe just to really simplify, at the
first layer, it's really a question around diversity and USichu has the most diverse and
398
00:37:38,956 --> 00:37:40,867
used ecosystem.
399
00:37:40,867 --> 00:37:42,508
Then we get to the model layer.
400
00:37:42,508 --> 00:37:45,720
That's okay, that's why I can see the race concept being.
401
00:37:45,720 --> 00:37:51,354
being true because it's really a race to who can build the smallest and cheapest model.
402
00:37:51,354 --> 00:37:54,916
That's really what I see the races and it's different approaches, right?
403
00:37:54,916 --> 00:38:06,643
So from a US perspective, it's really driven by the private sector, private sector and
also the inter competition trying to produce the cheapest sort of APIs for powerful
404
00:38:06,643 --> 00:38:07,250
models.
405
00:38:07,250 --> 00:38:12,086
Whereas in China is also that domestic competition from an open source level.
406
00:38:13,076 --> 00:38:19,846
At the infrastructure hardware chips layer, I didn't really see it as a race.
407
00:38:19,846 --> 00:38:21,218
It's more like...
408
00:38:23,062 --> 00:38:27,024
you find your own adventure to building self-sufficiency.
409
00:38:27,024 --> 00:38:28,485
That's how I see it.
410
00:38:28,845 --> 00:38:35,509
And you can either do it from a constructive or deconstructive uh approach.
411
00:38:35,509 --> 00:38:44,133
Constructive being, so when I say that, constructive means, for example, subsidies,
government investments, promoting trade.
412
00:38:44,174 --> 00:38:46,715
So stuff that kind of helps grow.
413
00:38:46,775 --> 00:38:52,608
And deconstructive, is export controls, tariffs, or other um
414
00:38:52,608 --> 00:38:57,741
anti-free trade policies that try to stifle what your competitors are doing.
415
00:38:57,741 --> 00:39:02,164
And it's not that you can only be constructive, it can't be deconstructive.
416
00:39:02,164 --> 00:39:03,915
You're gonna have a balance between those two, right?
417
00:39:03,915 --> 00:39:05,576
That's how policy works, right?
418
00:39:05,576 --> 00:39:16,732
So I think that layer, it's really, again, choose your own adventure, but your policy mix
depends on your current country circumstances.
419
00:39:16,932 --> 00:39:18,033
So that's how I see it.
420
00:39:18,033 --> 00:39:20,134
um
421
00:39:21,230 --> 00:39:34,701
In terms of more broadly, I think it is true that both, I think since 2022 and whole
January AI, I think before that, the whole AI policy debate was really around just the
422
00:39:34,701 --> 00:39:38,705
typical safety and reliability, all that stuff.
423
00:39:38,705 --> 00:39:43,708
Since 2022, AI is gonna become a more geopolitical topic.
424
00:39:43,869 --> 00:39:50,254
And so the idea of like, so as I said, the idea of leading is more around
425
00:39:50,444 --> 00:39:57,056
establishing, I think like just who has more influence on standards.
426
00:39:57,056 --> 00:40:03,288
I think that's one specific angle that I can see where there's that strong competition, as
I said before.
427
00:40:03,288 --> 00:40:07,719
Once you set the standards, your whole ecosystem becomes sticky.
428
00:40:07,719 --> 00:40:12,750
And when your system becomes sticky, people have to use it, revenue comes in, your GDP
booms.
429
00:40:12,750 --> 00:40:16,802
That's like, that's how, I think that's sort of the more long meta strategy.
430
00:40:16,802 --> 00:40:18,442
So I see that.
431
00:40:18,994 --> 00:40:24,198
And you also mentioned the whole green and uh energy thing.
432
00:40:24,499 --> 00:40:28,302
That's also a big part of it because AI consumes lot of power.
433
00:40:28,582 --> 00:40:38,431
I think this is also where it's important not to see AI policy in isolation, but how it
interconnects with every other domestic policy of a country.
434
00:40:38,431 --> 00:40:47,638
So AI crosses over into energy policy, also crosses over into land policy, because the
amount of land you have to dedicate to data centers.
435
00:40:47,680 --> 00:40:49,622
it crosses over into like tax.
436
00:40:49,622 --> 00:40:50,933
That's a huge thing, right?
437
00:40:50,933 --> 00:40:54,316
Tax incentive and all that to incentivize development.
438
00:40:54,316 --> 00:40:56,670
You to see all this in one big picture.
439
00:40:56,670 --> 00:41:07,647
I think where it's true, at least from the objective stats, is that China does have a huge
head start in this space because they have a lot of capacity, like in terms of electric
440
00:41:07,647 --> 00:41:14,172
generation, there's a lot of land that's still being underdeveloped that can be turned
into electric plants.
441
00:41:14,172 --> 00:41:16,802
There is a stronger central
442
00:41:16,802 --> 00:41:19,783
government push in energy.
443
00:41:20,083 --> 00:41:23,985
That's been quite a, it's been consistent for many, many years.
444
00:41:24,345 --> 00:41:32,608
The green tech industry over there has a lot of state support and also a of private sector
activity.
445
00:41:32,729 --> 00:41:34,780
It's also a very popular STEM subject.
446
00:41:34,780 --> 00:41:40,552
We talked to Chinese developers, a lot of them want to go into energy tech as their
engineering field.
447
00:41:40,552 --> 00:41:45,454
So there's that sort of capacity that's all there, that they're just making use of that.
448
00:41:45,454 --> 00:41:47,634
and part of it's going towards AI.
449
00:41:47,634 --> 00:41:51,114
And the US is also now focusing the same efforts now.
450
00:41:51,114 --> 00:41:56,034
I think it's just a good, I think it's a consistent challenge with the West around energy.
451
00:41:56,774 --> 00:42:05,434
There's a lot of debate around which sources you have to use and it doesn't, each
particular energy source has its own big policy debate.
452
00:42:05,434 --> 00:42:09,514
But I think in China, they sort of have, they sort of just have that one set.
453
00:42:09,514 --> 00:42:15,318
Like they just go with that one source and then go, so not one source, they go with a
certain mix of sources.
454
00:42:15,318 --> 00:42:16,210
And just run with that.
455
00:42:16,210 --> 00:42:21,970
They kind of skip the whole policy debate in the beginning, just go straight to
implementation.
456
00:42:22,541 --> 00:42:22,831
Yeah.
457
00:42:22,831 --> 00:42:27,993
And then, we're almost out of time and we've been taught this is, could talk about this
stuff all day.
458
00:42:27,993 --> 00:42:33,806
Um, I geek out on, you know, AI in general, but bringing it back to legal.
459
00:42:34,026 --> 00:42:48,332
So what do you, what do you envision and, know, on what sort of timeline do you see legal
work and the resources, the inputs, which are mostly human capital today?
460
00:42:48,432 --> 00:42:51,157
When do you see that being disrupted?
461
00:42:51,157 --> 00:43:01,384
where we will see material impact to law firm revenue, law firm headcount, inside uh
council processes.
462
00:43:01,384 --> 00:43:06,147
Like today, there's a lot of experimentation and I think there is some impact.
463
00:43:06,147 --> 00:43:10,349
But when do you see real disruption taking place in the legal space?
464
00:43:11,086 --> 00:43:24,766
Yeah, I could draw a graph here, but I feel like it's just a function of the more
standardized and the lower risk value of the work, the more prone it is to automation.
465
00:43:25,006 --> 00:43:37,726
And not just AI automation, but just any form of automation, like even your traditional
boring, algorithmic sort of if-else statements, that stuff can also act as automation.
466
00:43:38,574 --> 00:43:40,794
as a standardized low risk.
467
00:43:40,794 --> 00:43:53,474
the reason why I say that is because obviously standardized is a consistent process that's
really easy to encode into code and low risk being that if something goes wrong, the loss,
468
00:43:53,474 --> 00:43:55,954
the chance of harm is still going to be quite low.
469
00:43:55,954 --> 00:44:04,374
And there's, and it's also one of those like if something goes wrong, it's still easy or
still practical for someone to just jump in and fix things.
470
00:44:04,814 --> 00:44:05,966
like, yeah.
471
00:44:05,966 --> 00:44:16,786
Usual suspects that people talk about as like, you know, low value real estate
transactions, like mortgages, conveyancing, that's to the extent that's still done by
472
00:44:16,786 --> 00:44:17,866
lawyers.
473
00:44:18,606 --> 00:44:24,326
Some aspects of like loan, like loan contracts, equity, that's like all stock standard
terms.
474
00:44:24,786 --> 00:44:30,086
Certain tech contracts, software contracts, again, that's anything that's got to do stock
standard terms.
475
00:44:30,086 --> 00:44:31,086
Yeah, definitely.
476
00:44:31,086 --> 00:44:34,866
I mean, that's like the primary, lot of these legal tech startups are targeting.
477
00:44:36,300 --> 00:44:41,353
I'd say that's something that's already in the process of, since the past four, five
years.
478
00:44:41,413 --> 00:44:49,038
The next two, three years or so, we'll start to target the more, still relatively
standardized and still relatively low risk.
479
00:44:49,038 --> 00:44:55,762
But I'd say this, I think maybe 80 % standardized and an extra 10 % in risk.
480
00:44:55,762 --> 00:44:57,403
That's like the medium state level.
481
00:44:57,403 --> 00:45:01,646
um That's where we start to see stronger.
482
00:45:01,772 --> 00:45:07,444
reasoning capabilities of these models to be able to tackle these sort of semi
standardized problems.
483
00:45:07,444 --> 00:45:17,319
So they've still got some consistency in a problem, but there's also a level of
customization or nuance thinking that these models have to have to sort of recognize, but
484
00:45:17,319 --> 00:45:21,290
not too nuanced that it sort of confuses the model.
485
00:45:21,791 --> 00:45:29,954
So that's probably where we're getting to again, the same errors I just identified, but
the bit more complex like problems.
486
00:45:29,986 --> 00:45:39,444
We also start to see areas like, I say, crime, like sudden, I uh don't know what's the
right word to use, but petty crime.
487
00:45:39,444 --> 00:45:42,436
I think that's where you can start using for petty crime.
488
00:45:42,596 --> 00:45:49,181
Also, um yeah, a lot more areas of commercial law, so commercial contracting.
489
00:45:50,403 --> 00:45:59,510
What everyone's really excited about is in the next, I say, eight or 10 years, where we
really start tackling highly nuanced legal problems.
490
00:45:59,968 --> 00:46:04,560
And actually, this is where I honestly don't really know what will be the end outcome.
491
00:46:04,560 --> 00:46:11,663
As a practicing lawyer myself, when I say highly nuanced problems, I do mean they're
highly nuanced.
492
00:46:11,663 --> 00:46:21,027
I think the common misconception that people have is that like contracts or reading laws,
it's all based on what's written on the paper.
493
00:46:21,027 --> 00:46:24,198
As long as you know what's on paper, you can interpret that.
494
00:46:24,198 --> 00:46:26,909
You basically have the full answer.
495
00:46:26,909 --> 00:46:29,614
Actually, no, like the text.
496
00:46:29,614 --> 00:46:33,254
let's say probably only addresses 30 % of your problem.
497
00:46:33,274 --> 00:46:40,874
The 60 % is actually understanding your client's needs, the problem at hand, and also the
market.
498
00:46:41,054 --> 00:46:45,274
And the question is, how do you encode all of that into numbers?
499
00:46:45,274 --> 00:46:48,894
That's ultimately what developers have to do.
500
00:46:49,034 --> 00:46:53,814
Encode the legal problem into numbers that can be read by a machine.
501
00:46:53,814 --> 00:46:55,594
That's what you have to do at the end of the day.
502
00:46:55,594 --> 00:46:58,272
How do you encode client interests
503
00:46:58,272 --> 00:47:03,794
marker standard, marker practices, to extent that they're not written down in words or
standards.
504
00:47:03,794 --> 00:47:07,606
We're just talking about conversation dialogues and all that.
505
00:47:07,606 --> 00:47:15,669
How do you encode that in a consistent manner that a model can reliably reference for XYZ
problems?
506
00:47:15,669 --> 00:47:17,490
I've tried doing that myself.
507
00:47:17,490 --> 00:47:19,591
It's really hard, right?
508
00:47:19,851 --> 00:47:20,601
But who knows?
509
00:47:20,601 --> 00:47:24,223
Maybe at that point, the whole architecture will change.
510
00:47:24,223 --> 00:47:27,192
We're currently still on a transformer architecture.
511
00:47:27,192 --> 00:47:32,482
which is very much a predict the next word, predict next token.
512
00:47:32,482 --> 00:47:34,376
Obviously there's a lot of layers around that.
513
00:47:34,376 --> 00:47:37,808
It's not just that, but fundamentally that's still what happens.
514
00:47:37,808 --> 00:47:45,612
Who knows, it might be a new mainstream model, like the state space model that might allow
us to do a way more nuanced reasoning.
515
00:47:46,113 --> 00:47:50,335
Right now, all of the reasoning models are just limited to like chain of thought.
516
00:47:51,135 --> 00:47:53,417
But I think chain of thought is just level one.
517
00:47:53,417 --> 00:47:56,158
There's like way more levels down the line.
518
00:47:56,526 --> 00:48:01,186
which I don't know yet because I'm not within the research centers themselves.
519
00:48:01,186 --> 00:48:08,786
yeah, really, I'm probably, I'm gonna be one of those people who are really optimistic
around disruption in law.
520
00:48:08,786 --> 00:48:14,846
It's weird for a lawyer to say that, but I feel like it's gonna be amazing because it'll
free up a lot of our time.
521
00:48:14,846 --> 00:48:17,286
I think laws would just be happier in general.
522
00:48:17,286 --> 00:48:19,606
We don't wanna be bogged down with boring work.
523
00:48:19,606 --> 00:48:23,246
We wanna do cool, more strategic work and there'll be new types of.
524
00:48:23,246 --> 00:48:27,805
industries and work coming out of that as well that we can't conceive of today.
525
00:48:28,026 --> 00:48:40,946
If you think about the idea of a corporation, like when the Dutch Empire wanted to expand,
that's when they created this idea of a corporation as a vehicle to collect private funds
526
00:48:40,946 --> 00:48:42,926
to fund expansion.
527
00:48:43,046 --> 00:48:52,526
that's when you have the idea of a corporation that created the idea of shares, which then
created the whole stock market, which then created the whole securities law.
528
00:48:53,058 --> 00:48:58,261
commercial law, corporate law, all of that just came from one new abstract idea.
529
00:48:58,261 --> 00:49:08,908
Who knows one day there'll be a new abstract idea that we can't conceive of today, but it
will be there in the future and that will create a whole new area of law that's way above
530
00:49:08,908 --> 00:49:13,931
the pay grade of AI models and we humans have to navigate through that.
531
00:49:14,312 --> 00:49:16,193
So I'm very optimistic.
532
00:49:16,193 --> 00:49:18,034
Yeah, I'm actually so keen for it.
533
00:49:18,143 --> 00:49:26,475
Yeah, I mean, it's as a legal tech CEO, I am really enjoying myself.
534
00:49:26,475 --> 00:49:30,166
It doesn't come without uh heartburn.
535
00:49:30,166 --> 00:49:33,877
You know, we are solely dependent on law firms for our business.
536
00:49:33,877 --> 00:49:38,519
And, you know, I see a lot of complacency.
537
00:49:38,519 --> 00:49:44,750
um And I also see firms that are being aggressive and going out and hiring talent and
making investment.
538
00:49:44,750 --> 00:49:47,211
So I see kind of all ends of the spectrum.
539
00:49:47,245 --> 00:49:52,968
But I worry that things, I don't know how it is in Australia, but here in the US, it's a
very fragmented law.
540
00:49:52,968 --> 00:50:02,412
AmLaw 200, mean, 200 law firm, the AmLaw 100 is, if you add up all the revenue, they'd be
like Fortune 150.
541
00:50:02,412 --> 00:50:08,936
So it um does concern me, but I'm optimistic as well.
542
00:50:08,936 --> 00:50:11,877
And yeah, this has been a fantastic conversation.
543
00:50:12,010 --> 00:50:20,402
Before we wrap up, how do people find out more about the work that you're doing with your
regulation tracker or any other projects that you're working on?
544
00:50:20,960 --> 00:50:26,012
Yeah, so simply I have a website that links everything.
545
00:50:26,072 --> 00:50:35,816
So it's like www.techcareer.com or you can also just search for me on LinkedIn Raymond Sun
and yeah, they have all links in there.
546
00:50:35,816 --> 00:50:40,193
But yeah, just start with these two and yeah, hopefully you find my content fun.
547
00:50:40,193 --> 00:50:45,036
Yeah, we'll include links in the show notes um so people can get to you.
548
00:50:45,036 --> 00:50:50,660
Well, Ray, I really appreciate you taking a little bit of time out of your morning to have
a conversation with me.
549
00:50:50,660 --> 00:50:52,320
This has been a lot of fun.
550
00:50:52,821 --> 00:50:55,302
let's keep doing the work that you're doing, man.
551
00:50:55,302 --> 00:50:59,266
uh We're all benefited from it, so we appreciate it.
552
00:50:59,266 --> 00:51:00,169
Yeah, likewise, Ted.
553
00:51:00,169 --> 00:51:04,133
Thank you very much for bringing me on board, and I always love chatting with you,
especially on these topics.
554
00:51:04,133 --> 00:51:05,536
Yeah, thank you.
555
00:51:05,536 --> 00:51:06,017
All right.
556
00:51:06,017 --> 00:51:07,418
Have a good afternoon.
557
00:51:07,821 --> 00:51:08,781
Thanks. -->
Subscribe
Stay up on the latest innovations in legal technology and knowledge management.