Sarah Thompson

In this episode, Ted sits down with Sarah Thompson, Chief Product Officer at BlueStar, to discuss how AI and M365 are reshaping legal investigations and eDiscovery. From tackling the modern attachment problem to navigating complex data integrations, Sarah shares her expertise in building custom AI solutions that deliver results. With insights on accelerating case resolution and preparing for AI’s impact on the job market, this conversation gives law professionals a clear view of the tools and strategies needed to stay competitive.

In this episode, Sarah shares insights on how to:

  • Address the unique challenges of modern attachments in eDiscovery
  • Leverage M365 effectively in legal investigations
  • Build custom AI solutions tailored to complex legal data
  • Integrate subject matter expertise into AI adoption strategies
  • Prepare for the shifts AI will create in legal job markets

Key takeaways:

  • Modern attachments require new approaches and custom tools in eDiscovery
  • AI can significantly shorten investigation timelines, sometimes to under 48 hours
  • Subject matter expertise is essential for maximizing AI effectiveness
  • Off-the-shelf solutions often fall short for complex legal use cases
  • The legal job market will evolve as AI becomes more integrated into daily workflows

About the guest, Sarah Thompson

Sarah Thompson is a leading expert in legal AI, Microsoft 365 eDiscovery, and digital evidence strategy with over two decades of experience driving legal technology innovation. As Chief Product Officer at BlueStar Case Solutions and Founder of Siemly, LLC, she develops AI-powered solutions for challenges like cloud-native data, versioning, and real-time investigations. Known for turning complex AI and compliance issues into practical workflows, she also trains legal professionals on emerging risks such as deepfakes and data integrity.

“There’s going to be jobs created that we can’t even fathom.”

Connect with Sarah:

Subscribe for Updates

Newsletter

Newsletter

Machine Generated Episode Transcript

1 00:00:00,151 --> 00:00:02,024 Sarah, how are you this afternoon? 2 00:00:02,167 --> 00:00:02,832 I'm awesome. 3 00:00:02,832 --> 00:00:03,746 How are you? 4 00:00:03,925 --> 00:00:04,795 I'm doing good. 5 00:00:04,795 --> 00:00:10,265 I'm, I'm a little worn down from five days of ILTACON but, uh, I'm here. 6 00:00:10,265 --> 00:00:12,148 I'm, I'm vertical. 7 00:00:12,209 --> 00:00:14,229 So all is well. 8 00:00:14,950 --> 00:00:15,610 I'm winning. 9 00:00:15,610 --> 00:00:26,565 Um, well, you and I got connected through a, uh, one of your colleagues had reached out and I took a look at some of the stuff that you like to write and talk about. 10 00:00:26,565 --> 00:00:30,987 And there was pretty good alignment with what we'd like to talk about on the podcast. 11 00:00:31,223 --> 00:00:37,404 Stuff like M365 and AI and data and you've got a long history in legal tech. 12 00:00:37,404 --> 00:00:41,611 um You've been 20 plus years. 13 00:00:41,611 --> 00:00:45,447 Why don't you tell everybody kind of who you are, what you do and where you do it. 14 00:00:46,075 --> 00:00:46,845 Sure. 15 00:00:46,845 --> 00:00:48,396 My name is Sarah Thompson. 16 00:00:48,396 --> 00:00:50,976 I'm the Chief Product Officer for Blue Star. 17 00:00:50,976 --> 00:00:59,148 We're a litigation support shop at Chicago, but we operate worldwide, um but focus mainly on the United States. 18 00:00:59,289 --> 00:01:13,623 And uh what I do is really, uh we build legal tech solutions to kind of help out law firms and in-house counsel win their cases. 19 00:01:13,623 --> 00:01:15,437 I mean, that's the bottom line. 20 00:01:15,437 --> 00:01:23,044 and we do that both with, we have a investigations platform called Seamly, which kind of is pretty cool. 21 00:01:23,044 --> 00:01:34,894 It makes it so you don't have to collect data in Microsoft or at least start from a place of knowledge when you are doing collections, performing collections. 22 00:01:34,894 --> 00:01:40,319 And then we have, we build custom AI solutions to help, that are kind of matter-based. 23 00:01:40,319 --> 00:01:41,870 So we're doing a lot of really cool stuff. 24 00:01:41,870 --> 00:01:43,561 We've been doing it for, uh 25 00:01:43,857 --> 00:01:46,297 You know, like you said, I've been in the industry 20 years. 26 00:01:46,297 --> 00:01:49,637 Star has been actually been a litigation support shop for 20 years. 27 00:01:49,637 --> 00:01:53,737 So we kind of, we've seen it all done a lot. 28 00:01:54,477 --> 00:01:55,577 Don't know everything yet. 29 00:01:55,577 --> 00:01:57,237 Probably never will. 30 00:01:57,637 --> 00:01:58,941 So that's basically it. 31 00:01:58,941 --> 00:02:00,927 as you learn everything, it changes. 32 00:02:00,927 --> 00:02:03,331 So I don't think it's possible. 33 00:02:03,394 --> 00:02:06,096 I think if you learn it, it would be pretty boring. 34 00:02:06,217 --> 00:02:16,476 What I love about my job is that every day, you know, we're learning new stuff, especially with AI, it's like pretty mind blowing um how the pace that we're moving and the things 35 00:02:16,476 --> 00:02:21,491 that we can do today that we couldn't do even three months ago or yesterday, you know? 36 00:02:21,491 --> 00:02:23,250 So yeah, it's pretty cool. 37 00:02:23,250 --> 00:02:25,141 Yeah, I'm an enthusiast. 38 00:02:25,141 --> 00:02:27,893 don't call myself an AI expert. 39 00:02:27,893 --> 00:02:29,634 I'm more of an enthusiast. 40 00:02:29,634 --> 00:02:32,355 I don't even know what an AI expert really means. 41 00:02:32,355 --> 00:02:35,697 um Unless, you know, if you were... 42 00:02:35,697 --> 00:02:45,472 if you're working in the labs and engineering, okay, yeah, but you know, for the rest of us, I think just getting in and spending time with the tools and learning their 43 00:02:45,472 --> 00:02:50,456 capabilities and learning how to make them useful ah is best spent... 44 00:02:50,456 --> 00:02:52,408 actually doing both. 45 00:02:52,408 --> 00:02:53,579 We're into the code. 46 00:02:53,579 --> 00:02:55,220 into the rebuild agents, that kind of thing. 47 00:02:55,220 --> 00:02:57,732 That's, that's the really cool stuff. 48 00:02:57,732 --> 00:03:04,127 We definitely, yeah, we're looking at the products and we know what they do and it's really quite cool. 49 00:03:04,127 --> 00:03:09,122 But I think that the future we're going to see is going to be beyond that. 50 00:03:09,122 --> 00:03:11,083 I don't think it's going to be tool-based. 51 00:03:11,083 --> 00:03:18,719 think they had, there's a very small window where people are going to be, you know, buying these cool AI tools and you know, very shortly they're going to be building them. 52 00:03:19,423 --> 00:03:20,913 I, we agree with you. 53 00:03:20,913 --> 00:03:27,045 So, you know, we're gonna, we're gonna talk about that in, in, in just a minute, but we, we, we agree with you. 54 00:03:27,045 --> 00:03:37,588 I think it's, terms of differentiation and how firms are going to go about that process, it's going to require, you're not going to be buying off the shelf tools to differentiate 55 00:03:37,588 --> 00:03:38,009 yourself. 56 00:03:38,009 --> 00:03:41,440 If your competitor can buy them, it's not differentiating. 57 00:03:41,440 --> 00:03:46,380 So, you know, you're going to need to use your data. 58 00:03:46,380 --> 00:03:52,331 So, well, let's talk a little bit about M365, because that's one area of overlap with you guys and us. 59 00:03:52,331 --> 00:03:59,753 So, InfoDash is an intranet extranet platform that's built on SharePoint Online, Azure and Teams. 60 00:03:59,993 --> 00:04:06,255 And we've been doing SharePoint legal probably longer than anyone out there. 61 00:04:06,255 --> 00:04:14,087 Started in 08 and we were a services company and built custom bespoke solutions. 62 00:04:14,087 --> 00:04:15,757 And then we productized 63 00:04:15,917 --> 00:04:31,517 We started the process in 2018 and then Microsoft completely nuked their development model in SharePoint in 2019 ish and released the SharePoint framework and know, switched from 64 00:04:31,517 --> 00:04:32,877 Angular to React JS. 65 00:04:32,877 --> 00:04:44,237 And so we kind of back to the drawing board, but we finally got across the goal line in and released in January of 2022 and things have been going great ever since. 66 00:04:44,237 --> 00:04:44,609 But 67 00:04:44,609 --> 00:04:53,799 How uh did your alignment with M365, by you, mean, Blue Star, how did that alignment with M365, how did that happen? 68 00:04:54,545 --> 00:05:04,508 Right, gosh, around 2012, we started creating a platform called eCloud Collect. 69 00:05:04,508 --> 00:05:08,819 And this was uh kind of a concept before its time, I want to say. 70 00:05:08,819 --> 00:05:10,409 It was like a cloud collection tool. 71 00:05:10,409 --> 00:05:18,852 So it would collect from Microsoft 365, Google, AWS, for discovery purposes. 72 00:05:18,852 --> 00:05:21,372 But it would also do remote computer collections. 73 00:05:21,393 --> 00:05:23,085 And it was quite big. 74 00:05:23,085 --> 00:05:25,166 before its time, you know? 75 00:05:25,207 --> 00:05:33,914 And then we ended up selling um this product over to a company called Zaproved, or some people call it Z-approved, it's Zaproved. 76 00:05:33,914 --> 00:05:35,255 It's actually now Xtero. 77 00:05:35,255 --> 00:05:44,543 uh And so they started like integrating our product into their, you know, kind of collection tool, because they're building kind of any discovery stack, right? 78 00:05:44,543 --> 00:05:47,415 So uh we were collecting from Microsoft already. 79 00:05:47,415 --> 00:05:49,806 And, you know, we learned a lot there. 80 00:05:49,806 --> 00:05:52,509 We had to do OneDrive and SharePoint. 81 00:05:52,592 --> 00:05:54,583 saw the challenges there. 82 00:05:54,583 --> 00:05:57,985 And then we started moving. 83 00:05:58,005 --> 00:06:07,990 As it so happened, the product manager that headed up Microsoft eDiscovery over at Microsoft was a friend of mine from in the industry. 84 00:06:07,990 --> 00:06:09,671 His name was Rocky Messing. 85 00:06:09,671 --> 00:06:19,577 And we were chatting one day and he says, you know, it would be so great if you guys would build an integration with your legal hold tool into our Microsoft 365 eDiscovery 86 00:06:19,577 --> 00:06:20,907 preservation tool. 87 00:06:21,219 --> 00:06:29,366 And so that when people put a hold on and legal hold pro that they could, you it would actually go on and Microsoft and if they turned it off, you know, it would turn off 88 00:06:29,366 --> 00:06:30,658 anyways, the integration. 89 00:06:30,658 --> 00:06:32,600 So um we did that. 90 00:06:32,600 --> 00:06:40,406 And so we were the first, we built the first tool that integrated with the Microsoft Preservations with one, you know, in the legal tech space. 91 00:06:41,047 --> 00:06:42,789 And we learned a lot. 92 00:06:42,789 --> 00:06:43,710 It was not easy. 93 00:06:43,710 --> 00:06:47,665 um As you probably, as you know, you you alluded to. 94 00:06:47,665 --> 00:06:53,965 There's a lot of ins and outs, so we ended up having to use some weird technology. 95 00:06:54,385 --> 00:06:58,805 We had to use some old stuff, integrated PowerShell throughout. 96 00:06:58,805 --> 00:07:01,145 So a big learning curve. 97 00:07:02,085 --> 00:07:11,285 And so after that acquisition was complete, about three years later, I had gone over to disapprove for the acquisition. 98 00:07:11,285 --> 00:07:14,825 I came back with Bluestart, and we went back to the drawing board. 99 00:07:14,993 --> 00:07:17,553 we're kind of thinking about, you know, what am going to do next? 100 00:07:17,553 --> 00:07:29,413 And what I wanted to do was get, you know, really in on actual Microsoft investigations, you know, because, know, from a product perspective, you have like assumptions and, oh, 101 00:07:29,413 --> 00:07:37,953 this is how it works, you know, but it's nothing, there's nothing like actually doing it, like the user, you know, living that user's experience. 102 00:07:38,213 --> 00:07:41,426 So my team, of course, we do collect, like they do. 103 00:07:41,426 --> 00:07:46,486 and have been doing for years like forensic collections from Microsoft, all these employee investigations. 104 00:07:46,486 --> 00:07:50,506 So I got involved in these investigations. 105 00:07:51,166 --> 00:07:55,706 And what we kind of were thinking was, it's interesting. 106 00:07:56,246 --> 00:08:07,426 Having come from the forensic collection space just a little bit earlier, we were getting in these cases and they're all like, oh, let's get all the email, let's get all the 107 00:08:07,426 --> 00:08:10,577 SharePoint, let's get all the OneDrive files. 108 00:08:10,577 --> 00:08:13,197 for a relevant date range, which is a lot. 109 00:08:13,197 --> 00:08:15,817 And then let's get the computer as well. 110 00:08:15,817 --> 00:08:20,217 And then you start thinking to yourself, what are they actually looking for? 111 00:08:20,217 --> 00:08:22,797 So that question wasn't being asked, right? 112 00:08:22,797 --> 00:08:27,877 And so when we look at a computer forensic investigation, what are they looking at? 113 00:08:28,657 --> 00:08:29,877 What do the user do? 114 00:08:29,877 --> 00:08:31,197 What's the activity? 115 00:08:31,197 --> 00:08:35,997 Because generally these are like, they could be employee theft, IP litigation, that kind of thing. 116 00:08:35,997 --> 00:08:37,557 What happened, right? 117 00:08:37,557 --> 00:08:38,651 So we... 118 00:08:38,651 --> 00:08:44,832 thought it was weird that nobody was looking at the activity in Microsoft, you know, because that is readily available. 119 00:08:44,832 --> 00:08:48,605 You don't need to collect a computer for 20 grand or 10 grand or whatever it is. 120 00:08:48,825 --> 00:08:51,746 it's all of it is there and you can just have a look. 121 00:08:51,746 --> 00:09:02,091 So we started looking into the activity logs from an investigatory perspective and it was just a goldmine, you know, who did what, when, who were they talking to? 122 00:09:02,091 --> 00:09:03,111 Who did they email? 123 00:09:03,111 --> 00:09:04,592 What files did they download? 124 00:09:04,592 --> 00:09:06,272 Like, what were they looking at? 125 00:09:06,272 --> 00:09:08,304 And we found that like, 126 00:09:08,304 --> 00:09:16,444 I mean, this sounds like crazy, but we found like we could solve cases, some, if the data was there, within like 24 to 48 hours without collection. 127 00:09:16,444 --> 00:09:19,624 So we could see exactly what the employees stole. 128 00:09:19,664 --> 00:09:25,584 We knew exactly the data collected email that showed the proof of that, what happened. 129 00:09:25,584 --> 00:09:27,384 And so we built a platform. 130 00:09:27,384 --> 00:09:36,744 That's the seamless platform that I was talking about that just takes the audit logs from Microsoft and kind of denoises it so that you can see what happened. 131 00:09:37,413 --> 00:09:42,433 And if you've ever looked at an audit log, it's really terrible to look at. 132 00:09:42,897 --> 00:09:48,410 It's verbose, there's a lot of noise, and we wanted to see what happened. 133 00:09:48,410 --> 00:09:53,153 so we built it from the perspective of investigations, like legal investigations. 134 00:09:53,153 --> 00:09:56,605 And so, yeah, that's my experience with Microsoft. 135 00:09:56,766 --> 00:10:05,976 And in building that platform, we, again, have a very deep um understanding of the kind of 136 00:10:05,976 --> 00:10:10,020 landscape in Microsoft because we had to delve into each source. 137 00:10:10,020 --> 00:10:16,846 If you want to, in the audit log, yeah, it says they looked at a document in SharePoint, but I won't tell you the name. 138 00:10:16,846 --> 00:10:22,182 So we take the SharePoint ID, we can dive into SharePoint, get the name of the file, all that stuff. 139 00:10:22,182 --> 00:10:27,136 So we got a really good understanding what's happening and it is complex. 140 00:10:27,877 --> 00:10:30,954 It's not as easy as you'd think it would be, right? 141 00:10:30,954 --> 00:10:32,865 Well, yeah, I've had to rip apart. 142 00:10:32,865 --> 00:10:39,204 was, I used to be on the SQL team at Microsoft years ago, 20, 26 years ago. 143 00:10:39,204 --> 00:10:40,112 It was crazy. 144 00:10:40,112 --> 00:10:49,941 um But so I've, I've ripped apart the SQL logs and, um IIS logs and Windows server logs. 145 00:10:49,941 --> 00:10:52,383 And yeah, it is, it is extremely painful. 146 00:10:52,383 --> 00:10:57,748 um You, we talked a little bit about the concept of like, 147 00:10:57,820 --> 00:11:00,863 modern attachments in Microsoft 365. 148 00:11:00,863 --> 00:11:15,386 you know, for the listeners out there who have been super annoyed that when you try to attach a file to an email in Outlook, by default, it wants to, it wants it to be a modern 149 00:11:15,386 --> 00:11:15,948 attachment. 150 00:11:15,948 --> 00:11:20,621 It wants to basically host it in OneDrive and provide a link. 151 00:11:20,621 --> 00:11:24,754 And, you know, sometimes that works, but a lot of times it creates friction. 152 00:11:24,778 --> 00:11:34,455 I've had that where I'm trying to share with external parties and, and they don't have edit access or I lose track of where the file actually is. 153 00:11:34,455 --> 00:11:41,660 So sometimes it just, and for me more often than not, I revert back to a traditional attachment. 154 00:11:42,341 --> 00:11:47,745 just because I've had friction on and it's like, I don't have time to really like understand it. 155 00:11:47,745 --> 00:11:50,907 And I think a lot of people fall into that camp. 156 00:11:50,907 --> 00:11:52,299 So, um, 157 00:11:52,299 --> 00:11:58,202 Like what are modern attachments and how do they differ from traditional attachments? 158 00:11:58,202 --> 00:12:00,584 So modern attachments are just links. 159 00:12:00,584 --> 00:12:02,336 They are those links, the OneDrive links. 160 00:12:02,336 --> 00:12:04,898 They're links to files. 161 00:12:04,898 --> 00:12:13,756 uh Generally refer to files that are hosted uh in the uh productivity tool that you're using. 162 00:12:13,756 --> 00:12:17,270 So whether it's Microsoft 365 or Google Workspaces. 163 00:12:17,270 --> 00:12:20,432 So they're basically shared links to files. 164 00:12:20,432 --> 00:12:24,155 And so they help with collaboration, I think, in this day and age. 165 00:12:24,948 --> 00:12:30,252 much more prevalent than ever before after 2020, especially with the remote workers. 166 00:12:30,332 --> 00:12:33,344 So the problem, they've been around forever though, right? 167 00:12:33,344 --> 00:12:44,543 And the problem with traditionally discovery and how it's been is that you collect this email and there's a link there, but where's that file? 168 00:12:44,543 --> 00:12:52,272 So if you had an attachment, that attachment would be collected with the uh parent email and would be considered, you know, uh 169 00:12:52,272 --> 00:12:55,392 relevant to that email and that would be part of the discovery. 170 00:12:55,812 --> 00:12:58,832 Not with a modern attachment. 171 00:12:59,712 --> 00:13:10,872 Back in the day when we created eCloud Collect and we were collecting from Microsoft 365, we go to Microsoft and we're like, need that file. 172 00:13:11,052 --> 00:13:13,132 We need that file that's linked here. 173 00:13:13,132 --> 00:13:14,692 How do we get that? 174 00:13:14,692 --> 00:13:16,532 It's like, oh, you can't get that. 175 00:13:16,532 --> 00:13:19,872 This is like 13 years ago, granted. 176 00:13:20,824 --> 00:13:27,778 It's been a problem for a while and it's like you have to say to yourself, man, if the attachment is relevant, why is this not relevant? 177 00:13:27,778 --> 00:13:30,349 Like, why aren't we trying to get that file? 178 00:13:30,390 --> 00:13:38,049 So, um you know, like a lot of things in legal, it's like, well, what are you going to do? 179 00:13:38,049 --> 00:13:38,995 You can't get it, you know? 180 00:13:38,995 --> 00:13:41,296 And it's like, well, I think there's a better answer. 181 00:13:41,296 --> 00:13:48,463 So we offer solutions where we can go and like kind of grab that file because, you know, you can see that file. 182 00:13:48,463 --> 00:13:49,534 what that file is. 183 00:13:49,534 --> 00:13:57,068 Like if you know understand programmatically, you know what that link is and you'll get the idea of the file, you could actually go get it. 184 00:13:57,068 --> 00:14:01,340 Now the problem is what file do you get, right? 185 00:14:01,581 --> 00:14:08,464 So you have a file like there was this like the term like, are you gonna get the contemporaneous copy or the live version? 186 00:14:08,464 --> 00:14:15,088 And so really what we're saying is what is the relevant file because a file 187 00:14:15,310 --> 00:14:27,641 as an attachment has a static state, it is one thing, but a file as a modern attachment, you know, is it, you know, on Monday it'll be this file, on Tuesday it'll be another file, 188 00:14:27,641 --> 00:14:31,674 people are modifying, getting it constantly, so what file are you getting? 189 00:14:31,695 --> 00:14:40,422 So for, you know, Microsoft came out with some features and they're like, okay, now you can get the file, but it was only the live version. 190 00:14:41,964 --> 00:14:42,864 Well. 191 00:14:42,884 --> 00:14:43,474 Hey, guess what? 192 00:14:43,474 --> 00:14:45,865 Now you can get all versions of the file. 193 00:14:45,865 --> 00:14:47,224 It's like, yeah, great. 194 00:14:47,224 --> 00:14:48,826 5,000 versions of one file. 195 00:14:48,826 --> 00:14:51,027 I mean, there can be a ton of versions, right? 196 00:14:51,027 --> 00:14:52,807 So is that what you want? 197 00:14:53,088 --> 00:15:04,391 So what they've now come out with is, and this is super new, this is like in preview currently in Microsoft, is they're now saying that you can get the contemporaneous 198 00:15:04,391 --> 00:15:04,872 version. 199 00:15:04,872 --> 00:15:06,123 ah 200 00:15:06,123 --> 00:15:15,443 And that if you do, if you turn on certain switches in Microsoft, which is like creating a retention, like a label that is for rediscovery. 201 00:15:15,603 --> 00:15:18,583 So they're trying to solve this problem right now. 202 00:15:18,583 --> 00:15:25,503 Like we've been solving this problem for some time, but there's a few things that still exist. 203 00:15:25,503 --> 00:15:25,963 Okay. 204 00:15:25,963 --> 00:15:29,003 So Microsoft, of course they have a discovery platform, right? 205 00:15:29,003 --> 00:15:33,083 Nobody uses it, they, you know, I mean, no offense to Microsoft, but they don't, right? 206 00:15:33,083 --> 00:15:34,803 Like people use relativity. 207 00:15:35,222 --> 00:15:37,893 Disco, they use other things, they use Everlo. 208 00:15:38,194 --> 00:15:48,371 So if you get data from Microsoft, yeah, you can get the cloud attachments, but they're not automatically linked to the parent email. 209 00:15:48,371 --> 00:15:52,002 So there has to be a secondary process that occurs with your vendor. 210 00:15:52,584 --> 00:15:58,147 That's something that we do where you have to of like marry the files with the parent attachments, and they make it very difficult. 211 00:15:58,147 --> 00:16:04,179 uh On purpose, I mean, sure, because they want people to use their platform, which I totally understand. 212 00:16:04,179 --> 00:16:06,189 So that's a lot of challenges there. 213 00:16:06,189 --> 00:16:08,338 um People can't ignore them. 214 00:16:08,475 --> 00:16:23,705 Yeah, you know, I've heard some interesting perspectives on even the definition of what a file is, is really, it's not really a discrete object, binary object like it used to be. 215 00:16:23,705 --> 00:16:27,077 It is now a collaboration space, right? 216 00:16:27,077 --> 00:16:29,808 And it has multiple versions. 217 00:16:29,808 --> 00:16:34,691 It has uh conflict management. 218 00:16:34,851 --> 00:16:35,721 part of it, right? 219 00:16:35,721 --> 00:16:41,653 If I'm trying to edit the same piece of text that you're trying to edit, that somehow has to be resolved. 220 00:16:41,653 --> 00:16:44,504 um There are permissions associated. 221 00:16:44,504 --> 00:16:47,835 It's almost like, you know, files have now become... 222 00:16:47,835 --> 00:16:52,126 and I'm gonna use just Word docs because everybody knows what that is. 223 00:16:52,126 --> 00:16:59,938 It's almost like they're online wikis, um you know, rather than a traditional binary, discrete document. 224 00:16:59,938 --> 00:17:03,420 So, I would imagine that makes the whole e-discovery process much more challenging. 225 00:17:03,420 --> 00:17:11,620 Yeah, I really want people to start, know, attorneys and their let's support to start talking about what they're looking for, you know? 226 00:17:11,760 --> 00:17:19,000 So that, you know, while we try to have these conversations with our clients all the time, it's like, oh yeah, get me this, get me that, get me the other. 227 00:17:19,000 --> 00:17:21,140 And we could just say, do you want your cloud to happen? 228 00:17:21,140 --> 00:17:21,680 Sure. 229 00:17:21,680 --> 00:17:23,800 It's like, okay, you know, great. 230 00:17:23,800 --> 00:17:29,300 Now you're, instead of hosting 500 documents, you're hosting, you know, 17,000, you know? 231 00:17:29,300 --> 00:17:30,660 Like, that's awesome for us. 232 00:17:30,660 --> 00:17:33,380 We're charging you hosting, but is that really? 233 00:17:33,757 --> 00:17:35,458 You know, that kind of sucks, right? 234 00:17:35,458 --> 00:17:41,803 um we really, our conversations are now like, okay, exactly what are you looking for? 235 00:17:41,803 --> 00:17:45,745 You know, those files that you can identify as relevant. 236 00:17:45,745 --> 00:17:47,727 uh What is the date range? 237 00:17:47,727 --> 00:17:53,491 Are you looking for something that John did on a certain, what is the relevant date range that he did? 238 00:17:53,491 --> 00:18:01,656 Because you can say, well, instead of looking for all the files or all the emails, uh I want to know anything that John, you know, 239 00:18:01,656 --> 00:18:08,418 edited or modified or viewed during a certain period because just sending him the link doesn't mean he even opened it, right? 240 00:18:08,498 --> 00:18:12,079 So that's the level that we should be looking at it. 241 00:18:12,959 --> 00:18:18,681 you know, if that email is relevant, then we need to go dive into that, that attachment. 242 00:18:18,681 --> 00:18:22,172 And I don't, can think that could even be a secondary process, you know? 243 00:18:22,172 --> 00:18:24,483 So put everything on hold. 244 00:18:24,483 --> 00:18:25,423 Everything's preserved. 245 00:18:25,423 --> 00:18:26,463 It's there when you need it. 246 00:18:26,463 --> 00:18:27,104 Cool. 247 00:18:27,104 --> 00:18:28,484 Make sure of that. 248 00:18:28,764 --> 00:18:31,687 that you preserved it properly and there's retention labels and all this good stuff. 249 00:18:31,687 --> 00:18:44,742 And then identify the emails where these relevant links you think might make sense, like you want to look at and decide what do I want to look at and when and why, you know, and 250 00:18:44,742 --> 00:18:45,803 get that. 251 00:18:46,104 --> 00:18:48,286 That will really reduce your review, you know. 252 00:18:48,286 --> 00:18:50,068 uh 253 00:18:50,466 --> 00:18:52,777 Yeah, that makes sense. 254 00:18:52,777 --> 00:19:01,322 you and I talked a little bit about bringing your data to AI and bringing your AI to data. 255 00:19:01,822 --> 00:19:12,298 I think we're aligned in that as the future of law, Big Law 2.0, kind of unfolds in front of us that 256 00:19:12,436 --> 00:19:21,198 it's going to be difficult, as I mentioned kind in the intro for firms to differentiate themselves by buying Harvey or Legora or Paxton or these off the shelf tools. 257 00:19:21,198 --> 00:19:22,639 They're a good starting point. 258 00:19:22,639 --> 00:19:25,800 Those tools have value, but they're not differentiating. 259 00:19:25,800 --> 00:19:36,062 It's the collective knowledge and wisdom and work product um that has led to successful outcomes for the firm's clients that is differentiating. 260 00:19:36,283 --> 00:19:40,156 like for us, for example, we're not really an AI company. 261 00:19:40,156 --> 00:19:42,134 We're intranet extranet platform. 262 00:19:42,285 --> 00:19:54,700 But as part of our uh install process, we stand up an Azure, I'll call it a virtual appliance, essentially that taps into all the back office systems that a law firm uses and 263 00:19:54,700 --> 00:20:01,112 then presents a unified security trimmed API that respects ethical wall boundaries. 264 00:20:01,233 --> 00:20:09,600 like we did that, the reason that we built that is because our web parts need to surface information from I manage, from elite, from Adderent, from. 265 00:20:09,600 --> 00:20:12,101 interaction from foundation from all these different places. 266 00:20:12,101 --> 00:20:14,042 So it made our job easier. 267 00:20:14,042 --> 00:20:26,327 But then when AI hit the scene and we, this thing has existed for many years, but then when AI hit the scene and Azure released Azure open AI, Azure AI search, formerly Azure 268 00:20:26,327 --> 00:20:33,830 cognitive search, it created, we now have a utility that enables the law firms to go, Hey, you know what? 269 00:20:33,830 --> 00:20:38,890 want to crawl an index, a SQL repository or this file share and 270 00:20:38,890 --> 00:20:41,461 leverage Azure AI search to crawl and index it. 271 00:20:41,461 --> 00:20:46,283 Then I want to use Azure Open AI and run some AI operations on it. 272 00:20:46,283 --> 00:20:52,265 So it allows firms to bring AI to their data rather than again, Harvey Legora. 273 00:20:52,265 --> 00:21:04,500 Yeah, you can in Harvey, you can go navigate your way to a matter workspace and I manage and bring in a set of documents for rag purposes and perform actions, but not what we're, 274 00:21:04,500 --> 00:21:07,592 what we're enabling is like wholesale AI operations. 275 00:21:07,592 --> 00:21:08,748 So you could take 276 00:21:08,748 --> 00:21:19,896 you know, a SQL database full of regulatory updates for labor and employment and all your clients employment agreements, employee handbooks, crawl and index it, Azure AI search, 277 00:21:19,896 --> 00:21:22,958 and then use Azure AI and flag exceptions. 278 00:21:22,958 --> 00:21:25,579 doing those whole, yeah. 279 00:21:25,579 --> 00:21:32,385 But I don't know, what is your perspective on bringing AI to the firm's data? 280 00:21:32,385 --> 00:21:36,388 like firm meaning law firm or a type of corporation. 281 00:21:37,569 --> 00:21:49,518 Yeah, so law firms, it's interesting because a lot of these solutions that these new solutions like that are outside of kind of the traditional discovery workflow. 282 00:21:49,518 --> 00:21:55,763 They don't really work for law firms um from a business perspective, right? 283 00:21:55,763 --> 00:21:57,364 uh 284 00:21:57,592 --> 00:22:01,745 less data review, less hours, less build, et cetera, et cetera. 285 00:22:01,745 --> 00:22:04,497 And there are whole business models generally on hours. 286 00:22:04,497 --> 00:22:18,437 And I think that it's going to have to change because corporations are going to become more more uh agile and able to perform, get a look into their data more quickly and 287 00:22:18,437 --> 00:22:19,107 easily. 288 00:22:19,107 --> 00:22:25,976 so I think that, so right now I'm hearing a lot of outside counsel not really interested in that kind of like, 289 00:22:25,976 --> 00:22:29,328 agents in place, but the corporations are super interested. 290 00:22:29,328 --> 00:22:31,069 They're like, I need to know what we have. 291 00:22:31,069 --> 00:22:32,059 I want to know. 292 00:22:32,059 --> 00:22:34,210 And you see all this stuff in the productivity tools. 293 00:22:34,210 --> 00:22:39,623 Like you see like communication compliance, when things are like so that you can be litigation ready. 294 00:22:39,623 --> 00:22:43,115 ooh, oh that's a little, you know, sexual harassment there. 295 00:22:43,115 --> 00:22:44,366 Flag me, let me know. 296 00:22:44,366 --> 00:22:51,190 Like HR is like getting alerted when people are behaving in a way that may be, you know, cause future litigation. 297 00:22:51,190 --> 00:22:55,868 There's, um now you can actually, uh 298 00:22:55,868 --> 00:22:57,388 perform investigations. 299 00:22:57,388 --> 00:23:00,888 And this is again a preview feature in Microsoft utilizing AI. 300 00:23:00,888 --> 00:23:12,208 So you can kind of search and they're saying this is for, you know, kind of a breach purposes, but you could see it, it's an investigation. 301 00:23:12,208 --> 00:23:13,828 I mean, you know, I haven't used it yet. 302 00:23:13,828 --> 00:23:18,196 I haven't really kind of dove in enough to give a good kind of. 303 00:23:19,320 --> 00:23:21,782 perspective on how well it works. 304 00:23:21,782 --> 00:23:30,908 But I mean, really, you can see in the future that what you want to be doing in a corporation or even as a law firm is going into your clients and querying the data that's 305 00:23:30,908 --> 00:23:31,789 there. 306 00:23:31,789 --> 00:23:33,891 What kind of data? 307 00:23:33,891 --> 00:23:36,312 Where are the documents that discuss this? 308 00:23:36,312 --> 00:23:40,135 Is there anything in the SharePoint site that is relevant to XYZ? 309 00:23:40,135 --> 00:23:43,758 And that is 100 % possible. 310 00:23:43,758 --> 00:23:47,079 mean, like you said, you have the Azure OpenAI. 311 00:23:48,317 --> 00:23:50,238 API that's completely available. 312 00:23:50,238 --> 00:23:53,460 You have copilot people are going to you know, it's gonna get better and better. 313 00:23:53,460 --> 00:24:02,505 So yeah, that's the future is like people are just going to be like using the whether it's an eDiscovery tool or just an agent which saying like hey, what's in here and they're 314 00:24:02,505 --> 00:24:04,366 starting that now. 315 00:24:04,366 --> 00:24:05,686 Whether or not it's great. 316 00:24:05,686 --> 00:24:06,607 It's not that great. 317 00:24:06,607 --> 00:24:09,709 um People are building tools to do that. 318 00:24:09,709 --> 00:24:13,020 Like so we have some some friends that are building in the industry. 319 00:24:13,020 --> 00:24:16,198 They're building some tools where it's like, hey, we're gonna spin up 320 00:24:16,198 --> 00:24:22,312 Just like you're doing, you're spinning up these virtual environments within a company's Azure environment. 321 00:24:22,332 --> 00:24:23,483 that's the way to go, right? 322 00:24:23,483 --> 00:24:29,157 So you want privacy, you want protection of your data, then don't let it leave. 323 00:24:29,157 --> 00:24:37,163 That model is kind of outdated where you're uploading data into a relativity or into some other tool, like an I managed to look at it. 324 00:24:37,163 --> 00:24:37,593 Why? 325 00:24:37,593 --> 00:24:43,487 I don't see that model lasting for too much longer. 326 00:24:43,667 --> 00:24:45,020 It's expensive. 327 00:24:45,020 --> 00:24:46,880 It's risky. 328 00:24:47,480 --> 00:24:52,320 Your data does not, the retention policies that apply to your data no longer apply. 329 00:24:52,700 --> 00:24:58,880 That data that resides wherever it is, is still subject to discovery. 330 00:24:59,460 --> 00:25:05,500 So you don't want to, I think that that's where we're going to see it more and more. 331 00:25:05,500 --> 00:25:14,440 I'm just interested to understand how like these software companies plan on monetizing their applications. 332 00:25:15,398 --> 00:25:23,116 with such powerful AI being able to kind of create these agents so easily internally. 333 00:25:23,116 --> 00:25:31,396 you know, and agents and interoperability, that was a big theme at Ilticon this year. 334 00:25:32,236 --> 00:25:45,656 so yeah, we heard, you know, I believe it was I manage and their adoption of model context protocol, MCP, which allow, you know, agent to agent communication. 335 00:25:46,196 --> 00:25:47,376 I 336 00:25:47,417 --> 00:26:00,137 I'm very bullish on the future of agents, but I'm a little bit bearish on their ability to be deployed in any high risk scenario now. 337 00:26:00,277 --> 00:26:05,097 So for a customer service agent, no problem. 338 00:26:05,097 --> 00:26:07,837 A sales and marketing agent, no problem. 339 00:26:08,517 --> 00:26:14,797 A new matter intake agent, you're gonna need a human in the loop on that. 340 00:26:14,997 --> 00:26:15,906 Yeah, well. 341 00:26:15,906 --> 00:26:20,980 Well, um because right now, LLMs are not deterministic. 342 00:26:20,980 --> 00:26:28,616 In other words, you can take the same prompt, ah run it, copy and paste it, run it again, you're going to get back different results, right? 343 00:26:28,616 --> 00:26:30,067 So, they're not deterministic. 344 00:26:30,067 --> 00:26:32,279 uh There's still hallucinations. 345 00:26:32,279 --> 00:26:35,552 They've gotten better, but there are still are hallucinations. 346 00:26:35,552 --> 00:26:37,473 They're not hallucination free. 347 00:26:37,534 --> 00:26:42,598 So, anything of a high-risk nature, I would bucket that in. 348 00:26:42,598 --> 00:26:45,462 That's a little bit further down the road. 349 00:26:45,462 --> 00:26:57,575 Let's, let's check off the, let's check off the lower risk use cases first where, know, if something doesn't get classified properly, has big dollar economic implications. 350 00:26:57,575 --> 00:26:58,276 I just don't. 351 00:26:58,276 --> 00:27:12,550 um and, everybody may have different tolerances on this, but, um, given where, and things are moving so quickly, like honestly, uh, chat GPT five, I like less than I did, uh, prior 352 00:27:12,550 --> 00:27:14,380 where I could pick my model. 353 00:27:14,597 --> 00:27:19,916 And you know, yeah, because they backpedaled. 354 00:27:19,916 --> 00:27:29,196 reacted pretty quickly, but if you think about software, you know, it's generally, you know, a SaaS solution, right? 355 00:27:29,196 --> 00:27:33,656 You don't have like six versions of Salesforce you're using the current, right? 356 00:27:33,676 --> 00:27:37,907 But I mean, people have relationships with these models, which is a little bit different. 357 00:27:37,907 --> 00:27:43,827 They'll probably, you know, we're learning so much with like AI so new. 358 00:27:43,912 --> 00:27:47,971 that they're applying traditional kind of software models and releases. 359 00:27:47,971 --> 00:27:53,372 And they may just, they're gonna learn, they're gonna pivot, they're gonna do something that makes a little bit more sense to its users. 360 00:27:53,372 --> 00:27:56,692 But to be honest, I don't like chat GBT as much either. 361 00:27:57,592 --> 00:27:59,892 But some people really do. 362 00:27:59,892 --> 00:28:00,832 They're like, oh, it's super fast. 363 00:28:00,832 --> 00:28:03,463 I find it very slow, quite frankly. 364 00:28:03,463 --> 00:28:04,763 But I kind of agree with you. 365 00:28:04,763 --> 00:28:07,483 Just to come back to your point about... 366 00:28:08,234 --> 00:28:17,005 I just wanted to be double as advocate as to why you think they're not ready, these agents, to kind of do the things that are high risk. 367 00:28:17,005 --> 00:28:21,585 You have to kind of treat it like a junior associate. 368 00:28:21,585 --> 00:28:23,565 Like this stuff needs eyes on. 369 00:28:23,565 --> 00:28:34,605 And I think in pretty much in most respects, even if it's not high risk, if you're going to be repeating anything that you get out of AI, you should probably make sure that it's 370 00:28:34,605 --> 00:28:35,897 actually true. 371 00:28:35,897 --> 00:28:45,610 um Even, you know, even like, you know, facts about the news or this or that or the other, like, you know, this is not perfect. 372 00:28:45,610 --> 00:28:52,181 is getting data that it's been trained on and the training data may not be correct. 373 00:28:52,181 --> 00:28:57,723 um The people that are creating the agents, they have bias. 374 00:28:57,723 --> 00:29:03,035 They, you know, you don't have any transparency into how these are created or anything like that. 375 00:29:03,035 --> 00:29:03,785 So. 376 00:29:03,965 --> 00:29:09,268 We always like we do a lot of AI solutions and I would never say, all right, yeah, just send this out. 377 00:29:09,268 --> 00:29:18,653 It's like, you know, when we create something for our clients, we, we proof it and then we make sure that they proof it, you know, this is not a person. 378 00:29:18,653 --> 00:29:20,634 This is a machine. 379 00:29:20,634 --> 00:29:22,836 It is that it created this. 380 00:29:22,836 --> 00:29:24,046 So you, but it's real. 381 00:29:24,046 --> 00:29:25,167 mean, they're very effective. 382 00:29:25,167 --> 00:29:26,107 They save a lot of time. 383 00:29:26,107 --> 00:29:30,260 Like we do production requests responses. 384 00:29:30,260 --> 00:29:32,878 We have a tool that does this for our clients. 385 00:29:32,878 --> 00:29:37,699 And it writes as the attorneys write and it has the same format and looks exactly like that. 386 00:29:37,699 --> 00:29:43,080 So we'll create a production request response um for the attorneys to start with. 387 00:29:43,080 --> 00:29:54,343 So it just saves them a lot of time just to even create that saves them like, you know, days, provides like sample arguments, you know, that can, you know, cool stuff like that. 388 00:29:54,343 --> 00:29:56,884 But you know, I would never say just send that out. 389 00:29:56,884 --> 00:30:00,651 Like you get, you know, it'll take them an hour instead of two days to do something. 390 00:30:00,651 --> 00:30:02,045 I think that's great. 391 00:30:02,185 --> 00:30:02,797 You know, 392 00:30:02,797 --> 00:30:03,068 Yeah. 393 00:30:03,068 --> 00:30:07,711 Like I have an agent that, it's still not perfected. 394 00:30:07,711 --> 00:30:12,684 I'm still kicking around on an eight and, using co-pilot and trying to figure out the right path. 395 00:30:12,684 --> 00:30:19,488 But like, I have, uh, I have started, this is a great example of something you can use an agent for today. 396 00:30:19,488 --> 00:30:20,238 That's low risk. 397 00:30:20,238 --> 00:30:25,631 And if it fails, it's not a big deal, but I get emails from, 398 00:30:27,891 --> 00:30:33,263 like think tanks, like, you know, I, I subscribe to Jeff, Brant's Penhawk newsletter. 399 00:30:33,263 --> 00:30:33,904 It's great. 400 00:30:33,904 --> 00:30:45,719 Uh, artificial lawyer, Bob Ambrosia, and I have them routed all to a folder and I have an agent that goes through and finds out the stuff that's compelling for thought leadership 401 00:30:45,719 --> 00:30:46,249 activities. 402 00:30:46,249 --> 00:30:52,752 Like gives me a bulleted list of, Hey, these are the things that happened yesterday in legal tech or, or AI. 403 00:30:52,752 --> 00:30:54,718 Um, that's a s 404 00:30:54,718 --> 00:30:55,374 cool. 405 00:30:55,374 --> 00:30:56,045 Yeah. 406 00:30:56,045 --> 00:30:57,926 super easy and it's low risk. 407 00:30:57,926 --> 00:31:03,629 know, I use, uh haven't, I don't have not orchestrated an agent to do this yet. 408 00:31:03,629 --> 00:31:06,070 It's still manual, but it's going to be an agent. 409 00:31:06,070 --> 00:31:10,692 When I, when I do a planning call for an agenda, I record it. 410 00:31:10,692 --> 00:31:11,992 I download the transcript. 411 00:31:11,992 --> 00:31:18,245 I load it into a custom called Claude project and it outputs, uh I've trained it in its training materials. 412 00:31:18,245 --> 00:31:22,957 have several handwritten agendas back when I have to use to have to do it myself. 413 00:31:22,957 --> 00:31:24,042 Eventually. 414 00:31:24,042 --> 00:31:35,136 You know, I'll wire up a Zapier integration or with N8n go through and have it just as soon as I end the call, I'll have a naming convention in the meeting planner in my Outlook 415 00:31:35,136 --> 00:31:41,269 calendar that it will know to go grab it, run it through the project, and then send me the result. 416 00:31:41,269 --> 00:31:44,780 You know, those are all really productive and they're real time savers. 417 00:31:44,780 --> 00:31:53,215 Like it's, it's, it's, it's, um, I'm getting real time savings from it, but would I have it go through and 418 00:31:53,215 --> 00:31:54,550 Pay my bills? 419 00:31:54,796 --> 00:31:55,481 No. 420 00:31:55,481 --> 00:31:57,661 Nope, no, of course not. 421 00:31:58,061 --> 00:32:00,281 No, 100%, 100%. 422 00:32:00,281 --> 00:32:02,081 It's a huge time saver. 423 00:32:02,081 --> 00:32:06,001 People should be, you know, I'm just focusing on illegal. 424 00:32:06,001 --> 00:32:11,921 They should be using it for more than just thinking about, oh, I can do auto review for this. 425 00:32:12,041 --> 00:32:15,461 No, there's so, so many things that can be done. 426 00:32:15,781 --> 00:32:18,781 We integrated into all of our processes. 427 00:32:18,781 --> 00:32:25,561 We understand how to do that in a secure and, know, know, privacy first kind of manner, but. 428 00:32:26,019 --> 00:32:28,661 Like I want to go out and like teach corporations this stuff. 429 00:32:28,661 --> 00:32:35,505 Like I'm watching them uh buy these products for like six figures subs, you know, on manual basis. 430 00:32:35,505 --> 00:32:41,700 And, know, I'm thinking to myself, oh, she's just like, she's hired dude to carry these. 431 00:32:41,700 --> 00:32:51,186 Like you could create these internally, uh you know, that are probably more effective and more customized than then and more secure than you are currently, you know, paying these 432 00:32:51,186 --> 00:32:52,407 six figure subs. 433 00:32:52,407 --> 00:32:53,388 And what are they doing? 434 00:32:53,388 --> 00:32:55,481 You're not, they're using these like, 435 00:32:55,481 --> 00:32:57,061 products that are coming out. 436 00:32:57,061 --> 00:32:58,001 Oh, don't worry. 437 00:32:58,001 --> 00:33:01,501 We're going to use your, we'll use your API. 438 00:33:01,501 --> 00:33:02,681 Oh, great. 439 00:33:02,781 --> 00:33:05,381 So I pay for, you pay for the AI, right? 440 00:33:05,381 --> 00:33:07,821 And we're going to use your environment in Azure. 441 00:33:07,821 --> 00:33:08,801 Awesome. 442 00:33:08,921 --> 00:33:10,801 You're paying for the infrastructure. 443 00:33:10,801 --> 00:33:20,741 And so when you get these six figure, you know, I've been in this, the legal tech industry a long time with six figure subscriptions is normal for a large corporation. 444 00:33:21,321 --> 00:33:24,142 And these software companies, they're 445 00:33:24,142 --> 00:33:29,344 continuing to do so, but providing less and less value, right? 446 00:33:29,534 --> 00:33:30,504 you're providing the code. 447 00:33:30,504 --> 00:33:39,909 Well, the know, the news today on NPR was, you know, that people with a computer science degree, you know, used to be able to go to school and come out and get a six figure 448 00:33:39,909 --> 00:33:40,390 salary. 449 00:33:40,390 --> 00:33:49,023 Now they're one of the most, you know, unemployed, uh newly graduated comp sci students are the most unemployed sector of our economy. 450 00:33:49,023 --> 00:33:50,663 I have a computer science degree. 451 00:33:50,663 --> 00:33:52,576 I got it so I could get a job. 452 00:33:52,576 --> 00:33:55,659 Back in the day, I was into social science. 453 00:33:55,659 --> 00:33:59,442 I switched from international development to this. 454 00:34:00,384 --> 00:34:03,047 to think about that, I mean, that's a huge shift. 455 00:34:03,047 --> 00:34:06,940 So I think software companies, you're also going to see a shift. 456 00:34:06,940 --> 00:34:15,039 So these AIs like affecting us in so many different ways that I don't think we can predict, but it will change everything. 457 00:34:15,039 --> 00:34:16,505 ah 458 00:34:16,505 --> 00:34:28,433 you know, and there's like, hear a lot of banter to along those lines, like, you know, Satya Nadella, who's the CEO of Microsoft talked about kind of the death of SAS. 459 00:34:28,433 --> 00:34:33,937 And I'll tell you right now, SAS isn't touching our business anytime soon. 460 00:34:33,937 --> 00:34:45,168 I'm sure there are scenarios like maybe a CRM where you can vibe code, you know, a basic contact database and the ability to have a tickler and you know, 461 00:34:45,168 --> 00:34:46,549 Okay, yeah, maybe. 462 00:34:46,549 --> 00:34:58,241 like when, what we do, like all the integrations and all the management that comes associated with like when the API changes, like we have to go change our, our code, like 463 00:34:58,241 --> 00:35:00,053 for AI to go do that. 464 00:35:00,053 --> 00:35:05,858 And then the deployment, like we deploy in the client's tenant, like there's all these environmental variables. 465 00:35:05,858 --> 00:35:08,331 Like if they have, it was really interesting. 466 00:35:08,331 --> 00:35:09,852 We had a client. 467 00:35:09,852 --> 00:35:15,236 We just migrated off of uh high Q, which is kind of the incumbent in the extra net space. 468 00:35:15,497 --> 00:35:30,659 And, um, the, had two customers where we couldn't send invitations to the law firm had two customers where we would, we initiated the invitation, the external sharing and 469 00:35:30,659 --> 00:35:31,311 invitation. 470 00:35:31,311 --> 00:35:33,071 kind of bypassed the GUI. 471 00:35:33,071 --> 00:35:38,005 do it through the API and we had two, two other customers out of thousands. 472 00:35:38,005 --> 00:35:39,613 think there were like 4,000. 473 00:35:39,613 --> 00:35:42,545 where it wouldn't work and we're like, what the hell is going on here? 474 00:35:42,545 --> 00:35:43,836 But it worked in high Q. 475 00:35:43,836 --> 00:35:57,485 So we had to pick apart, there's two little check boxes and they, both of these companies uh had headquarters in China and um way buried deep in the bowels of the Azure, the Azure 476 00:35:57,485 --> 00:35:59,607 uh admin center. 477 00:35:59,607 --> 00:36:06,791 There's a couple of check boxes that where there's settings that enable or disable your ability to share with. 478 00:36:06,791 --> 00:36:08,134 um 479 00:36:08,134 --> 00:36:13,686 I think they use some separate network in China for M365 customers. 480 00:36:13,686 --> 00:36:16,107 It's above my pay grade. 481 00:36:16,107 --> 00:36:17,997 anyway, lots of little nuances like that. 482 00:36:17,997 --> 00:36:23,269 Like, yeah, you're never going to just create an AI agent to go, here, just go deploy this. 483 00:36:23,269 --> 00:36:24,169 I say never. 484 00:36:24,169 --> 00:36:26,510 Maybe never is the right time soon. 485 00:36:26,510 --> 00:36:31,552 Are you going to have a button where you just go deploy something as complex as what we have? 486 00:36:31,552 --> 00:36:32,664 um 487 00:36:32,664 --> 00:36:33,755 let's talk about that a little bit. 488 00:36:33,755 --> 00:36:36,046 Can they write the code to do it? 489 00:36:36,626 --> 00:36:43,149 If they're given the proper instructions, if you give the AI the proper instructions on writing that code, then yes. 490 00:36:43,549 --> 00:36:52,253 But um the subject matter experts that understand the, know, all AI is doing is it's learning from what's out there. 491 00:36:52,253 --> 00:36:57,314 So if you're doing something truly innovative, it does not know how to do that. 492 00:36:57,875 --> 00:37:02,457 And if there's no documentation or examples of how to do that, you're not going to have that. 493 00:37:02,457 --> 00:37:03,417 It'll just... 494 00:37:03,577 --> 00:37:12,711 like even coding, hallucinates quite, it's very, it kinda like, I love that it's coding, it's great, it saves some time, but you cannot trust that code. 495 00:37:12,711 --> 00:37:21,835 Like people think you can, oh, yeah, I can just, you know, no, like you need these subject matter experts that understand the true workflow and process and that make sure that 496 00:37:21,835 --> 00:37:23,856 things are not overlooked and et cetera, et cetera. 497 00:37:23,856 --> 00:37:31,334 Like what you're talking about, some tasks by, you know, more kind of like rote tasks and testing and. 498 00:37:31,334 --> 00:37:41,871 basic coding will be able to be taken over by like the QA process, I think is really going to benefit from AI, but not the innovation, the creativity, the true, like the in-depth 499 00:37:41,871 --> 00:37:43,452 understanding of subject matter. 500 00:37:43,452 --> 00:37:46,974 That's not like AI doesn't understand that. 501 00:37:47,174 --> 00:37:49,936 And I have a, like a little example of that. 502 00:37:49,936 --> 00:37:54,858 um If we have a minute, my business partner, 503 00:37:55,851 --> 00:37:59,811 and I, we've been working, Greg Estes, he's the CEO of Bluestar. 504 00:37:59,811 --> 00:38:03,531 We've been working on these technology projects forever. 505 00:38:03,531 --> 00:38:06,651 And he's like a big ideas guy, but he can't code. 506 00:38:06,711 --> 00:38:09,131 He can't code to save his life. 507 00:38:09,131 --> 00:38:13,031 So it'd always be like, okay, Sarah, want you guys to build this. 508 00:38:13,031 --> 00:38:14,011 Let's try that. 509 00:38:14,271 --> 00:38:21,731 And now he can build his own apps with AI. 510 00:38:21,731 --> 00:38:25,683 There's a platform called Replit, which is... 511 00:38:25,683 --> 00:38:28,695 You basically, you know, say, I want to build this app, does this. 512 00:38:28,695 --> 00:38:34,370 And to be honest, like, I hate it so much because I'm like, I can't believe that works. 513 00:38:34,370 --> 00:38:44,161 Like, like he builds a Salesforce CRM, you know, and I'm like, okay, well, I can't, it is great, but could you update it? 514 00:38:44,161 --> 00:38:45,631 Do you understand how the code works? 515 00:38:45,631 --> 00:38:47,192 Is it doing everything you want it to? 516 00:38:47,192 --> 00:38:48,873 Where's the data store? 517 00:38:48,873 --> 00:38:53,981 People that don't have that in depth knowledge, you know, they can't. 518 00:38:53,981 --> 00:38:55,162 they don't know. 519 00:38:55,743 --> 00:39:04,050 The small things that aren't working, they try to like with a text prompt, update it and the whole product gets, you know, stops working. 520 00:39:04,050 --> 00:39:13,737 So, you know, I think we're getting there to a point where people are going to be able to create more interesting things without having that product knowledge, but you still need, 521 00:39:14,358 --> 00:39:14,919 you know what I mean? 522 00:39:14,919 --> 00:39:21,444 Like you, I do believe that if people want to differentiate themselves, it's like, you still have to have that technology. 523 00:39:22,483 --> 00:39:29,860 technological background or understanding in order to create a product that really does work and is marketable, you know? 524 00:39:29,860 --> 00:39:31,611 Yeah, so I'm a software guy too. 525 00:39:31,611 --> 00:39:45,985 I came up the ranks as a uh software engineer and uh the the illities, uh scalability, maintainability, reliability, interoperability, usability, testability, portability, none 526 00:39:45,985 --> 00:39:49,766 of that exists today with when you're vibe coding. 527 00:39:49,766 --> 00:39:53,672 It's great for prototyping and proofs of concept. 528 00:39:53,672 --> 00:39:54,009 Yeah. 529 00:39:54,009 --> 00:40:02,395 but it's not, this is not production ready code and where we have to get to from where we are, it's a big, it's a big jump. 530 00:40:02,395 --> 00:40:03,466 Will we get there one day? 531 00:40:03,466 --> 00:40:09,761 I'm sure eventually we will, but all the illities today, if you crack open the code, I will say this. 532 00:40:09,761 --> 00:40:15,825 um One thing I have found uh AI really useful for is writing SQL. 533 00:40:15,825 --> 00:40:20,869 So that's the one area that I've kind of kept up to date because it doesn't. 534 00:40:20,869 --> 00:40:21,849 changed, right? 535 00:40:21,849 --> 00:40:25,369 SQL's almost the same today as it was 20 years ago. 536 00:40:25,369 --> 00:40:26,869 know, everything else has changed. 537 00:40:26,869 --> 00:40:33,269 If you're building a web app now versus 20 years ago, 20 years ago, was, you know, web forms. 538 00:40:33,269 --> 00:40:34,169 Let's see, what are they? 539 00:40:34,169 --> 00:40:34,989 2005? 540 00:40:34,989 --> 00:40:35,549 Yeah. 541 00:40:35,549 --> 00:40:37,609 Web forms and post back. 542 00:40:37,609 --> 00:40:43,789 And now it's all front end JavaScript focused and model view controller. 543 00:40:44,009 --> 00:40:46,749 um, so yeah. 544 00:40:47,209 --> 00:40:48,389 Oh yeah. 545 00:40:51,465 --> 00:40:53,163 Yeah, same. 546 00:40:53,163 --> 00:40:59,997 I don't, um, I don't, the only thing really technical that I do is, is, is write SQL and it's quite good at that. 547 00:40:59,997 --> 00:41:15,406 but I have seen, you know, I have seen the, the code from these, um, from like Replet and, a cursor and they're, they're utilities, they're, they're little productivity boosters, 548 00:41:15,406 --> 00:41:17,305 but if you're going to build an app, 549 00:41:17,305 --> 00:41:22,454 for anything other than a proof of concept, right now it's just, it's not there. 550 00:41:23,146 --> 00:41:28,266 It's kind of very difficult, know, like my partner would be like, yeah, it just has, this is just not working. 551 00:41:28,266 --> 00:41:29,746 Could you fix it? 552 00:41:30,006 --> 00:41:34,526 And so I go in the code and I'm like, I have no idea how this code's working. 553 00:41:34,526 --> 00:41:44,786 And it's super, it lays it in a very complex manner, like in non-human manner, which maybe is the most efficient and effective, but if a human wants to come in there and actually 554 00:41:44,786 --> 00:41:47,226 kind of dive in, it's almost impossible. 555 00:41:47,983 --> 00:41:48,937 Totally. 556 00:41:48,937 --> 00:41:55,437 Yeah, so I hate it, but I think it'll get there. 557 00:41:55,437 --> 00:42:01,377 I think it'll get there pretty quickly, unfortunately, for people like us. 558 00:42:01,377 --> 00:42:03,457 But we'll see, I guess. 559 00:42:03,557 --> 00:42:07,797 No one's going to replace that subject matter expertise. 560 00:42:08,077 --> 00:42:08,907 It's so true. 561 00:42:08,907 --> 00:42:09,258 Yeah. 562 00:42:09,258 --> 00:42:15,300 And you know, so, and you brought up a good point about the inability of, and this is an architectural limitation. 563 00:42:15,300 --> 00:42:20,222 Like this one's not solvable with the current architecture of LLMs. 564 00:42:20,323 --> 00:42:28,306 Um, I, that's pretty much the consensus and that's around the ability to come up with novel ideas. 565 00:42:28,606 --> 00:42:28,996 Right? 566 00:42:28,996 --> 00:42:29,186 Yeah. 567 00:42:29,186 --> 00:42:29,427 Yeah. 568 00:42:29,427 --> 00:42:32,308 So it's like, can, but you know what it can do? 569 00:42:32,308 --> 00:42:34,929 It can reassemble. 570 00:42:35,301 --> 00:42:42,005 um pieces of information into novel concepts, but if those component parts aren't... 571 00:42:42,005 --> 00:42:49,549 don't exist in its vector space, it can't like create a new mathematical proof. 572 00:42:49,549 --> 00:42:50,530 Not possible. 573 00:42:50,530 --> 00:42:51,180 Can't do it. 574 00:42:51,180 --> 00:42:53,742 If it hasn't seen it, it can't do it, right? 575 00:42:53,742 --> 00:43:00,695 Because um that usually requires a lot of like new abstract thinking. 576 00:43:00,956 --> 00:43:03,137 All the easy... 577 00:43:03,973 --> 00:43:06,676 Theorems have been proved proven. 578 00:43:06,676 --> 00:43:13,001 um But yeah, eventually I don't think I don't think the current LLM architecture is the end state. 579 00:43:13,001 --> 00:43:17,425 I think this is a stepping stone to whatever's next. 580 00:43:17,930 --> 00:43:24,353 It's like thinking that dial-up was the end state of the internet, right? 581 00:43:24,954 --> 00:43:29,787 We're in a completely new uh generation of technology. 582 00:43:29,787 --> 00:43:31,177 It's gonna be real interesting. 583 00:43:31,177 --> 00:43:40,803 And to think what, you I was just talking about this with my partner and we're, you know, we're just, we've seen so many things in our lifetimes at our age. 584 00:43:40,803 --> 00:43:44,155 You know, we didn't have cell phones and you know, we didn't have internet. 585 00:43:44,155 --> 00:43:45,001 didn't have. 586 00:43:45,001 --> 00:43:51,861 any of this technology and to kind of be where we are today with this kind of new AI, you know, wave coming through. 587 00:43:51,861 --> 00:43:55,801 It's pretty amazing how much has changed. 588 00:43:56,421 --> 00:44:04,641 And people talk a lot, you know, I have daughters that are like 13 and 14 and they're like, oh, you know, AI is taking away jobs and you know, that's bad. 589 00:44:04,641 --> 00:44:13,383 And I'm like, yeah, but people said, say that about like moving to kind of more sustainable energy. 590 00:44:13,383 --> 00:44:15,113 You know, like, it's taking away jobs. 591 00:44:15,113 --> 00:44:17,214 like, yeah, but don't this is called evolution. 592 00:44:17,214 --> 00:44:18,374 This is how we evolve. 593 00:44:18,374 --> 00:44:21,424 said, you know, out of anybody have a computer science degree. 594 00:44:21,424 --> 00:44:24,116 Like I, I should be upset about it. 595 00:44:24,116 --> 00:44:30,188 But no, any like you are not entitled to have the same job for your entire life. 596 00:44:30,188 --> 00:44:32,078 Nor what why should you want it? 597 00:44:32,158 --> 00:44:42,391 You know, like you should be constantly, you know, changing, learning uh and kind of adapting to kind of the what's new and upcoming. 598 00:44:42,391 --> 00:44:43,309 that's what we'll 599 00:44:43,309 --> 00:44:44,991 make you stay relevant. 600 00:44:44,991 --> 00:44:48,736 um So I think AI is great. 601 00:44:48,736 --> 00:44:50,317 I totally embrace it. 602 00:44:50,317 --> 00:44:52,820 um People should. 603 00:44:53,190 --> 00:45:01,755 it's just like capitalism rewards the efficient use of capital. 604 00:45:01,755 --> 00:45:13,292 um In a capitalistic society, skill sets have to change to align with those movements. 605 00:45:13,292 --> 00:45:15,563 You know, like, I mean, think about spreadsheets. 606 00:45:15,563 --> 00:45:19,110 So spreadsheets didn't exist in 1970. 607 00:45:19,110 --> 00:45:31,890 Um, they really hit their stride in the, in the eighties and there were, uh, there were 850,000 CPAs and I think this was the U S I looked this number up one time for another 608 00:45:31,890 --> 00:45:35,650 podcast and there are, there are more than double that number. 609 00:45:35,650 --> 00:45:39,470 Now it didn't, it didn't kill the accounting profession. 610 00:45:39,590 --> 00:45:41,050 It changed it. 611 00:45:41,050 --> 00:45:41,570 Right. 612 00:45:41,570 --> 00:45:48,557 Um, you did, you don't have to pay, you don't have to pay your accountants to build a pro forma or a, you know, 613 00:45:48,557 --> 00:45:54,793 your revenue you can do it with Excel but they still have plenty to do more than ever 614 00:45:54,793 --> 00:45:58,812 Oh yeah, they're just doing it faster, better, they're more in-site. 615 00:45:58,812 --> 00:46:02,544 And there's going to be jobs created that we can't even fathom. 616 00:46:02,544 --> 00:46:07,564 Right now they're coming out with these browsers that are all AI powered, right? 617 00:46:07,584 --> 00:46:19,684 So you think about that, it's like, well, if you have any kind of platform, even if Microsoft doesn't have this AI embedded within, if you have a browser and you can query 618 00:46:19,684 --> 00:46:22,272 the data that you're looking at, 619 00:46:22,844 --> 00:46:33,086 your Q &A summarize, you know, stuff like I heard an example this morning where, know, and uh I think it was Claude or I'm sorry, I can't remember what model it was, but they kind 620 00:46:33,086 --> 00:46:34,279 of were using a browser. 621 00:46:34,279 --> 00:46:44,933 And you could say, okay, you know, you're on LinkedIn, find me all the employees that used to work at, you know, um XYZ company, but no longer do. 622 00:46:44,974 --> 00:46:50,748 And you'd have to before you'd have to kind of go through and search it or get one of those tools, you know, like uh 623 00:46:50,748 --> 00:46:56,993 Zoom info gives you all the people's information, but now you can just, it's literally built into your browser. 624 00:46:57,534 --> 00:46:59,446 And that's the future, right? 625 00:46:59,446 --> 00:47:01,818 It's like, it's too easy. 626 00:47:01,818 --> 00:47:03,449 You know, don't have to set things up. 627 00:47:03,449 --> 00:47:09,304 It's just like, you're just, you're getting more insight into data and we're going to see some pretty cool stuff. 628 00:47:09,477 --> 00:47:12,477 Yeah, as you were talking there, I was just looking at my... 629 00:47:12,477 --> 00:47:16,237 I was using Comet, which is Perplexity's... 630 00:47:16,237 --> 00:47:17,777 Yeah, Comet. 631 00:47:17,777 --> 00:47:18,537 And they... 632 00:47:18,537 --> 00:47:20,377 I don't have a paid plan with Perplexity. 633 00:47:20,377 --> 00:47:21,097 I already have... 634 00:47:21,097 --> 00:47:22,177 I already pay for 4A. 635 00:47:22,177 --> 00:47:25,556 I pay for Grok, Gemini, Claude and ChechyBT. 636 00:47:25,556 --> 00:47:28,357 It's like, man, I'm not paying for Perplexity too. 637 00:47:28,357 --> 00:47:28,837 It's just... 638 00:47:28,837 --> 00:47:29,737 it's too much. 639 00:47:29,737 --> 00:47:30,657 I like Perplexity. 640 00:47:30,657 --> 00:47:31,957 I think it's great. 641 00:47:31,957 --> 00:47:37,677 But I just fired it up for the first time this morning and it told me I had to have a pro subscription. 642 00:47:37,677 --> 00:47:38,684 So, I'm gonna... 643 00:47:38,684 --> 00:47:52,575 Well, the most recent episode of Hardfork, it's a podcast, they were given a look into comment and that's actually what they use. 644 00:47:52,575 --> 00:47:56,979 And I thought, wow, that's pretty eye-opening about where we're moving towards. 645 00:47:56,979 --> 00:47:59,541 And so I think that's pretty cool. 646 00:47:59,541 --> 00:48:01,653 um I guess we'll see what happens, know? 647 00:48:01,653 --> 00:48:05,436 And like we talked about, we'll just have to adapt, right? 648 00:48:05,436 --> 00:48:07,047 Yeah. 649 00:48:09,344 --> 00:48:10,205 Yeah. 650 00:48:10,205 --> 00:48:13,738 We're almost out of time, but you I wanted to say one quick thing. 651 00:48:13,738 --> 00:48:20,540 So I heard some really interesting analysis as to why, you know, ChatGPT is building a browser too. 652 00:48:20,540 --> 00:48:24,819 I was like, why are these, why are these AI companies building browsers? 653 00:48:24,819 --> 00:48:30,474 And in addition to just wanting to be your kind of operating system, they need data. 654 00:48:30,692 --> 00:48:32,512 So they're out of data. 655 00:48:32,512 --> 00:48:43,052 They've, they've, they've sucked it all in from the internet and, um, your browsing data gives them more training material, which I, I don't know why it never occurred to me. 656 00:48:43,052 --> 00:48:51,092 It made perfect sense when I heard it, but, um, you know, that's why there's such a race, uh, to, like get the browser out. 657 00:48:51,092 --> 00:48:54,812 I perplexity is impressed me with, I keep thinking they're going to die. 658 00:48:55,092 --> 00:48:59,212 Like chat GPT is going to kill them and they, they're pretty resilient. 659 00:49:00,028 --> 00:49:14,976 They put an offer in for Chrome, perplexity did, which Google, which I think is probably one of the the best AI creators out there. 660 00:49:15,017 --> 00:49:16,317 And we use it a lot. 661 00:49:16,317 --> 00:49:23,261 I don't necessarily think Gemini is, because its user interface and features may not be as great as ChatGPT or others. 662 00:49:23,261 --> 00:49:26,323 But when using it from an API perspective, it's awesome. 663 00:49:26,323 --> 00:49:28,379 um 664 00:49:28,379 --> 00:49:30,170 and cost effective and all this good stuff. 665 00:49:30,170 --> 00:49:32,090 I mean, why are they so good? 666 00:49:32,090 --> 00:49:34,191 Well, what, where do they get their data? 667 00:49:34,191 --> 00:49:36,132 They have Chrome, right? 668 00:49:36,132 --> 00:49:42,314 They also have YouTube, which they just took all the data, just took it. 669 00:49:42,334 --> 00:49:44,235 No permit, no permission to do so. 670 00:49:44,235 --> 00:49:50,037 They have everything, like all the input, all the interactions you make with anything that is a Google product. 671 00:49:50,037 --> 00:49:52,498 And that is what, why they're so good. 672 00:49:52,498 --> 00:49:56,039 And so, you know, you want to see perplexity buying Chrome. 673 00:49:56,039 --> 00:49:57,260 Same reason. 674 00:49:57,966 --> 00:49:58,910 Totally. 675 00:49:59,120 --> 00:50:00,922 Yeah, yeah, it's pretty interesting. 676 00:50:00,922 --> 00:50:10,260 uh People need to be really, whole other conversation is like, start thinking about, well, they need to be thinking about already, like, you know, kind of protecting themselves and 677 00:50:10,260 --> 00:50:13,132 their data, because there are no boundaries anymore. 678 00:50:13,132 --> 00:50:19,998 We're in a race for them, you know, for the moon, a moon race for AI, these AI companies, and they just don't get a crap. 679 00:50:21,180 --> 00:50:24,343 They'd rather be litigated against because it's worth it. 680 00:50:24,343 --> 00:50:25,543 It's worth it. 681 00:50:25,572 --> 00:50:28,935 Even even the ones who claim that they do like anthropic. 682 00:50:28,935 --> 00:50:40,924 mean they're you know, and I think they're the best of the of the bunch um But you know, they're still using pirated uh Books and they just lost a major lawsuit. 683 00:50:40,924 --> 00:50:44,702 I think it's under appeal but um well, this is 684 00:50:44,702 --> 00:50:46,107 took, oh sorry. 685 00:50:46,107 --> 00:50:46,578 oh 686 00:50:46,578 --> 00:50:47,029 that's okay. 687 00:50:47,029 --> 00:50:51,176 I was just going to say, um know I've kept you longer than I agreed to. 688 00:50:51,176 --> 00:50:52,648 So I apologize for that. 689 00:50:52,648 --> 00:50:58,648 But before we wrap up, like how do people find out more about you or your company? 690 00:50:58,648 --> 00:51:01,672 um Do a little self promotion real quick. 691 00:51:01,704 --> 00:51:04,375 Yeah, Sarah Thompson, can find me on LinkedIn. 692 00:51:04,375 --> 00:51:08,655 Also, our company is bluestarcs.com. 693 00:51:08,655 --> 00:51:18,875 Find out more about what we do for lit support, or you can go to SIEMLY, like S-I-E-M, like a SIEM, loi.com to find out about our Microsoft Investigations platform. 694 00:51:19,022 --> 00:51:19,592 Good stuff. 695 00:51:19,592 --> 00:51:27,415 Well, I appreciate you spending a few minutes with me this morning, and I'm sure we'll bump into each other sometime soon. 696 00:51:28,316 --> 00:51:29,976 All right, take care. 697 00:51:30,236 --> 00:51:31,197 Bye-bye. 698 00:51:31,277 --> 00:51:32,097 Bye. 00:00:02,024 Sarah, how are you this afternoon? 2 00:00:02,167 --> 00:00:02,832 I'm awesome. 3 00:00:02,832 --> 00:00:03,746 How are you? 4 00:00:03,925 --> 00:00:04,795 I'm doing good. 5 00:00:04,795 --> 00:00:10,265 I'm, I'm a little worn down from five days of ILTACON but, uh, I'm here. 6 00:00:10,265 --> 00:00:12,148 I'm, I'm vertical. 7 00:00:12,209 --> 00:00:14,229 So all is well. 8 00:00:14,950 --> 00:00:15,610 I'm winning. 9 00:00:15,610 --> 00:00:26,565 Um, well, you and I got connected through a, uh, one of your colleagues had reached out and I took a look at some of the stuff that you like to write and talk about. 10 00:00:26,565 --> 00:00:30,987 And there was pretty good alignment with what we'd like to talk about on the podcast. 11 00:00:31,223 --> 00:00:37,404 Stuff like M365 and AI and data and you've got a long history in legal tech. 12 00:00:37,404 --> 00:00:41,611 um You've been 20 plus years. 13 00:00:41,611 --> 00:00:45,447 Why don't you tell everybody kind of who you are, what you do and where you do it. 14 00:00:46,075 --> 00:00:46,845 Sure. 15 00:00:46,845 --> 00:00:48,396 My name is Sarah Thompson. 16 00:00:48,396 --> 00:00:50,976 I'm the Chief Product Officer for Blue Star. 17 00:00:50,976 --> 00:00:59,148 We're a litigation support shop at Chicago, but we operate worldwide, um but focus mainly on the United States. 18 00:00:59,289 --> 00:01:13,623 And uh what I do is really, uh we build legal tech solutions to kind of help out law firms and in-house counsel win their cases. 19 00:01:13,623 --> 00:01:15,437 I mean, that's the bottom line. 20 00:01:15,437 --> 00:01:23,044 and we do that both with, we have a investigations platform called Seamly, which kind of is pretty cool. 21 00:01:23,044 --> 00:01:34,894 It makes it so you don't have to collect data in Microsoft or at least start from a place of knowledge when you are doing collections, performing collections. 22 00:01:34,894 --> 00:01:40,319 And then we have, we build custom AI solutions to help, that are kind of matter-based. 23 00:01:40,319 --> 00:01:41,870 So we're doing a lot of really cool stuff. 24 00:01:41,870 --> 00:01:43,561 We've been doing it for, uh 25 00:01:43,857 --> 00:01:46,297 You know, like you said, I've been in the industry 20 years. 26 00:01:46,297 --> 00:01:49,637 Star has been actually been a litigation support shop for 20 years. 27 00:01:49,637 --> 00:01:53,737 So we kind of, we've seen it all done a lot. 28 00:01:54,477 --> 00:01:55,577 Don't know everything yet. 29 00:01:55,577 --> 00:01:57,237 Probably never will. 30 00:01:57,637 --> 00:01:58,941 So that's basically it. 31 00:01:58,941 --> 00:02:00,927 as you learn everything, it changes. 32 00:02:00,927 --> 00:02:03,331 So I don't think it's possible. 33 00:02:03,394 --> 00:02:06,096 I think if you learn it, it would be pretty boring. 34 00:02:06,217 --> 00:02:16,476 What I love about my job is that every day, you know, we're learning new stuff, especially with AI, it's like pretty mind blowing um how the pace that we're moving and the things 35 00:02:16,476 --> 00:02:21,491 that we can do today that we couldn't do even three months ago or yesterday, you know? 36 00:02:21,491 --> 00:02:23,250 So yeah, it's pretty cool. 37 00:02:23,250 --> 00:02:25,141 Yeah, I'm an enthusiast. 38 00:02:25,141 --> 00:02:27,893 don't call myself an AI expert. 39 00:02:27,893 --> 00:02:29,634 I'm more of an enthusiast. 40 00:02:29,634 --> 00:02:32,355 I don't even know what an AI expert really means. 41 00:02:32,355 --> 00:02:35,697 um Unless, you know, if you were... 42 00:02:35,697 --> 00:02:45,472 if you're working in the labs and engineering, okay, yeah, but you know, for the rest of us, I think just getting in and spending time with the tools and learning their 43 00:02:45,472 --> 00:02:50,456 capabilities and learning how to make them useful ah is best spent... 44 00:02:50,456 --> 00:02:52,408 actually doing both. 45 00:02:52,408 --> 00:02:53,579 We're into the code. 46 00:02:53,579 --> 00:02:55,220 into the rebuild agents, that kind of thing. 47 00:02:55,220 --> 00:02:57,732 That's, that's the really cool stuff. 48 00:02:57,732 --> 00:03:04,127 We definitely, yeah, we're looking at the products and we know what they do and it's really quite cool. 49 00:03:04,127 --> 00:03:09,122 But I think that the future we're going to see is going to be beyond that. 50 00:03:09,122 --> 00:03:11,083 I don't think it's going to be tool-based. 51 00:03:11,083 --> 00:03:18,719 think they had, there's a very small window where people are going to be, you know, buying these cool AI tools and you know, very shortly they're going to be building them. 52 00:03:19,423 --> 00:03:20,913 I, we agree with you. 53 00:03:20,913 --> 00:03:27,045 So, you know, we're gonna, we're gonna talk about that in, in, in just a minute, but we, we, we agree with you. 54 00:03:27,045 --> 00:03:37,588 I think it's, terms of differentiation and how firms are going to go about that process, it's going to require, you're not going to be buying off the shelf tools to differentiate 55 00:03:37,588 --> 00:03:38,009 yourself. 56 00:03:38,009 --> 00:03:41,440 If your competitor can buy them, it's not differentiating. 57 00:03:41,440 --> 00:03:46,380 So, you know, you're going to need to use your data. 58 00:03:46,380 --> 00:03:52,331 So, well, let's talk a little bit about M365, because that's one area of overlap with you guys and us. 59 00:03:52,331 --> 00:03:59,753 So, InfoDash is an intranet extranet platform that's built on SharePoint Online, Azure and Teams. 60 00:03:59,993 --> 00:04:06,255 And we've been doing SharePoint legal probably longer than anyone out there. 61 00:04:06,255 --> 00:04:14,087 Started in 08 and we were a services company and built custom bespoke solutions. 62 00:04:14,087 --> 00:04:15,757 And then we productized 63 00:04:15,917 --> 00:04:31,517 We started the process in 2018 and then Microsoft completely nuked their development model in SharePoint in 2019 ish and released the SharePoint framework and know, switched from 64 00:04:31,517 --> 00:04:32,877 Angular to React JS. 65 00:04:32,877 --> 00:04:44,237 And so we kind of back to the drawing board, but we finally got across the goal line in and released in January of 2022 and things have been going great ever since. 66 00:04:44,237 --> 00:04:44,609 But 67 00:04:44,609 --> 00:04:53,799 How uh did your alignment with M365, by you, mean, Blue Star, how did that alignment with M365, how did that happen? 68 00:04:54,545 --> 00:05:04,508 Right, gosh, around 2012, we started creating a platform called eCloud Collect. 69 00:05:04,508 --> 00:05:08,819 And this was uh kind of a concept before its time, I want to say. 70 00:05:08,819 --> 00:05:10,409 It was like a cloud collection tool. 71 00:05:10,409 --> 00:05:18,852 So it would collect from Microsoft 365, Google, AWS, for discovery purposes. 72 00:05:18,852 --> 00:05:21,372 But it would also do remote computer collections. 73 00:05:21,393 --> 00:05:23,085 And it was quite big. 74 00:05:23,085 --> 00:05:25,166 before its time, you know? 75 00:05:25,207 --> 00:05:33,914 And then we ended up selling um this product over to a company called Zaproved, or some people call it Z-approved, it's Zaproved. 76 00:05:33,914 --> 00:05:35,255 It's actually now Xtero. 77 00:05:35,255 --> 00:05:44,543 uh And so they started like integrating our product into their, you know, kind of collection tool, because they're building kind of any discovery stack, right? 78 00:05:44,543 --> 00:05:47,415 So uh we were collecting from Microsoft already. 79 00:05:47,415 --> 00:05:49,806 And, you know, we learned a lot there. 80 00:05:49,806 --> 00:05:52,509 We had to do OneDrive and SharePoint. 81 00:05:52,592 --> 00:05:54,583 saw the challenges there. 82 00:05:54,583 --> 00:05:57,985 And then we started moving. 83 00:05:58,005 --> 00:06:07,990 As it so happened, the product manager that headed up Microsoft eDiscovery over at Microsoft was a friend of mine from in the industry. 84 00:06:07,990 --> 00:06:09,671 His name was Rocky Messing. 85 00:06:09,671 --> 00:06:19,577 And we were chatting one day and he says, you know, it would be so great if you guys would build an integration with your legal hold tool into our Microsoft 365 eDiscovery 86 00:06:19,577 --> 00:06:20,907 preservation tool. 87 00:06:21,219 --> 00:06:29,366 And so that when people put a hold on and legal hold pro that they could, you it would actually go on and Microsoft and if they turned it off, you know, it would turn off 88 00:06:29,366 --> 00:06:30,658 anyways, the integration. 89 00:06:30,658 --> 00:06:32,600 So um we did that. 90 00:06:32,600 --> 00:06:40,406 And so we were the first, we built the first tool that integrated with the Microsoft Preservations with one, you know, in the legal tech space. 91 00:06:41,047 --> 00:06:42,789 And we learned a lot. 92 00:06:42,789 --> 00:06:43,710 It was not easy. 93 00:06:43,710 --> 00:06:47,665 um As you probably, as you know, you you alluded to. 94 00:06:47,665 --> 00:06:53,965 There's a lot of ins and outs, so we ended up having to use some weird technology. 95 00:06:54,385 --> 00:06:58,805 We had to use some old stuff, integrated PowerShell throughout. 96 00:06:58,805 --> 00:07:01,145 So a big learning curve. 97 00:07:02,085 --> 00:07:11,285 And so after that acquisition was complete, about three years later, I had gone over to disapprove for the acquisition. 98 00:07:11,285 --> 00:07:14,825 I came back with Bluestart, and we went back to the drawing board. 99 00:07:14,993 --> 00:07:17,553 we're kind of thinking about, you know, what am going to do next? 100 00:07:17,553 --> 00:07:29,413 And what I wanted to do was get, you know, really in on actual Microsoft investigations, you know, because, know, from a product perspective, you have like assumptions and, oh, 101 00:07:29,413 --> 00:07:37,953 this is how it works, you know, but it's nothing, there's nothing like actually doing it, like the user, you know, living that user's experience. 102 00:07:38,213 --> 00:07:41,426 So my team, of course, we do collect, like they do. 103 00:07:41,426 --> 00:07:46,486 and have been doing for years like forensic collections from Microsoft, all these employee investigations. 104 00:07:46,486 --> 00:07:50,506 So I got involved in these investigations. 105 00:07:51,166 --> 00:07:55,706 And what we kind of were thinking was, it's interesting. 106 00:07:56,246 --> 00:08:07,426 Having come from the forensic collection space just a little bit earlier, we were getting in these cases and they're all like, oh, let's get all the email, let's get all the 107 00:08:07,426 --> 00:08:10,577 SharePoint, let's get all the OneDrive files. 108 00:08:10,577 --> 00:08:13,197 for a relevant date range, which is a lot. 109 00:08:13,197 --> 00:08:15,817 And then let's get the computer as well. 110 00:08:15,817 --> 00:08:20,217 And then you start thinking to yourself, what are they actually looking for? 111 00:08:20,217 --> 00:08:22,797 So that question wasn't being asked, right? 112 00:08:22,797 --> 00:08:27,877 And so when we look at a computer forensic investigation, what are they looking at? 113 00:08:28,657 --> 00:08:29,877 What do the user do? 114 00:08:29,877 --> 00:08:31,197 What's the activity? 115 00:08:31,197 --> 00:08:35,997 Because generally these are like, they could be employee theft, IP litigation, that kind of thing. 116 00:08:35,997 --> 00:08:37,557 What happened, right? 117 00:08:37,557 --> 00:08:38,651 So we... 118 00:08:38,651 --> 00:08:44,832 thought it was weird that nobody was looking at the activity in Microsoft, you know, because that is readily available. 119 00:08:44,832 --> 00:08:48,605 You don't need to collect a computer for 20 grand or 10 grand or whatever it is. 120 00:08:48,825 --> 00:08:51,746 it's all of it is there and you can just have a look. 121 00:08:51,746 --> 00:09:02,091 So we started looking into the activity logs from an investigatory perspective and it was just a goldmine, you know, who did what, when, who were they talking to? 122 00:09:02,091 --> 00:09:03,111 Who did they email? 123 00:09:03,111 --> 00:09:04,592 What files did they download? 124 00:09:04,592 --> 00:09:06,272 Like, what were they looking at? 125 00:09:06,272 --> 00:09:08,304 And we found that like, 126 00:09:08,304 --> 00:09:16,444 I mean, this sounds like crazy, but we found like we could solve cases, some, if the data was there, within like 24 to 48 hours without collection. 127 00:09:16,444 --> 00:09:19,624 So we could see exactly what the employees stole. 128 00:09:19,664 --> 00:09:25,584 We knew exactly the data collected email that showed the proof of that, what happened. 129 00:09:25,584 --> 00:09:27,384 And so we built a platform. 130 00:09:27,384 --> 00:09:36,744 That's the seamless platform that I was talking about that just takes the audit logs from Microsoft and kind of denoises it so that you can see what happened. 131 00:09:37,413 --> 00:09:42,433 And if you've ever looked at an audit log, it's really terrible to look at. 132 00:09:42,897 --> 00:09:48,410 It's verbose, there's a lot of noise, and we wanted to see what happened. 133 00:09:48,410 --> 00:09:53,153 so we built it from the perspective of investigations, like legal investigations. 134 00:09:53,153 --> 00:09:56,605 And so, yeah, that's my experience with Microsoft. 135 00:09:56,766 --> 00:10:05,976 And in building that platform, we, again, have a very deep um understanding of the kind of 136 00:10:05,976 --> 00:10:10,020 landscape in Microsoft because we had to delve into each source. 137 00:10:10,020 --> 00:10:16,846 If you want to, in the audit log, yeah, it says they looked at a document in SharePoint, but I won't tell you the name. 138 00:10:16,846 --> 00:10:22,182 So we take the SharePoint ID, we can dive into SharePoint, get the name of the file, all that stuff. 139 00:10:22,182 --> 00:10:27,136 So we got a really good understanding what's happening and it is complex. 140 00:10:27,877 --> 00:10:30,954 It's not as easy as you'd think it would be, right? 141 00:10:30,954 --> 00:10:32,865 Well, yeah, I've had to rip apart. 142 00:10:32,865 --> 00:10:39,204 was, I used to be on the SQL team at Microsoft years ago, 20, 26 years ago. 143 00:10:39,204 --> 00:10:40,112 It was crazy. 144 00:10:40,112 --> 00:10:49,941 um But so I've, I've ripped apart the SQL logs and, um IIS logs and Windows server logs. 145 00:10:49,941 --> 00:10:52,383 And yeah, it is, it is extremely painful. 146 00:10:52,383 --> 00:10:57,748 um You, we talked a little bit about the concept of like, 147 00:10:57,820 --> 00:11:00,863 modern attachments in Microsoft 365. 148 00:11:00,863 --> 00:11:15,386 you know, for the listeners out there who have been super annoyed that when you try to attach a file to an email in Outlook, by default, it wants to, it wants it to be a modern 149 00:11:15,386 --> 00:11:15,948 attachment. 150 00:11:15,948 --> 00:11:20,621 It wants to basically host it in OneDrive and provide a link. 151 00:11:20,621 --> 00:11:24,754 And, you know, sometimes that works, but a lot of times it creates friction. 152 00:11:24,778 --> 00:11:34,455 I've had that where I'm trying to share with external parties and, and they don't have edit access or I lose track of where the file actually is. 153 00:11:34,455 --> 00:11:41,660 So sometimes it just, and for me more often than not, I revert back to a traditional attachment. 154 00:11:42,341 --> 00:11:47,745 just because I've had friction on and it's like, I don't have time to really like understand it. 155 00:11:47,745 --> 00:11:50,907 And I think a lot of people fall into that camp. 156 00:11:50,907 --> 00:11:52,299 So, um, 157 00:11:52,299 --> 00:11:58,202 Like what are modern attachments and how do they differ from traditional attachments? 158 00:11:58,202 --> 00:12:00,584 So modern attachments are just links. 159 00:12:00,584 --> 00:12:02,336 They are those links, the OneDrive links. 160 00:12:02,336 --> 00:12:04,898 They're links to files. 161 00:12:04,898 --> 00:12:13,756 uh Generally refer to files that are hosted uh in the uh productivity tool that you're using. 162 00:12:13,756 --> 00:12:17,270 So whether it's Microsoft 365 or Google Workspaces. 163 00:12:17,270 --> 00:12:20,432 So they're basically shared links to files. 164 00:12:20,432 --> 00:12:24,155 And so they help with collaboration, I think, in this day and age. 165 00:12:24,948 --> 00:12:30,252 much more prevalent than ever before after 2020, especially with the remote workers. 166 00:12:30,332 --> 00:12:33,344 So the problem, they've been around forever though, right? 167 00:12:33,344 --> 00:12:44,543 And the problem with traditionally discovery and how it's been is that you collect this email and there's a link there, but where's that file? 168 00:12:44,543 --> 00:12:52,272 So if you had an attachment, that attachment would be collected with the uh parent email and would be considered, you know, uh 169 00:12:52,272 --> 00:12:55,392 relevant to that email and that would be part of the discovery. 170 00:12:55,812 --> 00:12:58,832 Not with a modern attachment. 171 00:12:59,712 --> 00:13:10,872 Back in the day when we created eCloud Collect and we were collecting from Microsoft 365, we go to Microsoft and we're like, need that file. 172 00:13:11,052 --> 00:13:13,132 We need that file that's linked here. 173 00:13:13,132 --> 00:13:14,692 How do we get that? 174 00:13:14,692 --> 00:13:16,532 It's like, oh, you can't get that. 175 00:13:16,532 --> 00:13:19,872 This is like 13 years ago, granted. 176 00:13:20,824 --> 00:13:27,778 It's been a problem for a while and it's like you have to say to yourself, man, if the attachment is relevant, why is this not relevant? 177 00:13:27,778 --> 00:13:30,349 Like, why aren't we trying to get that file? 178 00:13:30,390 --> 00:13:38,049 So, um you know, like a lot of things in legal, it's like, well, what are you going to do? 179 00:13:38,049 --> 00:13:38,995 You can't get it, you know? 180 00:13:38,995 --> 00:13:41,296 And it's like, well, I think there's a better answer. 181 00:13:41,296 --> 00:13:48,463 So we offer solutions where we can go and like kind of grab that file because, you know, you can see that file. 182 00:13:48,463 --> 00:13:49,534 what that file is. 183 00:13:49,534 --> 00:13:57,068 Like if you know understand programmatically, you know what that link is and you'll get the idea of the file, you could actually go get it. 184 00:13:57,068 --> 00:14:01,340 Now the problem is what file do you get, right? 185 00:14:01,581 --> 00:14:08,464 So you have a file like there was this like the term like, are you gonna get the contemporaneous copy or the live version? 186 00:14:08,464 --> 00:14:15,088 And so really what we're saying is what is the relevant file because a file 187 00:14:15,310 --> 00:14:27,641 as an attachment has a static state, it is one thing, but a file as a modern attachment, you know, is it, you know, on Monday it'll be this file, on Tuesday it'll be another file, 188 00:14:27,641 --> 00:14:31,674 people are modifying, getting it constantly, so what file are you getting? 189 00:14:31,695 --> 00:14:40,422 So for, you know, Microsoft came out with some features and they're like, okay, now you can get the file, but it was only the live version. 190 00:14:41,964 --> 00:14:42,864 Well. 191 00:14:42,884 --> 00:14:43,474 Hey, guess what? 192 00:14:43,474 --> 00:14:45,865 Now you can get all versions of the file. 193 00:14:45,865 --> 00:14:47,224 It's like, yeah, great. 194 00:14:47,224 --> 00:14:48,826 5,000 versions of one file. 195 00:14:48,826 --> 00:14:51,027 I mean, there can be a ton of versions, right? 196 00:14:51,027 --> 00:14:52,807 So is that what you want? 197 00:14:53,088 --> 00:15:04,391 So what they've now come out with is, and this is super new, this is like in preview currently in Microsoft, is they're now saying that you can get the contemporaneous 198 00:15:04,391 --> 00:15:04,872 version. 199 00:15:04,872 --> 00:15:06,123 ah 200 00:15:06,123 --> 00:15:15,443 And that if you do, if you turn on certain switches in Microsoft, which is like creating a retention, like a label that is for rediscovery. 201 00:15:15,603 --> 00:15:18,583 So they're trying to solve this problem right now. 202 00:15:18,583 --> 00:15:25,503 Like we've been solving this problem for some time, but there's a few things that still exist. 203 00:15:25,503 --> 00:15:25,963 Okay. 204 00:15:25,963 --> 00:15:29,003 So Microsoft, of course they have a discovery platform, right? 205 00:15:29,003 --> 00:15:33,083 Nobody uses it, they, you know, I mean, no offense to Microsoft, but they don't, right? 206 00:15:33,083 --> 00:15:34,803 Like people use relativity. 207 00:15:35,222 --> 00:15:37,893 Disco, they use other things, they use Everlo. 208 00:15:38,194 --> 00:15:48,371 So if you get data from Microsoft, yeah, you can get the cloud attachments, but they're not automatically linked to the parent email. 209 00:15:48,371 --> 00:15:52,002 So there has to be a secondary process that occurs with your vendor. 210 00:15:52,584 --> 00:15:58,147 That's something that we do where you have to of like marry the files with the parent attachments, and they make it very difficult. 211 00:15:58,147 --> 00:16:04,179 uh On purpose, I mean, sure, because they want people to use their platform, which I totally understand. 212 00:16:04,179 --> 00:16:06,189 So that's a lot of challenges there. 213 00:16:06,189 --> 00:16:08,338 um People can't ignore them. 214 00:16:08,475 --> 00:16:23,705 Yeah, you know, I've heard some interesting perspectives on even the definition of what a file is, is really, it's not really a discrete object, binary object like it used to be. 215 00:16:23,705 --> 00:16:27,077 It is now a collaboration space, right? 216 00:16:27,077 --> 00:16:29,808 And it has multiple versions. 217 00:16:29,808 --> 00:16:34,691 It has uh conflict management. 218 00:16:34,851 --> 00:16:35,721 part of it, right? 219 00:16:35,721 --> 00:16:41,653 If I'm trying to edit the same piece of text that you're trying to edit, that somehow has to be resolved. 220 00:16:41,653 --> 00:16:44,504 um There are permissions associated. 221 00:16:44,504 --> 00:16:47,835 It's almost like, you know, files have now become... 222 00:16:47,835 --> 00:16:52,126 and I'm gonna use just Word docs because everybody knows what that is. 223 00:16:52,126 --> 00:16:59,938 It's almost like they're online wikis, um you know, rather than a traditional binary, discrete document. 224 00:16:59,938 --> 00:17:03,420 So, I would imagine that makes the whole e-discovery process much more challenging. 225 00:17:03,420 --> 00:17:11,620 Yeah, I really want people to start, know, attorneys and their let's support to start talking about what they're looking for, you know? 226 00:17:11,760 --> 00:17:19,000 So that, you know, while we try to have these conversations with our clients all the time, it's like, oh yeah, get me this, get me that, get me the other. 227 00:17:19,000 --> 00:17:21,140 And we could just say, do you want your cloud to happen? 228 00:17:21,140 --> 00:17:21,680 Sure. 229 00:17:21,680 --> 00:17:23,800 It's like, okay, you know, great. 230 00:17:23,800 --> 00:17:29,300 Now you're, instead of hosting 500 documents, you're hosting, you know, 17,000, you know? 231 00:17:29,300 --> 00:17:30,660 Like, that's awesome for us. 232 00:17:30,660 --> 00:17:33,380 We're charging you hosting, but is that really? 233 00:17:33,757 --> 00:17:35,458 You know, that kind of sucks, right? 234 00:17:35,458 --> 00:17:41,803 um we really, our conversations are now like, okay, exactly what are you looking for? 235 00:17:41,803 --> 00:17:45,745 You know, those files that you can identify as relevant. 236 00:17:45,745 --> 00:17:47,727 uh What is the date range? 237 00:17:47,727 --> 00:17:53,491 Are you looking for something that John did on a certain, what is the relevant date range that he did? 238 00:17:53,491 --> 00:18:01,656 Because you can say, well, instead of looking for all the files or all the emails, uh I want to know anything that John, you know, 239 00:18:01,656 --> 00:18:08,418 edited or modified or viewed during a certain period because just sending him the link doesn't mean he even opened it, right? 240 00:18:08,498 --> 00:18:12,079 So that's the level that we should be looking at it. 241 00:18:12,959 --> 00:18:18,681 you know, if that email is relevant, then we need to go dive into that, that attachment. 242 00:18:18,681 --> 00:18:22,172 And I don't, can think that could even be a secondary process, you know? 243 00:18:22,172 --> 00:18:24,483 So put everything on hold. 244 00:18:24,483 --> 00:18:25,423 Everything's preserved. 245 00:18:25,423 --> 00:18:26,463 It's there when you need it. 246 00:18:26,463 --> 00:18:27,104 Cool. 247 00:18:27,104 --> 00:18:28,484 Make sure of that. 248 00:18:28,764 --> 00:18:31,687 that you preserved it properly and there's retention labels and all this good stuff. 249 00:18:31,687 --> 00:18:44,742 And then identify the emails where these relevant links you think might make sense, like you want to look at and decide what do I want to look at and when and why, you know, and 250 00:18:44,742 --> 00:18:45,803 get that. 251 00:18:46,104 --> 00:18:48,286 That will really reduce your review, you know. 252 00:18:48,286 --> 00:18:50,068 uh 253 00:18:50,466 --> 00:18:52,777 Yeah, that makes sense. 254 00:18:52,777 --> 00:19:01,322 you and I talked a little bit about bringing your data to AI and bringing your AI to data. 255 00:19:01,822 --> 00:19:12,298 I think we're aligned in that as the future of law, Big Law 2.0, kind of unfolds in front of us that 256 00:19:12,436 --> 00:19:21,198 it's going to be difficult, as I mentioned kind in the intro for firms to differentiate themselves by buying Harvey or Legora or Paxton or these off the shelf tools. 257 00:19:21,198 --> 00:19:22,639 They're a good starting point. 258 00:19:22,639 --> 00:19:25,800 Those tools have value, but they're not differentiating. 259 00:19:25,800 --> 00:19:36,062 It's the collective knowledge and wisdom and work product um that has led to successful outcomes for the firm's clients that is differentiating. 260 00:19:36,283 --> 00:19:40,156 like for us, for example, we're not really an AI company. 261 00:19:40,156 --> 00:19:42,134 We're intranet extranet platform. 262 00:19:42,285 --> 00:19:54,700 But as part of our uh install process, we stand up an Azure, I'll call it a virtual appliance, essentially that taps into all the back office systems that a law firm uses and 263 00:19:54,700 --> 00:20:01,112 then presents a unified security trimmed API that respects ethical wall boundaries. 264 00:20:01,233 --> 00:20:09,600 like we did that, the reason that we built that is because our web parts need to surface information from I manage, from elite, from Adderent, from. 265 00:20:09,600 --> 00:20:12,101 interaction from foundation from all these different places. 266 00:20:12,101 --> 00:20:14,042 So it made our job easier. 267 00:20:14,042 --> 00:20:26,327 But then when AI hit the scene and we, this thing has existed for many years, but then when AI hit the scene and Azure released Azure open AI, Azure AI search, formerly Azure 268 00:20:26,327 --> 00:20:33,830 cognitive search, it created, we now have a utility that enables the law firms to go, Hey, you know what? 269 00:20:33,830 --> 00:20:38,890 want to crawl an index, a SQL repository or this file share and 270 00:20:38,890 --> 00:20:41,461 leverage Azure AI search to crawl and index it. 271 00:20:41,461 --> 00:20:46,283 Then I want to use Azure Open AI and run some AI operations on it. 272 00:20:46,283 --> 00:20:52,265 So it allows firms to bring AI to their data rather than again, Harvey Legora. 273 00:20:52,265 --> 00:21:04,500 Yeah, you can in Harvey, you can go navigate your way to a matter workspace and I manage and bring in a set of documents for rag purposes and perform actions, but not what we're, 274 00:21:04,500 --> 00:21:07,592 what we're enabling is like wholesale AI operations. 275 00:21:07,592 --> 00:21:08,748 So you could take 276 00:21:08,748 --> 00:21:19,896 you know, a SQL database full of regulatory updates for labor and employment and all your clients employment agreements, employee handbooks, crawl and index it, Azure AI search, 277 00:21:19,896 --> 00:21:22,958 and then use Azure AI and flag exceptions. 278 00:21:22,958 --> 00:21:25,579 doing those whole, yeah. 279 00:21:25,579 --> 00:21:32,385 But I don't know, what is your perspective on bringing AI to the firm's data? 280 00:21:32,385 --> 00:21:36,388 like firm meaning law firm or a type of corporation. 281 00:21:37,569 --> 00:21:49,518 Yeah, so law firms, it's interesting because a lot of these solutions that these new solutions like that are outside of kind of the traditional discovery workflow. 282 00:21:49,518 --> 00:21:55,763 They don't really work for law firms um from a business perspective, right? 283 00:21:55,763 --> 00:21:57,364 uh 284 00:21:57,592 --> 00:22:01,745 less data review, less hours, less build, et cetera, et cetera. 285 00:22:01,745 --> 00:22:04,497 And there are whole business models generally on hours. 286 00:22:04,497 --> 00:22:18,437 And I think that it's going to have to change because corporations are going to become more more uh agile and able to perform, get a look into their data more quickly and 287 00:22:18,437 --> 00:22:19,107 easily. 288 00:22:19,107 --> 00:22:25,976 so I think that, so right now I'm hearing a lot of outside counsel not really interested in that kind of like, 289 00:22:25,976 --> 00:22:29,328 agents in place, but the corporations are super interested. 290 00:22:29,328 --> 00:22:31,069 They're like, I need to know what we have. 291 00:22:31,069 --> 00:22:32,059 I want to know. 292 00:22:32,059 --> 00:22:34,210 And you see all this stuff in the productivity tools. 293 00:22:34,210 --> 00:22:39,623 Like you see like communication compliance, when things are like so that you can be litigation ready. 294 00:22:39,623 --> 00:22:43,115 ooh, oh that's a little, you know, sexual harassment there. 295 00:22:43,115 --> 00:22:44,366 Flag me, let me know. 296 00:22:44,366 --> 00:22:51,190 Like HR is like getting alerted when people are behaving in a way that may be, you know, cause future litigation. 297 00:22:51,190 --> 00:22:55,868 There's, um now you can actually, uh 298 00:22:55,868 --> 00:22:57,388 perform investigations. 299 00:22:57,388 --> 00:23:00,888 And this is again a preview feature in Microsoft utilizing AI. 300 00:23:00,888 --> 00:23:12,208 So you can kind of search and they're saying this is for, you know, kind of a breach purposes, but you could see it, it's an investigation. 301 00:23:12,208 --> 00:23:13,828 I mean, you know, I haven't used it yet. 302 00:23:13,828 --> 00:23:18,196 I haven't really kind of dove in enough to give a good kind of. 303 00:23:19,320 --> 00:23:21,782 perspective on how well it works. 304 00:23:21,782 --> 00:23:30,908 But I mean, really, you can see in the future that what you want to be doing in a corporation or even as a law firm is going into your clients and querying the data that's 305 00:23:30,908 --> 00:23:31,789 there. 306 00:23:31,789 --> 00:23:33,891 What kind of data? 307 00:23:33,891 --> 00:23:36,312 Where are the documents that discuss this? 308 00:23:36,312 --> 00:23:40,135 Is there anything in the SharePoint site that is relevant to XYZ? 309 00:23:40,135 --> 00:23:43,758 And that is 100 % possible. 310 00:23:43,758 --> 00:23:47,079 mean, like you said, you have the Azure OpenAI. 311 00:23:48,317 --> 00:23:50,238 API that's completely available. 312 00:23:50,238 --> 00:23:53,460 You have copilot people are going to you know, it's gonna get better and better. 313 00:23:53,460 --> 00:24:02,505 So yeah, that's the future is like people are just going to be like using the whether it's an eDiscovery tool or just an agent which saying like hey, what's in here and they're 314 00:24:02,505 --> 00:24:04,366 starting that now. 315 00:24:04,366 --> 00:24:05,686 Whether or not it's great. 316 00:24:05,686 --> 00:24:06,607 It's not that great. 317 00:24:06,607 --> 00:24:09,709 um People are building tools to do that. 318 00:24:09,709 --> 00:24:13,020 Like so we have some some friends that are building in the industry. 319 00:24:13,020 --> 00:24:16,198 They're building some tools where it's like, hey, we're gonna spin up 320 00:24:16,198 --> 00:24:22,312 Just like you're doing, you're spinning up these virtual environments within a company's Azure environment. 321 00:24:22,332 --> 00:24:23,483 that's the way to go, right? 322 00:24:23,483 --> 00:24:29,157 So you want privacy, you want protection of your data, then don't let it leave. 323 00:24:29,157 --> 00:24:37,163 That model is kind of outdated where you're uploading data into a relativity or into some other tool, like an I managed to look at it. 324 00:24:37,163 --> 00:24:37,593 Why? 325 00:24:37,593 --> 00:24:43,487 I don't see that model lasting for too much longer. 326 00:24:43,667 --> 00:24:45,020 It's expensive. 327 00:24:45,020 --> 00:24:46,880 It's risky. 328 00:24:47,480 --> 00:24:52,320 Your data does not, the retention policies that apply to your data no longer apply. 329 00:24:52,700 --> 00:24:58,880 That data that resides wherever it is, is still subject to discovery. 330 00:24:59,460 --> 00:25:05,500 So you don't want to, I think that that's where we're going to see it more and more. 331 00:25:05,500 --> 00:25:14,440 I'm just interested to understand how like these software companies plan on monetizing their applications. 332 00:25:15,398 --> 00:25:23,116 with such powerful AI being able to kind of create these agents so easily internally. 333 00:25:23,116 --> 00:25:31,396 you know, and agents and interoperability, that was a big theme at Ilticon this year. 334 00:25:32,236 --> 00:25:45,656 so yeah, we heard, you know, I believe it was I manage and their adoption of model context protocol, MCP, which allow, you know, agent to agent communication. 335 00:25:46,196 --> 00:25:47,376 I 336 00:25:47,417 --> 00:26:00,137 I'm very bullish on the future of agents, but I'm a little bit bearish on their ability to be deployed in any high risk scenario now. 337 00:26:00,277 --> 00:26:05,097 So for a customer service agent, no problem. 338 00:26:05,097 --> 00:26:07,837 A sales and marketing agent, no problem. 339 00:26:08,517 --> 00:26:14,797 A new matter intake agent, you're gonna need a human in the loop on that. 340 00:26:14,997 --> 00:26:15,906 Yeah, well. 341 00:26:15,906 --> 00:26:20,980 Well, um because right now, LLMs are not deterministic. 342 00:26:20,980 --> 00:26:28,616 In other words, you can take the same prompt, ah run it, copy and paste it, run it again, you're going to get back different results, right? 343 00:26:28,616 --> 00:26:30,067 So, they're not deterministic. 344 00:26:30,067 --> 00:26:32,279 uh There's still hallucinations. 345 00:26:32,279 --> 00:26:35,552 They've gotten better, but there are still are hallucinations. 346 00:26:35,552 --> 00:26:37,473 They're not hallucination free. 347 00:26:37,534 --> 00:26:42,598 So, anything of a high-risk nature, I would bucket that in. 348 00:26:42,598 --> 00:26:45,462 That's a little bit further down the road. 349 00:26:45,462 --> 00:26:57,575 Let's, let's check off the, let's check off the lower risk use cases first where, know, if something doesn't get classified properly, has big dollar economic implications. 350 00:26:57,575 --> 00:26:58,276 I just don't. 351 00:26:58,276 --> 00:27:12,550 um and, everybody may have different tolerances on this, but, um, given where, and things are moving so quickly, like honestly, uh, chat GPT five, I like less than I did, uh, prior 352 00:27:12,550 --> 00:27:14,380 where I could pick my model. 353 00:27:14,597 --> 00:27:19,916 And you know, yeah, because they backpedaled. 354 00:27:19,916 --> 00:27:29,196 reacted pretty quickly, but if you think about software, you know, it's generally, you know, a SaaS solution, right? 355 00:27:29,196 --> 00:27:33,656 You don't have like six versions of Salesforce you're using the current, right? 356 00:27:33,676 --> 00:27:37,907 But I mean, people have relationships with these models, which is a little bit different. 357 00:27:37,907 --> 00:27:43,827 They'll probably, you know, we're learning so much with like AI so new. 358 00:27:43,912 --> 00:27:47,971 that they're applying traditional kind of software models and releases. 359 00:27:47,971 --> 00:27:53,372 And they may just, they're gonna learn, they're gonna pivot, they're gonna do something that makes a little bit more sense to its users. 360 00:27:53,372 --> 00:27:56,692 But to be honest, I don't like chat GBT as much either. 361 00:27:57,592 --> 00:27:59,892 But some people really do. 362 00:27:59,892 --> 00:28:00,832 They're like, oh, it's super fast. 363 00:28:00,832 --> 00:28:03,463 I find it very slow, quite frankly. 364 00:28:03,463 --> 00:28:04,763 But I kind of agree with you. 365 00:28:04,763 --> 00:28:07,483 Just to come back to your point about... 366 00:28:08,234 --> 00:28:17,005 I just wanted to be double as advocate as to why you think they're not ready, these agents, to kind of do the things that are high risk. 367 00:28:17,005 --> 00:28:21,585 You have to kind of treat it like a junior associate. 368 00:28:21,585 --> 00:28:23,565 Like this stuff needs eyes on. 369 00:28:23,565 --> 00:28:34,605 And I think in pretty much in most respects, even if it's not high risk, if you're going to be repeating anything that you get out of AI, you should probably make sure that it's 370 00:28:34,605 --> 00:28:35,897 actually true. 371 00:28:35,897 --> 00:28:45,610 um Even, you know, even like, you know, facts about the news or this or that or the other, like, you know, this is not perfect. 372 00:28:45,610 --> 00:28:52,181 is getting data that it's been trained on and the training data may not be correct. 373 00:28:52,181 --> 00:28:57,723 um The people that are creating the agents, they have bias. 374 00:28:57,723 --> 00:29:03,035 They, you know, you don't have any transparency into how these are created or anything like that. 375 00:29:03,035 --> 00:29:03,785 So. 376 00:29:03,965 --> 00:29:09,268 We always like we do a lot of AI solutions and I would never say, all right, yeah, just send this out. 377 00:29:09,268 --> 00:29:18,653 It's like, you know, when we create something for our clients, we, we proof it and then we make sure that they proof it, you know, this is not a person. 378 00:29:18,653 --> 00:29:20,634 This is a machine. 379 00:29:20,634 --> 00:29:22,836 It is that it created this. 380 00:29:22,836 --> 00:29:24,046 So you, but it's real. 381 00:29:24,046 --> 00:29:25,167 mean, they're very effective. 382 00:29:25,167 --> 00:29:26,107 They save a lot of time. 383 00:29:26,107 --> 00:29:30,260 Like we do production requests responses. 384 00:29:30,260 --> 00:29:32,878 We have a tool that does this for our clients. 385 00:29:32,878 --> 00:29:37,699 And it writes as the attorneys write and it has the same format and looks exactly like that. 386 00:29:37,699 --> 00:29:43,080 So we'll create a production request response um for the attorneys to start with. 387 00:29:43,080 --> 00:29:54,343 So it just saves them a lot of time just to even create that saves them like, you know, days, provides like sample arguments, you know, that can, you know, cool stuff like that. 388 00:29:54,343 --> 00:29:56,884 But you know, I would never say just send that out. 389 00:29:56,884 --> 00:30:00,651 Like you get, you know, it'll take them an hour instead of two days to do something. 390 00:30:00,651 --> 00:30:02,045 I think that's great. 391 00:30:02,185 --> 00:30:02,797 You know, 392 00:30:02,797 --> 00:30:03,068 Yeah. 393 00:30:03,068 --> 00:30:07,711 Like I have an agent that, it's still not perfected. 394 00:30:07,711 --> 00:30:12,684 I'm still kicking around on an eight and, using co-pilot and trying to figure out the right path. 395 00:30:12,684 --> 00:30:19,488 But like, I have, uh, I have started, this is a great example of something you can use an agent for today. 396 00:30:19,488 --> 00:30:20,238 That's low risk. 397 00:30:20,238 --> 00:30:25,631 And if it fails, it's not a big deal, but I get emails from, 398 00:30:27,891 --> 00:30:33,263 like think tanks, like, you know, I, I subscribe to Jeff, Brant's Penhawk newsletter. 399 00:30:33,263 --> 00:30:33,904 It's great. 400 00:30:33,904 --> 00:30:45,719 Uh, artificial lawyer, Bob Ambrosia, and I have them routed all to a folder and I have an agent that goes through and finds out the stuff that's compelling for thought leadership 401 00:30:45,719 --> 00:30:46,249 activities. 402 00:30:46,249 --> 00:30:52,752 Like gives me a bulleted list of, Hey, these are the things that happened yesterday in legal tech or, or AI. 403 00:30:52,752 --> 00:30:54,718 Um, that's a s 404 00:30:54,718 --> 00:30:55,374 cool. 405 00:30:55,374 --> 00:30:56,045 Yeah. 406 00:30:56,045 --> 00:30:57,926 super easy and it's low risk. 407 00:30:57,926 --> 00:31:03,629 know, I use, uh haven't, I don't have not orchestrated an agent to do this yet. 408 00:31:03,629 --> 00:31:06,070 It's still manual, but it's going to be an agent. 409 00:31:06,070 --> 00:31:10,692 When I, when I do a planning call for an agenda, I record it. 410 00:31:10,692 --> 00:31:11,992 I download the transcript. 411 00:31:11,992 --> 00:31:18,245 I load it into a custom called Claude project and it outputs, uh I've trained it in its training materials. 412 00:31:18,245 --> 00:31:22,957 have several handwritten agendas back when I have to use to have to do it myself. 413 00:31:22,957 --> 00:31:24,042 Eventually. 414 00:31:24,042 --> 00:31:35,136 You know, I'll wire up a Zapier integration or with N8n go through and have it just as soon as I end the call, I'll have a naming convention in the meeting planner in my Outlook 415 00:31:35,136 --> 00:31:41,269 calendar that it will know to go grab it, run it through the project, and then send me the result. 416 00:31:41,269 --> 00:31:44,780 You know, those are all really productive and they're real time savers. 417 00:31:44,780 --> 00:31:53,215 Like it's, it's, it's, it's, um, I'm getting real time savings from it, but would I have it go through and 418 00:31:53,215 --> 00:31:54,550 Pay my bills? 419 00:31:54,796 --> 00:31:55,481 No. 420 00:31:55,481 --> 00:31:57,661 Nope, no, of course not. 421 00:31:58,061 --> 00:32:00,281 No, 100%, 100%. 422 00:32:00,281 --> 00:32:02,081 It's a huge time saver. 423 00:32:02,081 --> 00:32:06,001 People should be, you know, I'm just focusing on illegal. 424 00:32:06,001 --> 00:32:11,921 They should be using it for more than just thinking about, oh, I can do auto review for this. 425 00:32:12,041 --> 00:32:15,461 No, there's so, so many things that can be done. 426 00:32:15,781 --> 00:32:18,781 We integrated into all of our processes. 427 00:32:18,781 --> 00:32:25,561 We understand how to do that in a secure and, know, know, privacy first kind of manner, but. 428 00:32:26,019 --> 00:32:28,661 Like I want to go out and like teach corporations this stuff. 429 00:32:28,661 --> 00:32:35,505 Like I'm watching them uh buy these products for like six figures subs, you know, on manual basis. 430 00:32:35,505 --> 00:32:41,700 And, know, I'm thinking to myself, oh, she's just like, she's hired dude to carry these. 431 00:32:41,700 --> 00:32:51,186 Like you could create these internally, uh you know, that are probably more effective and more customized than then and more secure than you are currently, you know, paying these 432 00:32:51,186 --> 00:32:52,407 six figure subs. 433 00:32:52,407 --> 00:32:53,388 And what are they doing? 434 00:32:53,388 --> 00:32:55,481 You're not, they're using these like, 435 00:32:55,481 --> 00:32:57,061 products that are coming out. 436 00:32:57,061 --> 00:32:58,001 Oh, don't worry. 437 00:32:58,001 --> 00:33:01,501 We're going to use your, we'll use your API. 438 00:33:01,501 --> 00:33:02,681 Oh, great. 439 00:33:02,781 --> 00:33:05,381 So I pay for, you pay for the AI, right? 440 00:33:05,381 --> 00:33:07,821 And we're going to use your environment in Azure. 441 00:33:07,821 --> 00:33:08,801 Awesome. 442 00:33:08,921 --> 00:33:10,801 You're paying for the infrastructure. 443 00:33:10,801 --> 00:33:20,741 And so when you get these six figure, you know, I've been in this, the legal tech industry a long time with six figure subscriptions is normal for a large corporation. 444 00:33:21,321 --> 00:33:24,142 And these software companies, they're 445 00:33:24,142 --> 00:33:29,344 continuing to do so, but providing less and less value, right? 446 00:33:29,534 --> 00:33:30,504 you're providing the code. 447 00:33:30,504 --> 00:33:39,909 Well, the know, the news today on NPR was, you know, that people with a computer science degree, you know, used to be able to go to school and come out and get a six figure 448 00:33:39,909 --> 00:33:40,390 salary. 449 00:33:40,390 --> 00:33:49,023 Now they're one of the most, you know, unemployed, uh newly graduated comp sci students are the most unemployed sector of our economy. 450 00:33:49,023 --> 00:33:50,663 I have a computer science degree. 451 00:33:50,663 --> 00:33:52,576 I got it so I could get a job. 452 00:33:52,576 --> 00:33:55,659 Back in the day, I was into social science. 453 00:33:55,659 --> 00:33:59,442 I switched from international development to this. 454 00:34:00,384 --> 00:34:03,047 to think about that, I mean, that's a huge shift. 455 00:34:03,047 --> 00:34:06,940 So I think software companies, you're also going to see a shift. 456 00:34:06,940 --> 00:34:15,039 So these AIs like affecting us in so many different ways that I don't think we can predict, but it will change everything. 457 00:34:15,039 --> 00:34:16,505 ah 458 00:34:16,505 --> 00:34:28,433 you know, and there's like, hear a lot of banter to along those lines, like, you know, Satya Nadella, who's the CEO of Microsoft talked about kind of the death of SAS. 459 00:34:28,433 --> 00:34:33,937 And I'll tell you right now, SAS isn't touching our business anytime soon. 460 00:34:33,937 --> 00:34:45,168 I'm sure there are scenarios like maybe a CRM where you can vibe code, you know, a basic contact database and the ability to have a tickler and you know, 461 00:34:45,168 --> 00:34:46,549 Okay, yeah, maybe. 462 00:34:46,549 --> 00:34:58,241 like when, what we do, like all the integrations and all the management that comes associated with like when the API changes, like we have to go change our, our code, like 463 00:34:58,241 --> 00:35:00,053 for AI to go do that. 464 00:35:00,053 --> 00:35:05,858 And then the deployment, like we deploy in the client's tenant, like there's all these environmental variables. 465 00:35:05,858 --> 00:35:08,331 Like if they have, it was really interesting. 466 00:35:08,331 --> 00:35:09,852 We had a client. 467 00:35:09,852 --> 00:35:15,236 We just migrated off of uh high Q, which is kind of the incumbent in the extra net space. 468 00:35:15,497 --> 00:35:30,659 And, um, the, had two customers where we couldn't send invitations to the law firm had two customers where we would, we initiated the invitation, the external sharing and 469 00:35:30,659 --> 00:35:31,311 invitation. 470 00:35:31,311 --> 00:35:33,071 kind of bypassed the GUI. 471 00:35:33,071 --> 00:35:38,005 do it through the API and we had two, two other customers out of thousands. 472 00:35:38,005 --> 00:35:39,613 think there were like 4,000. 473 00:35:39,613 --> 00:35:42,545 where it wouldn't work and we're like, what the hell is going on here? 474 00:35:42,545 --> 00:35:43,836 But it worked in high Q. 475 00:35:43,836 --> 00:35:57,485 So we had to pick apart, there's two little check boxes and they, both of these companies uh had headquarters in China and um way buried deep in the bowels of the Azure, the Azure 476 00:35:57,485 --> 00:35:59,607 uh admin center. 477 00:35:59,607 --> 00:36:06,791 There's a couple of check boxes that where there's settings that enable or disable your ability to share with. 478 00:36:06,791 --> 00:36:08,134 um 479 00:36:08,134 --> 00:36:13,686 I think they use some separate network in China for M365 customers. 480 00:36:13,686 --> 00:36:16,107 It's above my pay grade. 481 00:36:16,107 --> 00:36:17,997 anyway, lots of little nuances like that. 482 00:36:17,997 --> 00:36:23,269 Like, yeah, you're never going to just create an AI agent to go, here, just go deploy this. 483 00:36:23,269 --> 00:36:24,169 I say never. 484 00:36:24,169 --> 00:36:26,510 Maybe never is the right time soon. 485 00:36:26,510 --> 00:36:31,552 Are you going to have a button where you just go deploy something as complex as what we have? 486 00:36:31,552 --> 00:36:32,664 um 487 00:36:32,664 --> 00:36:33,755 let's talk about that a little bit. 488 00:36:33,755 --> 00:36:36,046 Can they write the code to do it? 489 00:36:36,626 --> 00:36:43,149 If they're given the proper instructions, if you give the AI the proper instructions on writing that code, then yes. 490 00:36:43,549 --> 00:36:52,253 But um the subject matter experts that understand the, know, all AI is doing is it's learning from what's out there. 491 00:36:52,253 --> 00:36:57,314 So if you're doing something truly innovative, it does not know how to do that. 492 00:36:57,875 --> 00:37:02,457 And if there's no documentation or examples of how to do that, you're not going to have that. 493 00:37:02,457 --> 00:37:03,417 It'll just... 494 00:37:03,577 --> 00:37:12,711 like even coding, hallucinates quite, it's very, it kinda like, I love that it's coding, it's great, it saves some time, but you cannot trust that code. 495 00:37:12,711 --> 00:37:21,835 Like people think you can, oh, yeah, I can just, you know, no, like you need these subject matter experts that understand the true workflow and process and that make sure that 496 00:37:21,835 --> 00:37:23,856 things are not overlooked and et cetera, et cetera. 497 00:37:23,856 --> 00:37:31,334 Like what you're talking about, some tasks by, you know, more kind of like rote tasks and testing and. 498 00:37:31,334 --> 00:37:41,871 basic coding will be able to be taken over by like the QA process, I think is really going to benefit from AI, but not the innovation, the creativity, the true, like the in-depth 499 00:37:41,871 --> 00:37:43,452 understanding of subject matter. 500 00:37:43,452 --> 00:37:46,974 That's not like AI doesn't understand that. 501 00:37:47,174 --> 00:37:49,936 And I have a, like a little example of that. 502 00:37:49,936 --> 00:37:54,858 um If we have a minute, my business partner, 503 00:37:55,851 --> 00:37:59,811 and I, we've been working, Greg Estes, he's the CEO of Bluestar. 504 00:37:59,811 --> 00:38:03,531 We've been working on these technology projects forever. 505 00:38:03,531 --> 00:38:06,651 And he's like a big ideas guy, but he can't code. 506 00:38:06,711 --> 00:38:09,131 He can't code to save his life. 507 00:38:09,131 --> 00:38:13,031 So it'd always be like, okay, Sarah, want you guys to build this. 508 00:38:13,031 --> 00:38:14,011 Let's try that. 509 00:38:14,271 --> 00:38:21,731 And now he can build his own apps with AI. 510 00:38:21,731 --> 00:38:25,683 There's a platform called Replit, which is... 511 00:38:25,683 --> 00:38:28,695 You basically, you know, say, I want to build this app, does this. 512 00:38:28,695 --> 00:38:34,370 And to be honest, like, I hate it so much because I'm like, I can't believe that works. 513 00:38:34,370 --> 00:38:44,161 Like, like he builds a Salesforce CRM, you know, and I'm like, okay, well, I can't, it is great, but could you update it? 514 00:38:44,161 --> 00:38:45,631 Do you understand how the code works? 515 00:38:45,631 --> 00:38:47,192 Is it doing everything you want it to? 516 00:38:47,192 --> 00:38:48,873 Where's the data store? 517 00:38:48,873 --> 00:38:53,981 People that don't have that in depth knowledge, you know, they can't. 518 00:38:53,981 --> 00:38:55,162 they don't know. 519 00:38:55,743 --> 00:39:04,050 The small things that aren't working, they try to like with a text prompt, update it and the whole product gets, you know, stops working. 520 00:39:04,050 --> 00:39:13,737 So, you know, I think we're getting there to a point where people are going to be able to create more interesting things without having that product knowledge, but you still need, 521 00:39:14,358 --> 00:39:14,919 you know what I mean? 522 00:39:14,919 --> 00:39:21,444 Like you, I do believe that if people want to differentiate themselves, it's like, you still have to have that technology. 523 00:39:22,483 --> 00:39:29,860 technological background or understanding in order to create a product that really does work and is marketable, you know? 524 00:39:29,860 --> 00:39:31,611 Yeah, so I'm a software guy too. 525 00:39:31,611 --> 00:39:45,985 I came up the ranks as a uh software engineer and uh the the illities, uh scalability, maintainability, reliability, interoperability, usability, testability, portability, none 526 00:39:45,985 --> 00:39:49,766 of that exists today with when you're vibe coding. 527 00:39:49,766 --> 00:39:53,672 It's great for prototyping and proofs of concept. 528 00:39:53,672 --> 00:39:54,009 Yeah. 529 00:39:54,009 --> 00:40:02,395 but it's not, this is not production ready code and where we have to get to from where we are, it's a big, it's a big jump. 530 00:40:02,395 --> 00:40:03,466 Will we get there one day? 531 00:40:03,466 --> 00:40:09,761 I'm sure eventually we will, but all the illities today, if you crack open the code, I will say this. 532 00:40:09,761 --> 00:40:15,825 um One thing I have found uh AI really useful for is writing SQL. 533 00:40:15,825 --> 00:40:20,869 So that's the one area that I've kind of kept up to date because it doesn't. 534 00:40:20,869 --> 00:40:21,849 changed, right? 535 00:40:21,849 --> 00:40:25,369 SQL's almost the same today as it was 20 years ago. 536 00:40:25,369 --> 00:40:26,869 know, everything else has changed. 537 00:40:26,869 --> 00:40:33,269 If you're building a web app now versus 20 years ago, 20 years ago, was, you know, web forms. 538 00:40:33,269 --> 00:40:34,169 Let's see, what are they? 539 00:40:34,169 --> 00:40:34,989 2005? 540 00:40:34,989 --> 00:40:35,549 Yeah. 541 00:40:35,549 --> 00:40:37,609 Web forms and post back. 542 00:40:37,609 --> 00:40:43,789 And now it's all front end JavaScript focused and model view controller. 543 00:40:44,009 --> 00:40:46,749 um, so yeah. 544 00:40:47,209 --> 00:40:48,389 Oh yeah. 545 00:40:51,465 --> 00:40:53,163 Yeah, same. 546 00:40:53,163 --> 00:40:59,997 I don't, um, I don't, the only thing really technical that I do is, is, is write SQL and it's quite good at that. 547 00:40:59,997 --> 00:41:15,406 but I have seen, you know, I have seen the, the code from these, um, from like Replet and, a cursor and they're, they're utilities, they're, they're little productivity boosters, 548 00:41:15,406 --> 00:41:17,305 but if you're going to build an app, 549 00:41:17,305 --> 00:41:22,454 for anything other than a proof of concept, right now it's just, it's not there. 550 00:41:23,146 --> 00:41:28,266 It's kind of very difficult, know, like my partner would be like, yeah, it just has, this is just not working. 551 00:41:28,266 --> 00:41:29,746 Could you fix it? 552 00:41:30,006 --> 00:41:34,526 And so I go in the code and I'm like, I have no idea how this code's working. 553 00:41:34,526 --> 00:41:44,786 And it's super, it lays it in a very complex manner, like in non-human manner, which maybe is the most efficient and effective, but if a human wants to come in there and actually 554 00:41:44,786 --> 00:41:47,226 kind of dive in, it's almost impossible. 555 00:41:47,983 --> 00:41:48,937 Totally. 556 00:41:48,937 --> 00:41:55,437 Yeah, so I hate it, but I think it'll get there. 557 00:41:55,437 --> 00:42:01,377 I think it'll get there pretty quickly, unfortunately, for people like us. 558 00:42:01,377 --> 00:42:03,457 But we'll see, I guess. 559 00:42:03,557 --> 00:42:07,797 No one's going to replace that subject matter expertise. 560 00:42:08,077 --> 00:42:08,907 It's so true. 561 00:42:08,907 --> 00:42:09,258 Yeah. 562 00:42:09,258 --> 00:42:15,300 And you know, so, and you brought up a good point about the inability of, and this is an architectural limitation. 563 00:42:15,300 --> 00:42:20,222 Like this one's not solvable with the current architecture of LLMs. 564 00:42:20,323 --> 00:42:28,306 Um, I, that's pretty much the consensus and that's around the ability to come up with novel ideas. 565 00:42:28,606 --> 00:42:28,996 Right? 566 00:42:28,996 --> 00:42:29,186 Yeah. 567 00:42:29,186 --> 00:42:29,427 Yeah. 568 00:42:29,427 --> 00:42:32,308 So it's like, can, but you know what it can do? 569 00:42:32,308 --> 00:42:34,929 It can reassemble. 570 00:42:35,301 --> 00:42:42,005 um pieces of information into novel concepts, but if those component parts aren't... 571 00:42:42,005 --> 00:42:49,549 don't exist in its vector space, it can't like create a new mathematical proof. 572 00:42:49,549 --> 00:42:50,530 Not possible. 573 00:42:50,530 --> 00:42:51,180 Can't do it. 574 00:42:51,180 --> 00:42:53,742 If it hasn't seen it, it can't do it, right? 575 00:42:53,742 --> 00:43:00,695 Because um that usually requires a lot of like new abstract thinking. 576 00:43:00,956 --> 00:43:03,137 All the easy... 577 00:43:03,973 --> 00:43:06,676 Theorems have been proved proven. 578 00:43:06,676 --> 00:43:13,001 um But yeah, eventually I don't think I don't think the current LLM architecture is the end state. 579 00:43:13,001 --> 00:43:17,425 I think this is a stepping stone to whatever's next. 580 00:43:17,930 --> 00:43:24,353 It's like thinking that dial-up was the end state of the internet, right? 581 00:43:24,954 --> 00:43:29,787 We're in a completely new uh generation of technology. 582 00:43:29,787 --> 00:43:31,177 It's gonna be real interesting. 583 00:43:31,177 --> 00:43:40,803 And to think what, you I was just talking about this with my partner and we're, you know, we're just, we've seen so many things in our lifetimes at our age. 584 00:43:40,803 --> 00:43:44,155 You know, we didn't have cell phones and you know, we didn't have internet. 585 00:43:44,155 --> 00:43:45,001 didn't have. 586 00:43:45,001 --> 00:43:51,861 any of this technology and to kind of be where we are today with this kind of new AI, you know, wave coming through. 587 00:43:51,861 --> 00:43:55,801 It's pretty amazing how much has changed. 588 00:43:56,421 --> 00:44:04,641 And people talk a lot, you know, I have daughters that are like 13 and 14 and they're like, oh, you know, AI is taking away jobs and you know, that's bad. 589 00:44:04,641 --> 00:44:13,383 And I'm like, yeah, but people said, say that about like moving to kind of more sustainable energy. 590 00:44:13,383 --> 00:44:15,113 You know, like, it's taking away jobs. 591 00:44:15,113 --> 00:44:17,214 like, yeah, but don't this is called evolution. 592 00:44:17,214 --> 00:44:18,374 This is how we evolve. 593 00:44:18,374 --> 00:44:21,424 said, you know, out of anybody have a computer science degree. 594 00:44:21,424 --> 00:44:24,116 Like I, I should be upset about it. 595 00:44:24,116 --> 00:44:30,188 But no, any like you are not entitled to have the same job for your entire life. 596 00:44:30,188 --> 00:44:32,078 Nor what why should you want it? 597 00:44:32,158 --> 00:44:42,391 You know, like you should be constantly, you know, changing, learning uh and kind of adapting to kind of the what's new and upcoming. 598 00:44:42,391 --> 00:44:43,309 that's what we'll 599 00:44:43,309 --> 00:44:44,991 make you stay relevant. 600 00:44:44,991 --> 00:44:48,736 um So I think AI is great. 601 00:44:48,736 --> 00:44:50,317 I totally embrace it. 602 00:44:50,317 --> 00:44:52,820 um People should. 603 00:44:53,190 --> 00:45:01,755 it's just like capitalism rewards the efficient use of capital. 604 00:45:01,755 --> 00:45:13,292 um In a capitalistic society, skill sets have to change to align with those movements. 605 00:45:13,292 --> 00:45:15,563 You know, like, I mean, think about spreadsheets. 606 00:45:15,563 --> 00:45:19,110 So spreadsheets didn't exist in 1970. 607 00:45:19,110 --> 00:45:31,890 Um, they really hit their stride in the, in the eighties and there were, uh, there were 850,000 CPAs and I think this was the U S I looked this number up one time for another 608 00:45:31,890 --> 00:45:35,650 podcast and there are, there are more than double that number. 609 00:45:35,650 --> 00:45:39,470 Now it didn't, it didn't kill the accounting profession. 610 00:45:39,590 --> 00:45:41,050 It changed it. 611 00:45:41,050 --> 00:45:41,570 Right. 612 00:45:41,570 --> 00:45:48,557 Um, you did, you don't have to pay, you don't have to pay your accountants to build a pro forma or a, you know, 613 00:45:48,557 --> 00:45:54,793 your revenue you can do it with Excel but they still have plenty to do more than ever 614 00:45:54,793 --> 00:45:58,812 Oh yeah, they're just doing it faster, better, they're more in-site. 615 00:45:58,812 --> 00:46:02,544 And there's going to be jobs created that we can't even fathom. 616 00:46:02,544 --> 00:46:07,564 Right now they're coming out with these browsers that are all AI powered, right? 617 00:46:07,584 --> 00:46:19,684 So you think about that, it's like, well, if you have any kind of platform, even if Microsoft doesn't have this AI embedded within, if you have a browser and you can query 618 00:46:19,684 --> 00:46:22,272 the data that you're looking at, 619 00:46:22,844 --> 00:46:33,086 your Q &A summarize, you know, stuff like I heard an example this morning where, know, and uh I think it was Claude or I'm sorry, I can't remember what model it was, but they kind 620 00:46:33,086 --> 00:46:34,279 of were using a browser. 621 00:46:34,279 --> 00:46:44,933 And you could say, okay, you know, you're on LinkedIn, find me all the employees that used to work at, you know, um XYZ company, but no longer do. 622 00:46:44,974 --> 00:46:50,748 And you'd have to before you'd have to kind of go through and search it or get one of those tools, you know, like uh 623 00:46:50,748 --> 00:46:56,993 Zoom info gives you all the people's information, but now you can just, it's literally built into your browser. 624 00:46:57,534 --> 00:46:59,446 And that's the future, right? 625 00:46:59,446 --> 00:47:01,818 It's like, it's too easy. 626 00:47:01,818 --> 00:47:03,449 You know, don't have to set things up. 627 00:47:03,449 --> 00:47:09,304 It's just like, you're just, you're getting more insight into data and we're going to see some pretty cool stuff. 628 00:47:09,477 --> 00:47:12,477 Yeah, as you were talking there, I was just looking at my... 629 00:47:12,477 --> 00:47:16,237 I was using Comet, which is Perplexity's... 630 00:47:16,237 --> 00:47:17,777 Yeah, Comet. 631 00:47:17,777 --> 00:47:18,537 And they... 632 00:47:18,537 --> 00:47:20,377 I don't have a paid plan with Perplexity. 633 00:47:20,377 --> 00:47:21,097 I already have... 634 00:47:21,097 --> 00:47:22,177 I already pay for 4A. 635 00:47:22,177 --> 00:47:25,556 I pay for Grok, Gemini, Claude and ChechyBT. 636 00:47:25,556 --> 00:47:28,357 It's like, man, I'm not paying for Perplexity too. 637 00:47:28,357 --> 00:47:28,837 It's just... 638 00:47:28,837 --> 00:47:29,737 it's too much. 639 00:47:29,737 --> 00:47:30,657 I like Perplexity. 640 00:47:30,657 --> 00:47:31,957 I think it's great. 641 00:47:31,957 --> 00:47:37,677 But I just fired it up for the first time this morning and it told me I had to have a pro subscription. 642 00:47:37,677 --> 00:47:38,684 So, I'm gonna... 643 00:47:38,684 --> 00:47:52,575 Well, the most recent episode of Hardfork, it's a podcast, they were given a look into comment and that's actually what they use. 644 00:47:52,575 --> 00:47:56,979 And I thought, wow, that's pretty eye-opening about where we're moving towards. 645 00:47:56,979 --> 00:47:59,541 And so I think that's pretty cool. 646 00:47:59,541 --> 00:48:01,653 um I guess we'll see what happens, know? 647 00:48:01,653 --> 00:48:05,436 And like we talked about, we'll just have to adapt, right? 648 00:48:05,436 --> 00:48:07,047 Yeah. 649 00:48:09,344 --> 00:48:10,205 Yeah. 650 00:48:10,205 --> 00:48:13,738 We're almost out of time, but you I wanted to say one quick thing. 651 00:48:13,738 --> 00:48:20,540 So I heard some really interesting analysis as to why, you know, ChatGPT is building a browser too. 652 00:48:20,540 --> 00:48:24,819 I was like, why are these, why are these AI companies building browsers? 653 00:48:24,819 --> 00:48:30,474 And in addition to just wanting to be your kind of operating system, they need data. 654 00:48:30,692 --> 00:48:32,512 So they're out of data. 655 00:48:32,512 --> 00:48:43,052 They've, they've, they've sucked it all in from the internet and, um, your browsing data gives them more training material, which I, I don't know why it never occurred to me. 656 00:48:43,052 --> 00:48:51,092 It made perfect sense when I heard it, but, um, you know, that's why there's such a race, uh, to, like get the browser out. 657 00:48:51,092 --> 00:48:54,812 I perplexity is impressed me with, I keep thinking they're going to die. 658 00:48:55,092 --> 00:48:59,212 Like chat GPT is going to kill them and they, they're pretty resilient. 659 00:49:00,028 --> 00:49:14,976 They put an offer in for Chrome, perplexity did, which Google, which I think is probably one of the the best AI creators out there. 660 00:49:15,017 --> 00:49:16,317 And we use it a lot. 661 00:49:16,317 --> 00:49:23,261 I don't necessarily think Gemini is, because its user interface and features may not be as great as ChatGPT or others. 662 00:49:23,261 --> 00:49:26,323 But when using it from an API perspective, it's awesome. 663 00:49:26,323 --> 00:49:28,379 um 664 00:49:28,379 --> 00:49:30,170 and cost effective and all this good stuff. 665 00:49:30,170 --> 00:49:32,090 I mean, why are they so good? 666 00:49:32,090 --> 00:49:34,191 Well, what, where do they get their data? 667 00:49:34,191 --> 00:49:36,132 They have Chrome, right? 668 00:49:36,132 --> 00:49:42,314 They also have YouTube, which they just took all the data, just took it. 669 00:49:42,334 --> 00:49:44,235 No permit, no permission to do so. 670 00:49:44,235 --> 00:49:50,037 They have everything, like all the input, all the interactions you make with anything that is a Google product. 671 00:49:50,037 --> 00:49:52,498 And that is what, why they're so good. 672 00:49:52,498 --> 00:49:56,039 And so, you know, you want to see perplexity buying Chrome. 673 00:49:56,039 --> 00:49:57,260 Same reason. 674 00:49:57,966 --> 00:49:58,910 Totally. 675 00:49:59,120 --> 00:50:00,922 Yeah, yeah, it's pretty interesting. 676 00:50:00,922 --> 00:50:10,260 uh People need to be really, whole other conversation is like, start thinking about, well, they need to be thinking about already, like, you know, kind of protecting themselves and 677 00:50:10,260 --> 00:50:13,132 their data, because there are no boundaries anymore. 678 00:50:13,132 --> 00:50:19,998 We're in a race for them, you know, for the moon, a moon race for AI, these AI companies, and they just don't get a crap. 679 00:50:21,180 --> 00:50:24,343 They'd rather be litigated against because it's worth it. 680 00:50:24,343 --> 00:50:25,543 It's worth it. 681 00:50:25,572 --> 00:50:28,935 Even even the ones who claim that they do like anthropic. 682 00:50:28,935 --> 00:50:40,924 mean they're you know, and I think they're the best of the of the bunch um But you know, they're still using pirated uh Books and they just lost a major lawsuit. 683 00:50:40,924 --> 00:50:44,702 I think it's under appeal but um well, this is 684 00:50:44,702 --> 00:50:46,107 took, oh sorry. 685 00:50:46,107 --> 00:50:46,578 oh 686 00:50:46,578 --> 00:50:47,029 that's okay. 687 00:50:47,029 --> 00:50:51,176 I was just going to say, um know I've kept you longer than I agreed to. 688 00:50:51,176 --> 00:50:52,648 So I apologize for that. 689 00:50:52,648 --> 00:50:58,648 But before we wrap up, like how do people find out more about you or your company? 690 00:50:58,648 --> 00:51:01,672 um Do a little self promotion real quick. 691 00:51:01,704 --> 00:51:04,375 Yeah, Sarah Thompson, can find me on LinkedIn. 692 00:51:04,375 --> 00:51:08,655 Also, our company is bluestarcs.com. 693 00:51:08,655 --> 00:51:18,875 Find out more about what we do for lit support, or you can go to SIEMLY, like S-I-E-M, like a SIEM, loi.com to find out about our Microsoft Investigations platform. 694 00:51:19,022 --> 00:51:19,592 Good stuff. 695 00:51:19,592 --> 00:51:27,415 Well, I appreciate you spending a few minutes with me this morning, and I'm sure we'll bump into each other sometime soon. 696 00:51:28,316 --> 00:51:29,976 All right, take care. 697 00:51:30,236 --> 00:51:31,197 Bye-bye. 698 00:51:31,277 --> 00:51:32,097 Bye. -->

Subscribe

Stay up on the latest innovations in legal technology and knowledge management.