Neill Pemberton

In this episode, Ted sits down with Neill Pemberton, Associate Partner at IBM Consulting, for a thought-provoking exploration of how AI is reshaping the legal industry. From leveraging smaller, greener models to overcoming cultural resistance within law firms, Neill shares his expertise in navigating the dynamic landscape of legal technology. Whether you’re curious about the shift from traditional AI to generative AI or looking for strategies to maximize ROI on AI adoption, this conversation offers valuable insights for law professionals at every level.

In this episode, Neill shares insights on how to:

  • Integrate AI into legal workflows effectively
  • Balance innovation with cost efficiency in law firms
  • Navigate the shift from traditional AI to generative AI in legal practice
  • Use small AI models to address privacy and energy concerns
  • Overcome resistance to change within law firms

Key takeaways:

  • IBM’s Granite AI models use only 8 billion parameters, proving that smaller, efficiently trained models can achieve high performance while reducing costs and energy consumption, making them ideal for enterprise applications.
  • Law firms can maximize the value of generative AI by integrating it with their existing labeled data, enabling more accurate and cost-effective workflows for tasks like clause identification and document review.
  • Incrementally introducing AI through low-risk, back-office functions like internal policy management or HR tasks allows firms to build confidence in the technology while avoiding the risks associated with client-facing errors.
  • Overcoming lawyers’ deeply ingrained “lone wolf” mindset requires strategic leadership and innovation teams to create a culture that prioritizes collaboration and long-term investment in transformative technology.

About the guest, Neill Pemberton:

Neill Pemberton is a former solicitor in England and an expert in the use of Generative AI in professional services. After 10 years at the global law firm Dentons, Neill joined Orbital Witness, where he became Head of Legal Innovation and grew the Legal Engineering team from one qualified lawyer to seven in just two years. Neill is now an Associate Partner with IBM Consulting.

We use our own models, but we use others too. We use Llama and all sorts in our day-to-day work, and we find we can get good results using small models. So I think it’s about how you use it, rather than what it is that you use.– Neill Pemberton

Connect with Neill Pemberton:

Subscribe for Updates

Newsletter

Newsletter

Machine Generated Episode Transcript

1 00:00:02,579 --> 00:00:08,178 Neil Pemberton, how are you this morning or I guess afternoon in your side of the world? 2 00:00:08,178 --> 00:00:08,658 the world. 3 00:00:08,658 --> 00:00:19,004 Yeah I'm doing well Ted thanks it is the afternoon we're coming up to 4 30 over in the UK it's dark and like I say I'm hoping that you will be able to bring me some sunshine this 4 00:00:19,004 --> 00:00:22,698 afternoon with a with an interesting interesting conversation. 5 00:00:22,911 --> 00:00:24,412 Well, I'll do my best. 6 00:00:24,412 --> 00:00:25,823 I'll do my, but no guarantees. 7 00:00:25,823 --> 00:00:28,626 Um, well, good stuff. 8 00:00:28,626 --> 00:00:38,874 I took a look at your background and you know, we got, I think we had some, some conversation, uh, previously on LinkedIn and then, had a, had a chat that I thought was 9 00:00:38,874 --> 00:00:43,998 very insightful and I thought your background was super industry, interesting. 10 00:00:43,998 --> 00:00:52,917 You had spent some time, uh, in legal at Denton's, you were spent some time on the vendor side and now you're an associate partner at IBM. 11 00:00:52,917 --> 00:00:53,610 Mm-hmm. 12 00:00:53,610 --> 00:00:58,448 um, I was unaware of IBM's offering that aligns with legal. 13 00:00:58,448 --> 00:01:00,723 So I thought it'd be a great conversation. 14 00:01:00,723 --> 00:01:04,282 Why don't you tell us a little bit about who you are, what you do and where you do it. 15 00:01:04,282 --> 00:01:04,772 do it? 16 00:01:04,772 --> 00:01:05,502 Yeah, great. 17 00:01:05,502 --> 00:01:05,962 Will do. 18 00:01:05,962 --> 00:01:07,222 Thank you. 19 00:01:07,222 --> 00:01:11,782 Well, I started my legal career back in about 2005. 20 00:01:11,782 --> 00:01:22,282 Worked my way up from from the bottom, so to speak is as a paralegal 18 months training contract, which is what we do in the UK or England at least two years almost of that. 21 00:01:22,282 --> 00:01:25,770 I got six months off for time to count, which was good. 22 00:01:25,934 --> 00:01:31,796 Three years later, was at a regional firm in Bristol in the southwest of England where I live. 23 00:01:32,037 --> 00:01:44,282 And I was looking for bit of a new challenge really, having spent six years at my first firm, an opportunity at Dentons, as you say, I came up, joined them in about 2011 and 24 00:01:44,282 --> 00:01:46,763 worked there for 10 years very happily. 25 00:01:47,283 --> 00:01:54,566 But after 16 years or so of doing what was essentially the same thing, commercial real estate work, which I enjoyed for a long time, I... 26 00:01:54,606 --> 00:01:57,128 started to get a bit itchy and looking around for other alternatives. 27 00:01:57,128 --> 00:02:01,550 And once I started looking around, a whole world of opportunity opened up to me. 28 00:02:01,550 --> 00:02:12,377 So a mentor of mine that I'd ended up working with at Denton's and really went out on a limb for me that got me working in the technology media telecom space. 29 00:02:12,497 --> 00:02:16,259 I found the technology work just to be much, much more interesting. 30 00:02:16,680 --> 00:02:20,246 And yeah, once I started looking around, opportunities just... 31 00:02:20,246 --> 00:02:25,679 that I never thought were there, you know, came up, they're no in-house opportunities for commercial real estate lawyers over here. 32 00:02:25,679 --> 00:02:36,464 So what happened was the company that I joined after Denton's, they were a startup, but just a two or three years before, and they had such a compelling proposition that when 33 00:02:36,464 --> 00:02:42,259 they'd raised money that meant they could afford me, I was jumping at an opportunity to go to go and work for them. 34 00:02:42,259 --> 00:02:48,662 A real first look at AI, pre-generative AI actually. 35 00:02:49,054 --> 00:02:55,516 to try and help them automate some real estate reporting, which is what my domain expertise was. 36 00:02:55,977 --> 00:02:56,747 I joined there. 37 00:02:56,747 --> 00:03:02,239 They'd already had a couple of what we called legal engineers working there, some very skilled people. 38 00:03:02,239 --> 00:03:04,220 But at one point it was just me. 39 00:03:04,540 --> 00:03:06,941 I grew that team up to about eight, nine people. 40 00:03:06,941 --> 00:03:11,143 And then generative AI came along, got really, really interesting. 41 00:03:11,143 --> 00:03:14,084 And eventually IBM just came knocking. 42 00:03:14,084 --> 00:03:18,195 And that to me was just too good of an opportunity not to explore it. 43 00:03:18,530 --> 00:03:24,634 You know, I joined the startup with a view to expanding my horizons and the horizons don't get much bigger than IBM. 44 00:03:24,634 --> 00:03:30,598 So when they came knocking, they were looking for someone who knew legal, someone who who'd had their hands on the tech. 45 00:03:30,598 --> 00:03:34,521 Um, and I was, I guess at the intersection of their Venn diagram. 46 00:03:34,521 --> 00:03:36,352 So here I am. 47 00:03:37,159 --> 00:03:37,749 Interesting. 48 00:03:37,749 --> 00:03:48,250 So yeah, I, I think I mentioned this, uh, in the intro, like I didn't realize that IBM had an offering aligned with the legal vertical. 49 00:03:48,250 --> 00:03:51,253 I, you know, I hear about the E Y's of the world. 50 00:03:51,253 --> 00:03:55,034 Um, and, other, you hear a lot of ALSPs. 51 00:03:55,815 --> 00:04:03,638 but I didn't know how big is the group that you work in and are you guys exclusively legal or is it broader than that? 52 00:04:03,638 --> 00:04:08,738 So yeah, I sit within IBM consulting, which globally is just huge. 53 00:04:08,738 --> 00:04:11,858 I didn't know much about IBM consulting before I joined. 54 00:04:11,858 --> 00:04:17,278 me, I did grow up in the States, as may be obvious from the whiteboard and the Denver Broncos helmet in the background. 55 00:04:17,278 --> 00:04:19,918 Sorry to anyone who's Kansas City fan. 56 00:04:21,678 --> 00:04:25,248 So IBM tech in the 80s was just huge. 57 00:04:25,248 --> 00:04:27,398 So I was well aware of that part of the business. 58 00:04:27,398 --> 00:04:30,512 I wasn't so aware of the consulting side. 59 00:04:30,890 --> 00:04:34,453 We have in-house lawyers and that's not really what my domain is. 60 00:04:34,453 --> 00:04:42,761 My domain is to work with professional services firms in general, which includes obviously legal, and just try and help their business. 61 00:04:42,761 --> 00:04:46,374 I IBM has been improving businesses for 100 plus years, right? 62 00:04:46,374 --> 00:04:53,920 So part of my job is taking the best of breed that we've got in-house in terms of technology. 63 00:04:54,112 --> 00:04:58,844 but we do partner with lots of other people, Microsoft, Adobe, Oracle, Salesforce, you name it. 64 00:04:58,844 --> 00:05:00,585 We will partner with other vendors. 65 00:05:00,585 --> 00:05:02,386 We'll do what's best for the client. 66 00:05:02,386 --> 00:05:11,551 So we offer a traditional consulting, I suppose, with the untraditional, if that's a real word, aspect that we have this big technology offering behind us. 67 00:05:11,691 --> 00:05:18,344 And my job is to go out and look at ways that we can improve not just the practice of law, but the business of law as well. 68 00:05:18,717 --> 00:05:19,637 Interesting. 69 00:05:19,637 --> 00:05:26,220 Yeah, it's been almost exactly two years since ChatGPT made its debut. 70 00:05:26,220 --> 00:05:38,765 I think that really changed everyone's perspective way beyond the legal industry, but within the legal industry itself, the status quo is very sticky in legal. 71 00:05:39,246 --> 00:05:43,908 Lawyers tend to embrace status quo, not always 72 00:05:45,032 --> 00:05:46,034 Mm-hmm 73 00:05:46,709 --> 00:05:49,709 the most open to change. 74 00:05:50,589 --> 00:06:01,689 and you know, I think the, that really rattled some cages at senior levels, you know, at the executive committee levels in law firms, like, wow. 75 00:06:01,689 --> 00:06:10,349 And you know, we saw things like the Goldman report that came out that 44 % of legal tasks could be automated by AI, which I've said multiple times. 76 00:06:10,349 --> 00:06:15,749 I think that's a gross overestimate, maybe one day, but we are a long way from that one day. 77 00:06:15,749 --> 00:06:16,877 Um, 78 00:06:17,189 --> 00:06:21,829 And you the, when you saw a, you saw a trajectory. 79 00:06:21,829 --> 00:06:34,752 in, when in November of 2022, when three five was released and scored 60 some odd percentile on the bar and then four was released, I think six or eight months later and it 80 00:06:34,752 --> 00:06:38,986 scored over the initial indications was it scored over 90 on the bar. 81 00:06:38,986 --> 00:06:43,879 People really took notice like, you know, that's a very steep innovation curve. 82 00:06:44,026 --> 00:06:53,879 Things have flattened out since then, the smidge and there's a lot of talk in AI circles about scaling laws and whether more... 83 00:06:53,879 --> 00:07:01,945 is going to continue to produce the incremental improvements that we have seen previously. 84 00:07:01,945 --> 00:07:05,164 Um, I think that they're again, this is Ted's opinion here. 85 00:07:05,164 --> 00:07:07,235 I'm not an expert, but I I'm an enthusiast. 86 00:07:07,235 --> 00:07:08,485 I follow the space closely. 87 00:07:08,485 --> 00:07:12,309 You know, once you get over about a trillion parameters and I think the latest 88 00:07:12,309 --> 00:07:13,280 GPT models. 89 00:07:13,280 --> 00:07:13,820 not sure. 90 00:07:13,820 --> 00:07:16,512 think maybe four is about 1.8 trillion. 91 00:07:16,512 --> 00:07:19,004 I might have that number wrong, but it's somewhere in that vicinity. 92 00:07:19,004 --> 00:07:30,653 You know, once you get over a trillion parameters, I things start to level out a smidge and I don't know if throwing more parameters and more data at these models is going to 93 00:07:30,653 --> 00:07:34,996 ultimately get us back on that steep innovation curve. 94 00:07:34,996 --> 00:07:36,297 There's a lot of debate about it. 95 00:07:36,297 --> 00:07:39,349 I mean, it's, if you listen to 96 00:07:47,922 --> 00:07:52,542 noticed anecdotally a little bit of a flattening. 97 00:07:52,542 --> 00:07:58,462 So I don't know, do you have any sense of kind of the trajectory we're on versus where we started? 98 00:08:00,575 --> 00:08:05,029 Yes, although not in the sense of how many trillions of parameters we might have. 99 00:08:05,029 --> 00:08:10,824 And in fact, to be a little contrarian, we do really well with way less. 100 00:08:10,824 --> 00:08:17,809 If you look at the IBM series of models, there's a series that we call granite, which is our in-house. 101 00:08:18,306 --> 00:08:19,916 variety and it's open source. 102 00:08:19,916 --> 00:08:22,127 So people are welcome to go and look at it and try it. 103 00:08:22,127 --> 00:08:22,827 Right. 104 00:08:22,827 --> 00:08:34,310 Um, we've just released granite 3.0 and it's got 8 billion and it, and and you look at the sums and say how on earth would eight, 8 billion compete with 1.8 trillion or whatever the 105 00:08:34,310 --> 00:08:38,602 number is being some significant, I'm not even gonna try and do the arithmetic on it. 106 00:08:38,602 --> 00:08:41,812 Cause I'm not that good in my head, but way less. 107 00:08:41,992 --> 00:08:48,044 I think the difference is, and can be that, like you say, you know, maybe, maybe we just don't need that many. 108 00:08:48,366 --> 00:08:49,046 parameters. 109 00:08:49,046 --> 00:08:52,816 mean, 1.2 trillion, 1.8 trillion, I can't even fathom what that looks like. 110 00:08:52,816 --> 00:08:55,266 I can't even fathom what 8 billion looks like. 111 00:08:55,266 --> 00:09:06,346 So, you know, we can get a lot out of small models, using them intelligently, training them on good data rather than just all data. 112 00:09:06,346 --> 00:09:09,036 And I think that's probably one of our key differentiators. 113 00:09:09,036 --> 00:09:10,306 And it's not the only one. 114 00:09:10,306 --> 00:09:16,328 But what we are quite keen on looking at is, what can we achieve the most with 115 00:09:16,328 --> 00:09:18,399 using the least, if I can put it that way. 116 00:09:18,399 --> 00:09:19,799 We use a small model. 117 00:09:19,799 --> 00:09:22,130 It's faster, it's cheaper, it's greener. 118 00:09:22,130 --> 00:09:24,720 And a lot of advantages to that. 119 00:09:24,720 --> 00:09:33,013 And to your point earlier about how quickly we scale and get to this 44 % number that we heard about, that's a while off. 120 00:09:33,013 --> 00:09:43,306 And to me, that takes an awful lot of time and effort and energy on the part of whoever's using the models, setting them up with the right workflows, doing the right prompting, et 121 00:09:43,306 --> 00:09:44,606 cetera, et cetera. 122 00:09:45,262 --> 00:09:54,882 I don't really have an idea on that, although I will say what's interesting from my perspective, trajectory wise, is how quickly some of the open source models have been 123 00:09:54,882 --> 00:09:57,132 catching up with some of the proprietary models. 124 00:09:57,132 --> 00:09:59,142 I find that quite interesting. 125 00:09:59,142 --> 00:10:02,122 And they say we use our own models, but we use others too, right? 126 00:10:02,122 --> 00:10:05,582 We use Llama and all sorts in our day-to-day work. 127 00:10:05,582 --> 00:10:09,022 And we find we can get good results using small models. 128 00:10:09,022 --> 00:10:13,237 So I think it's about how you use it rather than what it is that you use. 129 00:10:13,237 --> 00:10:14,377 Yeah, there are some better. 130 00:10:14,377 --> 00:10:15,787 There are some interesting. 131 00:10:15,787 --> 00:10:24,357 I've heard some interesting use cases for small models, specifically the ones that are downloadable and able to run locally on a laptop. 132 00:10:24,357 --> 00:10:27,037 One is from a privacy perspective. 133 00:10:27,157 --> 00:10:35,077 So I spoke to someone, I can't remember if it was on a podcast or outside of that. 134 00:10:35,077 --> 00:10:42,639 Somebody was telling me about they created a process for a patent. 135 00:10:42,823 --> 00:10:47,727 attorney where he would download a small model. 136 00:10:47,727 --> 00:10:59,666 forget, it may have been llama and they built some sort of interface on top that would allow him to automate, um, these patent applications, which I don't know, he did three or 137 00:10:59,666 --> 00:11:09,470 four a week and then he would delete the models and download a fresh every time he needed to do this and which sounds like a big deal, but it's not. 138 00:11:09,470 --> 00:11:11,730 not with some of the smaller models. 139 00:11:11,730 --> 00:11:20,470 So that's one way to achieve privacy until we get to a place where these larger models have these enterprise-grade security controls in place. 140 00:11:20,470 --> 00:11:24,850 I know OpenAI does, as do some others. 141 00:11:25,470 --> 00:11:27,830 But I mean, I think there are interesting news cases. 142 00:11:27,830 --> 00:11:29,310 I did a quick search. 143 00:11:29,310 --> 00:11:31,253 Yeah, the Lama 3.1 has eight 144 00:11:31,253 --> 00:11:34,444 um, 70 billion. 145 00:11:34,444 --> 00:11:37,155 have a 405 billion parameter version. 146 00:11:37,155 --> 00:11:38,631 So yeah, I mean, I think 147 00:11:38,631 --> 00:11:41,103 Obviously OpenAI is the monster. 148 00:11:41,401 --> 00:11:47,079 I would imagine Anthropic is in a similar ballpark. 149 00:11:47,079 --> 00:11:50,973 But yeah, there are interesting applications to some of the smaller models. 150 00:11:51,032 --> 00:11:58,468 Well, yeah, I mean, one of the ways that we've done it, we as a company know what's gone into our models because we made them, right? 151 00:11:58,468 --> 00:11:59,489 We trained them. 152 00:11:59,489 --> 00:12:01,731 It's all enterprise data. 153 00:12:01,751 --> 00:12:09,137 It's not what you get on the classic phrase of Reddit blogs or whatever it is that is the criticism of the larger models. 154 00:12:09,138 --> 00:12:13,381 So in theory, there's no garbage in any of the models that we use. 155 00:12:13,381 --> 00:12:16,284 And we can tell people and underwrite what's in there. 156 00:12:16,284 --> 00:12:16,934 And we do. 157 00:12:16,934 --> 00:12:19,298 And we publish the information on all of that. 158 00:12:19,298 --> 00:12:24,711 you know, how we've gone about it's available and you can find it on IBM's website. 159 00:12:24,711 --> 00:12:28,263 So being open about it is good. 160 00:12:28,263 --> 00:12:35,848 I think what's that phrase, know, garbage in, garbage out, rubbish in, rubbish out, whatever the right terminology is. 161 00:12:35,848 --> 00:12:37,038 That's sort of our view. 162 00:12:37,038 --> 00:12:45,773 Well, that's the view that I'm hearing anyway, at least is that we train it on good stuff and we get better results than you might otherwise think with a small model. 163 00:12:45,794 --> 00:12:46,560 And, know, 164 00:12:46,560 --> 00:12:47,642 I go back to it. 165 00:12:47,642 --> 00:12:56,505 I wouldn't underestimate the importance of low cost, low energy consumption and low carbon emission for the people who need to report that sort of thing. 166 00:12:56,505 --> 00:12:59,148 And I think everybody's interested in low cost. 167 00:12:59,649 --> 00:13:02,207 that's certainly my experience, especially with lawyers. 168 00:13:02,207 --> 00:13:03,278 Yeah, absolutely. 169 00:13:03,278 --> 00:13:03,428 Yeah. 170 00:13:03,428 --> 00:13:15,955 I've, I've even noticed as a consumer, uh, as a, someone who leverages the consumer paid versions of some of the big models I use, you know, the chat GPT pro and, the, Claude paid 171 00:13:15,955 --> 00:13:16,865 subscription. 172 00:13:16,865 --> 00:13:25,270 Every time I go into Claude now it's by default in concise mode because of high usage and to, you know, they're trying to manage token consumption. 173 00:13:25,270 --> 00:13:30,367 I've also heard this is unconfirmed, but there's been a big degradation. 174 00:13:30,367 --> 00:13:40,530 dip in performance and co-pilot that there's a lot of suspicion and scuttlebutt that it's a token throttling to manage resource consumption. 175 00:13:40,530 --> 00:13:53,363 yeah, I mean, every day I know myself as a consumer of these tools, every day I expand the scope and more and more usage every single day as I learn new ways to leverage the 176 00:13:53,363 --> 00:13:54,624 technology. 177 00:13:54,984 --> 00:13:59,571 That's not unique to me that, you know, everybody's doing that and 178 00:13:59,571 --> 00:14:01,703 It's just going to create more and more demand. 179 00:14:01,703 --> 00:14:15,106 I think, you know, to your point, um, you know, finding ways to mitigate that situation is, going to be a desirable outcome for both the providers and the consumers of the 180 00:14:15,106 --> 00:14:16,120 technology. 181 00:14:16,120 --> 00:14:27,735 Yeah, I think so and I don't know what the motives are behind, you know for a mini and Microsoft Fies model, know, the smaller models that they're bringing out there'll be a 182 00:14:27,735 --> 00:14:31,967 reason and it may be only monetarily it may well be Energy consumption. 183 00:14:31,967 --> 00:14:32,688 I don't know. 184 00:14:32,688 --> 00:14:37,730 But if you get the likes of the big software companies talking about 185 00:14:37,730 --> 00:14:46,695 building nuclear power stations to generate the energy that's going to go into building their or rather powering their models. 186 00:14:46,695 --> 00:14:48,396 I think that says something. 187 00:14:48,756 --> 00:14:56,170 Just put it into context, we've been in need of a new nuclear power station near where I am for a long time and it takes our government an awfully long time to do it. 188 00:14:56,170 --> 00:15:02,764 So I think Google and Microsoft are going to build them a lot faster than our government's going to do it. 189 00:15:02,764 --> 00:15:06,882 So you do it because you need to and I suspect that 190 00:15:06,882 --> 00:15:13,238 doing that is not cheap and using smaller models and getting similarly good results out is the way to go. 191 00:15:13,459 --> 00:15:14,190 Yeah. 192 00:15:14,190 --> 00:15:18,683 Yeah, I know there needs to be some innovation in the nuclear world as well. 193 00:15:18,683 --> 00:15:28,851 mean, having a 15 year timeline and know, tens of billions of dollars is not, that's not a scalable approach and there has not been a tremendous amount. 194 00:15:28,851 --> 00:15:37,779 I'm not an expert in that field, but I've read up on it recently as a result of all this and found it really interesting and they are starting to come up with some new ways to 195 00:15:37,779 --> 00:15:40,771 approach this problem. 196 00:15:40,771 --> 00:15:43,032 And I think, 197 00:15:43,807 --> 00:15:56,082 And now there's a real motivation just because of the massive power consumption that, I don't know if you saw, did you see, Elon stood up a data center? 198 00:15:56,082 --> 00:16:10,888 I forget how many hundreds of thousands of Nvidia, GPUs, but he did it in something like 130 days and it, it's massive multi football field size data center that he, he stood up 199 00:16:10,888 --> 00:16:12,979 in like under four months. 200 00:16:13,350 --> 00:16:16,458 or maybe just over four months, but it was, uh, it's incredible. 201 00:16:16,458 --> 00:16:20,519 And where's all the power are going to come from in those scenarios. 202 00:16:20,586 --> 00:16:24,531 I don't know, but I think he's probably the kind of guy that will figure that out very quickly. 203 00:16:24,531 --> 00:16:30,357 And, you know, he's, he's, he's also figuring out the energy distribution system for cars. 204 00:16:30,357 --> 00:16:33,151 So literally more power to the guy, right? 205 00:16:33,151 --> 00:16:35,443 He's, he's, he's got all the knowledge. 206 00:16:35,744 --> 00:16:37,205 No, I had not heard that. 207 00:16:37,205 --> 00:16:41,469 But that doesn't surprise me with someone like him. 208 00:16:41,758 --> 00:16:48,530 There's a real cool, you, for those that are curious, there's a real cool YouTube video out there that walks you through the end product. 209 00:16:48,530 --> 00:16:52,502 And it's just like three months, four months, whatever the number was. 210 00:16:52,502 --> 00:16:54,183 It's, it's outstanding. 211 00:16:54,183 --> 00:17:04,442 Well, getting back to the AI and legal, um, you've been around AI and legal pre LLM and you've kind of seen the transition. 212 00:17:04,442 --> 00:17:09,709 I would imagine, you know, there were, there was some machine learning application. 213 00:17:10,037 --> 00:17:22,345 Um, applications that you were involved in and neural networks, like what the transition from those legacy AI models to LLMs people think it's almost like a association that we 214 00:17:22,345 --> 00:17:28,909 have now AI people think LLMs, but AI has existed in legal for much longer than two years. 215 00:17:28,909 --> 00:17:29,839 Correct. 216 00:17:30,126 --> 00:17:30,966 Absolutely. 217 00:17:30,966 --> 00:17:31,366 Yeah. 218 00:17:31,366 --> 00:17:31,566 Yeah. 219 00:17:31,566 --> 00:17:36,146 Well, when I joined the startup company, I was out of all to witness. 220 00:17:36,706 --> 00:17:43,366 Yeah, we there was no generative AI or if there was we weren't using it and it wasn't sort of it wasn't really available to us. 221 00:17:43,366 --> 00:17:51,166 That was that was labeled data and supervised learning sort of old old old school way of doing things. 222 00:17:51,246 --> 00:17:58,758 For me still very fascinating, very interesting and and I felt cutting edge, you know, because a lot of people just weren't doing it. 223 00:17:58,758 --> 00:18:10,161 we've all been talking about AI automation for a long time, but use of it, actual use day to day inside of firms, at least in my network, not happening that much. 224 00:18:10,161 --> 00:18:12,762 And it wasn't happening that much, a bit more nowadays. 225 00:18:13,082 --> 00:18:24,595 So yeah, I started out my AI journey labeling documents with a team of other people and sounds easy, but it wasn't, you it's not just highlight and select, select your category. 226 00:18:24,595 --> 00:18:26,934 It's, know, how am I going to think about 227 00:18:26,934 --> 00:18:37,147 cutting up this document in a way that means when I train the model, the model really knows what this paragraph is relating to, because some paragraphs relate to more than one 228 00:18:37,147 --> 00:18:37,957 thing. 229 00:18:38,478 --> 00:18:47,600 So there was whole taxonomies involved there, and it required quite a deep understanding of what you were doing to be able to use it, at least in my experience. 230 00:18:48,221 --> 00:18:53,042 And it was laborious, because to get a decent result, you needed, let's say, 231 00:18:53,302 --> 00:18:57,924 I don't know, a thousand things to do it really well with enough variety in there. 232 00:18:57,924 --> 00:19:00,045 So access to that stuff is hard. 233 00:19:00,045 --> 00:19:05,987 Getting a thousand things labeled as a minimum, I would say is, hard, time consuming and expensive. 234 00:19:05,987 --> 00:19:14,671 So when generative AI came along and you know, you could just effectively give a model a few keywords and you know, some, what do they call it? 235 00:19:14,671 --> 00:19:19,733 Semantics that it can just go and figure out what, what, what clause you're looking for. 236 00:19:19,873 --> 00:19:21,664 I just changed the landscape entirely. 237 00:19:21,664 --> 00:19:22,318 So. 238 00:19:22,318 --> 00:19:27,618 For me, I felt it was a bit of a shame to throw away some of the work that we've done. 239 00:19:27,618 --> 00:19:33,758 And I think a lot of firms probably could still leverage what they have done in the past. 240 00:19:33,958 --> 00:19:44,898 Some firms I know were doing it for a long time and have got a big, big backlog of labeled data that they can and in my view should use as long as they can do it in a cost-effective 241 00:19:44,898 --> 00:19:46,628 way, because it's great for retrieval. 242 00:19:46,628 --> 00:19:49,538 You can get really high levels of accuracy with it. 243 00:19:50,530 --> 00:19:54,493 But I suppose generatively, I created a bit more of a level playing field. 244 00:19:54,493 --> 00:20:03,839 And I don't know whether we think we may have talked about it briefly before, but it was new at the time Adelshield Goddard had done a report whereby they'd given their associates 245 00:20:03,939 --> 00:20:16,007 or selected people within the firm, effectively a prompt library that they could go and, you know, do retrieval jobs for corporate support kind of work where they would go out and 246 00:20:16,007 --> 00:20:20,290 find the nominated clauses that they decided to go and try and find. 247 00:20:20,780 --> 00:20:23,651 This is like super powered control F, right? 248 00:20:23,651 --> 00:20:27,972 They can go out and find all the clauses that they want. 249 00:20:28,012 --> 00:20:29,436 Really writing a few rules. 250 00:20:29,436 --> 00:20:32,013 I don't want to diminish the work they've done because it's incredible. 251 00:20:32,013 --> 00:20:36,594 And if people haven't read the report, it's worth a read. 252 00:20:36,875 --> 00:20:43,196 You now can catch up, I think, with a lot of these people who have been doing labeled data for the years. 253 00:20:43,196 --> 00:20:49,268 so don't throw it away, but maybe focus your efforts on things like that. 254 00:20:49,737 --> 00:20:51,298 Yeah, I've got the report. 255 00:20:51,298 --> 00:20:52,558 I think you shared it with me. 256 00:20:52,558 --> 00:20:53,849 It is interesting. 257 00:20:53,849 --> 00:21:00,353 It's 50 pages and I have not, I've just kind of skimmed, but it is very interesting. 258 00:21:00,353 --> 00:21:08,847 you know, one thing that you'd mentioned earlier, you talked about business versus practice of law use cases. 259 00:21:08,847 --> 00:21:14,230 And, you know, I have a pretty strong opinion on that as well. 260 00:21:14,230 --> 00:21:19,603 I really feel like law firms should be focused on an incremental 261 00:21:19,879 --> 00:21:25,114 strategy or an incremental implementation to an AI strategy. 262 00:21:25,114 --> 00:21:35,363 And I do feel like the cost benefit ratio or the risk reward, however you want to frame it up, on the business of law side, works out a little better at the moment. 263 00:21:35,363 --> 00:21:40,197 And on the risk side, within the practice of law world, you've got a number of issues. 264 00:21:40,197 --> 00:21:41,669 You've got privacy. 265 00:21:41,669 --> 00:21:45,912 You've got client restrictions on generative AI use. 266 00:21:46,789 --> 00:21:58,953 And I think probably the biggest risk that doesn't get talked about enough is lawyers have a very low tolerance for missteps and wasting their time and rolling something out before 267 00:21:58,953 --> 00:22:06,755 it really is battle tested and has a clear ROI and can let allow them to leverage time. 268 00:22:06,755 --> 00:22:08,475 I think is a big mistake. 269 00:22:08,675 --> 00:22:15,037 And, um, I've seen, I'm seeing it happen now, like with copilot, Microsoft copilot, for example, 270 00:22:15,177 --> 00:22:16,838 I'm not a fan at the moment. 271 00:22:16,838 --> 00:22:18,548 know that Microsoft will get it right. 272 00:22:18,548 --> 00:22:21,199 think right now it needs a lot of work. 273 00:22:21,199 --> 00:22:30,001 It's I mean just you know, really bizarre challenges or I guess limitations with with copilot. 274 00:22:30,001 --> 00:22:32,302 So copilot has no no memory. 275 00:22:32,382 --> 00:22:42,895 So you know, even though it has vast access to vast troves of your writing when you when you draft in copilot or word, it doesn't leverage any of that. 276 00:22:43,278 --> 00:22:46,553 you basically have to upload a style document every 277 00:22:46,553 --> 00:22:53,528 when you're drafting and all of your, know, it has a very basic rag implementation where you can leverage three documents. 278 00:22:53,528 --> 00:22:55,470 They all have to be in one drive. 279 00:22:55,470 --> 00:23:01,804 And when you upload them into one drive, sometimes it takes up to 24 hours for them to show up for you to access. 280 00:23:01,804 --> 00:23:06,497 You basically throw a backslash in there, or maybe it's a forward slash to leverage the document. 281 00:23:06,497 --> 00:23:07,948 It's just not an efficient model. 282 00:23:07,948 --> 00:23:12,693 know Microsoft's going to get it right, but this is in my opinion, a beta beta product. 283 00:23:12,693 --> 00:23:15,293 and they're charging $30 a month for it. 284 00:23:15,293 --> 00:23:20,273 And all the marketing is selling firms and they're, I'm seeing it. 285 00:23:20,273 --> 00:23:21,103 They're pushing it out. 286 00:23:21,103 --> 00:23:25,053 In fact, it might, I don't know if it's, I can't remember the name of the firm. 287 00:23:25,053 --> 00:23:25,853 There are a couple. 288 00:23:25,853 --> 00:23:28,233 Clifford chance is one I know for sure. 289 00:23:28,233 --> 00:23:30,873 They, they released a case study. 290 00:23:30,873 --> 00:23:34,273 I have a lot of questions about the numbers in there. 291 00:23:34,273 --> 00:23:38,741 Um, you know, I think it was kind of co, uh, it was put together in 292 00:23:38,741 --> 00:23:40,182 collaboration with Microsoft. 293 00:23:40,182 --> 00:23:43,546 So I don't know if those numbers are optimistic or realistic, but I don't know. 294 00:23:43,546 --> 00:23:50,673 What is your, what is your take on business versus practice of law and where to start and that sort of stuff. 295 00:23:51,650 --> 00:23:52,511 Yeah, it's a tough question. 296 00:23:52,511 --> 00:23:55,263 mean, well, you're a gym guy, right? 297 00:23:55,263 --> 00:24:00,117 So losing fat and building muscle at the same time is just sort of how I see it. 298 00:24:00,117 --> 00:24:02,398 Those two things are really hard. 299 00:24:03,560 --> 00:24:14,288 But I suspect that the management of firms is such that the, you can divide and conquer to a degree. 300 00:24:15,069 --> 00:24:21,234 And if there are savings to be had in the back office business support functions, then 301 00:24:21,742 --> 00:24:27,542 you can use those savings to leverage up and pay up on the front office support stuff. 302 00:24:27,542 --> 00:24:30,982 I agree with you in many ways on the copilot stuff. 303 00:24:30,982 --> 00:24:37,542 don't have an intimate knowledge of it myself to that extent of using it. 304 00:24:37,542 --> 00:24:44,202 Albeit, what I would say is that will come as a package, I'm sure, with what Microsoft offers. 305 00:24:44,382 --> 00:24:48,452 And there will be ways and means, I'm sure, of using it in the right kind of way. 306 00:24:48,452 --> 00:24:51,660 If it is of summarizing 307 00:24:51,688 --> 00:24:54,760 notes from meetings, that is useful, right? 308 00:24:55,621 --> 00:25:07,021 If you use it in such a way as you can engineer a series of small prompts that can generate a report for you that don't necessarily need a playbook sitting in the 309 00:25:07,021 --> 00:25:16,718 background, but you just ask a series of questions and chain them together of a document, and then you get a useful report out of it, that's a good use case, in my opinion. 310 00:25:17,319 --> 00:25:19,861 I'm sure there's plenty of people who could be doing on that. 311 00:25:21,014 --> 00:25:26,648 I guess I'm a little bit biased in that my personal preference is to try and the lawyers be more productive. 312 00:25:26,648 --> 00:25:28,139 That was my goal. 313 00:25:28,139 --> 00:25:32,382 IBM was certainly, we've done a lot of useful things in that space. 314 00:25:32,382 --> 00:25:36,876 We've done some projects with in-house legal as well. 315 00:25:36,876 --> 00:25:44,971 There was a case study we did with NatWest Bank, which is of the big banks over here in the UK where we help them ingest their own playbook. 316 00:25:46,032 --> 00:25:48,590 It was almost like a word plug-in where the 317 00:25:48,590 --> 00:25:56,710 model will read the playbook, it'll read the incoming clause, and it will make recommendations and all sorts of great stuff like that, like you can imagine. 318 00:25:57,390 --> 00:26:05,990 But we've been in international business machines, we've been working on the back office side of things for an awfully long time and whether that's the traditional model of 319 00:26:05,990 --> 00:26:10,990 outsourcing and now it's AI first business process outsourcing. 320 00:26:10,990 --> 00:26:18,224 So how can we move some work that is manual at the moment onto a model? 321 00:26:18,348 --> 00:26:24,622 That's an area that I think is really interesting and one I'm really keen to explore. 322 00:26:24,622 --> 00:26:37,959 You can imagine the potential use cases for things like generative AI in talent acquisition, the whole process of reviewing applications and arranging meetings and so on 323 00:26:37,959 --> 00:26:38,489 and so on. 324 00:26:38,489 --> 00:26:42,131 That's all well within the wheelhouse of what we have nowadays. 325 00:26:42,131 --> 00:26:46,253 Not all of it will be generative AI, of course, but a lot of it will be. 326 00:26:47,246 --> 00:26:52,146 I guess I see a lot of easy wins for the firms in the back office. 327 00:26:52,146 --> 00:27:02,726 And like your point earlier, you can't, I don't think too many of us are going to trust what the models produce straight out of the gate and send it to our client without it 328 00:27:02,726 --> 00:27:03,246 being checked. 329 00:27:03,246 --> 00:27:08,426 So there's always going to be that phrase of human in the loop for a while at least, right? 330 00:27:08,546 --> 00:27:15,849 It's great for an augmentation speeding up tool, but I see a lot of potential on the back office side of things. 331 00:27:15,849 --> 00:27:16,709 Yeah. 332 00:27:16,729 --> 00:27:24,395 Well, and that was one of the caveats in the Clifford chance study was it did a good job listing out some of the use cases. 333 00:27:24,395 --> 00:27:33,381 And one of them was summarization, but then it, the, the, you know, the asterisk was, but it, should still be manually reviewed. 334 00:27:33,381 --> 00:27:34,512 It's just like, wait a second. 335 00:27:34,512 --> 00:27:36,723 So, or something along those lines. 336 00:27:36,804 --> 00:27:39,055 And it's just like, you're not saving me any time. 337 00:27:39,055 --> 00:27:44,917 If I have to go read the entire thread because I can't trust the technology to summarize and capture the main points. 338 00:27:44,917 --> 00:27:48,217 then it's not helping me or it's helping me minimally. 339 00:27:48,217 --> 00:27:49,397 And don't get me wrong. 340 00:27:49,397 --> 00:27:53,507 There's, use AI 10, 20 times a day. 341 00:27:53,507 --> 00:28:09,757 So I find a lot of really valuable use for it where I think I run into challenges mentally getting to a place where, all right, how are we going to calculate ROI on a implementation 342 00:28:09,757 --> 00:28:11,507 of a platform? 343 00:28:11,507 --> 00:28:13,845 Well, it's got us on the timekeeper side. 344 00:28:13,845 --> 00:28:16,206 It's got to save them time, right? 345 00:28:16,326 --> 00:28:24,550 And if there's manual checking that has to go in, how does that impact that ROI equation? 346 00:28:25,010 --> 00:28:29,612 For drafting, again, this is not just a co-pilot. 347 00:28:29,612 --> 00:28:36,656 mean, just in general, I think that, yes, there will have to be some manual oversight. 348 00:28:36,656 --> 00:28:38,796 The human's in the loop, to your point. 349 00:28:40,049 --> 00:28:46,396 On the summarization side, again, I think that I use it for summarization quite frequently, but for low risk things, right? 350 00:28:46,396 --> 00:28:53,453 Like honestly, I'm going to stick that, um, that AG report in and have Claude summarize it for me. 351 00:28:53,453 --> 00:28:56,174 And if it misses a couple of points, it's not the end of the world. 352 00:28:56,174 --> 00:29:03,302 But if I'm, if I'm a client facing thread that, you know, deals with a important matter, I'm not going to trust AI to summarize it. 353 00:29:03,302 --> 00:29:04,453 I'm going to read it. 354 00:29:04,674 --> 00:29:05,464 Yeah, absolutely. 355 00:29:05,464 --> 00:29:16,519 And I think a lot of firms are looking, I think, for new ways of doing, know, how can we use AI to open up new work methodologies and new work possibilities? 356 00:29:16,720 --> 00:29:24,603 I suppose the ideal scenario is you have an AI which is perfect and your clients just plug in and start getting what they need. 357 00:29:24,624 --> 00:29:28,189 And you have that dream scenario where you get paid while you're sleeping. 358 00:29:28,189 --> 00:29:30,356 know, everybody wants a bit of that, I think. 359 00:29:30,914 --> 00:29:34,157 Well, you've a long way to go before we get there. 360 00:29:34,157 --> 00:29:37,389 These models, you're going to have to be really sure that it's right. 361 00:29:37,389 --> 00:29:46,626 There are bound to be regulatory issues that people are going to have to grapple with, some of which you can probably navigate in terms of conditions, but probably not all. 362 00:29:46,807 --> 00:29:56,814 I see, though, the current state as still useful having the human in the loop in that, depending on how you structure the way you use the models, you could... 363 00:29:57,036 --> 00:30:05,331 collect an awful lot of ground truth data, which these firms may have currently unstructured sitting in their iManage account or wherever right now. 364 00:30:05,392 --> 00:30:21,953 If you sort of move that to a new world of generative AI produced data, which you then validate or confirm is correct or wrong, you will over time build up quite a additional 365 00:30:21,953 --> 00:30:24,294 set of data against which you can quickly monitor. 366 00:30:24,294 --> 00:30:27,058 So when the models do improve and 367 00:30:27,058 --> 00:30:36,256 when workflows, et cetera, improve, if you've got the right governance in place that allows you to manage and monitor all of these different models, which people are 368 00:30:36,256 --> 00:30:42,310 eventually gonna build up to, then swapping in a better model should be simple. 369 00:30:42,571 --> 00:30:52,098 And then people may well get to a point where their accuracy levels are so high that they're happy to, I'd love some those to use the word risk it, but you know. 370 00:30:52,226 --> 00:30:57,427 But it's probably no more risky than a person, than a human being doing the work at a certain point. 371 00:30:57,427 --> 00:31:07,370 So I think if you get the governance right, that's going to be critical for a lot of firms, especially when they do start using a lot of agents, or sorry, rather, assistants. 372 00:31:07,370 --> 00:31:09,451 Maybe they will use a lot of agents too. 373 00:31:10,651 --> 00:31:16,572 Today it's possible, I think, that you can build up a lot of assistants that will do an awful lot of stuff for you. 374 00:31:16,913 --> 00:31:21,930 And although the time is not necessarily, the time saving is not necessarily what you hope for. 375 00:31:21,930 --> 00:31:24,254 It's not wasted in my view. 376 00:31:24,445 --> 00:31:25,305 Yeah. 377 00:31:25,325 --> 00:31:25,676 Yeah. 378 00:31:25,676 --> 00:31:33,342 And to be clear, it is blatantly obvious where the most bottom line impact is going to come from in terms of use cases. 379 00:31:33,342 --> 00:31:36,354 It clearly is on the practice of law side. 380 00:31:36,354 --> 00:31:49,523 The opportunity cost for time spent on anything other than delivering work product is obviously very high for a thousand dollar plus an hour timekeepers. 381 00:31:50,365 --> 00:31:51,187 just 382 00:31:51,187 --> 00:32:01,900 you know, having let's say KM for example, or marketing or finance, leveraging the tools, especially KM that's ultimately going to support the timekeepers in the, in, in probably 383 00:32:01,900 --> 00:32:13,433 either KM or innovation, designing the strategies, providing the support, having them familiar and in a place where they're using the technology every day seems, wise. 384 00:32:13,433 --> 00:32:19,845 But to your point, there are, there are, if you're looking for bottom line impact, it's on that side of the business. 385 00:32:20,927 --> 00:32:27,629 But you, you and I talked about like different segments, kind of like large, mid and small law. 386 00:32:27,629 --> 00:32:31,240 We can define that any way we want for me. 387 00:32:31,280 --> 00:32:42,123 When I think about it from a vendor perspective, like small law is anything a hundred attorneys and under again, everybody has different ways of, um, defining this mid law 388 00:32:42,123 --> 00:32:47,404 feels like a hundred to 500 attorneys and large law feels like 500 and up. 389 00:32:47,404 --> 00:32:51,045 Um, do you feel like there are different? 390 00:32:51,355 --> 00:32:59,399 value propositions in those different segments of the law firm world with respect to AI. 391 00:33:01,311 --> 00:33:03,061 Yeah, probably. 392 00:33:03,321 --> 00:33:14,163 Although I would, I was, I personally think a lot of the difference of value proposition is down to the work that they do, maybe more so than the size of the firm. 393 00:33:14,243 --> 00:33:20,965 I think we may have been talking about this in the context of, of workflow and how we think AI is going to improve workflow. 394 00:33:20,965 --> 00:33:28,146 And again, anecdotally, I've heard a lot of lawyers say, know what I do is so specialized to you, you can't stick a workflow on it. 395 00:33:29,102 --> 00:33:31,503 I would disagree with that to a large extent. 396 00:33:31,503 --> 00:33:37,304 Anything that can write down into a set of rules can be automated. 397 00:33:38,205 --> 00:33:45,317 I see, over here we have some parts of the legal industry, conveyancing, wheel writing, probate. 398 00:33:45,317 --> 00:33:48,608 A lot of that is relatively formulaic. 399 00:33:48,608 --> 00:33:50,088 It's process driven. 400 00:33:50,088 --> 00:33:54,369 To some degree, entry level sort of debt recovery litigation work. 401 00:33:54,389 --> 00:33:56,670 That is to a large extent. 402 00:33:57,176 --> 00:33:57,947 form-filling. 403 00:33:57,947 --> 00:34:02,410 It isn't always small firms that do those, it tends to be. 404 00:34:03,012 --> 00:34:08,716 I think they can get an awful lot out of old school AI automation products. 405 00:34:09,898 --> 00:34:18,305 The new generative AI stuff, I guess for now, is probably within the domain of the bigger firms. 406 00:34:19,727 --> 00:34:22,489 It's difficult to tell, to be perfectly honest with you, it's... 407 00:34:23,382 --> 00:34:31,447 I think the small firms can certainly benefit from generative AI, but whether they need it or not, I'm not convinced entirely. 408 00:34:32,068 --> 00:34:35,670 It just depends, I think, on how much they're following a formula. 409 00:34:35,989 --> 00:34:36,539 Yeah. 410 00:34:36,539 --> 00:34:44,029 Where I see the difference and maybe this is, this is subtle is that the clients that these different size firms serve. 411 00:34:44,029 --> 00:34:44,769 Right. 412 00:34:44,769 --> 00:34:55,849 So, you know, in the a hundred attorney and under in the small law space, for example, you have customers like my company and you know, we don't have outside council guidelines with 413 00:34:55,849 --> 00:35:01,745 restrictions about use on AI on our stuff and you know, big law and 414 00:35:01,745 --> 00:35:08,648 especially in the financial services world or really any firm that caters to heavily regulated industries. 415 00:35:08,869 --> 00:35:11,410 There's a lot that goes into that. 416 00:35:11,891 --> 00:35:17,933 So I feel like there's a ton of opportunity on the small, smaller end of the spectrum. 417 00:35:17,994 --> 00:35:25,098 And then conversely, you know, a small law firms not buying Harvey, right? 418 00:35:25,098 --> 00:35:27,479 They're not even in the target market. 419 00:35:27,479 --> 00:35:31,461 It's, they probably wouldn't even be able to get a demo. 420 00:35:31,586 --> 00:35:32,991 Correct, yeah. 421 00:35:33,462 --> 00:35:39,264 So they, but they do have access to, you know, um, some of the paid consumer tools out there. 422 00:35:39,264 --> 00:35:49,368 Obviously they have access to co-pilot and I feel like a smaller law firm as well could be, um, nimble in their, in their rollout, right? 423 00:35:49,368 --> 00:35:57,011 Big firms have to do things in very formally and, um, strategically. 424 00:35:57,011 --> 00:36:00,753 So yeah, I, it's interesting. 425 00:36:00,753 --> 00:36:01,505 Um, 426 00:36:01,505 --> 00:36:11,302 I think the clients that the law firms serve also maybe is going to have some influence until these tools get to a place that they're widely available to all ends of the 427 00:36:11,302 --> 00:36:12,492 spectrum. 428 00:36:12,685 --> 00:36:19,687 you know, the outside council guidelines aren't restrictive like they are in some cases now. 429 00:36:19,700 --> 00:36:21,060 Yeah, I think you're right. 430 00:36:21,060 --> 00:36:22,651 The clients are going to influence a lot. 431 00:36:22,651 --> 00:36:36,582 And funnily enough, I came across a very interesting case study internally not that long ago where we'd done a generative AI powered bot customer facing, it's probably not right 432 00:36:36,582 --> 00:36:40,956 to call it a bot, know, customer chat interface for banks. 433 00:36:40,956 --> 00:36:48,178 And we've done it for a few banks and some of the really big ones too, their customer complaints are dealt with largely through that. 434 00:36:48,178 --> 00:36:49,858 this pushes a lot of 435 00:36:50,170 --> 00:36:59,725 work away from, in our case, that's, you know, the legal people who would be very expensive when maybe you've got names and whatever it is, you know, you can do a lot more 436 00:36:59,725 --> 00:37:03,016 with a lot less in that sense, people get much faster responses. 437 00:37:03,016 --> 00:37:12,210 And I think a younger generation is going to be perfectly at ease dealing with a, you know, a chat interface, if they get the answer they want, as long as you can do it 438 00:37:12,210 --> 00:37:13,141 reliably. 439 00:37:13,141 --> 00:37:16,832 I've been thinking about how do I, how does that apply to legal? 440 00:37:16,896 --> 00:37:22,368 In my old world, there is no way that a lot of the clients I used to work for are going to be happy with that. 441 00:37:22,368 --> 00:37:28,836 They're going to email me or in my previous role and say, I want the answer to this, or I've got a new job for you for this. 442 00:37:29,177 --> 00:37:37,744 So it doesn't immediately translate, albeit to the point about the smaller businesses, a lot of them probably can do that now. 443 00:37:37,744 --> 00:37:40,757 What's the update on my house acquisition right now? 444 00:37:40,757 --> 00:37:43,269 A lot of people won't care who they're dealing with. 445 00:37:43,269 --> 00:37:44,770 They'll just want to know. 446 00:37:44,800 --> 00:37:46,791 why haven't I had an answer on this for a week? 447 00:37:46,791 --> 00:37:47,732 What's going on? 448 00:37:47,732 --> 00:37:50,991 And go, okay, are some things to work through there. 449 00:37:50,991 --> 00:37:54,325 But what do you give the model access to in order to give them the answer? 450 00:37:54,325 --> 00:37:58,017 Because I'm sure there'll be bits and pieces of information you won't want to expose. 451 00:37:58,017 --> 00:38:05,781 Again, it's a sort of make sure you dot the I's and cross the T's and your governance is all done correctly. 452 00:38:05,982 --> 00:38:11,305 But actually inside of Big Law 2, I think you can apply that maybe to the lawyers. 453 00:38:11,305 --> 00:38:14,126 If you have visited the lawyer, 454 00:38:14,248 --> 00:38:17,040 as a client and your back office function. 455 00:38:17,040 --> 00:38:18,320 And we do this internally. 456 00:38:18,320 --> 00:38:19,431 We call it client zero. 457 00:38:19,431 --> 00:38:22,263 You know, we, do everything to ourselves first. 458 00:38:22,263 --> 00:38:30,907 So we have a, uh, an ask IBM system where if I need something from HR or it, I just ask through the system. 459 00:38:30,907 --> 00:38:34,069 And by and large, I get the answer without bothering anyone. 460 00:38:34,069 --> 00:38:40,473 So I think there's that kind of thing could be rolled out in different ways across large and small. 461 00:38:40,473 --> 00:38:43,134 Um, at least that's my, my hope. 462 00:38:43,571 --> 00:38:44,181 Yeah. 463 00:38:44,181 --> 00:38:45,292 Now that makes sense. 464 00:38:45,292 --> 00:38:53,935 you know, so we have rolled out, they are probably maybe just over the mid-law threshold, a firm. 465 00:38:53,935 --> 00:38:59,037 so we're an intranet extranet company and we work exclusively with law firms. 466 00:38:59,037 --> 00:39:05,490 don't have any customers outside of the law firm world, not even on the inside council side of the table. 467 00:39:05,490 --> 00:39:13,133 And, um, one of our clients, we built a chat bot internal facing where they can ask policy questions. 468 00:39:13,929 --> 00:39:15,574 into a chat interface. 469 00:39:15,574 --> 00:39:16,175 intranet. 470 00:39:16,175 --> 00:39:23,879 So this could be things about what is there, how many, how much time left do they have via their PTO allocation? 471 00:39:23,879 --> 00:39:30,703 What is their ethical threshold for, you know, um, vendor gifting? 472 00:39:30,703 --> 00:39:34,766 What is their laptop reimbursement policy, any policy question? 473 00:39:34,766 --> 00:39:35,456 And you know what? 474 00:39:35,456 --> 00:39:37,567 It's gone over really well. 475 00:39:37,567 --> 00:39:40,408 Um, it has internally, 476 00:39:41,697 --> 00:39:42,348 we're finding. 477 00:39:42,348 --> 00:39:48,373 So this system is about maybe three months deployed and they can't wait to increase the scope. 478 00:39:48,373 --> 00:40:00,573 They're taking an incremental strategy to this, but even busy lawyers who again have a low tolerance for BS and talking to a chat bot, they found they're getting really good 479 00:40:00,573 --> 00:40:01,773 adoption. 480 00:40:02,054 --> 00:40:08,830 I think the key there is this is a highly curated dataset and the performance is excellent. 481 00:40:08,830 --> 00:40:10,741 Like you get back good answers. 482 00:40:10,741 --> 00:40:11,721 because it's been tested. 483 00:40:11,721 --> 00:40:13,944 It's a small corpus of data. 484 00:40:13,944 --> 00:40:23,313 We've been able to really, well, they've done the testing to make sure that when questions get answered, you know, and they got a little thumbs up, thumbs down. 485 00:40:23,313 --> 00:40:33,111 So someone can rate the response and they dig in and they do the work when, when they get a thumbs down, they figure out why and how can they, how can they do better next time? 486 00:40:33,111 --> 00:40:37,520 So I think there's, there's real opportunity for that in legal. 487 00:40:37,520 --> 00:40:38,100 Absolutely. 488 00:40:38,100 --> 00:40:46,465 I mean, I couldn't agree more if I was if I was a CEO of a big law firm, I think I'd be saying where can I apply this in a very safe environment? 489 00:40:46,625 --> 00:40:53,158 is it does matter if they get it wrong because you'll you'll annoy your internal people who you're trying to keep happy and recruitment. 490 00:40:53,158 --> 00:40:55,370 It's hard enough and you don't want to make it worse. 491 00:40:55,370 --> 00:40:56,130 But 492 00:40:56,642 --> 00:40:58,644 I think it's a big, big opportunity. 493 00:40:58,644 --> 00:41:07,782 And also for the lawyers out there who unfortunately every now and again still have to manually print their billing guides and then walk it around to the partner and decide it, 494 00:41:07,782 --> 00:41:09,073 et cetera, et cetera. 495 00:41:09,073 --> 00:41:16,759 There's a whole bunch of process there that could be looked at and automated and just improved significantly. 496 00:41:16,759 --> 00:41:19,942 And you then get an awful lot of time back. 497 00:41:19,942 --> 00:41:25,516 So to your point about searching for something, mean, if somebody's got to go onto an intranet site manually, 498 00:41:25,560 --> 00:41:28,871 try and find it, I even locating the right document can be hard. 499 00:41:28,871 --> 00:41:35,474 And a lot of policies as a person who hasn't written too many of them, they all look and sound the same. 500 00:41:35,474 --> 00:41:39,586 And I don't want to have to read through it to know where I'm going. 501 00:41:39,586 --> 00:41:46,318 I mean, when I had my first child, it took me forever to figure out how much paternity leave I was going to get. 502 00:41:46,979 --> 00:41:54,922 And that's an hour or whatever, maybe an hour and a half of billing time that I lost because I was too busy trying to figure out all. 503 00:41:55,106 --> 00:41:58,368 What am I gonna do when this child arrives? 504 00:41:58,609 --> 00:42:03,813 And it really should have been a simple type it into a chat interface, know, what do I do about my first child? 505 00:42:03,813 --> 00:42:05,795 And it presumably told me. 506 00:42:05,795 --> 00:42:08,610 So I think you've hit the nail on the head. 507 00:42:08,610 --> 00:42:08,870 Yeah. 508 00:42:08,870 --> 00:42:10,531 I think it's going to be really interesting. 509 00:42:10,531 --> 00:42:12,952 And again, that's kind of an internal facing. 510 00:42:12,952 --> 00:42:15,553 I'll call, I'll still call that a business of law. 511 00:42:15,553 --> 00:42:18,594 It touches the timekeepers, but it's a business of law use case. 512 00:42:18,594 --> 00:42:28,158 Um, what, what about the, uh, you and I talked about the lone wolf mindset of lawyers and its impact on the technology implementation. 513 00:42:28,158 --> 00:42:32,820 I mean, this is a, this is a, it's a well documented, um, you know, Dr. 514 00:42:32,820 --> 00:42:34,771 Larry Richard has done. 515 00:42:35,701 --> 00:42:48,180 studied tens of thousands of lawyers and his book, Lawyer Brain, he talks about how he ranks lawyers on several personality traits, one of which is autonomy, and they are off 516 00:42:48,180 --> 00:42:48,820 the chart. 517 00:42:48,820 --> 00:42:50,121 I forget what the number is. 518 00:42:50,121 --> 00:42:54,334 I think it's like, you know, in the 70, 80 percentile, whatever the number. 519 00:42:54,334 --> 00:42:56,456 So they kind of have this lone wolf mentality. 520 00:42:56,456 --> 00:43:04,161 How is there, how do you feel that that impacts the, you know, um, 521 00:43:04,667 --> 00:43:09,904 impacts technology implementation, especially when it comes to some of the stuff we're talking about here. 522 00:43:09,904 --> 00:43:12,391 Do think there's an impact or no? 523 00:43:14,326 --> 00:43:17,609 I'll have to read those books, but which I've not. 524 00:43:17,609 --> 00:43:23,213 But my gut instinct is yes, there's an impact. 525 00:43:25,336 --> 00:43:36,295 Even though law firms, in many ways, are big groups of partners, the way I look at it is very often you have a few really, really big partners that will make a lot of decisions 526 00:43:36,295 --> 00:43:37,065 and 527 00:43:38,304 --> 00:43:42,316 and they may effectively operating their own firm within a firm. 528 00:43:42,316 --> 00:43:44,887 And a lot of firms are structured that way, actually, frankly, aren't they? 529 00:43:44,887 --> 00:43:53,640 Let's, you know, based out of Switzerland and all these different varines that are underneath them, or in some cases, they're somewhat like a franchise. 530 00:43:53,640 --> 00:43:58,142 So I think it can only, it must be true. 531 00:43:58,942 --> 00:44:00,323 And I think it's a bit of a shame. 532 00:44:00,323 --> 00:44:07,806 And I suppose when you are faced with a decision that I, 533 00:44:07,950 --> 00:44:18,320 you know, I could take home a million dollars this year, or a million pounds in my case, not my personal case, that would be nice, yeah, or 900,000. 534 00:44:18,320 --> 00:44:19,480 I'll take the million. 535 00:44:19,480 --> 00:44:30,510 And I know I'm exaggerating the numbers a bit, but the idea of spending money on something which might help me five years down the line, maybe not that long, but it's gonna take a 536 00:44:30,510 --> 00:44:32,030 bit of time to play out. 537 00:44:32,030 --> 00:44:33,270 Maybe I won't do that. 538 00:44:33,270 --> 00:44:35,928 And I don't think, I think a lot of firms have come a long way. 539 00:44:35,928 --> 00:44:41,632 They've set up innovation teams, they've done a lot of good stuff to recognize the need to invest in the future. 540 00:44:41,632 --> 00:44:49,928 also people living longer, partners hang around longer, they make partner early nowadays and maybe they see the value in future investment. 541 00:44:49,928 --> 00:44:54,371 But yeah, there's definitely somewhat of a lone wolf mentality going on, I think. 542 00:44:54,371 --> 00:45:05,068 And I think you can probably point to, again, going back anecdotally, you hear stories about things being a really good fit. 543 00:45:05,068 --> 00:45:10,914 and have been tested and gone through various layers of approval and then all of a sudden certain things are no longer approved. 544 00:45:10,914 --> 00:45:17,010 And I think that's probably down to some people saying, I just don't see the advantage to this kind of thing for me. 545 00:45:17,010 --> 00:45:18,451 So let's not do it. 546 00:45:18,772 --> 00:45:20,713 I've heard stories along those lines. 547 00:45:22,069 --> 00:45:22,729 Yeah. 548 00:45:22,729 --> 00:45:23,049 Yeah. 549 00:45:23,049 --> 00:45:32,309 And your point about kind of the power structure in big law is, I think is also interesting. 550 00:45:32,309 --> 00:45:40,289 know, a lot of people rise through the ranks in law firm leadership because they're the best at lawyering. 551 00:45:40,289 --> 00:45:40,889 You know what I mean? 552 00:45:40,889 --> 00:45:46,429 As opposed to being the best leader or being the most capable person to sit in that leadership seat. 553 00:45:46,429 --> 00:45:50,604 And then you also have another dynamic of, you know, retirement horizon. 554 00:45:50,604 --> 00:45:51,027 Yeah. 555 00:45:51,027 --> 00:45:58,939 you know, how close, because most of those partners, their, retirement horizon is in sight. 556 00:45:58,939 --> 00:46:05,181 So if it's three years and the break even on a project is five, am I going to vote to no, I'm not. 557 00:46:05,782 --> 00:46:15,275 you know, it's, it's, know, and law firms operate on a cash basis and capital expenditures are, um, don't really fit into that model. 558 00:46:15,275 --> 00:46:16,165 So 559 00:46:16,403 --> 00:46:19,107 Yeah, well this has been a really good conversation. 560 00:46:19,107 --> 00:46:22,582 Did you have some thoughts on that before we wrap up? 561 00:46:22,582 --> 00:46:32,322 only gonna say I just the final point for me is I think a lot of firms have done really well as I just just to set up innovation teams and hubs, allocate money like we do, right? 562 00:46:32,322 --> 00:46:36,292 We put our money into our pension, we never see it, it just is there for us for a rainy day. 563 00:46:36,292 --> 00:46:45,762 And I think a lot of firms have embraced that and, and, good on them for doing so because they will need to let's let's be honest, that we know that if you don't invest in the way 564 00:46:45,762 --> 00:46:50,278 you do your business in future, it's gonna start failing against the competitors that do so. 565 00:46:51,106 --> 00:46:52,836 That's it, really. 566 00:46:52,841 --> 00:47:02,325 Yeah, I, I, you know, there is a lot of, uh, real work in innovation and real investment in innovation in legal. 567 00:47:02,325 --> 00:47:05,706 But I would say again, this is anecdotal. 568 00:47:05,706 --> 00:47:07,347 There's no way to measure this. 569 00:47:07,347 --> 00:47:09,618 There's probably there. 570 00:47:09,618 --> 00:47:10,869 Well, I'll say it like this. 571 00:47:10,869 --> 00:47:13,710 There's a significant amount of innovation theater as well. 572 00:47:13,710 --> 00:47:21,877 Um, you know, at least in the U S there's people, you know, who I know I've been selling into the legal or in the KM. 573 00:47:21,877 --> 00:47:25,947 space for a long time before the word innovation existed as a role. 574 00:47:25,947 --> 00:47:34,377 And then all of a sudden I start seeing friends of mine, you know, who are in KM on all of a sudden instead of the CKO, they're the CK IO. 575 00:47:34,377 --> 00:47:37,237 And I reach out and Hey, how, has your role changed? 576 00:47:37,237 --> 00:47:40,297 And you know, it, it, hasn't. 577 00:47:40,577 --> 00:47:45,587 they want to, you know, they want to create the appearance of innovation, right? 578 00:47:45,587 --> 00:47:48,557 Cause they're paired, their clients want them to be more innovative. 579 00:47:48,557 --> 00:47:50,483 They want them to adopt. 580 00:47:50,803 --> 00:47:54,375 new innovative ways of solving their problems. 581 00:47:55,557 --> 00:48:00,320 So yeah, there is, but again, not to downplay, you're absolutely correct. 582 00:48:00,320 --> 00:48:04,323 I know really good innovation teams out there and there's plenty of them. 583 00:48:04,323 --> 00:48:12,989 It's just sometimes firms are taking the easy route of slapping innovation on some titles and calling it a day. 584 00:48:13,560 --> 00:48:23,266 Yeah, I think a lot of them have got an opportunity to buy stuff in and I think it's a full time job just reading the legal press, trying to keep on top of what's out there. 585 00:48:23,266 --> 00:48:30,660 There's a whole lot of people coming out with GBT rappers that, you know, pretend to do something. 586 00:48:30,860 --> 00:48:33,201 in many cases, they will do great things. 587 00:48:33,201 --> 00:48:35,423 In many cases, they will do average things. 588 00:48:35,423 --> 00:48:40,185 But if you're in that role, you really got to have a look at everything. 589 00:48:41,046 --> 00:48:44,148 So that is a challenging, challenging job for sure. 590 00:48:44,148 --> 00:48:51,753 And I can see why, you know, I think some firms really they've got, they've bought into it big, you know, they, they've appointed a new CIO, right? 591 00:48:51,753 --> 00:48:57,717 Like you said, it's a chief innovation officer right now, rather than just an information officer. 592 00:48:57,717 --> 00:49:06,203 And that's that's a big spend and a big commitment and, and, and, and probably a necessary one with how much stuff there is out there to do. 593 00:49:06,203 --> 00:49:10,744 But from my side, I'm, I'm, you know, I'm hopeful that people will 594 00:49:10,744 --> 00:49:12,266 Try different things. 595 00:49:12,266 --> 00:49:14,668 They'll do some innovation work themselves in-house. 596 00:49:14,668 --> 00:49:17,581 They'll have some people that can work with technology providers like me. 597 00:49:17,581 --> 00:49:18,942 That's what I'm here for. 598 00:49:19,644 --> 00:49:21,406 Use me for scaling up stuff up. 599 00:49:21,406 --> 00:49:28,724 You know, when you get something that works and looks good, come and talk to me and I'll try and find the right people to say, well, will it accelerate the growth of that that's 600 00:49:28,724 --> 00:49:29,134 working? 601 00:49:29,134 --> 00:49:31,636 And you know, if there's anything that's not working, ditch it. 602 00:49:31,967 --> 00:49:32,298 Yeah. 603 00:49:32,298 --> 00:49:34,834 Well, that's a good, that's a good way to kind of tie a bow on this. 604 00:49:34,834 --> 00:49:41,831 How do, how do people find out more about, um, IBM's offering and what you do. 605 00:49:41,888 --> 00:49:44,609 Yeah, well, there is a lot of offering, right? 606 00:49:44,609 --> 00:49:48,281 So best thing to do is probably just to message me. 607 00:49:48,281 --> 00:49:50,422 LinkedIn is the right place, I suspect. 608 00:49:50,422 --> 00:49:53,133 A lot of people get my name wrong. 609 00:49:53,133 --> 00:49:58,485 It's NEI2Ls for Neil, unusual, but I can't help that. 610 00:49:59,226 --> 00:50:01,407 So it's Neil Pemberton on LinkedIn. 611 00:50:01,947 --> 00:50:03,948 Just Google the name, you'll find it. 612 00:50:04,028 --> 00:50:06,640 And have a look around the IBM website. 613 00:50:06,640 --> 00:50:07,864 There is a whole... 614 00:50:07,864 --> 00:50:09,415 treasure trove of information on that. 615 00:50:09,415 --> 00:50:15,558 And as I said, there's a lot of open source stuff so people can go and try and just see what it's like. 616 00:50:15,558 --> 00:50:26,604 And we talked a little bit earlier, there's a lot of YouTube videos that IBM do as well that will explain all kinds of different things. 617 00:50:26,604 --> 00:50:28,645 We didn't even talk about agents today. 618 00:50:28,645 --> 00:50:33,588 We could do a whole session on something like that and the applicability of agents to legal work. 619 00:50:33,588 --> 00:50:35,029 Another conversation. 620 00:50:35,561 --> 00:50:36,355 Yeah. 621 00:50:37,132 --> 00:50:39,437 LinkedIn, IBM website, YouTube. 622 00:50:39,437 --> 00:50:41,353 I think those are probably good places to go. 623 00:50:41,353 --> 00:50:41,583 Yeah. 624 00:50:41,583 --> 00:50:47,637 And we'll, we'll, we'll post links in the show notes to help, help guide people in the right direction. 625 00:50:47,637 --> 00:50:58,373 And yeah, it would be a good, I would, I would love to, to stay in touch and maybe have you on sometime in the future to talk about, you know, some of the new work that, you 626 00:50:58,373 --> 00:51:05,087 know, big vendors and, and, leaders, technology leading companies like IBM are doing in this space. 627 00:51:05,087 --> 00:51:07,989 So, um, let's keep in touch. 628 00:51:08,444 --> 00:51:09,605 Yeah, we'll do. 629 00:51:09,703 --> 00:51:11,557 Awesome, well, I appreciate your time here. 630 00:51:11,557 --> 00:51:14,440 Have a great weekend and we will chat again soon. 631 00:51:15,604 --> 00:51:16,965 Alright, thanks Neil. 00:00:08,178 Neil Pemberton, how are you this morning or I guess afternoon in your side of the world? 2 00:00:08,178 --> 00:00:08,658 the world. 3 00:00:08,658 --> 00:00:19,004 Yeah I'm doing well Ted thanks it is the afternoon we're coming up to 4 30 over in the UK it's dark and like I say I'm hoping that you will be able to bring me some sunshine this 4 00:00:19,004 --> 00:00:22,698 afternoon with a with an interesting interesting conversation. 5 00:00:22,911 --> 00:00:24,412 Well, I'll do my best. 6 00:00:24,412 --> 00:00:25,823 I'll do my, but no guarantees. 7 00:00:25,823 --> 00:00:28,626 Um, well, good stuff. 8 00:00:28,626 --> 00:00:38,874 I took a look at your background and you know, we got, I think we had some, some conversation, uh, previously on LinkedIn and then, had a, had a chat that I thought was 9 00:00:38,874 --> 00:00:43,998 very insightful and I thought your background was super industry, interesting. 10 00:00:43,998 --> 00:00:52,917 You had spent some time, uh, in legal at Denton's, you were spent some time on the vendor side and now you're an associate partner at IBM. 11 00:00:52,917 --> 00:00:53,610 Mm-hmm. 12 00:00:53,610 --> 00:00:58,448 um, I was unaware of IBM's offering that aligns with legal. 13 00:00:58,448 --> 00:01:00,723 So I thought it'd be a great conversation. 14 00:01:00,723 --> 00:01:04,282 Why don't you tell us a little bit about who you are, what you do and where you do it. 15 00:01:04,282 --> 00:01:04,772 do it? 16 00:01:04,772 --> 00:01:05,502 Yeah, great. 17 00:01:05,502 --> 00:01:05,962 Will do. 18 00:01:05,962 --> 00:01:07,222 Thank you. 19 00:01:07,222 --> 00:01:11,782 Well, I started my legal career back in about 2005. 20 00:01:11,782 --> 00:01:22,282 Worked my way up from from the bottom, so to speak is as a paralegal 18 months training contract, which is what we do in the UK or England at least two years almost of that. 21 00:01:22,282 --> 00:01:25,770 I got six months off for time to count, which was good. 22 00:01:25,934 --> 00:01:31,796 Three years later, was at a regional firm in Bristol in the southwest of England where I live. 23 00:01:32,037 --> 00:01:44,282 And I was looking for bit of a new challenge really, having spent six years at my first firm, an opportunity at Dentons, as you say, I came up, joined them in about 2011 and 24 00:01:44,282 --> 00:01:46,763 worked there for 10 years very happily. 25 00:01:47,283 --> 00:01:54,566 But after 16 years or so of doing what was essentially the same thing, commercial real estate work, which I enjoyed for a long time, I... 26 00:01:54,606 --> 00:01:57,128 started to get a bit itchy and looking around for other alternatives. 27 00:01:57,128 --> 00:02:01,550 And once I started looking around, a whole world of opportunity opened up to me. 28 00:02:01,550 --> 00:02:12,377 So a mentor of mine that I'd ended up working with at Denton's and really went out on a limb for me that got me working in the technology media telecom space. 29 00:02:12,497 --> 00:02:16,259 I found the technology work just to be much, much more interesting. 30 00:02:16,680 --> 00:02:20,246 And yeah, once I started looking around, opportunities just... 31 00:02:20,246 --> 00:02:25,679 that I never thought were there, you know, came up, they're no in-house opportunities for commercial real estate lawyers over here. 32 00:02:25,679 --> 00:02:36,464 So what happened was the company that I joined after Denton's, they were a startup, but just a two or three years before, and they had such a compelling proposition that when 33 00:02:36,464 --> 00:02:42,259 they'd raised money that meant they could afford me, I was jumping at an opportunity to go to go and work for them. 34 00:02:42,259 --> 00:02:48,662 A real first look at AI, pre-generative AI actually. 35 00:02:49,054 --> 00:02:55,516 to try and help them automate some real estate reporting, which is what my domain expertise was. 36 00:02:55,977 --> 00:02:56,747 I joined there. 37 00:02:56,747 --> 00:03:02,239 They'd already had a couple of what we called legal engineers working there, some very skilled people. 38 00:03:02,239 --> 00:03:04,220 But at one point it was just me. 39 00:03:04,540 --> 00:03:06,941 I grew that team up to about eight, nine people. 40 00:03:06,941 --> 00:03:11,143 And then generative AI came along, got really, really interesting. 41 00:03:11,143 --> 00:03:14,084 And eventually IBM just came knocking. 42 00:03:14,084 --> 00:03:18,195 And that to me was just too good of an opportunity not to explore it. 43 00:03:18,530 --> 00:03:24,634 You know, I joined the startup with a view to expanding my horizons and the horizons don't get much bigger than IBM. 44 00:03:24,634 --> 00:03:30,598 So when they came knocking, they were looking for someone who knew legal, someone who who'd had their hands on the tech. 45 00:03:30,598 --> 00:03:34,521 Um, and I was, I guess at the intersection of their Venn diagram. 46 00:03:34,521 --> 00:03:36,352 So here I am. 47 00:03:37,159 --> 00:03:37,749 Interesting. 48 00:03:37,749 --> 00:03:48,250 So yeah, I, I think I mentioned this, uh, in the intro, like I didn't realize that IBM had an offering aligned with the legal vertical. 49 00:03:48,250 --> 00:03:51,253 I, you know, I hear about the E Y's of the world. 50 00:03:51,253 --> 00:03:55,034 Um, and, other, you hear a lot of ALSPs. 51 00:03:55,815 --> 00:04:03,638 but I didn't know how big is the group that you work in and are you guys exclusively legal or is it broader than that? 52 00:04:03,638 --> 00:04:08,738 So yeah, I sit within IBM consulting, which globally is just huge. 53 00:04:08,738 --> 00:04:11,858 I didn't know much about IBM consulting before I joined. 54 00:04:11,858 --> 00:04:17,278 me, I did grow up in the States, as may be obvious from the whiteboard and the Denver Broncos helmet in the background. 55 00:04:17,278 --> 00:04:19,918 Sorry to anyone who's Kansas City fan. 56 00:04:21,678 --> 00:04:25,248 So IBM tech in the 80s was just huge. 57 00:04:25,248 --> 00:04:27,398 So I was well aware of that part of the business. 58 00:04:27,398 --> 00:04:30,512 I wasn't so aware of the consulting side. 59 00:04:30,890 --> 00:04:34,453 We have in-house lawyers and that's not really what my domain is. 60 00:04:34,453 --> 00:04:42,761 My domain is to work with professional services firms in general, which includes obviously legal, and just try and help their business. 61 00:04:42,761 --> 00:04:46,374 I IBM has been improving businesses for 100 plus years, right? 62 00:04:46,374 --> 00:04:53,920 So part of my job is taking the best of breed that we've got in-house in terms of technology. 63 00:04:54,112 --> 00:04:58,844 but we do partner with lots of other people, Microsoft, Adobe, Oracle, Salesforce, you name it. 64 00:04:58,844 --> 00:05:00,585 We will partner with other vendors. 65 00:05:00,585 --> 00:05:02,386 We'll do what's best for the client. 66 00:05:02,386 --> 00:05:11,551 So we offer a traditional consulting, I suppose, with the untraditional, if that's a real word, aspect that we have this big technology offering behind us. 67 00:05:11,691 --> 00:05:18,344 And my job is to go out and look at ways that we can improve not just the practice of law, but the business of law as well. 68 00:05:18,717 --> 00:05:19,637 Interesting. 69 00:05:19,637 --> 00:05:26,220 Yeah, it's been almost exactly two years since ChatGPT made its debut. 70 00:05:26,220 --> 00:05:38,765 I think that really changed everyone's perspective way beyond the legal industry, but within the legal industry itself, the status quo is very sticky in legal. 71 00:05:39,246 --> 00:05:43,908 Lawyers tend to embrace status quo, not always 72 00:05:45,032 --> 00:05:46,034 Mm-hmm 73 00:05:46,709 --> 00:05:49,709 the most open to change. 74 00:05:50,589 --> 00:06:01,689 and you know, I think the, that really rattled some cages at senior levels, you know, at the executive committee levels in law firms, like, wow. 75 00:06:01,689 --> 00:06:10,349 And you know, we saw things like the Goldman report that came out that 44 % of legal tasks could be automated by AI, which I've said multiple times. 76 00:06:10,349 --> 00:06:15,749 I think that's a gross overestimate, maybe one day, but we are a long way from that one day. 77 00:06:15,749 --> 00:06:16,877 Um, 78 00:06:17,189 --> 00:06:21,829 And you the, when you saw a, you saw a trajectory. 79 00:06:21,829 --> 00:06:34,752 in, when in November of 2022, when three five was released and scored 60 some odd percentile on the bar and then four was released, I think six or eight months later and it 80 00:06:34,752 --> 00:06:38,986 scored over the initial indications was it scored over 90 on the bar. 81 00:06:38,986 --> 00:06:43,879 People really took notice like, you know, that's a very steep innovation curve. 82 00:06:44,026 --> 00:06:53,879 Things have flattened out since then, the smidge and there's a lot of talk in AI circles about scaling laws and whether more... 83 00:06:53,879 --> 00:07:01,945 is going to continue to produce the incremental improvements that we have seen previously. 84 00:07:01,945 --> 00:07:05,164 Um, I think that they're again, this is Ted's opinion here. 85 00:07:05,164 --> 00:07:07,235 I'm not an expert, but I I'm an enthusiast. 86 00:07:07,235 --> 00:07:08,485 I follow the space closely. 87 00:07:08,485 --> 00:07:12,309 You know, once you get over about a trillion parameters and I think the latest 88 00:07:12,309 --> 00:07:13,280 GPT models. 89 00:07:13,280 --> 00:07:13,820 not sure. 90 00:07:13,820 --> 00:07:16,512 think maybe four is about 1.8 trillion. 91 00:07:16,512 --> 00:07:19,004 I might have that number wrong, but it's somewhere in that vicinity. 92 00:07:19,004 --> 00:07:30,653 You know, once you get over a trillion parameters, I things start to level out a smidge and I don't know if throwing more parameters and more data at these models is going to 93 00:07:30,653 --> 00:07:34,996 ultimately get us back on that steep innovation curve. 94 00:07:34,996 --> 00:07:36,297 There's a lot of debate about it. 95 00:07:36,297 --> 00:07:39,349 I mean, it's, if you listen to 96 00:07:47,922 --> 00:07:52,542 noticed anecdotally a little bit of a flattening. 97 00:07:52,542 --> 00:07:58,462 So I don't know, do you have any sense of kind of the trajectory we're on versus where we started? 98 00:08:00,575 --> 00:08:05,029 Yes, although not in the sense of how many trillions of parameters we might have. 99 00:08:05,029 --> 00:08:10,824 And in fact, to be a little contrarian, we do really well with way less. 100 00:08:10,824 --> 00:08:17,809 If you look at the IBM series of models, there's a series that we call granite, which is our in-house. 101 00:08:18,306 --> 00:08:19,916 variety and it's open source. 102 00:08:19,916 --> 00:08:22,127 So people are welcome to go and look at it and try it. 103 00:08:22,127 --> 00:08:22,827 Right. 104 00:08:22,827 --> 00:08:34,310 Um, we've just released granite 3.0 and it's got 8 billion and it, and and you look at the sums and say how on earth would eight, 8 billion compete with 1.8 trillion or whatever the 105 00:08:34,310 --> 00:08:38,602 number is being some significant, I'm not even gonna try and do the arithmetic on it. 106 00:08:38,602 --> 00:08:41,812 Cause I'm not that good in my head, but way less. 107 00:08:41,992 --> 00:08:48,044 I think the difference is, and can be that, like you say, you know, maybe, maybe we just don't need that many. 108 00:08:48,366 --> 00:08:49,046 parameters. 109 00:08:49,046 --> 00:08:52,816 mean, 1.2 trillion, 1.8 trillion, I can't even fathom what that looks like. 110 00:08:52,816 --> 00:08:55,266 I can't even fathom what 8 billion looks like. 111 00:08:55,266 --> 00:09:06,346 So, you know, we can get a lot out of small models, using them intelligently, training them on good data rather than just all data. 112 00:09:06,346 --> 00:09:09,036 And I think that's probably one of our key differentiators. 113 00:09:09,036 --> 00:09:10,306 And it's not the only one. 114 00:09:10,306 --> 00:09:16,328 But what we are quite keen on looking at is, what can we achieve the most with 115 00:09:16,328 --> 00:09:18,399 using the least, if I can put it that way. 116 00:09:18,399 --> 00:09:19,799 We use a small model. 117 00:09:19,799 --> 00:09:22,130 It's faster, it's cheaper, it's greener. 118 00:09:22,130 --> 00:09:24,720 And a lot of advantages to that. 119 00:09:24,720 --> 00:09:33,013 And to your point earlier about how quickly we scale and get to this 44 % number that we heard about, that's a while off. 120 00:09:33,013 --> 00:09:43,306 And to me, that takes an awful lot of time and effort and energy on the part of whoever's using the models, setting them up with the right workflows, doing the right prompting, et 121 00:09:43,306 --> 00:09:44,606 cetera, et cetera. 122 00:09:45,262 --> 00:09:54,882 I don't really have an idea on that, although I will say what's interesting from my perspective, trajectory wise, is how quickly some of the open source models have been 123 00:09:54,882 --> 00:09:57,132 catching up with some of the proprietary models. 124 00:09:57,132 --> 00:09:59,142 I find that quite interesting. 125 00:09:59,142 --> 00:10:02,122 And they say we use our own models, but we use others too, right? 126 00:10:02,122 --> 00:10:05,582 We use Llama and all sorts in our day-to-day work. 127 00:10:05,582 --> 00:10:09,022 And we find we can get good results using small models. 128 00:10:09,022 --> 00:10:13,237 So I think it's about how you use it rather than what it is that you use. 129 00:10:13,237 --> 00:10:14,377 Yeah, there are some better. 130 00:10:14,377 --> 00:10:15,787 There are some interesting. 131 00:10:15,787 --> 00:10:24,357 I've heard some interesting use cases for small models, specifically the ones that are downloadable and able to run locally on a laptop. 132 00:10:24,357 --> 00:10:27,037 One is from a privacy perspective. 133 00:10:27,157 --> 00:10:35,077 So I spoke to someone, I can't remember if it was on a podcast or outside of that. 134 00:10:35,077 --> 00:10:42,639 Somebody was telling me about they created a process for a patent. 135 00:10:42,823 --> 00:10:47,727 attorney where he would download a small model. 136 00:10:47,727 --> 00:10:59,666 forget, it may have been llama and they built some sort of interface on top that would allow him to automate, um, these patent applications, which I don't know, he did three or 137 00:10:59,666 --> 00:11:09,470 four a week and then he would delete the models and download a fresh every time he needed to do this and which sounds like a big deal, but it's not. 138 00:11:09,470 --> 00:11:11,730 not with some of the smaller models. 139 00:11:11,730 --> 00:11:20,470 So that's one way to achieve privacy until we get to a place where these larger models have these enterprise-grade security controls in place. 140 00:11:20,470 --> 00:11:24,850 I know OpenAI does, as do some others. 141 00:11:25,470 --> 00:11:27,830 But I mean, I think there are interesting news cases. 142 00:11:27,830 --> 00:11:29,310 I did a quick search. 143 00:11:29,310 --> 00:11:31,253 Yeah, the Lama 3.1 has eight 144 00:11:31,253 --> 00:11:34,444 um, 70 billion. 145 00:11:34,444 --> 00:11:37,155 have a 405 billion parameter version. 146 00:11:37,155 --> 00:11:38,631 So yeah, I mean, I think 147 00:11:38,631 --> 00:11:41,103 Obviously OpenAI is the monster. 148 00:11:41,401 --> 00:11:47,079 I would imagine Anthropic is in a similar ballpark. 149 00:11:47,079 --> 00:11:50,973 But yeah, there are interesting applications to some of the smaller models. 150 00:11:51,032 --> 00:11:58,468 Well, yeah, I mean, one of the ways that we've done it, we as a company know what's gone into our models because we made them, right? 151 00:11:58,468 --> 00:11:59,489 We trained them. 152 00:11:59,489 --> 00:12:01,731 It's all enterprise data. 153 00:12:01,751 --> 00:12:09,137 It's not what you get on the classic phrase of Reddit blogs or whatever it is that is the criticism of the larger models. 154 00:12:09,138 --> 00:12:13,381 So in theory, there's no garbage in any of the models that we use. 155 00:12:13,381 --> 00:12:16,284 And we can tell people and underwrite what's in there. 156 00:12:16,284 --> 00:12:16,934 And we do. 157 00:12:16,934 --> 00:12:19,298 And we publish the information on all of that. 158 00:12:19,298 --> 00:12:24,711 you know, how we've gone about it's available and you can find it on IBM's website. 159 00:12:24,711 --> 00:12:28,263 So being open about it is good. 160 00:12:28,263 --> 00:12:35,848 I think what's that phrase, know, garbage in, garbage out, rubbish in, rubbish out, whatever the right terminology is. 161 00:12:35,848 --> 00:12:37,038 That's sort of our view. 162 00:12:37,038 --> 00:12:45,773 Well, that's the view that I'm hearing anyway, at least is that we train it on good stuff and we get better results than you might otherwise think with a small model. 163 00:12:45,794 --> 00:12:46,560 And, know, 164 00:12:46,560 --> 00:12:47,642 I go back to it. 165 00:12:47,642 --> 00:12:56,505 I wouldn't underestimate the importance of low cost, low energy consumption and low carbon emission for the people who need to report that sort of thing. 166 00:12:56,505 --> 00:12:59,148 And I think everybody's interested in low cost. 167 00:12:59,649 --> 00:13:02,207 that's certainly my experience, especially with lawyers. 168 00:13:02,207 --> 00:13:03,278 Yeah, absolutely. 169 00:13:03,278 --> 00:13:03,428 Yeah. 170 00:13:03,428 --> 00:13:15,955 I've, I've even noticed as a consumer, uh, as a, someone who leverages the consumer paid versions of some of the big models I use, you know, the chat GPT pro and, the, Claude paid 171 00:13:15,955 --> 00:13:16,865 subscription. 172 00:13:16,865 --> 00:13:25,270 Every time I go into Claude now it's by default in concise mode because of high usage and to, you know, they're trying to manage token consumption. 173 00:13:25,270 --> 00:13:30,367 I've also heard this is unconfirmed, but there's been a big degradation. 174 00:13:30,367 --> 00:13:40,530 dip in performance and co-pilot that there's a lot of suspicion and scuttlebutt that it's a token throttling to manage resource consumption. 175 00:13:40,530 --> 00:13:53,363 yeah, I mean, every day I know myself as a consumer of these tools, every day I expand the scope and more and more usage every single day as I learn new ways to leverage the 176 00:13:53,363 --> 00:13:54,624 technology. 177 00:13:54,984 --> 00:13:59,571 That's not unique to me that, you know, everybody's doing that and 178 00:13:59,571 --> 00:14:01,703 It's just going to create more and more demand. 179 00:14:01,703 --> 00:14:15,106 I think, you know, to your point, um, you know, finding ways to mitigate that situation is, going to be a desirable outcome for both the providers and the consumers of the 180 00:14:15,106 --> 00:14:16,120 technology. 181 00:14:16,120 --> 00:14:27,735 Yeah, I think so and I don't know what the motives are behind, you know for a mini and Microsoft Fies model, know, the smaller models that they're bringing out there'll be a 182 00:14:27,735 --> 00:14:31,967 reason and it may be only monetarily it may well be Energy consumption. 183 00:14:31,967 --> 00:14:32,688 I don't know. 184 00:14:32,688 --> 00:14:37,730 But if you get the likes of the big software companies talking about 185 00:14:37,730 --> 00:14:46,695 building nuclear power stations to generate the energy that's going to go into building their or rather powering their models. 186 00:14:46,695 --> 00:14:48,396 I think that says something. 187 00:14:48,756 --> 00:14:56,170 Just put it into context, we've been in need of a new nuclear power station near where I am for a long time and it takes our government an awfully long time to do it. 188 00:14:56,170 --> 00:15:02,764 So I think Google and Microsoft are going to build them a lot faster than our government's going to do it. 189 00:15:02,764 --> 00:15:06,882 So you do it because you need to and I suspect that 190 00:15:06,882 --> 00:15:13,238 doing that is not cheap and using smaller models and getting similarly good results out is the way to go. 191 00:15:13,459 --> 00:15:14,190 Yeah. 192 00:15:14,190 --> 00:15:18,683 Yeah, I know there needs to be some innovation in the nuclear world as well. 193 00:15:18,683 --> 00:15:28,851 mean, having a 15 year timeline and know, tens of billions of dollars is not, that's not a scalable approach and there has not been a tremendous amount. 194 00:15:28,851 --> 00:15:37,779 I'm not an expert in that field, but I've read up on it recently as a result of all this and found it really interesting and they are starting to come up with some new ways to 195 00:15:37,779 --> 00:15:40,771 approach this problem. 196 00:15:40,771 --> 00:15:43,032 And I think, 197 00:15:43,807 --> 00:15:56,082 And now there's a real motivation just because of the massive power consumption that, I don't know if you saw, did you see, Elon stood up a data center? 198 00:15:56,082 --> 00:16:10,888 I forget how many hundreds of thousands of Nvidia, GPUs, but he did it in something like 130 days and it, it's massive multi football field size data center that he, he stood up 199 00:16:10,888 --> 00:16:12,979 in like under four months. 200 00:16:13,350 --> 00:16:16,458 or maybe just over four months, but it was, uh, it's incredible. 201 00:16:16,458 --> 00:16:20,519 And where's all the power are going to come from in those scenarios. 202 00:16:20,586 --> 00:16:24,531 I don't know, but I think he's probably the kind of guy that will figure that out very quickly. 203 00:16:24,531 --> 00:16:30,357 And, you know, he's, he's, he's also figuring out the energy distribution system for cars. 204 00:16:30,357 --> 00:16:33,151 So literally more power to the guy, right? 205 00:16:33,151 --> 00:16:35,443 He's, he's, he's got all the knowledge. 206 00:16:35,744 --> 00:16:37,205 No, I had not heard that. 207 00:16:37,205 --> 00:16:41,469 But that doesn't surprise me with someone like him. 208 00:16:41,758 --> 00:16:48,530 There's a real cool, you, for those that are curious, there's a real cool YouTube video out there that walks you through the end product. 209 00:16:48,530 --> 00:16:52,502 And it's just like three months, four months, whatever the number was. 210 00:16:52,502 --> 00:16:54,183 It's, it's outstanding. 211 00:16:54,183 --> 00:17:04,442 Well, getting back to the AI and legal, um, you've been around AI and legal pre LLM and you've kind of seen the transition. 212 00:17:04,442 --> 00:17:09,709 I would imagine, you know, there were, there was some machine learning application. 213 00:17:10,037 --> 00:17:22,345 Um, applications that you were involved in and neural networks, like what the transition from those legacy AI models to LLMs people think it's almost like a association that we 214 00:17:22,345 --> 00:17:28,909 have now AI people think LLMs, but AI has existed in legal for much longer than two years. 215 00:17:28,909 --> 00:17:29,839 Correct. 216 00:17:30,126 --> 00:17:30,966 Absolutely. 217 00:17:30,966 --> 00:17:31,366 Yeah. 218 00:17:31,366 --> 00:17:31,566 Yeah. 219 00:17:31,566 --> 00:17:36,146 Well, when I joined the startup company, I was out of all to witness. 220 00:17:36,706 --> 00:17:43,366 Yeah, we there was no generative AI or if there was we weren't using it and it wasn't sort of it wasn't really available to us. 221 00:17:43,366 --> 00:17:51,166 That was that was labeled data and supervised learning sort of old old old school way of doing things. 222 00:17:51,246 --> 00:17:58,758 For me still very fascinating, very interesting and and I felt cutting edge, you know, because a lot of people just weren't doing it. 223 00:17:58,758 --> 00:18:10,161 we've all been talking about AI automation for a long time, but use of it, actual use day to day inside of firms, at least in my network, not happening that much. 224 00:18:10,161 --> 00:18:12,762 And it wasn't happening that much, a bit more nowadays. 225 00:18:13,082 --> 00:18:24,595 So yeah, I started out my AI journey labeling documents with a team of other people and sounds easy, but it wasn't, you it's not just highlight and select, select your category. 226 00:18:24,595 --> 00:18:26,934 It's, know, how am I going to think about 227 00:18:26,934 --> 00:18:37,147 cutting up this document in a way that means when I train the model, the model really knows what this paragraph is relating to, because some paragraphs relate to more than one 228 00:18:37,147 --> 00:18:37,957 thing. 229 00:18:38,478 --> 00:18:47,600 So there was whole taxonomies involved there, and it required quite a deep understanding of what you were doing to be able to use it, at least in my experience. 230 00:18:48,221 --> 00:18:53,042 And it was laborious, because to get a decent result, you needed, let's say, 231 00:18:53,302 --> 00:18:57,924 I don't know, a thousand things to do it really well with enough variety in there. 232 00:18:57,924 --> 00:19:00,045 So access to that stuff is hard. 233 00:19:00,045 --> 00:19:05,987 Getting a thousand things labeled as a minimum, I would say is, hard, time consuming and expensive. 234 00:19:05,987 --> 00:19:14,671 So when generative AI came along and you know, you could just effectively give a model a few keywords and you know, some, what do they call it? 235 00:19:14,671 --> 00:19:19,733 Semantics that it can just go and figure out what, what, what clause you're looking for. 236 00:19:19,873 --> 00:19:21,664 I just changed the landscape entirely. 237 00:19:21,664 --> 00:19:22,318 So. 238 00:19:22,318 --> 00:19:27,618 For me, I felt it was a bit of a shame to throw away some of the work that we've done. 239 00:19:27,618 --> 00:19:33,758 And I think a lot of firms probably could still leverage what they have done in the past. 240 00:19:33,958 --> 00:19:44,898 Some firms I know were doing it for a long time and have got a big, big backlog of labeled data that they can and in my view should use as long as they can do it in a cost-effective 241 00:19:44,898 --> 00:19:46,628 way, because it's great for retrieval. 242 00:19:46,628 --> 00:19:49,538 You can get really high levels of accuracy with it. 243 00:19:50,530 --> 00:19:54,493 But I suppose generatively, I created a bit more of a level playing field. 244 00:19:54,493 --> 00:20:03,839 And I don't know whether we think we may have talked about it briefly before, but it was new at the time Adelshield Goddard had done a report whereby they'd given their associates 245 00:20:03,939 --> 00:20:16,007 or selected people within the firm, effectively a prompt library that they could go and, you know, do retrieval jobs for corporate support kind of work where they would go out and 246 00:20:16,007 --> 00:20:20,290 find the nominated clauses that they decided to go and try and find. 247 00:20:20,780 --> 00:20:23,651 This is like super powered control F, right? 248 00:20:23,651 --> 00:20:27,972 They can go out and find all the clauses that they want. 249 00:20:28,012 --> 00:20:29,436 Really writing a few rules. 250 00:20:29,436 --> 00:20:32,013 I don't want to diminish the work they've done because it's incredible. 251 00:20:32,013 --> 00:20:36,594 And if people haven't read the report, it's worth a read. 252 00:20:36,875 --> 00:20:43,196 You now can catch up, I think, with a lot of these people who have been doing labeled data for the years. 253 00:20:43,196 --> 00:20:49,268 so don't throw it away, but maybe focus your efforts on things like that. 254 00:20:49,737 --> 00:20:51,298 Yeah, I've got the report. 255 00:20:51,298 --> 00:20:52,558 I think you shared it with me. 256 00:20:52,558 --> 00:20:53,849 It is interesting. 257 00:20:53,849 --> 00:21:00,353 It's 50 pages and I have not, I've just kind of skimmed, but it is very interesting. 258 00:21:00,353 --> 00:21:08,847 you know, one thing that you'd mentioned earlier, you talked about business versus practice of law use cases. 259 00:21:08,847 --> 00:21:14,230 And, you know, I have a pretty strong opinion on that as well. 260 00:21:14,230 --> 00:21:19,603 I really feel like law firms should be focused on an incremental 261 00:21:19,879 --> 00:21:25,114 strategy or an incremental implementation to an AI strategy. 262 00:21:25,114 --> 00:21:35,363 And I do feel like the cost benefit ratio or the risk reward, however you want to frame it up, on the business of law side, works out a little better at the moment. 263 00:21:35,363 --> 00:21:40,197 And on the risk side, within the practice of law world, you've got a number of issues. 264 00:21:40,197 --> 00:21:41,669 You've got privacy. 265 00:21:41,669 --> 00:21:45,912 You've got client restrictions on generative AI use. 266 00:21:46,789 --> 00:21:58,953 And I think probably the biggest risk that doesn't get talked about enough is lawyers have a very low tolerance for missteps and wasting their time and rolling something out before 267 00:21:58,953 --> 00:22:06,755 it really is battle tested and has a clear ROI and can let allow them to leverage time. 268 00:22:06,755 --> 00:22:08,475 I think is a big mistake. 269 00:22:08,675 --> 00:22:15,037 And, um, I've seen, I'm seeing it happen now, like with copilot, Microsoft copilot, for example, 270 00:22:15,177 --> 00:22:16,838 I'm not a fan at the moment. 271 00:22:16,838 --> 00:22:18,548 know that Microsoft will get it right. 272 00:22:18,548 --> 00:22:21,199 think right now it needs a lot of work. 273 00:22:21,199 --> 00:22:30,001 It's I mean just you know, really bizarre challenges or I guess limitations with with copilot. 274 00:22:30,001 --> 00:22:32,302 So copilot has no no memory. 275 00:22:32,382 --> 00:22:42,895 So you know, even though it has vast access to vast troves of your writing when you when you draft in copilot or word, it doesn't leverage any of that. 276 00:22:43,278 --> 00:22:46,553 you basically have to upload a style document every 277 00:22:46,553 --> 00:22:53,528 when you're drafting and all of your, know, it has a very basic rag implementation where you can leverage three documents. 278 00:22:53,528 --> 00:22:55,470 They all have to be in one drive. 279 00:22:55,470 --> 00:23:01,804 And when you upload them into one drive, sometimes it takes up to 24 hours for them to show up for you to access. 280 00:23:01,804 --> 00:23:06,497 You basically throw a backslash in there, or maybe it's a forward slash to leverage the document. 281 00:23:06,497 --> 00:23:07,948 It's just not an efficient model. 282 00:23:07,948 --> 00:23:12,693 know Microsoft's going to get it right, but this is in my opinion, a beta beta product. 283 00:23:12,693 --> 00:23:15,293 and they're charging $30 a month for it. 284 00:23:15,293 --> 00:23:20,273 And all the marketing is selling firms and they're, I'm seeing it. 285 00:23:20,273 --> 00:23:21,103 They're pushing it out. 286 00:23:21,103 --> 00:23:25,053 In fact, it might, I don't know if it's, I can't remember the name of the firm. 287 00:23:25,053 --> 00:23:25,853 There are a couple. 288 00:23:25,853 --> 00:23:28,233 Clifford chance is one I know for sure. 289 00:23:28,233 --> 00:23:30,873 They, they released a case study. 290 00:23:30,873 --> 00:23:34,273 I have a lot of questions about the numbers in there. 291 00:23:34,273 --> 00:23:38,741 Um, you know, I think it was kind of co, uh, it was put together in 292 00:23:38,741 --> 00:23:40,182 collaboration with Microsoft. 293 00:23:40,182 --> 00:23:43,546 So I don't know if those numbers are optimistic or realistic, but I don't know. 294 00:23:43,546 --> 00:23:50,673 What is your, what is your take on business versus practice of law and where to start and that sort of stuff. 295 00:23:51,650 --> 00:23:52,511 Yeah, it's a tough question. 296 00:23:52,511 --> 00:23:55,263 mean, well, you're a gym guy, right? 297 00:23:55,263 --> 00:24:00,117 So losing fat and building muscle at the same time is just sort of how I see it. 298 00:24:00,117 --> 00:24:02,398 Those two things are really hard. 299 00:24:03,560 --> 00:24:14,288 But I suspect that the management of firms is such that the, you can divide and conquer to a degree. 300 00:24:15,069 --> 00:24:21,234 And if there are savings to be had in the back office business support functions, then 301 00:24:21,742 --> 00:24:27,542 you can use those savings to leverage up and pay up on the front office support stuff. 302 00:24:27,542 --> 00:24:30,982 I agree with you in many ways on the copilot stuff. 303 00:24:30,982 --> 00:24:37,542 don't have an intimate knowledge of it myself to that extent of using it. 304 00:24:37,542 --> 00:24:44,202 Albeit, what I would say is that will come as a package, I'm sure, with what Microsoft offers. 305 00:24:44,382 --> 00:24:48,452 And there will be ways and means, I'm sure, of using it in the right kind of way. 306 00:24:48,452 --> 00:24:51,660 If it is of summarizing 307 00:24:51,688 --> 00:24:54,760 notes from meetings, that is useful, right? 308 00:24:55,621 --> 00:25:07,021 If you use it in such a way as you can engineer a series of small prompts that can generate a report for you that don't necessarily need a playbook sitting in the 309 00:25:07,021 --> 00:25:16,718 background, but you just ask a series of questions and chain them together of a document, and then you get a useful report out of it, that's a good use case, in my opinion. 310 00:25:17,319 --> 00:25:19,861 I'm sure there's plenty of people who could be doing on that. 311 00:25:21,014 --> 00:25:26,648 I guess I'm a little bit biased in that my personal preference is to try and the lawyers be more productive. 312 00:25:26,648 --> 00:25:28,139 That was my goal. 313 00:25:28,139 --> 00:25:32,382 IBM was certainly, we've done a lot of useful things in that space. 314 00:25:32,382 --> 00:25:36,876 We've done some projects with in-house legal as well. 315 00:25:36,876 --> 00:25:44,971 There was a case study we did with NatWest Bank, which is of the big banks over here in the UK where we help them ingest their own playbook. 316 00:25:46,032 --> 00:25:48,590 It was almost like a word plug-in where the 317 00:25:48,590 --> 00:25:56,710 model will read the playbook, it'll read the incoming clause, and it will make recommendations and all sorts of great stuff like that, like you can imagine. 318 00:25:57,390 --> 00:26:05,990 But we've been in international business machines, we've been working on the back office side of things for an awfully long time and whether that's the traditional model of 319 00:26:05,990 --> 00:26:10,990 outsourcing and now it's AI first business process outsourcing. 320 00:26:10,990 --> 00:26:18,224 So how can we move some work that is manual at the moment onto a model? 321 00:26:18,348 --> 00:26:24,622 That's an area that I think is really interesting and one I'm really keen to explore. 322 00:26:24,622 --> 00:26:37,959 You can imagine the potential use cases for things like generative AI in talent acquisition, the whole process of reviewing applications and arranging meetings and so on 323 00:26:37,959 --> 00:26:38,489 and so on. 324 00:26:38,489 --> 00:26:42,131 That's all well within the wheelhouse of what we have nowadays. 325 00:26:42,131 --> 00:26:46,253 Not all of it will be generative AI, of course, but a lot of it will be. 326 00:26:47,246 --> 00:26:52,146 I guess I see a lot of easy wins for the firms in the back office. 327 00:26:52,146 --> 00:27:02,726 And like your point earlier, you can't, I don't think too many of us are going to trust what the models produce straight out of the gate and send it to our client without it 328 00:27:02,726 --> 00:27:03,246 being checked. 329 00:27:03,246 --> 00:27:08,426 So there's always going to be that phrase of human in the loop for a while at least, right? 330 00:27:08,546 --> 00:27:15,849 It's great for an augmentation speeding up tool, but I see a lot of potential on the back office side of things. 331 00:27:15,849 --> 00:27:16,709 Yeah. 332 00:27:16,729 --> 00:27:24,395 Well, and that was one of the caveats in the Clifford chance study was it did a good job listing out some of the use cases. 333 00:27:24,395 --> 00:27:33,381 And one of them was summarization, but then it, the, the, you know, the asterisk was, but it, should still be manually reviewed. 334 00:27:33,381 --> 00:27:34,512 It's just like, wait a second. 335 00:27:34,512 --> 00:27:36,723 So, or something along those lines. 336 00:27:36,804 --> 00:27:39,055 And it's just like, you're not saving me any time. 337 00:27:39,055 --> 00:27:44,917 If I have to go read the entire thread because I can't trust the technology to summarize and capture the main points. 338 00:27:44,917 --> 00:27:48,217 then it's not helping me or it's helping me minimally. 339 00:27:48,217 --> 00:27:49,397 And don't get me wrong. 340 00:27:49,397 --> 00:27:53,507 There's, use AI 10, 20 times a day. 341 00:27:53,507 --> 00:28:09,757 So I find a lot of really valuable use for it where I think I run into challenges mentally getting to a place where, all right, how are we going to calculate ROI on a implementation 342 00:28:09,757 --> 00:28:11,507 of a platform? 343 00:28:11,507 --> 00:28:13,845 Well, it's got us on the timekeeper side. 344 00:28:13,845 --> 00:28:16,206 It's got to save them time, right? 345 00:28:16,326 --> 00:28:24,550 And if there's manual checking that has to go in, how does that impact that ROI equation? 346 00:28:25,010 --> 00:28:29,612 For drafting, again, this is not just a co-pilot. 347 00:28:29,612 --> 00:28:36,656 mean, just in general, I think that, yes, there will have to be some manual oversight. 348 00:28:36,656 --> 00:28:38,796 The human's in the loop, to your point. 349 00:28:40,049 --> 00:28:46,396 On the summarization side, again, I think that I use it for summarization quite frequently, but for low risk things, right? 350 00:28:46,396 --> 00:28:53,453 Like honestly, I'm going to stick that, um, that AG report in and have Claude summarize it for me. 351 00:28:53,453 --> 00:28:56,174 And if it misses a couple of points, it's not the end of the world. 352 00:28:56,174 --> 00:29:03,302 But if I'm, if I'm a client facing thread that, you know, deals with a important matter, I'm not going to trust AI to summarize it. 353 00:29:03,302 --> 00:29:04,453 I'm going to read it. 354 00:29:04,674 --> 00:29:05,464 Yeah, absolutely. 355 00:29:05,464 --> 00:29:16,519 And I think a lot of firms are looking, I think, for new ways of doing, know, how can we use AI to open up new work methodologies and new work possibilities? 356 00:29:16,720 --> 00:29:24,603 I suppose the ideal scenario is you have an AI which is perfect and your clients just plug in and start getting what they need. 357 00:29:24,624 --> 00:29:28,189 And you have that dream scenario where you get paid while you're sleeping. 358 00:29:28,189 --> 00:29:30,356 know, everybody wants a bit of that, I think. 359 00:29:30,914 --> 00:29:34,157 Well, you've a long way to go before we get there. 360 00:29:34,157 --> 00:29:37,389 These models, you're going to have to be really sure that it's right. 361 00:29:37,389 --> 00:29:46,626 There are bound to be regulatory issues that people are going to have to grapple with, some of which you can probably navigate in terms of conditions, but probably not all. 362 00:29:46,807 --> 00:29:56,814 I see, though, the current state as still useful having the human in the loop in that, depending on how you structure the way you use the models, you could... 363 00:29:57,036 --> 00:30:05,331 collect an awful lot of ground truth data, which these firms may have currently unstructured sitting in their iManage account or wherever right now. 364 00:30:05,392 --> 00:30:21,953 If you sort of move that to a new world of generative AI produced data, which you then validate or confirm is correct or wrong, you will over time build up quite a additional 365 00:30:21,953 --> 00:30:24,294 set of data against which you can quickly monitor. 366 00:30:24,294 --> 00:30:27,058 So when the models do improve and 367 00:30:27,058 --> 00:30:36,256 when workflows, et cetera, improve, if you've got the right governance in place that allows you to manage and monitor all of these different models, which people are 368 00:30:36,256 --> 00:30:42,310 eventually gonna build up to, then swapping in a better model should be simple. 369 00:30:42,571 --> 00:30:52,098 And then people may well get to a point where their accuracy levels are so high that they're happy to, I'd love some those to use the word risk it, but you know. 370 00:30:52,226 --> 00:30:57,427 But it's probably no more risky than a person, than a human being doing the work at a certain point. 371 00:30:57,427 --> 00:31:07,370 So I think if you get the governance right, that's going to be critical for a lot of firms, especially when they do start using a lot of agents, or sorry, rather, assistants. 372 00:31:07,370 --> 00:31:09,451 Maybe they will use a lot of agents too. 373 00:31:10,651 --> 00:31:16,572 Today it's possible, I think, that you can build up a lot of assistants that will do an awful lot of stuff for you. 374 00:31:16,913 --> 00:31:21,930 And although the time is not necessarily, the time saving is not necessarily what you hope for. 375 00:31:21,930 --> 00:31:24,254 It's not wasted in my view. 376 00:31:24,445 --> 00:31:25,305 Yeah. 377 00:31:25,325 --> 00:31:25,676 Yeah. 378 00:31:25,676 --> 00:31:33,342 And to be clear, it is blatantly obvious where the most bottom line impact is going to come from in terms of use cases. 379 00:31:33,342 --> 00:31:36,354 It clearly is on the practice of law side. 380 00:31:36,354 --> 00:31:49,523 The opportunity cost for time spent on anything other than delivering work product is obviously very high for a thousand dollar plus an hour timekeepers. 381 00:31:50,365 --> 00:31:51,187 just 382 00:31:51,187 --> 00:32:01,900 you know, having let's say KM for example, or marketing or finance, leveraging the tools, especially KM that's ultimately going to support the timekeepers in the, in, in probably 383 00:32:01,900 --> 00:32:13,433 either KM or innovation, designing the strategies, providing the support, having them familiar and in a place where they're using the technology every day seems, wise. 384 00:32:13,433 --> 00:32:19,845 But to your point, there are, there are, if you're looking for bottom line impact, it's on that side of the business. 385 00:32:20,927 --> 00:32:27,629 But you, you and I talked about like different segments, kind of like large, mid and small law. 386 00:32:27,629 --> 00:32:31,240 We can define that any way we want for me. 387 00:32:31,280 --> 00:32:42,123 When I think about it from a vendor perspective, like small law is anything a hundred attorneys and under again, everybody has different ways of, um, defining this mid law 388 00:32:42,123 --> 00:32:47,404 feels like a hundred to 500 attorneys and large law feels like 500 and up. 389 00:32:47,404 --> 00:32:51,045 Um, do you feel like there are different? 390 00:32:51,355 --> 00:32:59,399 value propositions in those different segments of the law firm world with respect to AI. 391 00:33:01,311 --> 00:33:03,061 Yeah, probably. 392 00:33:03,321 --> 00:33:14,163 Although I would, I was, I personally think a lot of the difference of value proposition is down to the work that they do, maybe more so than the size of the firm. 393 00:33:14,243 --> 00:33:20,965 I think we may have been talking about this in the context of, of workflow and how we think AI is going to improve workflow. 394 00:33:20,965 --> 00:33:28,146 And again, anecdotally, I've heard a lot of lawyers say, know what I do is so specialized to you, you can't stick a workflow on it. 395 00:33:29,102 --> 00:33:31,503 I would disagree with that to a large extent. 396 00:33:31,503 --> 00:33:37,304 Anything that can write down into a set of rules can be automated. 397 00:33:38,205 --> 00:33:45,317 I see, over here we have some parts of the legal industry, conveyancing, wheel writing, probate. 398 00:33:45,317 --> 00:33:48,608 A lot of that is relatively formulaic. 399 00:33:48,608 --> 00:33:50,088 It's process driven. 400 00:33:50,088 --> 00:33:54,369 To some degree, entry level sort of debt recovery litigation work. 401 00:33:54,389 --> 00:33:56,670 That is to a large extent. 402 00:33:57,176 --> 00:33:57,947 form-filling. 403 00:33:57,947 --> 00:34:02,410 It isn't always small firms that do those, it tends to be. 404 00:34:03,012 --> 00:34:08,716 I think they can get an awful lot out of old school AI automation products. 405 00:34:09,898 --> 00:34:18,305 The new generative AI stuff, I guess for now, is probably within the domain of the bigger firms. 406 00:34:19,727 --> 00:34:22,489 It's difficult to tell, to be perfectly honest with you, it's... 407 00:34:23,382 --> 00:34:31,447 I think the small firms can certainly benefit from generative AI, but whether they need it or not, I'm not convinced entirely. 408 00:34:32,068 --> 00:34:35,670 It just depends, I think, on how much they're following a formula. 409 00:34:35,989 --> 00:34:36,539 Yeah. 410 00:34:36,539 --> 00:34:44,029 Where I see the difference and maybe this is, this is subtle is that the clients that these different size firms serve. 411 00:34:44,029 --> 00:34:44,769 Right. 412 00:34:44,769 --> 00:34:55,849 So, you know, in the a hundred attorney and under in the small law space, for example, you have customers like my company and you know, we don't have outside council guidelines with 413 00:34:55,849 --> 00:35:01,745 restrictions about use on AI on our stuff and you know, big law and 414 00:35:01,745 --> 00:35:08,648 especially in the financial services world or really any firm that caters to heavily regulated industries. 415 00:35:08,869 --> 00:35:11,410 There's a lot that goes into that. 416 00:35:11,891 --> 00:35:17,933 So I feel like there's a ton of opportunity on the small, smaller end of the spectrum. 417 00:35:17,994 --> 00:35:25,098 And then conversely, you know, a small law firms not buying Harvey, right? 418 00:35:25,098 --> 00:35:27,479 They're not even in the target market. 419 00:35:27,479 --> 00:35:31,461 It's, they probably wouldn't even be able to get a demo. 420 00:35:31,586 --> 00:35:32,991 Correct, yeah. 421 00:35:33,462 --> 00:35:39,264 So they, but they do have access to, you know, um, some of the paid consumer tools out there. 422 00:35:39,264 --> 00:35:49,368 Obviously they have access to co-pilot and I feel like a smaller law firm as well could be, um, nimble in their, in their rollout, right? 423 00:35:49,368 --> 00:35:57,011 Big firms have to do things in very formally and, um, strategically. 424 00:35:57,011 --> 00:36:00,753 So yeah, I, it's interesting. 425 00:36:00,753 --> 00:36:01,505 Um, 426 00:36:01,505 --> 00:36:11,302 I think the clients that the law firms serve also maybe is going to have some influence until these tools get to a place that they're widely available to all ends of the 427 00:36:11,302 --> 00:36:12,492 spectrum. 428 00:36:12,685 --> 00:36:19,687 you know, the outside council guidelines aren't restrictive like they are in some cases now. 429 00:36:19,700 --> 00:36:21,060 Yeah, I think you're right. 430 00:36:21,060 --> 00:36:22,651 The clients are going to influence a lot. 431 00:36:22,651 --> 00:36:36,582 And funnily enough, I came across a very interesting case study internally not that long ago where we'd done a generative AI powered bot customer facing, it's probably not right 432 00:36:36,582 --> 00:36:40,956 to call it a bot, know, customer chat interface for banks. 433 00:36:40,956 --> 00:36:48,178 And we've done it for a few banks and some of the really big ones too, their customer complaints are dealt with largely through that. 434 00:36:48,178 --> 00:36:49,858 this pushes a lot of 435 00:36:50,170 --> 00:36:59,725 work away from, in our case, that's, you know, the legal people who would be very expensive when maybe you've got names and whatever it is, you know, you can do a lot more 436 00:36:59,725 --> 00:37:03,016 with a lot less in that sense, people get much faster responses. 437 00:37:03,016 --> 00:37:12,210 And I think a younger generation is going to be perfectly at ease dealing with a, you know, a chat interface, if they get the answer they want, as long as you can do it 438 00:37:12,210 --> 00:37:13,141 reliably. 439 00:37:13,141 --> 00:37:16,832 I've been thinking about how do I, how does that apply to legal? 440 00:37:16,896 --> 00:37:22,368 In my old world, there is no way that a lot of the clients I used to work for are going to be happy with that. 441 00:37:22,368 --> 00:37:28,836 They're going to email me or in my previous role and say, I want the answer to this, or I've got a new job for you for this. 442 00:37:29,177 --> 00:37:37,744 So it doesn't immediately translate, albeit to the point about the smaller businesses, a lot of them probably can do that now. 443 00:37:37,744 --> 00:37:40,757 What's the update on my house acquisition right now? 444 00:37:40,757 --> 00:37:43,269 A lot of people won't care who they're dealing with. 445 00:37:43,269 --> 00:37:44,770 They'll just want to know. 446 00:37:44,800 --> 00:37:46,791 why haven't I had an answer on this for a week? 447 00:37:46,791 --> 00:37:47,732 What's going on? 448 00:37:47,732 --> 00:37:50,991 And go, okay, are some things to work through there. 449 00:37:50,991 --> 00:37:54,325 But what do you give the model access to in order to give them the answer? 450 00:37:54,325 --> 00:37:58,017 Because I'm sure there'll be bits and pieces of information you won't want to expose. 451 00:37:58,017 --> 00:38:05,781 Again, it's a sort of make sure you dot the I's and cross the T's and your governance is all done correctly. 452 00:38:05,982 --> 00:38:11,305 But actually inside of Big Law 2, I think you can apply that maybe to the lawyers. 453 00:38:11,305 --> 00:38:14,126 If you have visited the lawyer, 454 00:38:14,248 --> 00:38:17,040 as a client and your back office function. 455 00:38:17,040 --> 00:38:18,320 And we do this internally. 456 00:38:18,320 --> 00:38:19,431 We call it client zero. 457 00:38:19,431 --> 00:38:22,263 You know, we, do everything to ourselves first. 458 00:38:22,263 --> 00:38:30,907 So we have a, uh, an ask IBM system where if I need something from HR or it, I just ask through the system. 459 00:38:30,907 --> 00:38:34,069 And by and large, I get the answer without bothering anyone. 460 00:38:34,069 --> 00:38:40,473 So I think there's that kind of thing could be rolled out in different ways across large and small. 461 00:38:40,473 --> 00:38:43,134 Um, at least that's my, my hope. 462 00:38:43,571 --> 00:38:44,181 Yeah. 463 00:38:44,181 --> 00:38:45,292 Now that makes sense. 464 00:38:45,292 --> 00:38:53,935 you know, so we have rolled out, they are probably maybe just over the mid-law threshold, a firm. 465 00:38:53,935 --> 00:38:59,037 so we're an intranet extranet company and we work exclusively with law firms. 466 00:38:59,037 --> 00:39:05,490 don't have any customers outside of the law firm world, not even on the inside council side of the table. 467 00:39:05,490 --> 00:39:13,133 And, um, one of our clients, we built a chat bot internal facing where they can ask policy questions. 468 00:39:13,929 --> 00:39:15,574 into a chat interface. 469 00:39:15,574 --> 00:39:16,175 intranet. 470 00:39:16,175 --> 00:39:23,879 So this could be things about what is there, how many, how much time left do they have via their PTO allocation? 471 00:39:23,879 --> 00:39:30,703 What is their ethical threshold for, you know, um, vendor gifting? 472 00:39:30,703 --> 00:39:34,766 What is their laptop reimbursement policy, any policy question? 473 00:39:34,766 --> 00:39:35,456 And you know what? 474 00:39:35,456 --> 00:39:37,567 It's gone over really well. 475 00:39:37,567 --> 00:39:40,408 Um, it has internally, 476 00:39:41,697 --> 00:39:42,348 we're finding. 477 00:39:42,348 --> 00:39:48,373 So this system is about maybe three months deployed and they can't wait to increase the scope. 478 00:39:48,373 --> 00:40:00,573 They're taking an incremental strategy to this, but even busy lawyers who again have a low tolerance for BS and talking to a chat bot, they found they're getting really good 479 00:40:00,573 --> 00:40:01,773 adoption. 480 00:40:02,054 --> 00:40:08,830 I think the key there is this is a highly curated dataset and the performance is excellent. 481 00:40:08,830 --> 00:40:10,741 Like you get back good answers. 482 00:40:10,741 --> 00:40:11,721 because it's been tested. 483 00:40:11,721 --> 00:40:13,944 It's a small corpus of data. 484 00:40:13,944 --> 00:40:23,313 We've been able to really, well, they've done the testing to make sure that when questions get answered, you know, and they got a little thumbs up, thumbs down. 485 00:40:23,313 --> 00:40:33,111 So someone can rate the response and they dig in and they do the work when, when they get a thumbs down, they figure out why and how can they, how can they do better next time? 486 00:40:33,111 --> 00:40:37,520 So I think there's, there's real opportunity for that in legal. 487 00:40:37,520 --> 00:40:38,100 Absolutely. 488 00:40:38,100 --> 00:40:46,465 I mean, I couldn't agree more if I was if I was a CEO of a big law firm, I think I'd be saying where can I apply this in a very safe environment? 489 00:40:46,625 --> 00:40:53,158 is it does matter if they get it wrong because you'll you'll annoy your internal people who you're trying to keep happy and recruitment. 490 00:40:53,158 --> 00:40:55,370 It's hard enough and you don't want to make it worse. 491 00:40:55,370 --> 00:40:56,130 But 492 00:40:56,642 --> 00:40:58,644 I think it's a big, big opportunity. 493 00:40:58,644 --> 00:41:07,782 And also for the lawyers out there who unfortunately every now and again still have to manually print their billing guides and then walk it around to the partner and decide it, 494 00:41:07,782 --> 00:41:09,073 et cetera, et cetera. 495 00:41:09,073 --> 00:41:16,759 There's a whole bunch of process there that could be looked at and automated and just improved significantly. 496 00:41:16,759 --> 00:41:19,942 And you then get an awful lot of time back. 497 00:41:19,942 --> 00:41:25,516 So to your point about searching for something, mean, if somebody's got to go onto an intranet site manually, 498 00:41:25,560 --> 00:41:28,871 try and find it, I even locating the right document can be hard. 499 00:41:28,871 --> 00:41:35,474 And a lot of policies as a person who hasn't written too many of them, they all look and sound the same. 500 00:41:35,474 --> 00:41:39,586 And I don't want to have to read through it to know where I'm going. 501 00:41:39,586 --> 00:41:46,318 I mean, when I had my first child, it took me forever to figure out how much paternity leave I was going to get. 502 00:41:46,979 --> 00:41:54,922 And that's an hour or whatever, maybe an hour and a half of billing time that I lost because I was too busy trying to figure out all. 503 00:41:55,106 --> 00:41:58,368 What am I gonna do when this child arrives? 504 00:41:58,609 --> 00:42:03,813 And it really should have been a simple type it into a chat interface, know, what do I do about my first child? 505 00:42:03,813 --> 00:42:05,795 And it presumably told me. 506 00:42:05,795 --> 00:42:08,610 So I think you've hit the nail on the head. 507 00:42:08,610 --> 00:42:08,870 Yeah. 508 00:42:08,870 --> 00:42:10,531 I think it's going to be really interesting. 509 00:42:10,531 --> 00:42:12,952 And again, that's kind of an internal facing. 510 00:42:12,952 --> 00:42:15,553 I'll call, I'll still call that a business of law. 511 00:42:15,553 --> 00:42:18,594 It touches the timekeepers, but it's a business of law use case. 512 00:42:18,594 --> 00:42:28,158 Um, what, what about the, uh, you and I talked about the lone wolf mindset of lawyers and its impact on the technology implementation. 513 00:42:28,158 --> 00:42:32,820 I mean, this is a, this is a, it's a well documented, um, you know, Dr. 514 00:42:32,820 --> 00:42:34,771 Larry Richard has done. 515 00:42:35,701 --> 00:42:48,180 studied tens of thousands of lawyers and his book, Lawyer Brain, he talks about how he ranks lawyers on several personality traits, one of which is autonomy, and they are off 516 00:42:48,180 --> 00:42:48,820 the chart. 517 00:42:48,820 --> 00:42:50,121 I forget what the number is. 518 00:42:50,121 --> 00:42:54,334 I think it's like, you know, in the 70, 80 percentile, whatever the number. 519 00:42:54,334 --> 00:42:56,456 So they kind of have this lone wolf mentality. 520 00:42:56,456 --> 00:43:04,161 How is there, how do you feel that that impacts the, you know, um, 521 00:43:04,667 --> 00:43:09,904 impacts technology implementation, especially when it comes to some of the stuff we're talking about here. 522 00:43:09,904 --> 00:43:12,391 Do think there's an impact or no? 523 00:43:14,326 --> 00:43:17,609 I'll have to read those books, but which I've not. 524 00:43:17,609 --> 00:43:23,213 But my gut instinct is yes, there's an impact. 525 00:43:25,336 --> 00:43:36,295 Even though law firms, in many ways, are big groups of partners, the way I look at it is very often you have a few really, really big partners that will make a lot of decisions 526 00:43:36,295 --> 00:43:37,065 and 527 00:43:38,304 --> 00:43:42,316 and they may effectively operating their own firm within a firm. 528 00:43:42,316 --> 00:43:44,887 And a lot of firms are structured that way, actually, frankly, aren't they? 529 00:43:44,887 --> 00:43:53,640 Let's, you know, based out of Switzerland and all these different varines that are underneath them, or in some cases, they're somewhat like a franchise. 530 00:43:53,640 --> 00:43:58,142 So I think it can only, it must be true. 531 00:43:58,942 --> 00:44:00,323 And I think it's a bit of a shame. 532 00:44:00,323 --> 00:44:07,806 And I suppose when you are faced with a decision that I, 533 00:44:07,950 --> 00:44:18,320 you know, I could take home a million dollars this year, or a million pounds in my case, not my personal case, that would be nice, yeah, or 900,000. 534 00:44:18,320 --> 00:44:19,480 I'll take the million. 535 00:44:19,480 --> 00:44:30,510 And I know I'm exaggerating the numbers a bit, but the idea of spending money on something which might help me five years down the line, maybe not that long, but it's gonna take a 536 00:44:30,510 --> 00:44:32,030 bit of time to play out. 537 00:44:32,030 --> 00:44:33,270 Maybe I won't do that. 538 00:44:33,270 --> 00:44:35,928 And I don't think, I think a lot of firms have come a long way. 539 00:44:35,928 --> 00:44:41,632 They've set up innovation teams, they've done a lot of good stuff to recognize the need to invest in the future. 540 00:44:41,632 --> 00:44:49,928 also people living longer, partners hang around longer, they make partner early nowadays and maybe they see the value in future investment. 541 00:44:49,928 --> 00:44:54,371 But yeah, there's definitely somewhat of a lone wolf mentality going on, I think. 542 00:44:54,371 --> 00:45:05,068 And I think you can probably point to, again, going back anecdotally, you hear stories about things being a really good fit. 543 00:45:05,068 --> 00:45:10,914 and have been tested and gone through various layers of approval and then all of a sudden certain things are no longer approved. 544 00:45:10,914 --> 00:45:17,010 And I think that's probably down to some people saying, I just don't see the advantage to this kind of thing for me. 545 00:45:17,010 --> 00:45:18,451 So let's not do it. 546 00:45:18,772 --> 00:45:20,713 I've heard stories along those lines. 547 00:45:22,069 --> 00:45:22,729 Yeah. 548 00:45:22,729 --> 00:45:23,049 Yeah. 549 00:45:23,049 --> 00:45:32,309 And your point about kind of the power structure in big law is, I think is also interesting. 550 00:45:32,309 --> 00:45:40,289 know, a lot of people rise through the ranks in law firm leadership because they're the best at lawyering. 551 00:45:40,289 --> 00:45:40,889 You know what I mean? 552 00:45:40,889 --> 00:45:46,429 As opposed to being the best leader or being the most capable person to sit in that leadership seat. 553 00:45:46,429 --> 00:45:50,604 And then you also have another dynamic of, you know, retirement horizon. 554 00:45:50,604 --> 00:45:51,027 Yeah. 555 00:45:51,027 --> 00:45:58,939 you know, how close, because most of those partners, their, retirement horizon is in sight. 556 00:45:58,939 --> 00:46:05,181 So if it's three years and the break even on a project is five, am I going to vote to no, I'm not. 557 00:46:05,782 --> 00:46:15,275 you know, it's, it's, know, and law firms operate on a cash basis and capital expenditures are, um, don't really fit into that model. 558 00:46:15,275 --> 00:46:16,165 So 559 00:46:16,403 --> 00:46:19,107 Yeah, well this has been a really good conversation. 560 00:46:19,107 --> 00:46:22,582 Did you have some thoughts on that before we wrap up? 561 00:46:22,582 --> 00:46:32,322 only gonna say I just the final point for me is I think a lot of firms have done really well as I just just to set up innovation teams and hubs, allocate money like we do, right? 562 00:46:32,322 --> 00:46:36,292 We put our money into our pension, we never see it, it just is there for us for a rainy day. 563 00:46:36,292 --> 00:46:45,762 And I think a lot of firms have embraced that and, and, good on them for doing so because they will need to let's let's be honest, that we know that if you don't invest in the way 564 00:46:45,762 --> 00:46:50,278 you do your business in future, it's gonna start failing against the competitors that do so. 565 00:46:51,106 --> 00:46:52,836 That's it, really. 566 00:46:52,841 --> 00:47:02,325 Yeah, I, I, you know, there is a lot of, uh, real work in innovation and real investment in innovation in legal. 567 00:47:02,325 --> 00:47:05,706 But I would say again, this is anecdotal. 568 00:47:05,706 --> 00:47:07,347 There's no way to measure this. 569 00:47:07,347 --> 00:47:09,618 There's probably there. 570 00:47:09,618 --> 00:47:10,869 Well, I'll say it like this. 571 00:47:10,869 --> 00:47:13,710 There's a significant amount of innovation theater as well. 572 00:47:13,710 --> 00:47:21,877 Um, you know, at least in the U S there's people, you know, who I know I've been selling into the legal or in the KM. 573 00:47:21,877 --> 00:47:25,947 space for a long time before the word innovation existed as a role. 574 00:47:25,947 --> 00:47:34,377 And then all of a sudden I start seeing friends of mine, you know, who are in KM on all of a sudden instead of the CKO, they're the CK IO. 575 00:47:34,377 --> 00:47:37,237 And I reach out and Hey, how, has your role changed? 576 00:47:37,237 --> 00:47:40,297 And you know, it, it, hasn't. 577 00:47:40,577 --> 00:47:45,587 they want to, you know, they want to create the appearance of innovation, right? 578 00:47:45,587 --> 00:47:48,557 Cause they're paired, their clients want them to be more innovative. 579 00:47:48,557 --> 00:47:50,483 They want them to adopt. 580 00:47:50,803 --> 00:47:54,375 new innovative ways of solving their problems. 581 00:47:55,557 --> 00:48:00,320 So yeah, there is, but again, not to downplay, you're absolutely correct. 582 00:48:00,320 --> 00:48:04,323 I know really good innovation teams out there and there's plenty of them. 583 00:48:04,323 --> 00:48:12,989 It's just sometimes firms are taking the easy route of slapping innovation on some titles and calling it a day. 584 00:48:13,560 --> 00:48:23,266 Yeah, I think a lot of them have got an opportunity to buy stuff in and I think it's a full time job just reading the legal press, trying to keep on top of what's out there. 585 00:48:23,266 --> 00:48:30,660 There's a whole lot of people coming out with GBT rappers that, you know, pretend to do something. 586 00:48:30,860 --> 00:48:33,201 in many cases, they will do great things. 587 00:48:33,201 --> 00:48:35,423 In many cases, they will do average things. 588 00:48:35,423 --> 00:48:40,185 But if you're in that role, you really got to have a look at everything. 589 00:48:41,046 --> 00:48:44,148 So that is a challenging, challenging job for sure. 590 00:48:44,148 --> 00:48:51,753 And I can see why, you know, I think some firms really they've got, they've bought into it big, you know, they, they've appointed a new CIO, right? 591 00:48:51,753 --> 00:48:57,717 Like you said, it's a chief innovation officer right now, rather than just an information officer. 592 00:48:57,717 --> 00:49:06,203 And that's that's a big spend and a big commitment and, and, and, and probably a necessary one with how much stuff there is out there to do. 593 00:49:06,203 --> 00:49:10,744 But from my side, I'm, I'm, you know, I'm hopeful that people will 594 00:49:10,744 --> 00:49:12,266 Try different things. 595 00:49:12,266 --> 00:49:14,668 They'll do some innovation work themselves in-house. 596 00:49:14,668 --> 00:49:17,581 They'll have some people that can work with technology providers like me. 597 00:49:17,581 --> 00:49:18,942 That's what I'm here for. 598 00:49:19,644 --> 00:49:21,406 Use me for scaling up stuff up. 599 00:49:21,406 --> 00:49:28,724 You know, when you get something that works and looks good, come and talk to me and I'll try and find the right people to say, well, will it accelerate the growth of that that's 600 00:49:28,724 --> 00:49:29,134 working? 601 00:49:29,134 --> 00:49:31,636 And you know, if there's anything that's not working, ditch it. 602 00:49:31,967 --> 00:49:32,298 Yeah. 603 00:49:32,298 --> 00:49:34,834 Well, that's a good, that's a good way to kind of tie a bow on this. 604 00:49:34,834 --> 00:49:41,831 How do, how do people find out more about, um, IBM's offering and what you do. 605 00:49:41,888 --> 00:49:44,609 Yeah, well, there is a lot of offering, right? 606 00:49:44,609 --> 00:49:48,281 So best thing to do is probably just to message me. 607 00:49:48,281 --> 00:49:50,422 LinkedIn is the right place, I suspect. 608 00:49:50,422 --> 00:49:53,133 A lot of people get my name wrong. 609 00:49:53,133 --> 00:49:58,485 It's NEI2Ls for Neil, unusual, but I can't help that. 610 00:49:59,226 --> 00:50:01,407 So it's Neil Pemberton on LinkedIn. 611 00:50:01,947 --> 00:50:03,948 Just Google the name, you'll find it. 612 00:50:04,028 --> 00:50:06,640 And have a look around the IBM website. 613 00:50:06,640 --> 00:50:07,864 There is a whole... 614 00:50:07,864 --> 00:50:09,415 treasure trove of information on that. 615 00:50:09,415 --> 00:50:15,558 And as I said, there's a lot of open source stuff so people can go and try and just see what it's like. 616 00:50:15,558 --> 00:50:26,604 And we talked a little bit earlier, there's a lot of YouTube videos that IBM do as well that will explain all kinds of different things. 617 00:50:26,604 --> 00:50:28,645 We didn't even talk about agents today. 618 00:50:28,645 --> 00:50:33,588 We could do a whole session on something like that and the applicability of agents to legal work. 619 00:50:33,588 --> 00:50:35,029 Another conversation. 620 00:50:35,561 --> 00:50:36,355 Yeah. 621 00:50:37,132 --> 00:50:39,437 LinkedIn, IBM website, YouTube. 622 00:50:39,437 --> 00:50:41,353 I think those are probably good places to go. 623 00:50:41,353 --> 00:50:41,583 Yeah. 624 00:50:41,583 --> 00:50:47,637 And we'll, we'll, we'll post links in the show notes to help, help guide people in the right direction. 625 00:50:47,637 --> 00:50:58,373 And yeah, it would be a good, I would, I would love to, to stay in touch and maybe have you on sometime in the future to talk about, you know, some of the new work that, you 626 00:50:58,373 --> 00:51:05,087 know, big vendors and, and, leaders, technology leading companies like IBM are doing in this space. 627 00:51:05,087 --> 00:51:07,989 So, um, let's keep in touch. 628 00:51:08,444 --> 00:51:09,605 Yeah, we'll do. 629 00:51:09,703 --> 00:51:11,557 Awesome, well, I appreciate your time here. 630 00:51:11,557 --> 00:51:14,440 Have a great weekend and we will chat again soon. 631 00:51:15,604 --> 00:51:16,965 Alright, thanks Neil. -->

Subscribe

Stay up on the latest innovations in legal technology and knowledge management.