Dr. Anna Popowicz-Pazdej and Maz Araghrez

In this episode, Ted sits down with Dr. Anna Popowicz-Pazdej, Global Senior Data Privacy Lawyer at Dentons, and Maz Araghrez, Director of IT at Dentons, to discuss how AI is reshaping law firm operating models, governance, and risk management. From rethinking ROI in early-stage AI adoption to addressing cybersecurity and data protection at scale, Anna and Maz share their combined expertise across privacy, technology, and firm-wide transformation. Grounded in real-world experience, this conversation challenges law firms to move beyond experimentation and build sustainable, secure foundations for AI-enabled legal practice.

In this episode, Dr. Anna Popowicz-Pazdej and Maz Araghrez share insights on how to:

  • Rethink traditional law firm operating models in the age of AI
  • Approach AI investment with a long-term transformation mindset rather than short-term ROI
  • Build governance structures before scaling AI and new technologies
  • Strengthen cybersecurity and data protection as AI adoption increases
  • Foster collaboration between legal, IT, and leadership teams to drive meaningful change

Key takeaways:

  • ROI is often the wrong metric in the early stages of AI adoption
  • Governance and operating models must evolve before technology can scale
  • Cybersecurity and risk management are foundational, not optional
  • Collaboration across disciplines is essential for successful legal transformation
  • Law firms are data rich but often lack the structures to use that data safely and effectively

About the guest, Dr. Anna Popowicz-Pazdej

Dr. Anna Popowicz-Pazdej is a Global Senior Data Protection Lawyer at Dentons, holding a PhD in privacy and data protection and certified as a CIPP/E privacy expert. She advises on complex data protection matters and lectures extensively on privacy, cybersecurity, artificial intelligence, and IT technologies, combining deep legal expertise with a strong understanding of emerging tech. A frequent international speaker and member of the International Neural Network Society, Anna is recognized as a rising leader at the intersection of AI, privacy, and regulation.

With the usage of technological advancement, with cloud, with processing, with usage of AI, law firms really need to think about security because more and more often, fines will be imposed on the law firms.

About the guest, Maz Araghrez

Maz Araghrez is a business technology consultant with deep experience delivering technology transformation and service improvement programs in large, complex legal environments. He works closely with law firms to modernize operations, strengthen cross-functional collaboration, and align technology strategy with business objectives.

I would urge anyone who wants to take some of this seriously and practically, Google Operator Model Canvas and just Google Business Model Canvas and please start with those two things when you go from transformation, improvement, AI adoption, whatever it may be, you don’t have to go deep. One page plan is better than no plan. Start there.

Connect with our guests:

Subscribe for Updates

Newsletter Pop-up

Newsletter Pop-up

Machine Generated Episode Transcript

1 00:00:02,582 --> 00:00:05,153 Maz, Anna, how are you today? 2 00:00:06,876 --> 00:00:08,899 All right, Ted, very well, thank you. 3 00:00:09,858 --> 00:00:12,451 Good, excellent. 4 00:00:12,451 --> 00:00:14,354 this episode's been a long time in the making. 5 00:00:14,354 --> 00:00:20,332 I think we first started having conversations about this shortly after Ilt. 6 00:00:20,332 --> 00:00:22,314 It's been a couple months, right? 7 00:00:23,504 --> 00:00:25,555 Good things come to those who wait, Ted, right? 8 00:00:25,555 --> 00:00:28,005 oh 9 00:00:28,005 --> 00:00:28,986 it's a great agenda. 10 00:00:28,986 --> 00:00:31,349 I know it's going to be a really good conversation. 11 00:00:31,430 --> 00:00:39,200 But before we jump into the agenda, let's just do some quick introductions, kind of who you are, what you do, where you do it. 12 00:00:39,261 --> 00:00:40,852 Maz, why don't we start with you? 13 00:00:41,040 --> 00:00:41,680 Yeah, thank you, Ted. 14 00:00:41,680 --> 00:00:43,001 Appreciate it. 15 00:00:43,001 --> 00:00:45,003 I am a technology consultant. 16 00:00:45,003 --> 00:00:53,569 I've got experience in delivering technology transformation and generally service improvement programs in typically large and complex legal environments. 17 00:00:53,589 --> 00:00:59,373 I typically advise firms on modernization operations and strengthen and sort of the cross-functional performance. 18 00:00:59,513 --> 00:01:03,216 I am pivoting towards the SaaS space in January 2026. 19 00:01:03,216 --> 00:01:07,249 I'm joining the startup to help build up the Canadian capability. 20 00:01:07,249 --> 00:01:09,570 So I'm really excited for that. 21 00:01:10,615 --> 00:01:11,923 Awesome, Anna? 22 00:01:12,340 --> 00:01:23,069 So I'm a global privacy lawyer uh engaged in various privacy matters, but on a daily basis, I'm onboarding vendors, basically analyzing contracts. 23 00:01:23,249 --> 00:01:29,814 Particularly, I'm focusing on onboarding uh AI models, AI-related vendors. 24 00:01:30,075 --> 00:01:35,982 So on a daily basis, working with different teams, IT, security, and... 25 00:01:35,982 --> 00:01:48,978 so that my advice is the best one and to the point analyzing data flows, architecture diagrams, really to take into account all the security related knowledge as well. 26 00:01:49,259 --> 00:01:59,924 Because apart from working at Dentons, I'm also a PhD providing lectures on IT technology and cybersecurity for lawyers at my university in Poland. 27 00:01:59,924 --> 00:02:05,740 So basically I'm trying on a daily basis combining practical and theoretical knowledge to provide the best. 28 00:02:05,740 --> 00:02:06,994 advice again. 29 00:02:07,544 --> 00:02:13,797 Well, there's a lot of work to do with respect to InfoSec and AI. 30 00:02:13,978 --> 00:02:26,405 I spent years in risk management roles at Bank of America, both in corporate audit, consumer risk management. 31 00:02:26,625 --> 00:02:32,429 I was in compliance, anti-money laundering, and we always had that risk management lens. 32 00:02:32,429 --> 00:02:35,110 And when I look at AI through that lens, 33 00:02:35,392 --> 00:02:41,485 I see lots of opportunities um that need to be addressed. 34 00:02:41,485 --> 00:02:43,146 I'm sure you have your work cut out for you. 35 00:02:43,146 --> 00:02:53,160 um Well, the last time that we spoke and we were kind of working through the agenda, you guys had mentioned, and you used to work together at Denton's, correct? 36 00:02:55,451 --> 00:02:59,244 Yes, I've been to October of this year. 37 00:02:59,244 --> 00:03:00,154 OK. 38 00:03:00,235 --> 00:03:16,302 And um you had mentioned the target operating models for legal transformation and how one of the things we discussed was how technology alone doesn't equal transformation. 39 00:03:16,423 --> 00:03:23,656 Maz, why don't you kick us off with what you meant by that and um your general take on that topic. 40 00:03:24,633 --> 00:03:25,443 Yes, appreciate it. 41 00:03:25,443 --> 00:03:28,394 Thank you, Ted. 42 00:03:28,394 --> 00:03:31,915 think so law firms face a mismatch, right? 43 00:03:31,915 --> 00:03:43,008 So their operating model was designed and developed when success meant leveraging junior associates to really do repetitive work and then build by the hour. 44 00:03:43,588 --> 00:03:46,809 But today, client the client demands are shifting and changing. 45 00:03:46,809 --> 00:03:53,301 Technologies put in pressure on firms and alternative legal providers are continuing to grow. 46 00:03:54,374 --> 00:04:01,446 Competition is coming from those who are redesigning the operator model from scratch. 47 00:04:02,127 --> 00:04:07,609 Now the way that most firms respond to that is by hiring a person, is to put a body behind it. 48 00:04:07,769 --> 00:04:14,472 For example, since we brought the topic up, the role of chief AI officer or a director of AI, et cetera. 49 00:04:15,312 --> 00:04:23,455 I argue that you cannot modernize a 20th century operator model by bolting on 21st century tools. 50 00:04:25,103 --> 00:04:29,205 There was a really interesting uh piece of content I engaged on LinkedIn just yesterday. 51 00:04:29,205 --> 00:04:40,242 Somebody read an article about the CSO role slowly dying away in some industries, in some organizations, much like the data officer role did. 52 00:04:40,242 --> 00:04:40,893 Why? 53 00:04:40,893 --> 00:04:46,716 Primarily because it was hard to measure ROI and demonstrate that ROI. 54 00:04:47,076 --> 00:04:49,398 Before joining the school today, I'd like to read it out. 55 00:04:49,398 --> 00:04:51,929 Actually, since we're on this topic, I Googled 56 00:04:52,239 --> 00:04:55,901 What KPIs and responsibilities of the chief AI officer have? 57 00:04:56,822 --> 00:04:58,984 I'll just briefly touch on that. 58 00:04:58,984 --> 00:05:06,128 Responsible for overseeing enterprise-wide AI strategy governance and implementation to drive innovation and business value. 59 00:05:06,569 --> 00:05:07,549 Great. 60 00:05:07,710 --> 00:05:14,514 KPIs, return on investment, revenue growth, cost savings, time to market, et cetera. 61 00:05:14,514 --> 00:05:16,636 You can put any role under that description. 62 00:05:16,636 --> 00:05:22,219 The CEO role, the CFO role, the data officer role, the technology guy, whatever. 63 00:05:22,583 --> 00:05:24,944 It is just generic stuff. 64 00:05:25,865 --> 00:05:28,346 You are not going to get competitive. 65 00:05:28,346 --> 00:05:30,026 You are not going to win in the market. 66 00:05:30,026 --> 00:05:31,457 You don't go to differentiate yourself. 67 00:05:31,457 --> 00:05:36,429 You're not going to be successful if all we do is put in a role behind the potential problem. 68 00:05:36,429 --> 00:05:46,603 And so what I really like to see, I guess, is for law firms to think about the operator model as it engages the market, because that's the only way that they can become 69 00:05:46,603 --> 00:05:50,455 competitive and crucially understand what an operator model means. 70 00:05:50,902 --> 00:05:51,742 Yeah. 71 00:05:51,742 --> 00:05:58,684 So you know what's interesting is I've been beating the drum on this whole ROI conversation around AI. 72 00:05:58,944 --> 00:06:15,219 I've got, I've got a take on this that may be a little controversial, but I think that in the early stages of this AI transformation, ROI is really the wrong metric to really even 73 00:06:15,219 --> 00:06:17,190 be having conversations about at this point. 74 00:06:17,190 --> 00:06:18,650 And the reason is, 75 00:06:19,820 --> 00:06:23,092 The ROI in the early stages is learning. 76 00:06:23,673 --> 00:06:26,754 It's really hard to put that into a spreadsheet. 77 00:06:26,754 --> 00:06:42,204 And if you over-index on generating ROI that has a numerical value at the bottom of your spreadsheet, you're not going to be able to articulate a way to get to a positive ROI 78 00:06:42,204 --> 00:06:43,145 today. 79 00:06:43,145 --> 00:06:45,406 So you really need to think about this. 80 00:06:46,958 --> 00:06:51,618 this transformation and the steps that you need to take as an R and D exercise. 81 00:06:51,858 --> 00:07:04,778 You know, when Google allocated 20 % time, which was basically in the early 2000s, they allowed their engineering team to take one day a week and just do a pet project, something 82 00:07:04,778 --> 00:07:09,478 that could, and Gmail, all these things came out of that process. 83 00:07:09,478 --> 00:07:13,278 But if someone, if an executive were to go, well, what's the ROI on that? 84 00:07:13,278 --> 00:07:14,398 Well, it's zero. 85 00:07:14,398 --> 00:07:16,086 It's actually negative. 86 00:07:16,086 --> 00:07:25,033 And that's okay because this is an R and D exercise and we need to be thinking three, four, five, 10 years down the road. 87 00:07:25,414 --> 00:07:26,790 And, but you know what? 88 00:07:26,790 --> 00:07:30,158 I see firms all the time that are going, what's the ROI? 89 00:07:30,158 --> 00:07:35,903 And you know, the reality is AI may come in and actually displace some of your revenue. 90 00:07:35,903 --> 00:07:39,326 So you're going to have a negative number in that ROI column. 91 00:07:39,326 --> 00:07:45,150 But if you don't, if you don't take action, there's Coney, the cost of not investing. 92 00:07:45,440 --> 00:07:59,684 And that you could potentially put a real value to um because the enterprise value of your law firm is going to go to zero when we get to a place um however many years down the road 93 00:07:59,684 --> 00:08:06,296 where AI is delivering a significant portion of or managing a significant portion of legal service delivery. 94 00:08:06,296 --> 00:08:08,697 So, Anna, I don't know. 95 00:08:08,697 --> 00:08:10,207 What is your take on this? 96 00:08:10,207 --> 00:08:13,428 Are you aligned or do you see this differently? 97 00:08:13,952 --> 00:08:26,509 align with you, Tadej, because uh when you are starting investing in AI, you need to first think about important things like the training and then maybe measure just an 98 00:08:26,509 --> 00:08:36,604 effectiveness by taking maybe some samples of a couple of lawyers, people that are really using effectively these tools. 99 00:08:36,665 --> 00:08:41,197 And then maybe try to actually adjust certain metrics. 100 00:08:41,197 --> 00:08:45,050 but not basically ROE for a couple of years as you mentioned. 101 00:08:45,050 --> 00:08:50,024 That will be very visible as you pointedly highlighted. 102 00:08:50,024 --> 00:08:56,358 That will be very visible when you are losing with the competition when other law firms start implementing it. 103 00:08:56,358 --> 00:08:59,781 The lawyers are starting using it quite effectively. 104 00:08:59,781 --> 00:09:08,407 Maybe they are changing the pricing models because this is actually very important because now if you are charging for an hour it won't be the same. 105 00:09:08,419 --> 00:09:17,988 uh So you need to really think maybe about the outcome, charging per document for specific work rather than the pricing model. 106 00:09:18,469 --> 00:09:28,359 And then maybe if you gather all the data and through a couple of years, then you can actually translate it into the specific ROI. 107 00:09:29,162 --> 00:09:29,752 Exactly. 108 00:09:29,752 --> 00:09:30,683 Yeah. 109 00:09:30,683 --> 00:09:33,254 And it sounds like you guys are advocating. 110 00:09:33,254 --> 00:09:47,943 And I think what I'm advocating for is let's take a step back from how we traditionally evaluate and deploy technology in our environment, because it is reasonable. 111 00:09:47,943 --> 00:09:56,218 Like ROI is not an unreasonable thing to ask for, but given that we're like at the top of the first inning, 112 00:09:56,504 --> 00:09:59,475 That's an American baseball metaphor, Anna. 113 00:09:59,475 --> 00:10:06,558 um I'm not sure what the parallel would be in your world, but it's the beginning of this journey. 114 00:10:06,558 --> 00:10:17,723 And we're so early on that it's very difficult to, um again, place too much emphasis on this or it will paralyze us. 115 00:10:17,903 --> 00:10:24,546 And Maz, it sounds like you're also advocating a kind of back to basics, like target operating model. 116 00:10:24,546 --> 00:10:30,814 with governance, processes, culture, measures, like tell us, tell us about how you see that. 117 00:10:33,142 --> 00:10:33,844 Indeed. 118 00:10:33,844 --> 00:10:37,492 And if I may just cling on to something that was mentioned now. 119 00:10:39,714 --> 00:10:41,765 session both Anna and Ted yourself just had. 120 00:10:41,765 --> 00:10:46,227 think a lot of the time law firms will look outside and say, what is this law firm doing? 121 00:10:46,227 --> 00:10:47,367 What does that law firm do? 122 00:10:47,367 --> 00:11:00,683 But the R &D exercise is such a costly initial investment that potentially we should look at other sectors like the AstraZeneca, the medicine manufacturers, because their initial R 123 00:11:00,683 --> 00:11:02,723 &D outlay is very high. 124 00:11:02,723 --> 00:11:08,105 So they're used to this long-term vision, long-term thinking, long-term strategies. 125 00:11:10,906 --> 00:11:19,886 Whereas law firms, we may have a three to five year strategy, our annual budget cycle, the annual, our yearly, so every year it can change. 126 00:11:19,886 --> 00:11:28,346 So how could you innovate and experiment if every year you can potentially cut a chunk of what you've put towards achieving your objective? 127 00:11:28,586 --> 00:11:34,746 what I'm advocating is for, let's take a step back and analyze our business model first. 128 00:11:34,966 --> 00:11:36,046 Okay, so what is a model? 129 00:11:36,046 --> 00:11:38,819 A model is a visual representation of something, right? 130 00:11:38,819 --> 00:11:45,481 If you model a house, you can have a model house made out of clay or computer aided design, whatever it may be. 131 00:11:45,562 --> 00:11:48,563 And a business model is where you define your value proposition, right? 132 00:11:48,563 --> 00:11:59,127 What product or service we're delivering, the value chain, how we're going to deliver that product or service, what the cost structure is, where the revenue is going to come from 133 00:11:59,127 --> 00:12:02,648 and how, what the distribution channel is done, et cetera, et cetera. 134 00:12:02,648 --> 00:12:07,694 That's important because when we think about alternative billing arrangements, that is the 135 00:12:07,694 --> 00:12:09,895 fundamental business model change. 136 00:12:10,455 --> 00:12:16,818 And we don't step back and look at that and how it will impact the organization and what needs to be done, how we structure it, restructure it. 137 00:12:16,994 --> 00:12:23,041 Once that front end is done, once the business model is in place, what you then say to yourself, how do I operationalize that? 138 00:12:23,041 --> 00:12:24,882 How do I make it a reality? 139 00:12:25,322 --> 00:12:26,793 Here comes the operator model. 140 00:12:26,793 --> 00:12:33,746 Again, the visual representation of how you're going to operate to achieve the things you put in and set out in your business model. 141 00:12:35,898 --> 00:12:38,358 And we don't do that today. 142 00:12:39,658 --> 00:12:44,778 Most organizations on planet Earth don't do that inherently, but particularly the law firm space. 143 00:12:44,778 --> 00:12:47,078 I've been a management consultant for a number of years. 144 00:12:47,078 --> 00:12:51,538 I've worked with professional services and legal firms, and we don't do that first step. 145 00:12:51,538 --> 00:12:56,698 You wouldn't make changes to a house if you didn't architect it first. 146 00:12:56,898 --> 00:13:01,838 You wouldn't put a rocket on a horseback and say, go faster. 147 00:13:02,398 --> 00:13:04,739 Sorry, Anna, you wanted to come in there. 148 00:13:04,739 --> 00:13:10,012 Yeah, because I'm also a lawyer and I'm talking with other lawyers from our organization. 149 00:13:10,012 --> 00:13:22,809 So what is also extremely important is that legal innovation team within the law firm will actually speak with the lawyers that are going to use it with the associates, senior 150 00:13:22,809 --> 00:13:25,050 associates, like partners. 151 00:13:25,050 --> 00:13:34,446 They will actually collaborate how the tool could be used because obviously lawyers are using certain tools, but a lot needs to be done in terms of the training. 152 00:13:34,446 --> 00:13:39,379 So providing specific training, we really can change the situation. 153 00:13:39,379 --> 00:13:47,965 And it doesn't matter how you onboard the vendor, like how you're thinking about all the security measures that you have in place. 154 00:13:47,965 --> 00:13:53,168 What really matters, especially in terms of ROE, is how the lawyers will be using it. 155 00:13:53,168 --> 00:14:00,883 Whether they know exactly to what kind of task the AI models could support them. 156 00:14:00,883 --> 00:14:04,565 Because this is where effectiveness comes from. 157 00:14:04,821 --> 00:14:10,100 they need to be aware of is the broad area of the usage. 158 00:14:11,636 --> 00:14:26,298 And you, when we spoke, you guys both mentioned like some practical lessons from like the service level framework and the support op model um in your experience. 159 00:14:26,399 --> 00:14:31,643 I don't know who wants to go first and share some of those lessons. 160 00:14:31,643 --> 00:14:35,506 What was learned through the process? 161 00:14:37,593 --> 00:14:38,793 That's a good question. 162 00:14:38,793 --> 00:14:43,393 I'll pick up first, think Ted and I'll lean on Anna because I did some of the work. 163 00:14:43,393 --> 00:14:51,673 But before I do, I just wanted to touch on something Anna mentioned about, you know, the lawyer needs to understand what tasks need to be done or can be done by AI or whatever. 164 00:14:51,673 --> 00:14:58,613 And I think that's a very interesting point because I think it's again to step back. 165 00:14:58,613 --> 00:15:05,513 AI, because it's just the topic of everybody's discussion now, will potentially remove some tasks. 166 00:15:05,513 --> 00:15:07,057 So the question is, 167 00:15:07,673 --> 00:15:13,013 not only what tasks will AI do, do we need to do those tasks at all? 168 00:15:13,353 --> 00:15:23,213 as we deliver these services to our clients, as we transform the way we deliver legal services, do we need to do certain business processes, certain tasks? 169 00:15:23,273 --> 00:15:31,993 Obviously some of them you will do such as, I don't know, your customer and risk management onboard and et cetera, but do we just remove some completely? 170 00:15:31,993 --> 00:15:34,113 they still needed in this day and age? 171 00:15:34,113 --> 00:15:36,693 know, when people do transformation, I've always, 172 00:15:36,885 --> 00:15:41,687 almost always notice that they are bending technology to their business processes. 173 00:15:41,687 --> 00:15:44,648 And the business process was designed 10, 20 years ago. 174 00:15:45,128 --> 00:15:55,923 Why not redesign that as well and use the transformation as an opportunity to step back and ask yourself, can we do something better here before we put the technology underneath 175 00:15:55,923 --> 00:15:56,553 it? 176 00:15:58,318 --> 00:16:00,099 But that is a nice segue. 177 00:16:00,099 --> 00:16:08,804 So I've left Dentons now, but I've done this work with other, you you could take this example and lift and shift it to our professional services, including legal firms. 178 00:16:08,804 --> 00:16:16,028 The first thing I always do and I always find beneficial as a lesson learned is to identify and define expectations. 179 00:16:16,068 --> 00:16:24,813 When they say to me, we would like you to help us define and design an operating model, I ask them, what does an operating model mean to you? 180 00:16:24,813 --> 00:16:26,164 What are you expecting out of me? 181 00:16:26,164 --> 00:16:30,017 Because if it's just an org chart structure, it's a completely different thing. 182 00:16:30,177 --> 00:16:30,438 Right? 183 00:16:30,438 --> 00:16:39,065 And so to do an org model is as these ramifications, these benefits, these cons to just do org structure as this different expectations, different outputs. 184 00:16:39,065 --> 00:16:42,217 Almost always people think it's just an org structure thing with some processes. 185 00:16:42,217 --> 00:16:46,741 And when I redefine the problem and reframe it, whole expectation changes. 186 00:16:46,741 --> 00:16:52,054 And that's one thing I think that the immediate lesson learned from any organization. 187 00:16:52,836 --> 00:16:54,977 Law firms specifically, think, 188 00:16:55,575 --> 00:17:03,428 I've learned to also take a step back away into the business model because, and this is something you mentioned on LinkedIn a couple of days ago, Ted, I think you were talking 189 00:17:03,428 --> 00:17:05,489 about alternative business models. 190 00:17:05,509 --> 00:17:16,754 In the US States, I know in UK we had legislation, I think, in 2007, so we have more uh freedom for non-lawyers to basically be partners in ownership and legal firms. 191 00:17:16,754 --> 00:17:24,289 But the reason why I mention that is because when it came to decision-making of what an operator model should look like, it always almost 192 00:17:24,289 --> 00:17:30,395 reverted back to a number of partners who made the final decision. 193 00:17:30,395 --> 00:17:36,341 And no matter what we did through the design principles, sometimes that got overwritten. 194 00:17:36,341 --> 00:17:41,926 um Sometimes there was an explanation, sometimes there wasn't. 195 00:17:42,287 --> 00:17:47,392 And so we need to take a step back and really think about how low things are run. 196 00:17:47,392 --> 00:17:49,060 If it's a Swiss variant. 197 00:17:49,060 --> 00:17:51,632 what are decision-making frameworks and structures in place. 198 00:17:51,632 --> 00:17:55,476 If it's just a partnership model, what are decision-making structures in place there? 199 00:17:55,476 --> 00:18:05,003 We need to have this governance piece in place first before we undertake any design work because we can spend six to 12 months doing it and then suddenly someone comes in and 200 00:18:05,003 --> 00:18:05,934 overturns everything. 201 00:18:05,934 --> 00:18:11,809 I think that's fundamentally the key to areas to keep in mind. 202 00:18:12,438 --> 00:18:14,939 Yeah, you know, just a comment on that. 203 00:18:15,059 --> 00:18:29,903 yeah, the post was really pointing out the how poorly aligned the US law firm partnership model is to post transformation, big law. 204 00:18:29,943 --> 00:18:34,164 It's very poorly aligned for a whole multitude of reasons. 205 00:18:34,164 --> 00:18:41,314 Um, you know, internal firm compensation models, the way the client engagement model, um, 206 00:18:41,314 --> 00:18:57,503 the consensus driven decision making process that you just mentioned, cash basis accounting, which clears the books, law firm partnerships are optimized for profit taking. 207 00:18:58,344 --> 00:19:08,130 operating on a cash versus an accrual basis, it creates different um optimizations and 208 00:19:08,130 --> 00:19:17,738 The reality is that we need to accrue some year over year expense in order to allocate resources towards R &D. 209 00:19:17,890 --> 00:19:23,604 I think the reality, and I'm not sure law firms have really wrapped their heads around this yet. 210 00:19:23,604 --> 00:19:29,449 I think some have, is that you're not going to buy off the shelf tools to differentiate yourself. 211 00:19:29,449 --> 00:19:33,612 Like you buying Harvey or Legora, your competitor down the street can do that. 212 00:19:33,612 --> 00:19:36,044 How you're going to differentiate yourself. 213 00:19:36,172 --> 00:19:44,504 in the post AI world is through all of these documents and knowledge that was used to deliver winning outcomes to your clients. 214 00:19:44,504 --> 00:19:49,586 How do you capitalize on that with off the shelf tools that are hosted? 215 00:19:49,586 --> 00:19:51,946 That's bringing your data to AI. 216 00:19:51,946 --> 00:20:02,989 I'm a proponent of bringing AI to your data because that's where all of that knowledge and all of those documents and all of that know-how exists. 217 00:20:03,049 --> 00:20:04,490 And Anna, you're shaking your head here. 218 00:20:04,490 --> 00:20:05,783 I'm assuming you agree. 219 00:20:05,783 --> 00:20:18,907 Yes, because that like for over the years we've been discussing this issue, know, in our film because like low films are specific and lawyers are specific because all of our like 220 00:20:18,907 --> 00:20:29,430 invaluable precious thing is about like documents and knowledge and lawyers are not really willing to share their knowledge, you know, even within like a global law firm. 221 00:20:29,430 --> 00:20:34,421 So that's the first thing because I completely agree with you. 222 00:20:34,421 --> 00:20:46,019 It's because when only when you actually add through retrieval aggravation generation technique and or another thing, so like when you add your documents to the processing and 223 00:20:46,019 --> 00:20:53,905 you will train maybe the specific model, maybe you fine tune it or you change the model in a way that it actually suits your needs. 224 00:20:53,905 --> 00:20:57,007 This is how the competitive advantage could start, right? 225 00:20:57,007 --> 00:21:02,211 So this is why it's extremely important to use the internal knowledge. 226 00:21:02,211 --> 00:21:03,049 I think 227 00:21:03,049 --> 00:21:13,595 That's the biggest, one of the biggest challenge in the operating model that we've faced and we are facing and that will be a problem. 228 00:21:13,595 --> 00:21:17,467 But hopefully the lawyers will need to understand this. 229 00:21:17,467 --> 00:21:19,028 That's the first thing. 230 00:21:19,028 --> 00:21:31,014 And the second I forgot to the operating model to the previous point, I think what is also important is not only what kind of tasks the AI will be used, but also how to use it. 231 00:21:31,014 --> 00:21:32,888 So basically prompt engineering. 232 00:21:32,888 --> 00:21:36,709 to learn lawyers how to formulate the prompt. 233 00:21:36,709 --> 00:21:52,674 And maybe it seems to be really em easy straightforward exercise, but from my experience, it's really important to really get the point how to use it in order to get what you 234 00:21:52,674 --> 00:21:54,054 really need. 235 00:21:54,054 --> 00:21:56,355 And I'm using on a daily basis different tools. 236 00:21:56,355 --> 00:21:59,500 And I'm sometimes really uh 237 00:21:59,500 --> 00:22:12,440 a story or even, know, thrilled how AI could actually do for me and how it could replace the existing tools that I have, right? 238 00:22:12,440 --> 00:22:16,343 With translation, with changing conversion, these kinds of things. 239 00:22:16,343 --> 00:22:19,085 I'm not aware that I can use it, but I'm trying. 240 00:22:19,085 --> 00:22:26,011 So this is also maybe important to try different things and learn how to uh utilize the AI. 241 00:22:26,011 --> 00:22:29,391 So that's also, I think, very important to 242 00:22:30,007 --> 00:22:41,109 find the communication again between onboarding legal teams, innovation teams, and the lawyers that are actually using it in practice because that's the most important thing. 243 00:22:42,438 --> 00:22:43,538 Agreed. 244 00:22:43,698 --> 00:22:51,981 you know, um given your cybersecurity focus and expertise, we definitely want to spend some time there. 245 00:22:52,001 --> 00:22:53,801 So let's talk about that a little bit. 246 00:22:53,801 --> 00:23:06,515 um You know, when we spoke last preparing for this call, you had mentioned cybersecurity baseline as a prerequisite for data sharing and AI adoption. 247 00:23:06,515 --> 00:23:10,976 And I'm not sure that's the sequence. 248 00:23:10,976 --> 00:23:14,229 of things of how things are playing out today. 249 00:23:14,229 --> 00:23:26,679 It's a uh friend of mine, Tom Baldwin has a uh company called integrata and they, they do a lot of business of law rationalization. 250 00:23:26,679 --> 00:23:41,076 I'll call it and um getting, you know, his, his job out there and his objective is to get, get the house in order, get the foundation laid. 251 00:23:41,076 --> 00:23:45,651 upon which to build robust AI processes. 252 00:23:45,651 --> 00:23:50,795 And part of that is security, ethical walls, all of those sorts of things. 253 00:23:50,795 --> 00:23:58,823 But tell me, Anna, from your perspective, in terms of sequence, are we starting where we need to broadly? 254 00:23:58,823 --> 00:24:00,024 And I don't just mean at your firm. 255 00:24:00,024 --> 00:24:02,066 I mean, just as an industry. 256 00:24:02,467 --> 00:24:06,540 I'm not seeing as much dialogue about this as I would expect. 257 00:24:06,915 --> 00:24:10,558 Yes, so in my view, it's extremely important. 258 00:24:10,558 --> 00:24:19,223 Lawyers are custodians of very sensitive data, confidential data, legal privilege data, business information, trade secrets. 259 00:24:19,223 --> 00:24:24,547 So basically, are obliged, ethically obliged to protect the data. 260 00:24:24,547 --> 00:24:35,434 And if we are using the models that are not properly onboarded, then we can actually, our data could be leaked like very easily. 261 00:24:35,434 --> 00:24:37,035 The prompts like... 262 00:24:37,103 --> 00:24:44,629 could be everything what we are including in the props could be leaked to the general public knowledge. 263 00:24:45,731 --> 00:25:00,395 on boarding I couldn't like say more but it's like extremely important to actually think about uh safeguards that should be put in place uh during AI adoption. 264 00:25:00,395 --> 00:25:11,710 And it's not only about the reputation, it's not only even about the carrying of personal data for the client or confidential data, but it's also about compliance with the existing 265 00:25:11,710 --> 00:25:12,480 laws. 266 00:25:12,480 --> 00:25:21,144 In US, we have a CCPA, data protection laws, specifically because this is when the fines are really enormous. 267 00:25:21,424 --> 00:25:27,267 CCPA in US, GDPR in Europe, like all these laws mandate 268 00:25:27,267 --> 00:25:36,470 to implement specific security measures, to think about the technology from the perspective of so-called privacy by design. 269 00:25:36,470 --> 00:25:42,471 So from the very beginning to think about the security controls so that our data will not be leaked. 270 00:25:42,471 --> 00:25:50,043 Like we are using the system we start from the systems that's been uh operating on prem, right? 271 00:25:50,043 --> 00:25:51,474 Everything was on premise. 272 00:25:51,474 --> 00:25:54,554 We didn't want to go to the cloud because the data could be revealed. 273 00:25:54,554 --> 00:25:56,716 That was very important for the office. 274 00:25:56,716 --> 00:26:08,821 And now it's sometimes when I hear, for like smaller law firms, they do not think about like the AI usage when they actually put the data and they are copying to the internet. 275 00:26:08,821 --> 00:26:19,206 So I think it's extremely important to understand last week, I've been attending a conference in Warsaw organized by Bar Association. 276 00:26:19,206 --> 00:26:22,687 And I was really also astonished like how many... 277 00:26:22,759 --> 00:26:29,023 lawyers are not thinking about the usage of AI and not putting a specific security guard. 278 00:26:29,023 --> 00:26:43,673 Law society in UK has indicated from the very beginning of AI adoption that security is one of the most important factors when uh implementing the AI models. 279 00:26:44,154 --> 00:26:51,359 And starting from security, maybe thinking about the certifications, these kind of things are like security baseline is 280 00:26:51,509 --> 00:26:53,298 enormously important. 281 00:26:54,094 --> 00:27:06,101 So, Moz, I spent 10 years at Bank of America, as I mentioned, in risk management roles, and there are four lines of defense, which are the line of business. 282 00:27:06,101 --> 00:27:15,606 So this would be the consumer bank, the investment bank, the wealth bank, the people doing the work on the front lines. 283 00:27:15,626 --> 00:27:22,722 The second line of defense is they have a dedicated risk management function with a separate reporting structure that, and, 284 00:27:22,722 --> 00:27:26,405 there are groups that are aligned to each line of business. 285 00:27:26,405 --> 00:27:36,691 So they oversee the, um, their LOP partners to make sure that they have the proper controls in place. 286 00:27:36,691 --> 00:27:39,033 Third line of defense is corporate audit. 287 00:27:39,033 --> 00:27:46,508 So internal audit comes and evaluates both the second and first line of defense and evaluates the control environment and looks for gaps. 288 00:27:46,508 --> 00:27:49,740 I, I sat in that seat for, for many years. 289 00:27:50,081 --> 00:27:52,738 And the fourth line of defense, my listeners have 290 00:27:52,738 --> 00:27:59,501 heard me say this before, that's the Wall Street Journal, because that's where you end up if the first three fail. 291 00:28:00,222 --> 00:28:08,787 And it's a very bad day when the fourth line of defense um is where you end up. 292 00:28:08,787 --> 00:28:16,891 So in legal, in law firms, we don't have these sorts of structures typically, at least I've never seen them. 293 00:28:17,152 --> 00:28:21,874 You've got an InfoSec team and an IT team, but there's not these 294 00:28:21,926 --> 00:28:26,149 layered um mechanisms to ensure. 295 00:28:26,149 --> 00:28:33,154 so in the third line of defense in corporate audit, we had frameworks like COBIT, control objectives for IT. 296 00:28:33,154 --> 00:28:47,343 And that was a framework we would use to, so very mature, like COSO, um ISACA, like all of these organizations had tools that we would leverage as a corporate audit function to 297 00:28:47,343 --> 00:28:50,265 ensure that we didn't end up in the Wall Street Journal. 298 00:28:50,265 --> 00:28:51,846 um 299 00:28:52,002 --> 00:28:57,852 This seems like a glaring gap in law firms, Maz. 300 00:28:57,852 --> 00:28:59,496 What's your take on that? 301 00:29:00,976 --> 00:29:05,838 I echo that completely and I'll lean on Anna in a minute, but I think so. 302 00:29:09,111 --> 00:29:15,291 Law firms are data rich and resource poor for the most part. 303 00:29:15,831 --> 00:29:25,691 And I have discussed it, but in Canada, you've got something like 35,000 law firms, in the US, you've got something like 430,000 law firms based on different stats. 304 00:29:25,691 --> 00:29:28,211 Most of them are probably small organizations, fine. 305 00:29:28,891 --> 00:29:36,161 But as soon as you work on a matter, even if you're a single person lawyer, that's sensitive confidential information, you could be doing work for the government. 306 00:29:36,161 --> 00:29:42,095 far as we know, So sensitive stuff can be as Anna mentioned, ah leaked. 307 00:29:42,697 --> 00:29:45,119 Ted, you mentioned the business layer, right? 308 00:29:45,119 --> 00:29:48,581 I think that's the biggest risk profile, right? 309 00:29:48,581 --> 00:29:50,573 Us, and Anna has alluded to that. 310 00:29:50,573 --> 00:29:58,479 We've got ChatGPT and our Cloud AI, you pay 20 bucks a month, you can copy and paste whatever information you want there, it goes somewhere in the ether, nobody knows what 311 00:29:58,479 --> 00:29:59,080 happens to it. 312 00:29:59,080 --> 00:30:03,233 That risk profile has increased a thousandfold. 313 00:30:04,354 --> 00:30:05,415 And so, 314 00:30:06,784 --> 00:30:12,239 we can implement and I guess we can implement frameworks and we've been doing this for decades. 315 00:30:12,239 --> 00:30:13,170 We've still had issues. 316 00:30:13,170 --> 00:30:14,501 We still had data breaches. 317 00:30:14,501 --> 00:30:20,636 I think the bottom line is as long as humans are involved in a chain somewhere, there's always going to be a problem. 318 00:30:20,636 --> 00:30:22,077 It's how we minimize that. 319 00:30:22,077 --> 00:30:31,665 And today with those AI tools available to each single one of us at the click of a button, I don't think we've managed to figure out how we reduce that risk profile, whether it's 320 00:30:31,665 --> 00:30:33,587 through education or some other. 321 00:30:33,807 --> 00:30:35,378 that stuff she copying things over? 322 00:30:35,378 --> 00:30:36,199 I don't know, right? 323 00:30:36,199 --> 00:30:37,910 We should discuss that. 324 00:30:39,492 --> 00:30:47,417 But all that to mention is that I think law firms are battling this problem alone right now. 325 00:30:47,478 --> 00:30:48,879 So Denton's is doing something. 326 00:30:48,879 --> 00:30:50,070 I know we have been doing. 327 00:30:50,070 --> 00:30:54,023 Other law firms I'm speaking to are doing this. 328 00:30:54,023 --> 00:30:55,984 Vendors are doing some stuff. 329 00:30:56,425 --> 00:30:59,107 But if we are sharing 330 00:30:59,485 --> 00:31:10,234 matters if we are sharing courts, if we're sharing client information, if we're sharing vendors, if we're sharing infrastructure, if everything is shared at the front end, why is 331 00:31:10,234 --> 00:31:14,375 it at the back end we are not collaborating on something as uh a community? 332 00:31:14,375 --> 00:31:16,750 is everybody going at this alone? 333 00:31:16,750 --> 00:31:21,834 It's not a competitive advantage for you to have a robust infrastructure system and a policy in place. 334 00:31:21,934 --> 00:31:24,927 It is basic, you know, client confidence. 335 00:31:24,927 --> 00:31:27,967 That's your duty as a law firm to do. 336 00:31:27,967 --> 00:31:39,340 And so one thing that Anna and I trying to, and we started this when I was a Dentist, and we will try to get the ball rolling, but it is to propose a 337 00:31:42,352 --> 00:31:52,583 A platform or a system that is built by law firms for law firms with the industry participate and that helps every single law firm, small, medium or large, simply improve 338 00:31:52,583 --> 00:31:57,518 the infrastructure security posture with minimal effort, minimal financial costs. 339 00:31:57,518 --> 00:32:06,793 Because if you mentioned, you yourself, it's very hard to the law firms to get anything invested in this, especially if it's hard to prove what the ROI is. 340 00:32:06,793 --> 00:32:08,183 And so I just think that. 341 00:32:08,361 --> 00:32:10,042 You and I are not going to be able to solve it. 342 00:32:10,042 --> 00:32:12,593 Anna and I and you are not going to able solve it alone. 343 00:32:12,593 --> 00:32:18,437 We need to have a cohesive, coherent strategy that is led by the industry. 344 00:32:18,437 --> 00:32:21,318 We all come together around the table and we ask ourselves these questions. 345 00:32:21,318 --> 00:32:25,840 We put our resources in the same pot and we develop something that benefits everybody. 346 00:32:26,541 --> 00:32:28,156 I think that's the future. 347 00:32:28,156 --> 00:32:33,884 I love the idea, but what organization would spearhead this? 348 00:32:34,406 --> 00:32:37,330 One doesn't come to mind, maybe ILTA? 349 00:32:37,992 --> 00:32:39,894 You have some thoughts on that, Anna? 350 00:32:40,255 --> 00:32:42,326 Yes, like we've been discussing with Filtas. 351 00:32:42,326 --> 00:32:51,461 So we are in the conversation, to your point, I actually also have an experience working in the bank for years. 352 00:32:51,461 --> 00:33:00,776 So we had something like in Poland, we have something like a recommendation H, which also assumes that you have a free layer of defense. 353 00:33:00,776 --> 00:33:04,228 So I understand very well the concept. 354 00:33:05,449 --> 00:33:09,197 so what I think is that, you know, like we it's very 355 00:33:09,197 --> 00:33:19,152 hard sometimes to translate it because if you are a financial institution, if you are like a bank, you have a really great structure and this kind of regulation recommendation that 356 00:33:19,152 --> 00:33:21,113 you need to apply with. 357 00:33:21,754 --> 00:33:28,648 For huge companies, for sure, I think the audit function should be somewhere. 358 00:33:28,648 --> 00:33:38,323 Based on my experience, it's actually very, uh very good function that actually check whether the controls are in place. 359 00:33:38,654 --> 00:33:48,048 when you would like to translate it into the operation of the law firms, I think it would be uh good to focus on some kind of certification. 360 00:33:48,048 --> 00:33:59,652 is what I was talking from basically external auditors who will look at your infrastructure, who will assess your security posture as an external bodies that are 361 00:33:59,652 --> 00:34:05,504 companies that are not involved in your processing on a daily basis. 362 00:34:05,504 --> 00:34:07,275 So I think this kind of 363 00:34:08,173 --> 00:34:11,035 second layer could be implemented in this kind of way. 364 00:34:11,035 --> 00:34:23,604 This is why we've been thinking with MAS about this platform that could provide the lawyers with certain guidance playbooks, how to implement certain security controls in a 365 00:34:23,604 --> 00:34:37,313 very simple way, especially for smaller law firms who do not have so many resources, who cannot put uh many resources, uh money, investment into the security. 366 00:34:37,313 --> 00:34:54,088 because it's very easy to say for a huge global low film to implement certain ISO 27001 or 42,000 for the management of AI, but for smaller low films, it could be really 367 00:34:54,088 --> 00:34:54,728 challenging. 368 00:34:54,728 --> 00:35:06,403 So I think the second layer of defense is possible, and I think it could be done for bigger low films who can em collaborate with defenders that will help. 369 00:35:06,403 --> 00:35:14,088 can help them to uh provide the readiness assessment for the certification. 370 00:35:14,088 --> 00:35:18,180 But for smaller law firms, this is what we've been thinking about it with MAS. 371 00:35:18,180 --> 00:35:32,479 Maybe we will collaborate with ILTA to help us really build some this kind of free vendor neutral platform who could serve these specifically uh smaller law firms. 372 00:35:33,263 --> 00:35:37,974 Yeah, and what are your so sock to type to ISO? 373 00:35:37,974 --> 00:35:40,741 um This would be different. 374 00:35:40,741 --> 00:35:41,802 I take it 375 00:35:43,586 --> 00:35:49,344 This would be geared specifically towards the needs of law firms instead of industry wide. 376 00:35:51,358 --> 00:35:57,762 it should be started from ISO 27001 about the information security management systems. 377 00:35:57,762 --> 00:36:13,821 uh SOC 2 type 2 is also now uh very uh often chosen as basically also a requirement from the clients because if you are dealing with huge clients they are also imposing on you 378 00:36:13,821 --> 00:36:18,133 obligation to uh possess certain certification. 379 00:36:18,354 --> 00:36:21,996 SOCTO type 2 is very often mentioned. 380 00:36:22,937 --> 00:36:37,628 ISO 27001, helps, but of course, this controls without this translation into the specific situation without like uh support from the third uh parties, you cannot easily implement 381 00:36:37,628 --> 00:36:40,067 it because there are not very easy controls that 382 00:36:40,067 --> 00:36:40,977 could be implemented. 383 00:36:40,977 --> 00:36:43,728 There is nothing about multi-factor authentication. 384 00:36:43,728 --> 00:36:48,079 There is nothing about the access controls, encryption that should be put in place. 385 00:36:48,079 --> 00:37:00,384 There are very generic statements what kind of controls should be put in place in order to assess your whole infrastructure and basically have certain guardrails in place. 386 00:37:00,384 --> 00:37:01,654 But what kind of guidelines? 387 00:37:01,654 --> 00:37:04,015 You need an expert for this. 388 00:37:04,015 --> 00:37:05,045 And this is. 389 00:37:05,148 --> 00:37:07,791 why we've been thinking about such platforms. 390 00:37:07,791 --> 00:37:16,503 But you are right that like ISO 27000, maybe 27071, it's about the compliance within the data protection that could be also useful. 391 00:37:17,646 --> 00:37:23,726 So, Maz, what are you guys doing about this and what's the payoff for you? 392 00:37:23,726 --> 00:37:25,986 You're spending time on this. 393 00:37:26,386 --> 00:37:35,821 Is it just really kind of a be a good citizen of the community effort or is there, what's in it for you? 394 00:37:35,821 --> 00:37:42,642 Yeah, well I'll give you the raw answer to that question. 395 00:37:45,094 --> 00:37:57,613 I was made aware of the problem when I was in Denton's and I was asked to lead the RISO 27000 implementation from the global team. 396 00:37:57,613 --> 00:38:05,748 And, you know, I realized that given the size of Denton's at the time, we still, and, you know, we had people in the CISO team who were ex-military, right? 397 00:38:05,748 --> 00:38:12,072 One of our directors, Scott Applegate, is a brilliant man, he's ex-military, you know, he's an awesome guy. 398 00:38:12,173 --> 00:38:14,035 And even with all the knowledge that we had, 399 00:38:14,035 --> 00:38:19,240 I still had to go and find an external contractor to come and do some work for us. 400 00:38:20,938 --> 00:38:25,942 going through business cases, uh FDWs, onboarding. 401 00:38:25,942 --> 00:38:33,548 I quickly realized that, know, if dentists is struggling to get this done on their own with all the resources that we have available, all these other thousands of small and 402 00:38:33,548 --> 00:38:38,432 medium sized law firms have no hope. 403 00:38:38,813 --> 00:38:43,447 And I'm interested, think as Anna is keenly on just problem solving, right? 404 00:38:43,447 --> 00:38:47,589 I think when you identify a problem, I'm not someone who just kind of brushes under the rug. 405 00:38:47,589 --> 00:38:49,221 want to find a solution to it. 406 00:38:49,221 --> 00:38:51,112 The topics are very interesting. 407 00:38:51,144 --> 00:38:56,676 AI come into play into the fold just magnifies the potential problem in an issue. 408 00:38:57,697 --> 00:39:10,052 But also, narcissistically, it just helps me personally grow and develop myself is to learn about new things, learn how to solve them, work with interesting people, get your 409 00:39:10,052 --> 00:39:12,183 brand out there as well. 410 00:39:12,183 --> 00:39:18,836 But fundamentally, it is about the bad actors are coming together. 411 00:39:19,296 --> 00:39:21,827 They've picked the red color, the team jersey. 412 00:39:21,948 --> 00:39:23,588 They're fighting against us. 413 00:39:23,588 --> 00:39:26,480 We're fighting against them independently. 414 00:39:26,480 --> 00:39:28,101 We need to become a single team as well. 415 00:39:28,101 --> 00:39:32,423 We need to pick the blue jersey, whatever color you want to pick, and be the opposition team, right? 416 00:39:32,423 --> 00:39:36,225 To face them head on because otherwise, individually, we're just going to lose, right? 417 00:39:36,225 --> 00:39:37,736 So we need to come together. 418 00:39:37,736 --> 00:39:41,318 That's really my personal objective. 419 00:39:41,572 --> 00:39:49,956 From my perspective, lawyers will need to understand how our modern legal practice will look like with these AI models. 420 00:39:49,956 --> 00:40:03,643 So I think we need to also see each other as a society that should support each other, especially uh when this new advanced technology is coming. 421 00:40:04,004 --> 00:40:09,026 And we will inevitably need to use it in order to... 422 00:40:09,026 --> 00:40:10,221 uh 423 00:40:10,221 --> 00:40:13,793 to have a competitive advantage or just to stay in the market. 424 00:40:15,034 --> 00:40:31,272 that's, think, some kind of work that should be done and if we could somehow also help lawyers to understand how this kind of usage could impact and how important security is 425 00:40:31,272 --> 00:40:40,287 now in order to actually uh defend lawyers, our profession, our legal obligations. 426 00:40:40,387 --> 00:40:42,299 in a very responsible way. 427 00:40:42,299 --> 00:40:43,730 So that's one thing. 428 00:40:43,730 --> 00:40:49,995 The second thing is more maybe dental related, it's about the data sharing, right? 429 00:40:49,995 --> 00:40:55,098 Because each global law firm is structured in a way that there are a lot of partnerships. 430 00:40:55,098 --> 00:41:05,386 So basically some kind of independent law firms, very often smaller ones that do not have enough resources, investments to uh implement certain security measures, right? 431 00:41:05,386 --> 00:41:07,149 So it's also 432 00:41:07,149 --> 00:41:15,219 for this kind of smaller entities law firms to implement responsibly security baselines at least. 433 00:41:15,842 --> 00:41:16,192 Yeah. 434 00:41:16,192 --> 00:41:26,146 And I'll share another unpopular opinion of mine, which is that the law firm partnership model does not have the appropriate governance mechanisms in place. 435 00:41:26,146 --> 00:41:37,811 So for example, at Bank of America, we had a chief risk officer who, when I was there 20 years ago, Amy Brinkley, she reported directly to the CEO. 436 00:41:38,131 --> 00:41:45,644 And um when I worked in audit, we had a chief auditor who reported directly to the audit committee. 437 00:41:45,964 --> 00:42:01,144 So there were, there were controls in place so that if the third line of defense found something and had pressure from the business, a very profitable part of the business that 438 00:42:01,144 --> 00:42:07,029 would, this issue would cause problems for their ability to continue to generate profits. 439 00:42:07,029 --> 00:42:15,394 You create a conflict and that reporting structure bypasses that conflict because the audit. 440 00:42:15,394 --> 00:42:22,776 function reports to the audit committee, audit committee, which is a subset of the board of directors. 441 00:42:22,896 --> 00:42:36,770 So in the governance structures that a big business, a C-corp has in place, and ultimately those boards of directors are installed by shareholders and they hold management 442 00:42:36,770 --> 00:42:44,012 accountable for delivering on, you know, not just the financial objectives, but the risk management objectives. 443 00:42:44,342 --> 00:42:55,069 None of this exists in law firms today and lawyers who are leading law firms today because of the bespoke nature of legal. 444 00:42:55,409 --> 00:42:59,432 The lawyers are the front line, the first, second and third lines of defense. 445 00:42:59,432 --> 00:43:04,095 Nobody checks the lawyer's work before it gets sent to me as the consumer, right? 446 00:43:04,095 --> 00:43:09,470 But as we get into this tech enabled world, we're going to need to have 447 00:43:09,470 --> 00:43:17,194 systemic approaches to evaluating technology, making sure the proper controls are in place, making sure that there's no conflicts. 448 00:43:17,313 --> 00:43:22,357 And at the top, it probably shouldn't be necessarily run by lawyers. 449 00:43:22,737 --> 00:43:29,841 Maybe these are professionals from big business or big tech or other areas of discipline. 450 00:43:29,841 --> 00:43:34,083 The lawyer perspective is very important in this whole equation. 451 00:43:34,084 --> 00:43:38,932 But from what I've seen, the C-suite in 452 00:43:38,932 --> 00:43:48,939 legal is not always sufficiently empowered the way they would be in a traditional C Corp governance model, right? 453 00:43:48,939 --> 00:43:57,574 With very intentional reporting structures that ensure and eliminate conflicts like the one I spoke of. 454 00:43:57,935 --> 00:44:08,442 And we got a lot of maturing to do to get from where we are today to where we're going to need to be in five years when technology is delivering so much. 455 00:44:08,926 --> 00:44:19,309 And I see it as a huge gap and I wave this flag all the time and I get told by really like by lawyers in leadership who I really respect, but they've never worked in a big corporate 456 00:44:19,309 --> 00:44:22,790 environment and seen all of this risk management rigor applied. 457 00:44:22,790 --> 00:44:25,591 And they're like, yeah, we're not doing that. 458 00:44:25,591 --> 00:44:31,313 I'm like, okay, you're able to get away with that today, but what about tomorrow when things look different? 459 00:44:31,313 --> 00:44:35,440 So I don't know, Anna, do you, do you agree that we need more? 460 00:44:35,440 --> 00:44:36,121 exactly. 461 00:44:36,121 --> 00:44:46,248 Especially with this new technological advancement, really like 10 years ago, I've been actually, coming from the family when my mother had a low pyramid. 462 00:44:46,248 --> 00:44:54,233 Everything was processed on servers and the client's data, everything what she had was protected in a very specific way. 463 00:44:54,233 --> 00:45:02,719 Even if we've been traveling, she couldn't leave any document in the car because she was so aware about it. 464 00:45:03,087 --> 00:45:19,692 And now with usage of technological advancement, with cloud, with processing, with usage of uh AI, really need, like law firms, we really need to think about security because that 465 00:45:19,692 --> 00:45:25,364 like more and more often the fines will be imposed on the law firms. 466 00:45:25,364 --> 00:45:28,074 And that is already happening in UK. 467 00:45:28,074 --> 00:45:29,895 There are a couple of last... 468 00:45:30,069 --> 00:45:34,380 year there were really a lot of fines imposed on the law firms. 469 00:45:34,380 --> 00:45:38,982 In Europe it's also some kind of scrutiny imposed on law firms. 470 00:45:38,982 --> 00:45:44,834 So I think now lawyers will need to start thinking about these layers of defense. 471 00:45:44,834 --> 00:45:56,216 And you are right Ted, that maybe that, and in my view, it shouldn't be the work for the lawyers because they should be focused on their work, but they should have in place. 472 00:45:56,216 --> 00:46:03,222 put some guidelines, engage some vendors, some third parties that would uh actually protect them. 473 00:46:04,263 --> 00:46:16,114 But like again, even if they will engage some third parties, there is nothing now in place in the internet that could support really how to implement certification if you not have, 474 00:46:16,114 --> 00:46:16,814 you 475 00:46:17,731 --> 00:46:23,834 free to two vendors at least, know, one to assess the readiness of your architecture. 476 00:46:23,834 --> 00:46:27,316 The other one would help with the certification itself. 477 00:46:27,316 --> 00:46:28,917 So it's very costly. 478 00:46:28,917 --> 00:46:44,616 So I think that the future and if we would like to survive as a professional and really the profession that is trusted and reputation is really important. 479 00:46:44,616 --> 00:46:46,837 We would need to start thinking. 480 00:46:46,837 --> 00:46:55,155 about implementation of the security measures as soon as possible because that is extremely important. 481 00:46:55,155 --> 00:47:07,627 And I totally agree with the layers of defense, but again, this kind of very complicated structure, not possible, but alternative ones for sure, there should be. 482 00:47:07,635 --> 00:47:10,056 we need to start inching our way there. 483 00:47:10,177 --> 00:47:16,052 It's, um you know, and the biggest law firm in the world by revenue is Kirkland and Ellis. 484 00:47:16,052 --> 00:47:17,453 They're around eight billion in revenue. 485 00:47:17,453 --> 00:47:20,906 They wouldn't even qualify for the Fortune 500 if they were public. 486 00:47:20,906 --> 00:47:21,256 Right. 487 00:47:21,256 --> 00:47:24,529 So there's been a big ceiling on scale. 488 00:47:24,529 --> 00:47:29,433 And I've had, I've talked to people who've told me, well, maybe law firms don't want to scale. 489 00:47:29,433 --> 00:47:35,337 I was like, um okay, I don't know too many lawyers that don't want to make more money. 490 00:47:35,948 --> 00:47:36,458 Right? 491 00:47:36,458 --> 00:47:39,719 Because ultimately that's the motivation to scale a business. 492 00:47:39,719 --> 00:47:42,380 um the primary motivation. 493 00:47:42,380 --> 00:47:44,081 There may be others as well. 494 00:47:44,081 --> 00:47:46,798 Well, this has been like a fantastic conversation. 495 00:47:46,798 --> 00:47:54,275 Uh, Maz, how do people find out more about what you're trying to do and how can my listeners help support this effort? 496 00:47:54,275 --> 00:47:58,196 Cause it sounds like a fantastic step in the right direction. 497 00:48:00,153 --> 00:48:01,653 There's a very good question. 498 00:48:01,653 --> 00:48:03,935 One I must admit I should have thought of. 499 00:48:04,956 --> 00:48:10,270 Anna and I, you can find Anna and I on LinkedIn, I think, or through this podcast and reach out to us. 500 00:48:10,270 --> 00:48:16,974 We will be, we are working on some public facing content. 501 00:48:16,974 --> 00:48:25,880 So we're working with a technology magazine, Canadian legal technology magazine on potentially publishing our thesis on this, right? 502 00:48:25,880 --> 00:48:29,123 Our argument that'll be published hopefully sometime in 2026. 503 00:48:29,123 --> 00:48:29,980 And so 504 00:48:29,980 --> 00:48:38,986 We are putting out that sort of content and as people, you know, just keep an eye on that, engage with it, reach out to Anna and myself, reach out to you, maybe Ted, and you can put 505 00:48:38,986 --> 00:48:40,127 us in touch with everybody. 506 00:48:40,127 --> 00:48:50,364 The one thing I just want to say on everything we've discussed today, I would urge any listener who wants to take somebody seriously and practically, Google Operator Model 507 00:48:50,364 --> 00:48:58,930 Canvas and just Google Business Model Canvas and please start with those two things when you go from transformation, improvement, AI adoption. 508 00:48:58,930 --> 00:49:01,366 whatever it may be, you don't have to go deep. 509 00:49:01,366 --> 00:49:04,452 One page plan is better than no plan. 510 00:49:04,453 --> 00:49:07,770 Start there and it'll help you for your journey. 511 00:49:08,866 --> 00:49:09,826 Well, that's great advice. 512 00:49:09,826 --> 00:49:16,132 It will include some links in the show notes to your LinkedIn profiles and people are always welcome to reach out to me. 513 00:49:16,132 --> 00:49:18,313 I'm very easy to find on LinkedIn. 514 00:49:19,195 --> 00:49:26,980 you know, I'm always posting and you know, I, sometimes I have to step back and go, man, I hope I'm not trying to be critical. 515 00:49:27,201 --> 00:49:28,762 Like pointing these things out. 516 00:49:28,762 --> 00:49:31,825 And sometimes I wonder if I'm not coming across that way. 517 00:49:31,825 --> 00:49:38,390 I'm pointing these things out because I think there's a lack of awareness and urgency. 518 00:49:38,858 --> 00:49:44,542 around this and our business is entirely dependent, you know, the what's in it for me. 519 00:49:44,542 --> 00:49:49,706 Like what's in it for me is my customers are still around in five years. 520 00:49:49,706 --> 00:49:53,568 I really want law firms to succeed. 521 00:49:53,688 --> 00:50:05,887 And so, you know, I, I am aware that pointing these things out sometimes can ruffle feathers, but the reality is if we don't talk, start taking steps in this direction, the 522 00:50:05,887 --> 00:50:08,200 entire industry is at risk. 523 00:50:08,200 --> 00:50:11,213 And that would be bad for me, it would be bad for both of you. 524 00:50:11,213 --> 00:50:13,586 um But this has been... 525 00:50:13,586 --> 00:50:16,138 Yeah. 526 00:50:16,389 --> 00:50:21,752 vendors often throw around the word strategic partnership, willy-nilly and randomly. 527 00:50:21,752 --> 00:50:31,006 Strategic partnership means that you have a critical friend, somebody who can tell you what's wrong with your best interest in mind. 528 00:50:31,006 --> 00:50:32,316 So think that's what you're doing, Ted. 529 00:50:32,316 --> 00:50:36,578 I think Anna and I, that's what we're trying to do as well, is just to help the industry improve. 530 00:50:36,578 --> 00:50:38,903 Vendors are innovating right now. 531 00:50:38,903 --> 00:50:40,100 Legal firms need vendors. 532 00:50:40,100 --> 00:50:40,850 So. 533 00:50:40,850 --> 00:50:45,006 The more we work closely with each other to co-create value, think the better. 534 00:50:45,580 --> 00:50:46,775 Yes, awesome. 535 00:50:46,775 --> 00:50:52,734 Well, thank you both for the work that you do and it's been a pleasure to speak with you here this afternoon. 536 00:50:53,441 --> 00:50:54,896 Likewise, thank you very much. 537 00:50:54,896 --> 00:50:55,336 us Ted. 538 00:50:55,336 --> 00:50:56,588 Thank you Anna. 539 00:50:56,829 --> 00:50:57,910 Have good day. 00:00:05,153 Maz, Anna, how are you today? 2 00:00:06,876 --> 00:00:08,899 All right, Ted, very well, thank you. 3 00:00:09,858 --> 00:00:12,451 Good, excellent. 4 00:00:12,451 --> 00:00:14,354 this episode's been a long time in the making. 5 00:00:14,354 --> 00:00:20,332 I think we first started having conversations about this shortly after Ilt. 6 00:00:20,332 --> 00:00:22,314 It's been a couple months, right? 7 00:00:23,504 --> 00:00:25,555 Good things come to those who wait, Ted, right? 8 00:00:25,555 --> 00:00:28,005 oh 9 00:00:28,005 --> 00:00:28,986 it's a great agenda. 10 00:00:28,986 --> 00:00:31,349 I know it's going to be a really good conversation. 11 00:00:31,430 --> 00:00:39,200 But before we jump into the agenda, let's just do some quick introductions, kind of who you are, what you do, where you do it. 12 00:00:39,261 --> 00:00:40,852 Maz, why don't we start with you? 13 00:00:41,040 --> 00:00:41,680 Yeah, thank you, Ted. 14 00:00:41,680 --> 00:00:43,001 Appreciate it. 15 00:00:43,001 --> 00:00:45,003 I am a technology consultant. 16 00:00:45,003 --> 00:00:53,569 I've got experience in delivering technology transformation and generally service improvement programs in typically large and complex legal environments. 17 00:00:53,589 --> 00:00:59,373 I typically advise firms on modernization operations and strengthen and sort of the cross-functional performance. 18 00:00:59,513 --> 00:01:03,216 I am pivoting towards the SaaS space in January 2026. 19 00:01:03,216 --> 00:01:07,249 I'm joining the startup to help build up the Canadian capability. 20 00:01:07,249 --> 00:01:09,570 So I'm really excited for that. 21 00:01:10,615 --> 00:01:11,923 Awesome, Anna? 22 00:01:12,340 --> 00:01:23,069 So I'm a global privacy lawyer uh engaged in various privacy matters, but on a daily basis, I'm onboarding vendors, basically analyzing contracts. 23 00:01:23,249 --> 00:01:29,814 Particularly, I'm focusing on onboarding uh AI models, AI-related vendors. 24 00:01:30,075 --> 00:01:35,982 So on a daily basis, working with different teams, IT, security, and... 25 00:01:35,982 --> 00:01:48,978 so that my advice is the best one and to the point analyzing data flows, architecture diagrams, really to take into account all the security related knowledge as well. 26 00:01:49,259 --> 00:01:59,924 Because apart from working at Dentons, I'm also a PhD providing lectures on IT technology and cybersecurity for lawyers at my university in Poland. 27 00:01:59,924 --> 00:02:05,740 So basically I'm trying on a daily basis combining practical and theoretical knowledge to provide the best. 28 00:02:05,740 --> 00:02:06,994 advice again. 29 00:02:07,544 --> 00:02:13,797 Well, there's a lot of work to do with respect to InfoSec and AI. 30 00:02:13,978 --> 00:02:26,405 I spent years in risk management roles at Bank of America, both in corporate audit, consumer risk management. 31 00:02:26,625 --> 00:02:32,429 I was in compliance, anti-money laundering, and we always had that risk management lens. 32 00:02:32,429 --> 00:02:35,110 And when I look at AI through that lens, 33 00:02:35,392 --> 00:02:41,485 I see lots of opportunities um that need to be addressed. 34 00:02:41,485 --> 00:02:43,146 I'm sure you have your work cut out for you. 35 00:02:43,146 --> 00:02:53,160 um Well, the last time that we spoke and we were kind of working through the agenda, you guys had mentioned, and you used to work together at Denton's, correct? 36 00:02:55,451 --> 00:02:59,244 Yes, I've been to October of this year. 37 00:02:59,244 --> 00:03:00,154 OK. 38 00:03:00,235 --> 00:03:16,302 And um you had mentioned the target operating models for legal transformation and how one of the things we discussed was how technology alone doesn't equal transformation. 39 00:03:16,423 --> 00:03:23,656 Maz, why don't you kick us off with what you meant by that and um your general take on that topic. 40 00:03:24,633 --> 00:03:25,443 Yes, appreciate it. 41 00:03:25,443 --> 00:03:28,394 Thank you, Ted. 42 00:03:28,394 --> 00:03:31,915 think so law firms face a mismatch, right? 43 00:03:31,915 --> 00:03:43,008 So their operating model was designed and developed when success meant leveraging junior associates to really do repetitive work and then build by the hour. 44 00:03:43,588 --> 00:03:46,809 But today, client the client demands are shifting and changing. 45 00:03:46,809 --> 00:03:53,301 Technologies put in pressure on firms and alternative legal providers are continuing to grow. 46 00:03:54,374 --> 00:04:01,446 Competition is coming from those who are redesigning the operator model from scratch. 47 00:04:02,127 --> 00:04:07,609 Now the way that most firms respond to that is by hiring a person, is to put a body behind it. 48 00:04:07,769 --> 00:04:14,472 For example, since we brought the topic up, the role of chief AI officer or a director of AI, et cetera. 49 00:04:15,312 --> 00:04:23,455 I argue that you cannot modernize a 20th century operator model by bolting on 21st century tools. 50 00:04:25,103 --> 00:04:29,205 There was a really interesting uh piece of content I engaged on LinkedIn just yesterday. 51 00:04:29,205 --> 00:04:40,242 Somebody read an article about the CSO role slowly dying away in some industries, in some organizations, much like the data officer role did. 52 00:04:40,242 --> 00:04:40,893 Why? 53 00:04:40,893 --> 00:04:46,716 Primarily because it was hard to measure ROI and demonstrate that ROI. 54 00:04:47,076 --> 00:04:49,398 Before joining the school today, I'd like to read it out. 55 00:04:49,398 --> 00:04:51,929 Actually, since we're on this topic, I Googled 56 00:04:52,239 --> 00:04:55,901 What KPIs and responsibilities of the chief AI officer have? 57 00:04:56,822 --> 00:04:58,984 I'll just briefly touch on that. 58 00:04:58,984 --> 00:05:06,128 Responsible for overseeing enterprise-wide AI strategy governance and implementation to drive innovation and business value. 59 00:05:06,569 --> 00:05:07,549 Great. 60 00:05:07,710 --> 00:05:14,514 KPIs, return on investment, revenue growth, cost savings, time to market, et cetera. 61 00:05:14,514 --> 00:05:16,636 You can put any role under that description. 62 00:05:16,636 --> 00:05:22,219 The CEO role, the CFO role, the data officer role, the technology guy, whatever. 63 00:05:22,583 --> 00:05:24,944 It is just generic stuff. 64 00:05:25,865 --> 00:05:28,346 You are not going to get competitive. 65 00:05:28,346 --> 00:05:30,026 You are not going to win in the market. 66 00:05:30,026 --> 00:05:31,457 You don't go to differentiate yourself. 67 00:05:31,457 --> 00:05:36,429 You're not going to be successful if all we do is put in a role behind the potential problem. 68 00:05:36,429 --> 00:05:46,603 And so what I really like to see, I guess, is for law firms to think about the operator model as it engages the market, because that's the only way that they can become 69 00:05:46,603 --> 00:05:50,455 competitive and crucially understand what an operator model means. 70 00:05:50,902 --> 00:05:51,742 Yeah. 71 00:05:51,742 --> 00:05:58,684 So you know what's interesting is I've been beating the drum on this whole ROI conversation around AI. 72 00:05:58,944 --> 00:06:15,219 I've got, I've got a take on this that may be a little controversial, but I think that in the early stages of this AI transformation, ROI is really the wrong metric to really even 73 00:06:15,219 --> 00:06:17,190 be having conversations about at this point. 74 00:06:17,190 --> 00:06:18,650 And the reason is, 75 00:06:19,820 --> 00:06:23,092 The ROI in the early stages is learning. 76 00:06:23,673 --> 00:06:26,754 It's really hard to put that into a spreadsheet. 77 00:06:26,754 --> 00:06:42,204 And if you over-index on generating ROI that has a numerical value at the bottom of your spreadsheet, you're not going to be able to articulate a way to get to a positive ROI 78 00:06:42,204 --> 00:06:43,145 today. 79 00:06:43,145 --> 00:06:45,406 So you really need to think about this. 80 00:06:46,958 --> 00:06:51,618 this transformation and the steps that you need to take as an R and D exercise. 81 00:06:51,858 --> 00:07:04,778 You know, when Google allocated 20 % time, which was basically in the early 2000s, they allowed their engineering team to take one day a week and just do a pet project, something 82 00:07:04,778 --> 00:07:09,478 that could, and Gmail, all these things came out of that process. 83 00:07:09,478 --> 00:07:13,278 But if someone, if an executive were to go, well, what's the ROI on that? 84 00:07:13,278 --> 00:07:14,398 Well, it's zero. 85 00:07:14,398 --> 00:07:16,086 It's actually negative. 86 00:07:16,086 --> 00:07:25,033 And that's okay because this is an R and D exercise and we need to be thinking three, four, five, 10 years down the road. 87 00:07:25,414 --> 00:07:26,790 And, but you know what? 88 00:07:26,790 --> 00:07:30,158 I see firms all the time that are going, what's the ROI? 89 00:07:30,158 --> 00:07:35,903 And you know, the reality is AI may come in and actually displace some of your revenue. 90 00:07:35,903 --> 00:07:39,326 So you're going to have a negative number in that ROI column. 91 00:07:39,326 --> 00:07:45,150 But if you don't, if you don't take action, there's Coney, the cost of not investing. 92 00:07:45,440 --> 00:07:59,684 And that you could potentially put a real value to um because the enterprise value of your law firm is going to go to zero when we get to a place um however many years down the road 93 00:07:59,684 --> 00:08:06,296 where AI is delivering a significant portion of or managing a significant portion of legal service delivery. 94 00:08:06,296 --> 00:08:08,697 So, Anna, I don't know. 95 00:08:08,697 --> 00:08:10,207 What is your take on this? 96 00:08:10,207 --> 00:08:13,428 Are you aligned or do you see this differently? 97 00:08:13,952 --> 00:08:26,509 align with you, Tadej, because uh when you are starting investing in AI, you need to first think about important things like the training and then maybe measure just an 98 00:08:26,509 --> 00:08:36,604 effectiveness by taking maybe some samples of a couple of lawyers, people that are really using effectively these tools. 99 00:08:36,665 --> 00:08:41,197 And then maybe try to actually adjust certain metrics. 100 00:08:41,197 --> 00:08:45,050 but not basically ROE for a couple of years as you mentioned. 101 00:08:45,050 --> 00:08:50,024 That will be very visible as you pointedly highlighted. 102 00:08:50,024 --> 00:08:56,358 That will be very visible when you are losing with the competition when other law firms start implementing it. 103 00:08:56,358 --> 00:08:59,781 The lawyers are starting using it quite effectively. 104 00:08:59,781 --> 00:09:08,407 Maybe they are changing the pricing models because this is actually very important because now if you are charging for an hour it won't be the same. 105 00:09:08,419 --> 00:09:17,988 uh So you need to really think maybe about the outcome, charging per document for specific work rather than the pricing model. 106 00:09:18,469 --> 00:09:28,359 And then maybe if you gather all the data and through a couple of years, then you can actually translate it into the specific ROI. 107 00:09:29,162 --> 00:09:29,752 Exactly. 108 00:09:29,752 --> 00:09:30,683 Yeah. 109 00:09:30,683 --> 00:09:33,254 And it sounds like you guys are advocating. 110 00:09:33,254 --> 00:09:47,943 And I think what I'm advocating for is let's take a step back from how we traditionally evaluate and deploy technology in our environment, because it is reasonable. 111 00:09:47,943 --> 00:09:56,218 Like ROI is not an unreasonable thing to ask for, but given that we're like at the top of the first inning, 112 00:09:56,504 --> 00:09:59,475 That's an American baseball metaphor, Anna. 113 00:09:59,475 --> 00:10:06,558 um I'm not sure what the parallel would be in your world, but it's the beginning of this journey. 114 00:10:06,558 --> 00:10:17,723 And we're so early on that it's very difficult to, um again, place too much emphasis on this or it will paralyze us. 115 00:10:17,903 --> 00:10:24,546 And Maz, it sounds like you're also advocating a kind of back to basics, like target operating model. 116 00:10:24,546 --> 00:10:30,814 with governance, processes, culture, measures, like tell us, tell us about how you see that. 117 00:10:33,142 --> 00:10:33,844 Indeed. 118 00:10:33,844 --> 00:10:37,492 And if I may just cling on to something that was mentioned now. 119 00:10:39,714 --> 00:10:41,765 session both Anna and Ted yourself just had. 120 00:10:41,765 --> 00:10:46,227 think a lot of the time law firms will look outside and say, what is this law firm doing? 121 00:10:46,227 --> 00:10:47,367 What does that law firm do? 122 00:10:47,367 --> 00:11:00,683 But the R &D exercise is such a costly initial investment that potentially we should look at other sectors like the AstraZeneca, the medicine manufacturers, because their initial R 123 00:11:00,683 --> 00:11:02,723 &D outlay is very high. 124 00:11:02,723 --> 00:11:08,105 So they're used to this long-term vision, long-term thinking, long-term strategies. 125 00:11:10,906 --> 00:11:19,886 Whereas law firms, we may have a three to five year strategy, our annual budget cycle, the annual, our yearly, so every year it can change. 126 00:11:19,886 --> 00:11:28,346 So how could you innovate and experiment if every year you can potentially cut a chunk of what you've put towards achieving your objective? 127 00:11:28,586 --> 00:11:34,746 what I'm advocating is for, let's take a step back and analyze our business model first. 128 00:11:34,966 --> 00:11:36,046 Okay, so what is a model? 129 00:11:36,046 --> 00:11:38,819 A model is a visual representation of something, right? 130 00:11:38,819 --> 00:11:45,481 If you model a house, you can have a model house made out of clay or computer aided design, whatever it may be. 131 00:11:45,562 --> 00:11:48,563 And a business model is where you define your value proposition, right? 132 00:11:48,563 --> 00:11:59,127 What product or service we're delivering, the value chain, how we're going to deliver that product or service, what the cost structure is, where the revenue is going to come from 133 00:11:59,127 --> 00:12:02,648 and how, what the distribution channel is done, et cetera, et cetera. 134 00:12:02,648 --> 00:12:07,694 That's important because when we think about alternative billing arrangements, that is the 135 00:12:07,694 --> 00:12:09,895 fundamental business model change. 136 00:12:10,455 --> 00:12:16,818 And we don't step back and look at that and how it will impact the organization and what needs to be done, how we structure it, restructure it. 137 00:12:16,994 --> 00:12:23,041 Once that front end is done, once the business model is in place, what you then say to yourself, how do I operationalize that? 138 00:12:23,041 --> 00:12:24,882 How do I make it a reality? 139 00:12:25,322 --> 00:12:26,793 Here comes the operator model. 140 00:12:26,793 --> 00:12:33,746 Again, the visual representation of how you're going to operate to achieve the things you put in and set out in your business model. 141 00:12:35,898 --> 00:12:38,358 And we don't do that today. 142 00:12:39,658 --> 00:12:44,778 Most organizations on planet Earth don't do that inherently, but particularly the law firm space. 143 00:12:44,778 --> 00:12:47,078 I've been a management consultant for a number of years. 144 00:12:47,078 --> 00:12:51,538 I've worked with professional services and legal firms, and we don't do that first step. 145 00:12:51,538 --> 00:12:56,698 You wouldn't make changes to a house if you didn't architect it first. 146 00:12:56,898 --> 00:13:01,838 You wouldn't put a rocket on a horseback and say, go faster. 147 00:13:02,398 --> 00:13:04,739 Sorry, Anna, you wanted to come in there. 148 00:13:04,739 --> 00:13:10,012 Yeah, because I'm also a lawyer and I'm talking with other lawyers from our organization. 149 00:13:10,012 --> 00:13:22,809 So what is also extremely important is that legal innovation team within the law firm will actually speak with the lawyers that are going to use it with the associates, senior 150 00:13:22,809 --> 00:13:25,050 associates, like partners. 151 00:13:25,050 --> 00:13:34,446 They will actually collaborate how the tool could be used because obviously lawyers are using certain tools, but a lot needs to be done in terms of the training. 152 00:13:34,446 --> 00:13:39,379 So providing specific training, we really can change the situation. 153 00:13:39,379 --> 00:13:47,965 And it doesn't matter how you onboard the vendor, like how you're thinking about all the security measures that you have in place. 154 00:13:47,965 --> 00:13:53,168 What really matters, especially in terms of ROE, is how the lawyers will be using it. 155 00:13:53,168 --> 00:14:00,883 Whether they know exactly to what kind of task the AI models could support them. 156 00:14:00,883 --> 00:14:04,565 Because this is where effectiveness comes from. 157 00:14:04,821 --> 00:14:10,100 they need to be aware of is the broad area of the usage. 158 00:14:11,636 --> 00:14:26,298 And you, when we spoke, you guys both mentioned like some practical lessons from like the service level framework and the support op model um in your experience. 159 00:14:26,399 --> 00:14:31,643 I don't know who wants to go first and share some of those lessons. 160 00:14:31,643 --> 00:14:35,506 What was learned through the process? 161 00:14:37,593 --> 00:14:38,793 That's a good question. 162 00:14:38,793 --> 00:14:43,393 I'll pick up first, think Ted and I'll lean on Anna because I did some of the work. 163 00:14:43,393 --> 00:14:51,673 But before I do, I just wanted to touch on something Anna mentioned about, you know, the lawyer needs to understand what tasks need to be done or can be done by AI or whatever. 164 00:14:51,673 --> 00:14:58,613 And I think that's a very interesting point because I think it's again to step back. 165 00:14:58,613 --> 00:15:05,513 AI, because it's just the topic of everybody's discussion now, will potentially remove some tasks. 166 00:15:05,513 --> 00:15:07,057 So the question is, 167 00:15:07,673 --> 00:15:13,013 not only what tasks will AI do, do we need to do those tasks at all? 168 00:15:13,353 --> 00:15:23,213 as we deliver these services to our clients, as we transform the way we deliver legal services, do we need to do certain business processes, certain tasks? 169 00:15:23,273 --> 00:15:31,993 Obviously some of them you will do such as, I don't know, your customer and risk management onboard and et cetera, but do we just remove some completely? 170 00:15:31,993 --> 00:15:34,113 they still needed in this day and age? 171 00:15:34,113 --> 00:15:36,693 know, when people do transformation, I've always, 172 00:15:36,885 --> 00:15:41,687 almost always notice that they are bending technology to their business processes. 173 00:15:41,687 --> 00:15:44,648 And the business process was designed 10, 20 years ago. 174 00:15:45,128 --> 00:15:55,923 Why not redesign that as well and use the transformation as an opportunity to step back and ask yourself, can we do something better here before we put the technology underneath 175 00:15:55,923 --> 00:15:56,553 it? 176 00:15:58,318 --> 00:16:00,099 But that is a nice segue. 177 00:16:00,099 --> 00:16:08,804 So I've left Dentons now, but I've done this work with other, you you could take this example and lift and shift it to our professional services, including legal firms. 178 00:16:08,804 --> 00:16:16,028 The first thing I always do and I always find beneficial as a lesson learned is to identify and define expectations. 179 00:16:16,068 --> 00:16:24,813 When they say to me, we would like you to help us define and design an operating model, I ask them, what does an operating model mean to you? 180 00:16:24,813 --> 00:16:26,164 What are you expecting out of me? 181 00:16:26,164 --> 00:16:30,017 Because if it's just an org chart structure, it's a completely different thing. 182 00:16:30,177 --> 00:16:30,438 Right? 183 00:16:30,438 --> 00:16:39,065 And so to do an org model is as these ramifications, these benefits, these cons to just do org structure as this different expectations, different outputs. 184 00:16:39,065 --> 00:16:42,217 Almost always people think it's just an org structure thing with some processes. 185 00:16:42,217 --> 00:16:46,741 And when I redefine the problem and reframe it, whole expectation changes. 186 00:16:46,741 --> 00:16:52,054 And that's one thing I think that the immediate lesson learned from any organization. 187 00:16:52,836 --> 00:16:54,977 Law firms specifically, think, 188 00:16:55,575 --> 00:17:03,428 I've learned to also take a step back away into the business model because, and this is something you mentioned on LinkedIn a couple of days ago, Ted, I think you were talking 189 00:17:03,428 --> 00:17:05,489 about alternative business models. 190 00:17:05,509 --> 00:17:16,754 In the US States, I know in UK we had legislation, I think, in 2007, so we have more uh freedom for non-lawyers to basically be partners in ownership and legal firms. 191 00:17:16,754 --> 00:17:24,289 But the reason why I mention that is because when it came to decision-making of what an operator model should look like, it always almost 192 00:17:24,289 --> 00:17:30,395 reverted back to a number of partners who made the final decision. 193 00:17:30,395 --> 00:17:36,341 And no matter what we did through the design principles, sometimes that got overwritten. 194 00:17:36,341 --> 00:17:41,926 um Sometimes there was an explanation, sometimes there wasn't. 195 00:17:42,287 --> 00:17:47,392 And so we need to take a step back and really think about how low things are run. 196 00:17:47,392 --> 00:17:49,060 If it's a Swiss variant. 197 00:17:49,060 --> 00:17:51,632 what are decision-making frameworks and structures in place. 198 00:17:51,632 --> 00:17:55,476 If it's just a partnership model, what are decision-making structures in place there? 199 00:17:55,476 --> 00:18:05,003 We need to have this governance piece in place first before we undertake any design work because we can spend six to 12 months doing it and then suddenly someone comes in and 200 00:18:05,003 --> 00:18:05,934 overturns everything. 201 00:18:05,934 --> 00:18:11,809 I think that's fundamentally the key to areas to keep in mind. 202 00:18:12,438 --> 00:18:14,939 Yeah, you know, just a comment on that. 203 00:18:15,059 --> 00:18:29,903 yeah, the post was really pointing out the how poorly aligned the US law firm partnership model is to post transformation, big law. 204 00:18:29,943 --> 00:18:34,164 It's very poorly aligned for a whole multitude of reasons. 205 00:18:34,164 --> 00:18:41,314 Um, you know, internal firm compensation models, the way the client engagement model, um, 206 00:18:41,314 --> 00:18:57,503 the consensus driven decision making process that you just mentioned, cash basis accounting, which clears the books, law firm partnerships are optimized for profit taking. 207 00:18:58,344 --> 00:19:08,130 operating on a cash versus an accrual basis, it creates different um optimizations and 208 00:19:08,130 --> 00:19:17,738 The reality is that we need to accrue some year over year expense in order to allocate resources towards R &D. 209 00:19:17,890 --> 00:19:23,604 I think the reality, and I'm not sure law firms have really wrapped their heads around this yet. 210 00:19:23,604 --> 00:19:29,449 I think some have, is that you're not going to buy off the shelf tools to differentiate yourself. 211 00:19:29,449 --> 00:19:33,612 Like you buying Harvey or Legora, your competitor down the street can do that. 212 00:19:33,612 --> 00:19:36,044 How you're going to differentiate yourself. 213 00:19:36,172 --> 00:19:44,504 in the post AI world is through all of these documents and knowledge that was used to deliver winning outcomes to your clients. 214 00:19:44,504 --> 00:19:49,586 How do you capitalize on that with off the shelf tools that are hosted? 215 00:19:49,586 --> 00:19:51,946 That's bringing your data to AI. 216 00:19:51,946 --> 00:20:02,989 I'm a proponent of bringing AI to your data because that's where all of that knowledge and all of those documents and all of that know-how exists. 217 00:20:03,049 --> 00:20:04,490 And Anna, you're shaking your head here. 218 00:20:04,490 --> 00:20:05,783 I'm assuming you agree. 219 00:20:05,783 --> 00:20:18,907 Yes, because that like for over the years we've been discussing this issue, know, in our film because like low films are specific and lawyers are specific because all of our like 220 00:20:18,907 --> 00:20:29,430 invaluable precious thing is about like documents and knowledge and lawyers are not really willing to share their knowledge, you know, even within like a global law firm. 221 00:20:29,430 --> 00:20:34,421 So that's the first thing because I completely agree with you. 222 00:20:34,421 --> 00:20:46,019 It's because when only when you actually add through retrieval aggravation generation technique and or another thing, so like when you add your documents to the processing and 223 00:20:46,019 --> 00:20:53,905 you will train maybe the specific model, maybe you fine tune it or you change the model in a way that it actually suits your needs. 224 00:20:53,905 --> 00:20:57,007 This is how the competitive advantage could start, right? 225 00:20:57,007 --> 00:21:02,211 So this is why it's extremely important to use the internal knowledge. 226 00:21:02,211 --> 00:21:03,049 I think 227 00:21:03,049 --> 00:21:13,595 That's the biggest, one of the biggest challenge in the operating model that we've faced and we are facing and that will be a problem. 228 00:21:13,595 --> 00:21:17,467 But hopefully the lawyers will need to understand this. 229 00:21:17,467 --> 00:21:19,028 That's the first thing. 230 00:21:19,028 --> 00:21:31,014 And the second I forgot to the operating model to the previous point, I think what is also important is not only what kind of tasks the AI will be used, but also how to use it. 231 00:21:31,014 --> 00:21:32,888 So basically prompt engineering. 232 00:21:32,888 --> 00:21:36,709 to learn lawyers how to formulate the prompt. 233 00:21:36,709 --> 00:21:52,674 And maybe it seems to be really em easy straightforward exercise, but from my experience, it's really important to really get the point how to use it in order to get what you 234 00:21:52,674 --> 00:21:54,054 really need. 235 00:21:54,054 --> 00:21:56,355 And I'm using on a daily basis different tools. 236 00:21:56,355 --> 00:21:59,500 And I'm sometimes really uh 237 00:21:59,500 --> 00:22:12,440 a story or even, know, thrilled how AI could actually do for me and how it could replace the existing tools that I have, right? 238 00:22:12,440 --> 00:22:16,343 With translation, with changing conversion, these kinds of things. 239 00:22:16,343 --> 00:22:19,085 I'm not aware that I can use it, but I'm trying. 240 00:22:19,085 --> 00:22:26,011 So this is also maybe important to try different things and learn how to uh utilize the AI. 241 00:22:26,011 --> 00:22:29,391 So that's also, I think, very important to 242 00:22:30,007 --> 00:22:41,109 find the communication again between onboarding legal teams, innovation teams, and the lawyers that are actually using it in practice because that's the most important thing. 243 00:22:42,438 --> 00:22:43,538 Agreed. 244 00:22:43,698 --> 00:22:51,981 you know, um given your cybersecurity focus and expertise, we definitely want to spend some time there. 245 00:22:52,001 --> 00:22:53,801 So let's talk about that a little bit. 246 00:22:53,801 --> 00:23:06,515 um You know, when we spoke last preparing for this call, you had mentioned cybersecurity baseline as a prerequisite for data sharing and AI adoption. 247 00:23:06,515 --> 00:23:10,976 And I'm not sure that's the sequence. 248 00:23:10,976 --> 00:23:14,229 of things of how things are playing out today. 249 00:23:14,229 --> 00:23:26,679 It's a uh friend of mine, Tom Baldwin has a uh company called integrata and they, they do a lot of business of law rationalization. 250 00:23:26,679 --> 00:23:41,076 I'll call it and um getting, you know, his, his job out there and his objective is to get, get the house in order, get the foundation laid. 251 00:23:41,076 --> 00:23:45,651 upon which to build robust AI processes. 252 00:23:45,651 --> 00:23:50,795 And part of that is security, ethical walls, all of those sorts of things. 253 00:23:50,795 --> 00:23:58,823 But tell me, Anna, from your perspective, in terms of sequence, are we starting where we need to broadly? 254 00:23:58,823 --> 00:24:00,024 And I don't just mean at your firm. 255 00:24:00,024 --> 00:24:02,066 I mean, just as an industry. 256 00:24:02,467 --> 00:24:06,540 I'm not seeing as much dialogue about this as I would expect. 257 00:24:06,915 --> 00:24:10,558 Yes, so in my view, it's extremely important. 258 00:24:10,558 --> 00:24:19,223 Lawyers are custodians of very sensitive data, confidential data, legal privilege data, business information, trade secrets. 259 00:24:19,223 --> 00:24:24,547 So basically, are obliged, ethically obliged to protect the data. 260 00:24:24,547 --> 00:24:35,434 And if we are using the models that are not properly onboarded, then we can actually, our data could be leaked like very easily. 261 00:24:35,434 --> 00:24:37,035 The prompts like... 262 00:24:37,103 --> 00:24:44,629 could be everything what we are including in the props could be leaked to the general public knowledge. 263 00:24:45,731 --> 00:25:00,395 on boarding I couldn't like say more but it's like extremely important to actually think about uh safeguards that should be put in place uh during AI adoption. 264 00:25:00,395 --> 00:25:11,710 And it's not only about the reputation, it's not only even about the carrying of personal data for the client or confidential data, but it's also about compliance with the existing 265 00:25:11,710 --> 00:25:12,480 laws. 266 00:25:12,480 --> 00:25:21,144 In US, we have a CCPA, data protection laws, specifically because this is when the fines are really enormous. 267 00:25:21,424 --> 00:25:27,267 CCPA in US, GDPR in Europe, like all these laws mandate 268 00:25:27,267 --> 00:25:36,470 to implement specific security measures, to think about the technology from the perspective of so-called privacy by design. 269 00:25:36,470 --> 00:25:42,471 So from the very beginning to think about the security controls so that our data will not be leaked. 270 00:25:42,471 --> 00:25:50,043 Like we are using the system we start from the systems that's been uh operating on prem, right? 271 00:25:50,043 --> 00:25:51,474 Everything was on premise. 272 00:25:51,474 --> 00:25:54,554 We didn't want to go to the cloud because the data could be revealed. 273 00:25:54,554 --> 00:25:56,716 That was very important for the office. 274 00:25:56,716 --> 00:26:08,821 And now it's sometimes when I hear, for like smaller law firms, they do not think about like the AI usage when they actually put the data and they are copying to the internet. 275 00:26:08,821 --> 00:26:19,206 So I think it's extremely important to understand last week, I've been attending a conference in Warsaw organized by Bar Association. 276 00:26:19,206 --> 00:26:22,687 And I was really also astonished like how many... 277 00:26:22,759 --> 00:26:29,023 lawyers are not thinking about the usage of AI and not putting a specific security guard. 278 00:26:29,023 --> 00:26:43,673 Law society in UK has indicated from the very beginning of AI adoption that security is one of the most important factors when uh implementing the AI models. 279 00:26:44,154 --> 00:26:51,359 And starting from security, maybe thinking about the certifications, these kind of things are like security baseline is 280 00:26:51,509 --> 00:26:53,298 enormously important. 281 00:26:54,094 --> 00:27:06,101 So, Moz, I spent 10 years at Bank of America, as I mentioned, in risk management roles, and there are four lines of defense, which are the line of business. 282 00:27:06,101 --> 00:27:15,606 So this would be the consumer bank, the investment bank, the wealth bank, the people doing the work on the front lines. 283 00:27:15,626 --> 00:27:22,722 The second line of defense is they have a dedicated risk management function with a separate reporting structure that, and, 284 00:27:22,722 --> 00:27:26,405 there are groups that are aligned to each line of business. 285 00:27:26,405 --> 00:27:36,691 So they oversee the, um, their LOP partners to make sure that they have the proper controls in place. 286 00:27:36,691 --> 00:27:39,033 Third line of defense is corporate audit. 287 00:27:39,033 --> 00:27:46,508 So internal audit comes and evaluates both the second and first line of defense and evaluates the control environment and looks for gaps. 288 00:27:46,508 --> 00:27:49,740 I, I sat in that seat for, for many years. 289 00:27:50,081 --> 00:27:52,738 And the fourth line of defense, my listeners have 290 00:27:52,738 --> 00:27:59,501 heard me say this before, that's the Wall Street Journal, because that's where you end up if the first three fail. 291 00:28:00,222 --> 00:28:08,787 And it's a very bad day when the fourth line of defense um is where you end up. 292 00:28:08,787 --> 00:28:16,891 So in legal, in law firms, we don't have these sorts of structures typically, at least I've never seen them. 293 00:28:17,152 --> 00:28:21,874 You've got an InfoSec team and an IT team, but there's not these 294 00:28:21,926 --> 00:28:26,149 layered um mechanisms to ensure. 295 00:28:26,149 --> 00:28:33,154 so in the third line of defense in corporate audit, we had frameworks like COBIT, control objectives for IT. 296 00:28:33,154 --> 00:28:47,343 And that was a framework we would use to, so very mature, like COSO, um ISACA, like all of these organizations had tools that we would leverage as a corporate audit function to 297 00:28:47,343 --> 00:28:50,265 ensure that we didn't end up in the Wall Street Journal. 298 00:28:50,265 --> 00:28:51,846 um 299 00:28:52,002 --> 00:28:57,852 This seems like a glaring gap in law firms, Maz. 300 00:28:57,852 --> 00:28:59,496 What's your take on that? 301 00:29:00,976 --> 00:29:05,838 I echo that completely and I'll lean on Anna in a minute, but I think so. 302 00:29:09,111 --> 00:29:15,291 Law firms are data rich and resource poor for the most part. 303 00:29:15,831 --> 00:29:25,691 And I have discussed it, but in Canada, you've got something like 35,000 law firms, in the US, you've got something like 430,000 law firms based on different stats. 304 00:29:25,691 --> 00:29:28,211 Most of them are probably small organizations, fine. 305 00:29:28,891 --> 00:29:36,161 But as soon as you work on a matter, even if you're a single person lawyer, that's sensitive confidential information, you could be doing work for the government. 306 00:29:36,161 --> 00:29:42,095 far as we know, So sensitive stuff can be as Anna mentioned, ah leaked. 307 00:29:42,697 --> 00:29:45,119 Ted, you mentioned the business layer, right? 308 00:29:45,119 --> 00:29:48,581 I think that's the biggest risk profile, right? 309 00:29:48,581 --> 00:29:50,573 Us, and Anna has alluded to that. 310 00:29:50,573 --> 00:29:58,479 We've got ChatGPT and our Cloud AI, you pay 20 bucks a month, you can copy and paste whatever information you want there, it goes somewhere in the ether, nobody knows what 311 00:29:58,479 --> 00:29:59,080 happens to it. 312 00:29:59,080 --> 00:30:03,233 That risk profile has increased a thousandfold. 313 00:30:04,354 --> 00:30:05,415 And so, 314 00:30:06,784 --> 00:30:12,239 we can implement and I guess we can implement frameworks and we've been doing this for decades. 315 00:30:12,239 --> 00:30:13,170 We've still had issues. 316 00:30:13,170 --> 00:30:14,501 We still had data breaches. 317 00:30:14,501 --> 00:30:20,636 I think the bottom line is as long as humans are involved in a chain somewhere, there's always going to be a problem. 318 00:30:20,636 --> 00:30:22,077 It's how we minimize that. 319 00:30:22,077 --> 00:30:31,665 And today with those AI tools available to each single one of us at the click of a button, I don't think we've managed to figure out how we reduce that risk profile, whether it's 320 00:30:31,665 --> 00:30:33,587 through education or some other. 321 00:30:33,807 --> 00:30:35,378 that stuff she copying things over? 322 00:30:35,378 --> 00:30:36,199 I don't know, right? 323 00:30:36,199 --> 00:30:37,910 We should discuss that. 324 00:30:39,492 --> 00:30:47,417 But all that to mention is that I think law firms are battling this problem alone right now. 325 00:30:47,478 --> 00:30:48,879 So Denton's is doing something. 326 00:30:48,879 --> 00:30:50,070 I know we have been doing. 327 00:30:50,070 --> 00:30:54,023 Other law firms I'm speaking to are doing this. 328 00:30:54,023 --> 00:30:55,984 Vendors are doing some stuff. 329 00:30:56,425 --> 00:30:59,107 But if we are sharing 330 00:30:59,485 --> 00:31:10,234 matters if we are sharing courts, if we're sharing client information, if we're sharing vendors, if we're sharing infrastructure, if everything is shared at the front end, why is 331 00:31:10,234 --> 00:31:14,375 it at the back end we are not collaborating on something as uh a community? 332 00:31:14,375 --> 00:31:16,750 is everybody going at this alone? 333 00:31:16,750 --> 00:31:21,834 It's not a competitive advantage for you to have a robust infrastructure system and a policy in place. 334 00:31:21,934 --> 00:31:24,927 It is basic, you know, client confidence. 335 00:31:24,927 --> 00:31:27,967 That's your duty as a law firm to do. 336 00:31:27,967 --> 00:31:39,340 And so one thing that Anna and I trying to, and we started this when I was a Dentist, and we will try to get the ball rolling, but it is to propose a 337 00:31:42,352 --> 00:31:52,583 A platform or a system that is built by law firms for law firms with the industry participate and that helps every single law firm, small, medium or large, simply improve 338 00:31:52,583 --> 00:31:57,518 the infrastructure security posture with minimal effort, minimal financial costs. 339 00:31:57,518 --> 00:32:06,793 Because if you mentioned, you yourself, it's very hard to the law firms to get anything invested in this, especially if it's hard to prove what the ROI is. 340 00:32:06,793 --> 00:32:08,183 And so I just think that. 341 00:32:08,361 --> 00:32:10,042 You and I are not going to be able to solve it. 342 00:32:10,042 --> 00:32:12,593 Anna and I and you are not going to able solve it alone. 343 00:32:12,593 --> 00:32:18,437 We need to have a cohesive, coherent strategy that is led by the industry. 344 00:32:18,437 --> 00:32:21,318 We all come together around the table and we ask ourselves these questions. 345 00:32:21,318 --> 00:32:25,840 We put our resources in the same pot and we develop something that benefits everybody. 346 00:32:26,541 --> 00:32:28,156 I think that's the future. 347 00:32:28,156 --> 00:32:33,884 I love the idea, but what organization would spearhead this? 348 00:32:34,406 --> 00:32:37,330 One doesn't come to mind, maybe ILTA? 349 00:32:37,992 --> 00:32:39,894 You have some thoughts on that, Anna? 350 00:32:40,255 --> 00:32:42,326 Yes, like we've been discussing with Filtas. 351 00:32:42,326 --> 00:32:51,461 So we are in the conversation, to your point, I actually also have an experience working in the bank for years. 352 00:32:51,461 --> 00:33:00,776 So we had something like in Poland, we have something like a recommendation H, which also assumes that you have a free layer of defense. 353 00:33:00,776 --> 00:33:04,228 So I understand very well the concept. 354 00:33:05,449 --> 00:33:09,197 so what I think is that, you know, like we it's very 355 00:33:09,197 --> 00:33:19,152 hard sometimes to translate it because if you are a financial institution, if you are like a bank, you have a really great structure and this kind of regulation recommendation that 356 00:33:19,152 --> 00:33:21,113 you need to apply with. 357 00:33:21,754 --> 00:33:28,648 For huge companies, for sure, I think the audit function should be somewhere. 358 00:33:28,648 --> 00:33:38,323 Based on my experience, it's actually very, uh very good function that actually check whether the controls are in place. 359 00:33:38,654 --> 00:33:48,048 when you would like to translate it into the operation of the law firms, I think it would be uh good to focus on some kind of certification. 360 00:33:48,048 --> 00:33:59,652 is what I was talking from basically external auditors who will look at your infrastructure, who will assess your security posture as an external bodies that are 361 00:33:59,652 --> 00:34:05,504 companies that are not involved in your processing on a daily basis. 362 00:34:05,504 --> 00:34:07,275 So I think this kind of 363 00:34:08,173 --> 00:34:11,035 second layer could be implemented in this kind of way. 364 00:34:11,035 --> 00:34:23,604 This is why we've been thinking with MAS about this platform that could provide the lawyers with certain guidance playbooks, how to implement certain security controls in a 365 00:34:23,604 --> 00:34:37,313 very simple way, especially for smaller law firms who do not have so many resources, who cannot put uh many resources, uh money, investment into the security. 366 00:34:37,313 --> 00:34:54,088 because it's very easy to say for a huge global low film to implement certain ISO 27001 or 42,000 for the management of AI, but for smaller low films, it could be really 367 00:34:54,088 --> 00:34:54,728 challenging. 368 00:34:54,728 --> 00:35:06,403 So I think the second layer of defense is possible, and I think it could be done for bigger low films who can em collaborate with defenders that will help. 369 00:35:06,403 --> 00:35:14,088 can help them to uh provide the readiness assessment for the certification. 370 00:35:14,088 --> 00:35:18,180 But for smaller law firms, this is what we've been thinking about it with MAS. 371 00:35:18,180 --> 00:35:32,479 Maybe we will collaborate with ILTA to help us really build some this kind of free vendor neutral platform who could serve these specifically uh smaller law firms. 372 00:35:33,263 --> 00:35:37,974 Yeah, and what are your so sock to type to ISO? 373 00:35:37,974 --> 00:35:40,741 um This would be different. 374 00:35:40,741 --> 00:35:41,802 I take it 375 00:35:43,586 --> 00:35:49,344 This would be geared specifically towards the needs of law firms instead of industry wide. 376 00:35:51,358 --> 00:35:57,762 it should be started from ISO 27001 about the information security management systems. 377 00:35:57,762 --> 00:36:13,821 uh SOC 2 type 2 is also now uh very uh often chosen as basically also a requirement from the clients because if you are dealing with huge clients they are also imposing on you 378 00:36:13,821 --> 00:36:18,133 obligation to uh possess certain certification. 379 00:36:18,354 --> 00:36:21,996 SOCTO type 2 is very often mentioned. 380 00:36:22,937 --> 00:36:37,628 ISO 27001, helps, but of course, this controls without this translation into the specific situation without like uh support from the third uh parties, you cannot easily implement 381 00:36:37,628 --> 00:36:40,067 it because there are not very easy controls that 382 00:36:40,067 --> 00:36:40,977 could be implemented. 383 00:36:40,977 --> 00:36:43,728 There is nothing about multi-factor authentication. 384 00:36:43,728 --> 00:36:48,079 There is nothing about the access controls, encryption that should be put in place. 385 00:36:48,079 --> 00:37:00,384 There are very generic statements what kind of controls should be put in place in order to assess your whole infrastructure and basically have certain guardrails in place. 386 00:37:00,384 --> 00:37:01,654 But what kind of guidelines? 387 00:37:01,654 --> 00:37:04,015 You need an expert for this. 388 00:37:04,015 --> 00:37:05,045 And this is. 389 00:37:05,148 --> 00:37:07,791 why we've been thinking about such platforms. 390 00:37:07,791 --> 00:37:16,503 But you are right that like ISO 27000, maybe 27071, it's about the compliance within the data protection that could be also useful. 391 00:37:17,646 --> 00:37:23,726 So, Maz, what are you guys doing about this and what's the payoff for you? 392 00:37:23,726 --> 00:37:25,986 You're spending time on this. 393 00:37:26,386 --> 00:37:35,821 Is it just really kind of a be a good citizen of the community effort or is there, what's in it for you? 394 00:37:35,821 --> 00:37:42,642 Yeah, well I'll give you the raw answer to that question. 395 00:37:45,094 --> 00:37:57,613 I was made aware of the problem when I was in Denton's and I was asked to lead the RISO 27000 implementation from the global team. 396 00:37:57,613 --> 00:38:05,748 And, you know, I realized that given the size of Denton's at the time, we still, and, you know, we had people in the CISO team who were ex-military, right? 397 00:38:05,748 --> 00:38:12,072 One of our directors, Scott Applegate, is a brilliant man, he's ex-military, you know, he's an awesome guy. 398 00:38:12,173 --> 00:38:14,035 And even with all the knowledge that we had, 399 00:38:14,035 --> 00:38:19,240 I still had to go and find an external contractor to come and do some work for us. 400 00:38:20,938 --> 00:38:25,942 going through business cases, uh FDWs, onboarding. 401 00:38:25,942 --> 00:38:33,548 I quickly realized that, know, if dentists is struggling to get this done on their own with all the resources that we have available, all these other thousands of small and 402 00:38:33,548 --> 00:38:38,432 medium sized law firms have no hope. 403 00:38:38,813 --> 00:38:43,447 And I'm interested, think as Anna is keenly on just problem solving, right? 404 00:38:43,447 --> 00:38:47,589 I think when you identify a problem, I'm not someone who just kind of brushes under the rug. 405 00:38:47,589 --> 00:38:49,221 want to find a solution to it. 406 00:38:49,221 --> 00:38:51,112 The topics are very interesting. 407 00:38:51,144 --> 00:38:56,676 AI come into play into the fold just magnifies the potential problem in an issue. 408 00:38:57,697 --> 00:39:10,052 But also, narcissistically, it just helps me personally grow and develop myself is to learn about new things, learn how to solve them, work with interesting people, get your 409 00:39:10,052 --> 00:39:12,183 brand out there as well. 410 00:39:12,183 --> 00:39:18,836 But fundamentally, it is about the bad actors are coming together. 411 00:39:19,296 --> 00:39:21,827 They've picked the red color, the team jersey. 412 00:39:21,948 --> 00:39:23,588 They're fighting against us. 413 00:39:23,588 --> 00:39:26,480 We're fighting against them independently. 414 00:39:26,480 --> 00:39:28,101 We need to become a single team as well. 415 00:39:28,101 --> 00:39:32,423 We need to pick the blue jersey, whatever color you want to pick, and be the opposition team, right? 416 00:39:32,423 --> 00:39:36,225 To face them head on because otherwise, individually, we're just going to lose, right? 417 00:39:36,225 --> 00:39:37,736 So we need to come together. 418 00:39:37,736 --> 00:39:41,318 That's really my personal objective. 419 00:39:41,572 --> 00:39:49,956 From my perspective, lawyers will need to understand how our modern legal practice will look like with these AI models. 420 00:39:49,956 --> 00:40:03,643 So I think we need to also see each other as a society that should support each other, especially uh when this new advanced technology is coming. 421 00:40:04,004 --> 00:40:09,026 And we will inevitably need to use it in order to... 422 00:40:09,026 --> 00:40:10,221 uh 423 00:40:10,221 --> 00:40:13,793 to have a competitive advantage or just to stay in the market. 424 00:40:15,034 --> 00:40:31,272 that's, think, some kind of work that should be done and if we could somehow also help lawyers to understand how this kind of usage could impact and how important security is 425 00:40:31,272 --> 00:40:40,287 now in order to actually uh defend lawyers, our profession, our legal obligations. 426 00:40:40,387 --> 00:40:42,299 in a very responsible way. 427 00:40:42,299 --> 00:40:43,730 So that's one thing. 428 00:40:43,730 --> 00:40:49,995 The second thing is more maybe dental related, it's about the data sharing, right? 429 00:40:49,995 --> 00:40:55,098 Because each global law firm is structured in a way that there are a lot of partnerships. 430 00:40:55,098 --> 00:41:05,386 So basically some kind of independent law firms, very often smaller ones that do not have enough resources, investments to uh implement certain security measures, right? 431 00:41:05,386 --> 00:41:07,149 So it's also 432 00:41:07,149 --> 00:41:15,219 for this kind of smaller entities law firms to implement responsibly security baselines at least. 433 00:41:15,842 --> 00:41:16,192 Yeah. 434 00:41:16,192 --> 00:41:26,146 And I'll share another unpopular opinion of mine, which is that the law firm partnership model does not have the appropriate governance mechanisms in place. 435 00:41:26,146 --> 00:41:37,811 So for example, at Bank of America, we had a chief risk officer who, when I was there 20 years ago, Amy Brinkley, she reported directly to the CEO. 436 00:41:38,131 --> 00:41:45,644 And um when I worked in audit, we had a chief auditor who reported directly to the audit committee. 437 00:41:45,964 --> 00:42:01,144 So there were, there were controls in place so that if the third line of defense found something and had pressure from the business, a very profitable part of the business that 438 00:42:01,144 --> 00:42:07,029 would, this issue would cause problems for their ability to continue to generate profits. 439 00:42:07,029 --> 00:42:15,394 You create a conflict and that reporting structure bypasses that conflict because the audit. 440 00:42:15,394 --> 00:42:22,776 function reports to the audit committee, audit committee, which is a subset of the board of directors. 441 00:42:22,896 --> 00:42:36,770 So in the governance structures that a big business, a C-corp has in place, and ultimately those boards of directors are installed by shareholders and they hold management 442 00:42:36,770 --> 00:42:44,012 accountable for delivering on, you know, not just the financial objectives, but the risk management objectives. 443 00:42:44,342 --> 00:42:55,069 None of this exists in law firms today and lawyers who are leading law firms today because of the bespoke nature of legal. 444 00:42:55,409 --> 00:42:59,432 The lawyers are the front line, the first, second and third lines of defense. 445 00:42:59,432 --> 00:43:04,095 Nobody checks the lawyer's work before it gets sent to me as the consumer, right? 446 00:43:04,095 --> 00:43:09,470 But as we get into this tech enabled world, we're going to need to have 447 00:43:09,470 --> 00:43:17,194 systemic approaches to evaluating technology, making sure the proper controls are in place, making sure that there's no conflicts. 448 00:43:17,313 --> 00:43:22,357 And at the top, it probably shouldn't be necessarily run by lawyers. 449 00:43:22,737 --> 00:43:29,841 Maybe these are professionals from big business or big tech or other areas of discipline. 450 00:43:29,841 --> 00:43:34,083 The lawyer perspective is very important in this whole equation. 451 00:43:34,084 --> 00:43:38,932 But from what I've seen, the C-suite in 452 00:43:38,932 --> 00:43:48,939 legal is not always sufficiently empowered the way they would be in a traditional C Corp governance model, right? 453 00:43:48,939 --> 00:43:57,574 With very intentional reporting structures that ensure and eliminate conflicts like the one I spoke of. 454 00:43:57,935 --> 00:44:08,442 And we got a lot of maturing to do to get from where we are today to where we're going to need to be in five years when technology is delivering so much. 455 00:44:08,926 --> 00:44:19,309 And I see it as a huge gap and I wave this flag all the time and I get told by really like by lawyers in leadership who I really respect, but they've never worked in a big corporate 456 00:44:19,309 --> 00:44:22,790 environment and seen all of this risk management rigor applied. 457 00:44:22,790 --> 00:44:25,591 And they're like, yeah, we're not doing that. 458 00:44:25,591 --> 00:44:31,313 I'm like, okay, you're able to get away with that today, but what about tomorrow when things look different? 459 00:44:31,313 --> 00:44:35,440 So I don't know, Anna, do you, do you agree that we need more? 460 00:44:35,440 --> 00:44:36,121 exactly. 461 00:44:36,121 --> 00:44:46,248 Especially with this new technological advancement, really like 10 years ago, I've been actually, coming from the family when my mother had a low pyramid. 462 00:44:46,248 --> 00:44:54,233 Everything was processed on servers and the client's data, everything what she had was protected in a very specific way. 463 00:44:54,233 --> 00:45:02,719 Even if we've been traveling, she couldn't leave any document in the car because she was so aware about it. 464 00:45:03,087 --> 00:45:19,692 And now with usage of technological advancement, with cloud, with processing, with usage of uh AI, really need, like law firms, we really need to think about security because that 465 00:45:19,692 --> 00:45:25,364 like more and more often the fines will be imposed on the law firms. 466 00:45:25,364 --> 00:45:28,074 And that is already happening in UK. 467 00:45:28,074 --> 00:45:29,895 There are a couple of last... 468 00:45:30,069 --> 00:45:34,380 year there were really a lot of fines imposed on the law firms. 469 00:45:34,380 --> 00:45:38,982 In Europe it's also some kind of scrutiny imposed on law firms. 470 00:45:38,982 --> 00:45:44,834 So I think now lawyers will need to start thinking about these layers of defense. 471 00:45:44,834 --> 00:45:56,216 And you are right Ted, that maybe that, and in my view, it shouldn't be the work for the lawyers because they should be focused on their work, but they should have in place. 472 00:45:56,216 --> 00:46:03,222 put some guidelines, engage some vendors, some third parties that would uh actually protect them. 473 00:46:04,263 --> 00:46:16,114 But like again, even if they will engage some third parties, there is nothing now in place in the internet that could support really how to implement certification if you not have, 474 00:46:16,114 --> 00:46:16,814 you 475 00:46:17,731 --> 00:46:23,834 free to two vendors at least, know, one to assess the readiness of your architecture. 476 00:46:23,834 --> 00:46:27,316 The other one would help with the certification itself. 477 00:46:27,316 --> 00:46:28,917 So it's very costly. 478 00:46:28,917 --> 00:46:44,616 So I think that the future and if we would like to survive as a professional and really the profession that is trusted and reputation is really important. 479 00:46:44,616 --> 00:46:46,837 We would need to start thinking. 480 00:46:46,837 --> 00:46:55,155 about implementation of the security measures as soon as possible because that is extremely important. 481 00:46:55,155 --> 00:47:07,627 And I totally agree with the layers of defense, but again, this kind of very complicated structure, not possible, but alternative ones for sure, there should be. 482 00:47:07,635 --> 00:47:10,056 we need to start inching our way there. 483 00:47:10,177 --> 00:47:16,052 It's, um you know, and the biggest law firm in the world by revenue is Kirkland and Ellis. 484 00:47:16,052 --> 00:47:17,453 They're around eight billion in revenue. 485 00:47:17,453 --> 00:47:20,906 They wouldn't even qualify for the Fortune 500 if they were public. 486 00:47:20,906 --> 00:47:21,256 Right. 487 00:47:21,256 --> 00:47:24,529 So there's been a big ceiling on scale. 488 00:47:24,529 --> 00:47:29,433 And I've had, I've talked to people who've told me, well, maybe law firms don't want to scale. 489 00:47:29,433 --> 00:47:35,337 I was like, um okay, I don't know too many lawyers that don't want to make more money. 490 00:47:35,948 --> 00:47:36,458 Right? 491 00:47:36,458 --> 00:47:39,719 Because ultimately that's the motivation to scale a business. 492 00:47:39,719 --> 00:47:42,380 um the primary motivation. 493 00:47:42,380 --> 00:47:44,081 There may be others as well. 494 00:47:44,081 --> 00:47:46,798 Well, this has been like a fantastic conversation. 495 00:47:46,798 --> 00:47:54,275 Uh, Maz, how do people find out more about what you're trying to do and how can my listeners help support this effort? 496 00:47:54,275 --> 00:47:58,196 Cause it sounds like a fantastic step in the right direction. 497 00:48:00,153 --> 00:48:01,653 There's a very good question. 498 00:48:01,653 --> 00:48:03,935 One I must admit I should have thought of. 499 00:48:04,956 --> 00:48:10,270 Anna and I, you can find Anna and I on LinkedIn, I think, or through this podcast and reach out to us. 500 00:48:10,270 --> 00:48:16,974 We will be, we are working on some public facing content. 501 00:48:16,974 --> 00:48:25,880 So we're working with a technology magazine, Canadian legal technology magazine on potentially publishing our thesis on this, right? 502 00:48:25,880 --> 00:48:29,123 Our argument that'll be published hopefully sometime in 2026. 503 00:48:29,123 --> 00:48:29,980 And so 504 00:48:29,980 --> 00:48:38,986 We are putting out that sort of content and as people, you know, just keep an eye on that, engage with it, reach out to Anna and myself, reach out to you, maybe Ted, and you can put 505 00:48:38,986 --> 00:48:40,127 us in touch with everybody. 506 00:48:40,127 --> 00:48:50,364 The one thing I just want to say on everything we've discussed today, I would urge any listener who wants to take somebody seriously and practically, Google Operator Model 507 00:48:50,364 --> 00:48:58,930 Canvas and just Google Business Model Canvas and please start with those two things when you go from transformation, improvement, AI adoption. 508 00:48:58,930 --> 00:49:01,366 whatever it may be, you don't have to go deep. 509 00:49:01,366 --> 00:49:04,452 One page plan is better than no plan. 510 00:49:04,453 --> 00:49:07,770 Start there and it'll help you for your journey. 511 00:49:08,866 --> 00:49:09,826 Well, that's great advice. 512 00:49:09,826 --> 00:49:16,132 It will include some links in the show notes to your LinkedIn profiles and people are always welcome to reach out to me. 513 00:49:16,132 --> 00:49:18,313 I'm very easy to find on LinkedIn. 514 00:49:19,195 --> 00:49:26,980 you know, I'm always posting and you know, I, sometimes I have to step back and go, man, I hope I'm not trying to be critical. 515 00:49:27,201 --> 00:49:28,762 Like pointing these things out. 516 00:49:28,762 --> 00:49:31,825 And sometimes I wonder if I'm not coming across that way. 517 00:49:31,825 --> 00:49:38,390 I'm pointing these things out because I think there's a lack of awareness and urgency. 518 00:49:38,858 --> 00:49:44,542 around this and our business is entirely dependent, you know, the what's in it for me. 519 00:49:44,542 --> 00:49:49,706 Like what's in it for me is my customers are still around in five years. 520 00:49:49,706 --> 00:49:53,568 I really want law firms to succeed. 521 00:49:53,688 --> 00:50:05,887 And so, you know, I, I am aware that pointing these things out sometimes can ruffle feathers, but the reality is if we don't talk, start taking steps in this direction, the 522 00:50:05,887 --> 00:50:08,200 entire industry is at risk. 523 00:50:08,200 --> 00:50:11,213 And that would be bad for me, it would be bad for both of you. 524 00:50:11,213 --> 00:50:13,586 um But this has been... 525 00:50:13,586 --> 00:50:16,138 Yeah. 526 00:50:16,389 --> 00:50:21,752 vendors often throw around the word strategic partnership, willy-nilly and randomly. 527 00:50:21,752 --> 00:50:31,006 Strategic partnership means that you have a critical friend, somebody who can tell you what's wrong with your best interest in mind. 528 00:50:31,006 --> 00:50:32,316 So think that's what you're doing, Ted. 529 00:50:32,316 --> 00:50:36,578 I think Anna and I, that's what we're trying to do as well, is just to help the industry improve. 530 00:50:36,578 --> 00:50:38,903 Vendors are innovating right now. 531 00:50:38,903 --> 00:50:40,100 Legal firms need vendors. 532 00:50:40,100 --> 00:50:40,850 So. 533 00:50:40,850 --> 00:50:45,006 The more we work closely with each other to co-create value, think the better. 534 00:50:45,580 --> 00:50:46,775 Yes, awesome. 535 00:50:46,775 --> 00:50:52,734 Well, thank you both for the work that you do and it's been a pleasure to speak with you here this afternoon. 536 00:50:53,441 --> 00:50:54,896 Likewise, thank you very much. 537 00:50:54,896 --> 00:50:55,336 us Ted. 538 00:50:55,336 --> 00:50:56,588 Thank you Anna. 539 00:50:56,829 --> 00:50:57,910 Have good day. -->

Subscribe

Stay up on the latest innovations in legal technology and knowledge management.