Enterprise AI Adoption: A Board Member’s Perspective

Friday 1/10/25 1:00 PM EST

Add to Calendar

Multi-industry board director and former Chief Information Officer Andi Karaboutis shares a board perspective on enterprise AI adoption — balancing opportunity, risk, governance, and organizational readiness, on CXOTalk episode 866.

56:09

Jan 10, 2025
2 Views

This week on CXOTalk, multi-industry board director Adriana Karaboutis offers a unique look at how boards of directors shape AI adoption in large organizations. She explains how to strike the right balance between driving innovation and mitigating risk. This episode provides essential guidance for boards navigating the complexities of AI integration.

Key topics include:

  • Current State of AI Adoption: Understand the latest trends and challenges in enterprise AI implementation.
  • The Board's Role: Discover how boards can effectively oversee AI strategy, balancing innovation with risk mitigation.
  • Bridging the Technical Gap: Learn strategies for non-technical board members to engage meaningfully in AI discussions.
  • Effective Governance: Explore approaches for establishing AI governance, managing risk, and ensuring ethical considerations.
  • Asking the Right Questions: Gain practical tips on how boards can ask insightful questions about AI initiatives, even with limited time.
  • Measuring Success: Learn how to define and track key metrics for evaluating the impact of AI investments.

This episode offers valuable perspectives, whether you're a board member, executive, or simply interested in the future of AI in the enterprise.

Episode Participants

Adriana Karaboutis is an independent board director at Aon plc, Perrigo plc, Autoliv, and Savills plc.  She has held prior director roles at AspenTech, Advance Auto Parts, and Blue Cross Blue Shield of Massachusetts. Her career highlights include being the Global Chief Information Officer at Dell Technologies and the Group Chief Information & Digital Officer at National Grid plc.  Adriana was Biogen's EVP of Technology, Business Solutions & Corporate Affairs. She is also an advisor to iGreenTree.ai.

Michael Krigsman is a globally recognized analyst, strategic advisor, and industry commentator known for his deep expertise in business transformation, innovation, and leadership. He has presented at industry events worldwide and written extensively on the reasons for IT failures. His work has been referenced in the media over 1,000 times and in more than 50 books and journal articles; his commentary on technology trends and business strategy reaches a global audience.

Transcript

Michael Krigsman:

How are boards of directors approaching enterprise AI and AI adoption in the enterprise, and what should boards do about members who don't understand technology? Let's get into it on CXOTalk number 866 with Adriana Karaboutis. She's an independent board member across multiple industries and has also held senior leadership roles at companies like Dell, Biogen, and National Grid.

Where are we in terms of the adoption of AI in the enterprise? What are you seeing right now?

Current State of AI Adoption by Industry

Andi Karaboutis:

It is different by industry and company, but one thing that is absolutely consistent is that it is permeating and growing by the day. The current state of adoption, I would say, is different by industry. Tech and software are clearly leading the path. It's part of the fabric of what they're doing: virtual assistants, chatbots, AI-driven cloud services, et cetera.

Financial services, another industry way up there relative to their adoption around fraud, risk management, using it to check algorithms, healthcare being pressured and using it quite a bit. And then you see some of the, and you also have retail and manufacturing. And then you see some of the, and I wouldn't call them laggards, Michael, but I would call them a little bit slower to adopt your regulatory environments or governments and maybe some of the agricultural components or industries in agriculture.

In terms of adoption inside the companies, in other words, these companies and industries are trying to drive AI and really become consumers of it. Internally, you still have about a 10-40-40-10 split where it's 10% experts, 40% kind of people that are experimenting or in the novice phase, and then 10% that are still very reticent to adopt. They're the laggards. Oh, we shouldn't use this, shut it down, et cetera.

I would say some of that comes from a lack of education or understanding of it, and some of it just comes from an abundance of being overly conservative and cautious. So there's quite a spread, but one thing's for sure, and it's what I said at the start: the adoption is going faster and faster and faster. We're moving quickly, Michael.

Drivers and Roadblocks to AI Adoption

Michael Krigsman:

When we talk about large organizations, what do you see as the drivers of AI adoption, and also the obstacles or the roadblocks?

Andi Karaboutis:

The drivers is around competitive pressures, companies, especially public and private companies, I shouldn't say especially public, but companies are always looking to do more with less, to have higher quality, higher cost, customer satisfaction, employee improved, et cetera. So those pressures that cause you to look at tools and capabilities that will help you achieve that and so that's some of the drivers.

And as peers in industries adopt more of these tools and are able to actually show more capability and more of the outcomes that we're looking for, then that drives competitive pressures to all the companies in that industry to stay abreast and continue to improve. Some of it is around newer products and offerings. Some of it is around getting a better bottom line, which is cost efficiency, reduced SG&A costs, et cetera.

Some of the roadblocks is around, it is change and there is just human nature and reticence to adopt, Michael, lack of skills and understanding of the tools and technology. Risk aversion and boards in particular have a big, big role to play around risk management and making sure that the company is within what we call risk appetite and that anything that is adopted or happens stays within that risk appetite, whether you're talking about cyber et cetera, and in particular with artificial intelligence and tools. There's ethical issues, there's privacy, there's security, there's ensuring there's no bias and fairness, et cetera.

Regulatory compliance, legal compliance. All of those things are huge and could be seen as roadblocks to overcome, but certainly slow us down in our thinking and sort of our foray into artificial intelligence.

Balancing Innovation and Risk in AI

Michael Krigsman:

Andi, you talk a lot about risk. Of course, AI is all about innovation, and innovation implies something new, that's changing and that's different, which, of course, is risk. And so, as a board member, how do you see this balance?

Andi Karaboutis:

Boards are around governance and ensuring the company has the right strategy. Management teams do day-to-day operations that execute the strategy. So, as board members, we concern ourselves with do we have the right strategy in order to meet our stakeholder commitments and the outcomes that we want, right? And so we have a unique position. It's not overlapping with management, it augments management. There's a trust and a relationship and a collaboration that needs to happen with management.

Many have heard the expression noses in and fingers out. That sort of is the layman's expression for a board. In other words, we ensure, we govern, we oversight, et cetera, and we ensure that risk management is, as I said before, within an appetite level that we set for the company. So, in that sense, boards increasingly recognize the impact of AI and everything that we do should be in line with our business strategy, our regulatory requirements, our legal requirements, et cetera, and so we always make sure that how a management team is going to execute, to deliver a strategy, is within those parameters.

Tension Between AI Goals and Risk Management

Michael Krigsman:

Can you dive into that tension between the desire to make use of that technology to achieve the organizational objectives and strategy versus the restraining factors due to the risks?

Andi Karaboutis:

I'll use a comparison that should make it clear. Sports teams don't just play offensive plays, they have defensive plays. So in order to win the game, you go on offense and defense. Right? Offense is we try things new. We look to technology to help us meet or exceed our business stated goals and objectives and, interestingly enough, when board directors were asked by an organization called North America Corporate Directors what are the trends they worry about most? Tech, AI, and cyber were in the top six, along with geopolitical climate, et cetera.

So, given that when we think about that, we worry about it in a defensive play to ensure that, as I said before, we're staying in position in order to stay within risk appetite, we also want to leverage those tools and we want to see our companies and management leveraging the tools to accelerate and or meet the business strategy.

So there's the offensive and the defensive.

It's a 360 degree wide angle lens and it's critically important that boards don't fall into one which could, if it's offensive, that boards don't fall into one which could, if it's offensive, you could end up with a lot of privacy, regulatory et cetera issues, or the other where we're trying to preserve our way, we lock everything out, we don't let, we say we don't want to bring, for example, generative AI tools into the company or machine learning, artificial intelligence, because we don't want to run the risk. Nobody ever cost cut or risk avoided their way to prosperity.

So there is that balance and informed decision making is critical, making sure that parameters are understood. For how is a company using AI? Do they have a group that says we're looking at the use cases for what we want to do? What are the risks of the use cases and what are the benefits of the use cases? Those are very important things for management and, in large scale applications, for boards to worry about as well. So it's that balanced, informed decision making, offensive and defensive plays in order to win the game.

Enabling AI Adoption Within Organizations

Michael Krigsman:

Please subscribe to the CXOTalk newsletter. Decision-making offensive and defensive plays time of the beginning of social media. There needs to be a kind of more thoughtful, in-depth approach that addresses the reality of how people live and work.

Andi Karaboutis:

Technology is ubiquitous. We've heard that before we're led by what individuals do is we're led by what individuals do. I would say a great proportion of the population now, at least in the US, are using, you know, ChatGPT, DALL-E. We've seen students at school, you know, using this to write papers, et cetera. The exposure is there. Therefore, when people go to the office, we don't turn that off right and we look for tools enabled by the tech organization, et cetera, to be able to help deliver the outcomes.

I fundamentally believe people always want to do the right thing, but they want to be helpful and they want to be productive and they want to deliver the outcomes. Or I should say, and they want to be helpful, productive and deliver outcomes. These tools enable that. So organizations need to say yes and how, versus no because right and so how do you? And again it goes back to my offensive, defensive play but within these large organizations, how do we deliver the capabilities to use generative AI to create new content, right? Text to photo to create, you know, illustrations that could help new products, new services?

So, unfortunately, regulatory lags behind, sometimes the infrastructure of these large companies, which is important, by the way, to be able to renew and refresh so that you've got good, clean, concise data not concise, but good, clean data accessible to be there. All of that must be in place, so we need to consider these.

Tools are here, they are coming in. If we don't enable them within the enterprise, you know, the worst of all worlds is where people take data and I'm not saying anybody's done this, but we did see it some years ago with Samsung. You know, hopefully done inadvertently where people take data to the outside, into these tools to either create code or get insights and then bring it back in. You have now exposed potential company secure information and confidential information to the outside. So it has to be how do we bring in, not how do we create steel walls so people can't access these tools for 10 or 20 days?

I mean, when I was at one of my former companies, I said we're going to lock down for a period of 10 or 15 days, but we let the workforce know this is coming. We just need to bring in our own instance to ensure that we keep the security and the safety in. And, by the way, that is a really important point, Michael. Communication on how, when, how far, which data is available, et cetera, is super important, because, absent communication, employees may not know what they can or can't do, or what they should or shouldn't do, and so all of that is critically important.

Board Mindsets on AI: Competitive Advantage vs. ROI

Michael Krigsman:

I really like the not defaulting to no. For so many years, enterprise technologists, CIOs just simply the default answer if somebody wants something new is, well, no, we can't do it, and then later we can take a look and see well, maybe we can do it, but the default is no, and so I really like your approach. Why don't we jump to some questions from LinkedIn and Twitter? There's a bunch that are coming in and I encourage everybody who's watching ask your questions, because literally, when else will you be able to ask somebody like Andi Karaboutis, pretty much whatever you want, so take advantage of it. Okay. So the first question comes from Isaac Sacolick, who actually will be a guest coming up very soon on CXOTalk, and he says where are the board's mindsets today on AI? Is it a gold rush for competitive advantage?

Is it a gold rush for competitive advantage or are they looking for pragmatic ROI? Or how are boards relating to AI?

Andi Karaboutis:

Software tech companies, I think, are rushing for advantage. No question, we're taking a look, whether it's the Microsofts, the Googles, et cetera, and beyond. They're really taking a look at how they can leverage AI for competitive advantage, and I would imagine that the board conversations are full of discussion around the opportunity as well as the risk. But on that topic, as you move into what I would call the more traditional businesses, I think the discussions are still happening and some board members are quite bullish around what are the advantages? What can we use? How can it help us? That's the big thing. How does it help us? Top line, bottom line, customer sat, employee satisfaction, job availability, et cetera? But it's an intellectual curiosity that is actually something that I think is necessary of any board member, and so the conversations are there.

Most boards are asking for more training in some of the more traditional businesses training and understanding. Now, they don't need to understand how you ingest data and all of the bits and bytes, but they do need to understand the fundamentals around. What are the prerequisites? As I said before, your infrastructure needs to be simplified, your data needs to be available and, by the way, this is a huge opportunity for the technical CIOs, CTOs, in the room. Figure out and practice your game on how to communicate, because boards are asking for the education. Please don't bury them. Stroke us in a bunch of tech, speak around neural networks, et cetera.

Practice your game to really be able to explain the technology, the capability, the risks and how it could help your specific company and industry. That may be the difference between true success and what I'll call faster success versus a prolonged realization of value. Isaac, it varies is what I would say is what I would say.

The Importance of Clear Communication About AI

Michael Krigsman:

I really want to emphasize a point you just made around the importance of simple, straightforward, clear communication and how important that is.

Andi Karaboutis:

Absolutely, Michael. You know what my teams would recognize and probably smile at the next phrase I'm going to make. I always say please start me with the forest before you take me to photosynthesis, right, so understand the big umbrella, even if when it's explaining artificial intelligence, it's the overarching umbrella. And then there's machine learning and there's all of the categories generative AI, genetic AI, et cetera.

And so when you take a board that is noses and fingers out, that has to really figure out and synthesize a lot of information, comprehend it and come into insight they need to you have to ask yourself what do they need to know? And you know what do they need, what do they need for informed decision making? And so that's a big capability that you know. I'll put my CIO CTO hat on. We, as technology people, need to work on in order to help the boards be supportive and be able to balance that offensive, you know, with defensive play.

Michael Krigsman:

You know, I just this morning saw a segment from an interview with Sam Altman, the founder CEO of OpenAI, and he said that among startup founders, one consistent trait, among every single successful founder, is the ability to summarize and express what they're doing in 25 words or less, in a really clear way.

Andi Karaboutis:

Absolutely right. Call it the elevator pitch, call it the in 10 seconds you've lost my attention. Call it whatever it is, but be very clear. And you know, there's the old adage of what do I want to say versus what do others need to understand and, by the way, who are they? That's super important. So I can't emphasize enough. And it's with any big breakthrough tech or any big concept, even if it isn't tech, you have to be able to really communicate. What are we trying to do? Even if it isn't tech, you have to be able to really communicate. What are we trying to do? How are we going to do it? And then I'll quote Simon Sinek and connect people to the why. And so that doesn't change for board members either.

Measuring the Success of AI Initiatives at the Board Level

Michael Krigsman:

Let's jump to the next question from LinkedIn and keep your questions coming on LinkedIn and on Twitter, and this is from Deepak Adinarayana.

Deepak says how do you measure the success of AI initiatives at the board level? What metrics or indicators do you rely on to ensure these technologies are delivering both value and alignment to the company's long-term vision?

Andi Karaboutis:

As with any initiative, major program or effort or transformation that we do at major, bigger, small companies, there is an expected outcome and when you embark on something like this, which is a huge investment for the company from and I don't just mean monetary, but resource, culture, change, effort, discomfort, et cetera there's an outcome that's expected. So, the measures are very simple: tried and true. Return on investment, return on equity, and return for what we said we're going to do. How do we measure the success as we're going? KPIs I know these are not big and sexy and different, but they're fundamentally, when we're doing something that's that big of an investment, what is the return we expect and are we getting it in the timeframe in which we expected?

I wish I could give a fancier answer than that, but fundamentally that's what we look for. But very important to be clear at the outset what are we getting for this? And sometimes there are hard measures, most often and sometimes there's also soft measures or soft outcomes. I would say improve satisfaction from employees. Those are real, they're important. So if we state them and do what we say and say what we do. That's where we look at these outcomes, and sometimes an outcome could be learning and saying only, as I said earlier, 10% of the organization is expert, we want to get to 20. That's an outcome. So it might not be monetary, but it is along the scale of what we said we want to do in order to adopt something that could then provide stronger returns.

Management vs. Board Roles in Measuring AI Initiatives

Michael Krigsman:

And this is from Cheryl Falks Bendy and she has a question about this relationship between management and the board, especially when it comes to AI, and she says this, following on Deepak's question are you even measuring AI initiatives discreetly or leaving that to the management team and sticking to business outcomes that AI might be a component of?

Andi Karaboutis:

When the initiatives are big and the investment is significant or once again it's a significant change, like we're trying to drive a different culture, organizational adoption of AI, the board will get involved and we will take a look at it and overall we look at the maturity of artificial intelligence. What we don't do is go in and look at every artificial intelligence initiative and say how are each of these going, percentage-wise, and things like that. That's for management to do. Again, I can't emphasize enough day-to-day operation and execution of initiatives to deliver on the strategy is the job of management. The board assesses the risk, the board assesses the progress to commitments and the outcomes, et cetera. As it relates to sort of the big overarching business strategy of the company, it's a clear distinction as it relates to sort of you know, the big overarching business strategy of the company.

When Should the Board Drill Down into AI Initiatives?

Michael Krigsman:

It's a clear distinction, but it can also be murky at times. And how do you decide when you need to kind of drill in more, because at some point you know the board member may have to get involved, you may be able to offer advice, or there could be many, many different reasons. How do you balance that? How do you make that choice?

Andi Karaboutis:

I get asked a lot about when do you know the difference and when to do it? It depends by company and by board, right? And it depends it's steered by risk, right? So if it's not big enough to for a board to to to worry about, in comparison to other things on the agenda or happening, it's a very subjective. When is it time? You know, a lot of people ask me how often should the boards get a review of AI initiatives? I don't know. It depends on are those AI initiatives integral and core to achieving the business strategy on the table? If that's the case and there's a timeframe and a dollar amount and an importance level to it because it's the pivot point that needs to be done to get us there then I would say the board visits it quite a bit. But if it's not, then we could get it in a pre-read or we could get it just in an update or it could be infrequent. So it depends. I wish I could say there was a recipe behind it, but there isn't.

Michael Krigsman:

Yeah, I can imagine there. These are complex issues that will vary according to the company culture and the situation.

Andi Karaboutis:

That's right. In general, because we do track. Are we staying abreast of technological advancement? So, even if there isn't a specific initiative, we need to ask leading questions like how is the adoption of AI tools? What are we using it for? What are some of the use cases that we're going after? Is it inventory management, for example, in a retail space? Are we using it to monitor manufacturing quality and adaptive learning?

Certainly, if you're in the auto industry the autonomous vehicle I would say those boards are probably and I'm not on an automotive board, so it's an opinion but I would say those boards are probably talking quite a bit about the artificial intelligence that is, self-learning around autonomous driving, if they have autonomous vehicles. So, again, it is very subjective to the industry and the company.

The Need for Tech Expertise on the Board

Michael Krigsman:

This is from Arsalan Khan. He's a regular listener to CXOTalk and he says this. He says as tech becomes more and more part of our lives, do boards need a tech advisor only or a permanent tech board member who is fit to do this? The CIO, the chief digital officer, the CTO?

Andi Karaboutis:

So if you can address that and I think it also gets to the larger issue of technology capability on the board. Board of directors have what's called a skills matrix, and I would say all public company and I think, private company as well boards have these and a skills matrix is around. A company assesses what do they need, what are the capabilities they need in their board. So, for example, financial literacy, financial expertise, is one of the metrics, right, it may be sales and marketing, it may be product development, etc. The skills matrix. More and more we are saying cybersecurity, digital transformation, et cetera. I suspect we're going to start saying tech savvy, artificial intelligence and a deepening of that.

To have a token member on a board that says there is our digital tech person is a start, is a start, but it's not sufficient, in my opinion, because technology impacts every part of companies, whether it's from, you know, customer satisfaction measurement to product development. It is ubiquitous.

So, in my opinion, we have to have tech savvy board members sort of permeating the board. That doesn't mean that they need to know how to do microprocess programming or that they need to do a Python program, but what it does mean is they need to be abreast. We need to be abreast of what are the technological advancements, how do they disrupt, how do they augment, what risks do they pose what opportunities and benefits for our company and industry. And that broad-based tech knowledge, as I call it, I think is imperative, if not for every member of the board, for majority members of the board. And you know, at one point many years ago on another board, someone said well, thank goodness we have Andi on this board because she's watching out for that. Well, I'm not sure that that is, you know, it may be back then, but it's certainly not the right mindset now.

So tech-savvy is important.

And again I'm going to reiterate not everybody needs to go get a degree in computer science or computer engineering, but being abreast, staying in tune, intellectually curious learning, et cetera. I myself have joined the advisor of an awesome company called iCreamTreeai. It's an artificial intelligence company that helps, even though I have a computer science degree and I'm tech savvy, but it's out there helping augment human intelligence for utilities industries. That's how I stay abreast. So even people that have the degree can be not tech savvy. So it takes I'll use a phrase, it takes a village. That's how I feel about it.

Addressing Board Members with Limited Tech Understanding

Michael Krigsman:

What about the stereotype of board members who are old school, have very little comfort or understanding of technology, who are used to having secretaries maybe in the past, who don't have the background? What do you do if you're a board and you have esteemed members who are in that situation?

Andi Karaboutis:

I'm an optimist and a glass half full, but I would say they're retiring because the pressures are such that you can't really. I'm honored and pleased to be on the boards I'm on because I, hand on heart, don't see that. Still, I think they're retiring because you just cannot be an advocate for and do your duty of loyalty and duty of care that you need to do as a board member if you refuse to sort of stay abreast of your industry and the technology that could impact it.

Michael Krigsman:

All right. So that problem is a problem that is taking care of itself over time.

Andi Karaboutis:

Essentially, that's my view. Yes, care of itself over time. Essentially, that's my view.

Measuring the Impact of AI Tool Choices

Michael Krigsman:

Yes, this would be an excellent time to subscribe to the CXOTalk newsletter. Go to CXOTalk.com and subscribe to the newsletter and we'll notify you about upcoming events just like this, which we do all the time.

So this question is from Chris Peterson. He's another regular listener and he says for those mature AI adopters, how do they measure positive and negative impacts of the tool and product choices that they make?

Andi Karaboutis:

Tools and product choices are not a decision for the board. So I will just say right off the bat, the board may ask questions of you know why did you bring an instance of ChatGPT or DALL-E or? But it is not a question for the board. As for management and what they choose to adopt, I you know, I can assume and believe that they make that determination based on the benefits, the availability, the costs.

For sure and that is something you know boards will ask is how much we're allocating and the cost of AI tools, the risks and the openness. Is it open source or is it not so?

For companies that are doing that—and I apologize—I'm sort of going between what the board would do and what management would do. I assume they're balancing the equation as we did whenever we bought other kinds of tools around all of those parameters: commercial relationship, viability, durability, price performance, et cetera.

Balancing AI Use with ESG Concerns

Michael Krigsman:

Here's a question from Lisa Pratico and she says AI, used everywhere, is showing a negative impact on sustainability with energy and water consumption. Are boards, or to what degree are boards, concerned about balancing the use of AI against negative ESG outcomes or to ensure that there are positive ESG outcomes?

Andi Karaboutis:

As with any new technology, Lisa, there is what I call swings and roundabouts, and so boards recognize that, while we have our ESG commitments and they remain strong and we want to adopt the new technology, we have to monitor and understand both, and there may be a time where one is, you know, sort of, you know, usurping the other.

But we monitor and we see growing pains in this technology is inevitable, and so we. I can't say that we sit and measure this is how much AI and compute power we're using and this is the offset, but we are, in general terms, watching that as board directors because we have ESG commitments and we have commitments on how to improve and do more with less and provide better offerings. So it's not a one-for-one, but I think in the round, it will all come back to. We have a wide-angle lens and a 360-degree view of these things.

The Board's Indirect Role and Avoiding Frustration

Michael Krigsman:

Andi, as a board member, you don't directly have your hand on the rudder of a ship. You are giving guidance and then management is actually steering the ship in a more direct way. Is that frustrating for you at all, the fact that you have this indirect role in terms of what the organization does as indirect role, as opposed to a very direct leadership management role?

Andi Karaboutis:

When I stepped down from my last operational role, people said are you retiring? And I said no, I'm rewiring and a good friend of mine actually coined that phrase, so I won't take credit for it. And when you rewire, you realize that you're going into a different set of roles and, by the way, I've been doing boards for 10 years, so it wasn't sort of I stopped full time and went into boards. I've been doing them since 2015.

But it's a different role and with every new role, you ask yourself what is my accountability? What is my responsibility? What is it that I'm bound to? What is my responsibility? What is it that I'm bound to have? You heard me say duty of loyalty, duty of care, et cetera. And how do I best do that to show up well to my boards? Now I will tell you.

I went through growing pains. I've been quoted as saying if I could go back to my first board, I would give them all an apology, because it was that transition of management where I wanted to know why are you doing this, et cetera, to a board position? I'm going to use your boat analogy. It was an excellent one. Management is driving the boat, but the board is overseeing, and the board doesn't come in.

You know, have we delivered? Is the business strategy, are the financials the way that we expected them coming in? And we ensure that we're delivering to the commitments made and we help course correct, whether it's succession planning, whether it's, you know, looking at the allocation of spend, whether it's we have to look at competitive pressures, and things have changed.

Therefore, maybe we need to do a course correction. So there is the pilot that is piloting or the captain that is driving, but then there's also the control tower that is monitoring, and so the control tower plays its position, knows what it does and has to synthesize a lot of information quickly to be able to be helpful and stay in the role that they're in.

Aligning AI Strategy with Responsibility

Michael Krigsman:

So staying in the role is a key part of success for a board and individual board members.

Andi Karaboutis:

Yes.

Michael Krigsman:

Yes, yes, Mario Garcia asks. He says AI is a game changer, but aligning strategy with responsibility, that's the real challenge. What do you think companies are missing?

Andi Karaboutis:

First of all, understanding the true power of AI. So education. I don't know that they're missing, but certainly this is what companies need to truly understand artificial intelligence, capability and tools and offerings from software and tech companies and what they could provide. Clear tying of initiatives to the outcomes and the strategy of a company and what it's trying to achieve. To the outcomes and the strategy of a company and what it's trying to achieve.

Uber. Important. Measuring and ensuring that you don't have to course correct, because course correction is good. Staying with a strategy or with a tool or a capability that isn't working is just throwing good money after bad and sunk cost is sunk cost. So there's a clarity of your strategy, the deliverables to get to your strategy, the outcomes that must happen, the tools and the programs that are going to help deliver that, and what I call stringing the thread through all of that. I think if there's anything and I'll go back to your direct question of what companies might be missing? I think if there's anything and I'll go back to your direct question of what companies might be missing it's really understanding the capabilities of AI, given that topic, and what can or can it not provide for the achievement of the business strategy of the organization.

AI Change: Upskilling vs. Culture Change for Boards

Michael Krigsman:

Let's jump to another question, this time again on LinkedIn, from Dimitrios Burampas, and he says and this is a really good question. He says do board members face the AI change as an upskilling necessity or as a more holistic culture change?

Andi Karaboutis:

It is an upskilling, as I said before. I think as boards are evolving and sort of realizing the dramatic change that tech, cyber and AI are bringing that, board skills, board awareness and education need to also sort of move commensurately.

It is also a big culture change. It is a culture change within the company. Technology is now not just for programmers right. I'm not going to make a bold statement and say you never need programmers anymore, but you are moving further from that direction and away from the direction of programmers to anybody can be a citizen developer getting the output that they need, or more people can. I don't want to offend anybody here on viewership. So I think it's both. It's the composition, the focus of the board, it's the composition and focus education and management team and it is a big culture change across the spectrum from employees all the way to the board.

Ensuring Ethical and Trustworthy AI

Michael Krigsman:

And this is a question from Lisbeth Shaw, who says how are boards helping their companies address the ethical use of AI and trustworthy AI, especially since doing so may cut into maximizing profit?

Andi Karaboutis:

I'm very proud and I'm not, you know, not just the boards that I'm on, but most boards. Ethics trumps profits, right? I will just say that it is a darn shame when we find the one or two unicorns that don't you know operate that way. Ethics trumps profits. So how do boards ensure ethical usage? Through audits, through reviews, through stakeholder interviews, through third-party assessments? It's you know we do this. You know, on many fronts, around strategy, et cetera, around financial controls. You know we have our internal audits.

We have three stages: first, second, and third line of threat vectors of financial controls. Well, it should be the same and I would say is turning in that direction of three lines of defense around AI, I believe, does this where they have an AI ethics group.

So the board asks these management questions and ensures these controls are in place. Do we have an ethics committee? What's the latest audit? Is our data secure? How do we know we don't have biases in these programs? By asking these questions and forcing sort of the responses. That's how we ensure and make ourselves comfortable and we look at data associated with that that in fact, we're staying ethical.

Board Involvement in Navigating Roadblocks to AI Adoption

Michael Krigsman:

Okay, and on a related question, Justin Mennen on LinkedIn raises again the issue of the critical role of board involvement in navigating roadblocks to AI adoption while maintaining an appropriate risk profile. So, can you elaborate a little bit on this point, which really is how boards can ensure innovation while, at the same time, not taking the ostrich approach and hiding from change?

Andi Karaboutis:

The easiest way is we ask the question what will it take? What will it take to be able to adopt artificial intelligence and AI tools in order to achieve X outcome or Y outcome? Then, the board tries to dedicate resources or help management, supporting management in dedicating resources or funds in capital allocation to remove the roadblocks to achieve the outcomes. Not open-ended, though, how do we get AI tools in is not something the board would say, right. But if, for example, we were saying we want to improve inventory management and reduce shrinkage, for example in a retail company, we would say what would it take to bring tools in? To do that?

Management needs to be prepared and again, very much a two-way street, Michael, but management needs to be prepared with our infrastructure is legacy? Here's what we need, this is what it would take, or these are the tools that we need to invest in, or these are the resources that we need to go hire or get help with. Right, but it's how do we achieve? What are the roadblocks we can help remove? And then setting the support structure up for management to do that, but again, not just to bring in AI tools, but to achieve the outcome of what these tools could promise for the stakeholders.

The Board's Role in Driving the Organizational Agenda

Michael Krigsman:

Correct me if I'm wrong. In a sense, what you're saying is that the board's responsibility is to drive the organizational agenda while balancing competing forces.

Andi Karaboutis:

It's to agree to the organizational agenda and the company's strategy and then see what the board needs to do to support removing those roadblocks while constantly monitoring that we stay within our risk appetite.

Michael Krigsman:

The way that you're describing it. If I can be totally blunt, it sounds like kind of corporate speak. That's pretty much what I said, or am I missing? What am I missing here?

Andi Karaboutis:

You missed the risk appetite.

Michael Krigsman:

Okay.

Andi Karaboutis:

So, you said the board will remove things, but the board will. There's a check and balance. Look, I'll take the corporate speak out of it, and just, I love that you called that out, Michael, but there's checks and balances, so it's not. Yeah, go it's again. I'll go back to what I said 40 minutes ago.

There's an offensive play, but we must ensure that we stay within our regulatory, legal, ethical boundaries and I'll always repeat that.

Michael Krigsman:

Very helpful. So, checks and balances are an essential element of this.

Andi Karaboutis:

Absolutely, absolutely. So if I'm asked as a board member, Andi, would you give us your view and opinion on ChatGPT and it's great, it's this, it'll do this. And what are we training? And we use 6 million data sets and all the rest of it. That's fantastic, but in the same breath, I'm going to say what are we doing to ensure that corporate data does not get out as we're delivering on this? Can't emphasize it enough.

Addressing Execution Gaps in AI Strategy

Michael Krigsman:

I know I've said it about 12 times, so forgive me. Well, it's clearly an essential part of the board role, and so it's great that you're emphasizing it. We have a number of questions left. I'm going to ask you to answer these really quickly because I want to have time to share advice on how technology leaders can become board members. So very, very quickly from Twitter, from Arsalan Khan again, strategy without execution is hallucination. Sometimes execution is a fraction of what strategy wanted. What do boards do?

Andi Karaboutis:

We make sure that the plans for execution are sufficient and enough to achieve the strategy. And, by the way, some of those are not just tech plans or AI plans. So this goes to threading the needle. Strategy, strategic initiatives to achieve that strategy, that outcome, execution being in line with those initiatives to achieve the strategy. I know I'm making it sound very simplistic. Quite often companies have strategic goals. Quite often companies have strategic goals but don't have the right strategy to achieve them, which means they don't have the right execution focus and programs. But it's a thread in the needle.

Board Involvement in AI Training and Change Management

Michael Krigsman:

Let's go back to LinkedIn very quickly again from Isaac Sacolick. Are boards diving into how effective management is in investing in leaders and employees for training around AI and effectiveness on change management, or is this left to be a management issue?

Andi Karaboutis:

If we have set up the outcomes and the commitments and the strategy again, that we're trying to achieve. There is a set of capabilities, programs, initiatives, efforts, culture changes et cetera that have been set forth to achieve that. You measure if those things are being achieved. Boards will not go in and say, you know, just sort of ad hoc, how many AI specialists do you have? We will say, do we have enough specialists to achieve this? Do you have the workforce of the future? Are they located in the right places? Do they have the right reward mechanism? Are their objectives aligned?

We will ask those questions. We don't go except for the chief executive. We don't hire and sort of fire the management team, etc. We have a lot of input, but we make sure that companies are resourced. In the same way, we make sure that capital allocation from a financial perspective is done sufficiently to achieve the outcomes that we're trying to achieve sufficiently to achieve the outcomes that we're trying to achieve.

Michael Krigsman:

Again, you've got the checks and balances to ensure that resources and people are in place to carry out the organization's vision and strategy.

Andi Karaboutis:

That's exactly right. Again, we put our noses in, fingers out, sniff it, and ensure that we don't execute that sort of thing.

Criteria for Joining a Company Board

Michael Krigsman:

We have a question from Cheryl Foulkes Bendy again, who says what do you look for in a company when you're considering joining a board?

Andi Karaboutis:

First of all, is it an industry that I am interested in and can contribute to? Is it a company that I believe I could be interested in and can contribute to and benefit from my expertise?

 Is it a board that I can work with? I look at by name and background, each board member to see if that is a board that I can learn from and contribute to. Is it an ethical company? I look at the financials. You know, sometimes they're great, sometimes you know you can find an opportunity to grow that, but that's not typically how I you know a key criteria for assessing it and so through that you start realizing is it a place where, if I go play on that field, I can actually contribute and, you know, give good guidance, good insight, and where my capabilities are needed?

From my personal experience, I consider it a duty and an honor to be on a board.

Effective Collaboration Between Management and the Board

Michael Krigsman:

What advice do you have to business leaders inside an organization on being effective in working with the board?

Andi Karaboutis:

So I have seen management teams that try to manage the board and to some degree, there is a level in that collaboration where you need to manage each other's expectations, et cetera. But for me, it's around collaboration. What does the board need to do to do their job? What does the board need to do to be effective at their oversight and governance and to help give us guidance? That's what I would look at in a management team.

Quite often, we look at board members when they ask questions. Oh my gosh, the board asked for this. Relax, right. Why is the board asking? What is the board looking for? And so, just like this is going to sound very trite, Michael, but just like in a good relationship, you want someone asks you something. Don't go on the offensive or defensive, but what is it they're looking for and how do I help achieve that so that they can help me, through oversight, to deliver? That's how I would view it.

Michael Krigsman:

So, really understanding, trying to understand the board member's perspective. What do they need, and how can I, as a member of the company's leadership or management, help that board member achieve whatever it is that they're looking for?

Andi Karaboutis:

That's right and quite often it's fair to say why do you ask? Because it could be something completely different than what they might interpret. It could be just a curiosity, it could be some other reason, or the board member may say I can't really divulge or I need to understand this bit of information, but typically we go to the CEO for that.

Advice for Tech Leaders Seeking Board Membership

Michael Krigsman:

What advice do you have for technology leaders, CIOs, CTOs and so forth who want to become a board member?

Andi Karaboutis:

Recognize there's no recipe. So if someone says to you go, do this, go, do this, go, do this right, I've seen that quite often it is not a recipe. There is chemistry to bring people on boards. There are experiences that certain companies are looking for. I mentioned capability matrix, et cetera. But here is what I would recommend to every tech person. Figure out business.

Understand if you're going for a particular company or in an industry. Start understanding the industry. Your trade may be technology and that's what you bring, but you're going on a board for governance oversight, a board for governance oversight, fiduciary responsibility, your financial literacy, et cetera. So hone those skills and bring yourself up sort of into that broader umbrella and lose the tech speak. Quite often we are so impressed with ourselves on all of the deep technology that we know and we should be very proud of it, but that is not the level that boards operate at. Occasionally, your tech boards may want to go deep into things, but you really need to raise your game up a level.

Providing the Right Level of Technical Detail to the Board

Michael Krigsman:

You mentioned this earlier. Can you give us an example to make this point concrete? If I'm a CIO and I'm talking with the board, how much technical detail should I provide?

Andi Karaboutis:

Enough so that they get the picture of benefit, risk, required, dollar allocation, if that's what's being asked for, and so that they can do their job. So, recognize the board's job and recognize their oversight and governance. What do they need to know in order to do their job? Well, right. And so I use a lot, and I'll say something else. I use a lot of analogies and comparisons.

You know, we were trying to describe threat vectors, and you know how people can infiltrate, what the bad actors do, for example, on cyber. And so you know, draw some analogies. I used to say if you put a lock on the front door but the jewels are sitting on the table, then not good on the front door, but the jewels are sitting on the table, then not good. So maybe you have a safe and then you have it under the boards in the kitchen. That's defense in depth. So I would explain it in terms that they could. It's not dumbing it down, it's just relating it to something they know.

Michael Krigsman:

That's an interesting point. It's not dumbing it down, but it's presenting it within the context of what's important to them and where their focus lies.

Andi Karaboutis:

Right.

Conclusion and Thank You

Michael Krigsman:

Okay, well, with that, we are out of time, so I want to say an enormous thank you to Adriana Karaboutis, who is a multi-industry board member. She's held senior leadership roles at large organizations like Dell, Biogen and National Grid. Andi, thank you so much for being with us and sharing your expertise with us today.

Andi Karaboutis:

Thank you, Michael. It's a pleasure being here. I appreciate you having me back.

Michael Krigsman:

And a huge thank you to everybody who asked such amazing questions and who was watching today. Before you go, please subscribe to the CXOTalk newsletter. Just go to CXOTalk.com. Subscribe to our YouTube channel. We really have extraordinary shows that are coming up and great discussions, and you should be a part of it, folks. Thank you so much, everybody. I hope you have a great day and we should be a part of it, folks. Thank you so much, everybody. I hope you have a great day and we will see you again next time.

Published Date: Jan 10, 2025

Author: Michael Krigsman

Episode ID: 866