MIT's Andrew McAfee discusses AI culture in business, geek culture impact, ethical AI integration, and future work strategies in CXOTalk #812.
Building an AI Culture: Strategies for Business Leaders
In CXOTalk episode number 812, Michael Krigsman speaks with Andrew McAfee, a principal research scientist at MIT, for a detailed discussion on creating a business culture that supports AI. As the author of 'The Geek Way, McAfee shares lessons drawn from his extensive research on how technological advancements impact business operations and organizational culture.
This conversation is particularly valuable for business leaders interested in learning how to create culture that supports the strategic role of AI in their organization.
Key highlights from this episode include:
- The Intersection of AI and Business Culture: Insight into how AI is reshaping business strategies and influencing organizational dynamics.
- 'Geek Culture' in Organizations: Exploration of the concept of 'geek culture' within enterprises and its significance in fostering innovation.
- Ethical and Strategic Implications: Discussion on the ethical aspects of AI integration and strategies for effective implementation in corporate settings.
- Adapting to Technological Change: Guidance on how businesses can evolve to embrace technological advancements and the future of work.
Andrew McAfee is a Principal Research Scientist at the MIT Sloan School of Management, co-founder and co-director of MIT’s Initiative on the Digital Economy, and the inaugural Visiting Fellow at the Technology and Society organization at Google. He studies how technological progress changes the world. His next book The Geek Way will be published by Little, Brown in 2023. His previous books include More from Less and, with Erik Brynjolfsson, The Second Machine Age.
McAfee has written for publications including Foreign Affairs, Harvard Business Review, The Economist, The Wall St. Journal, and The New York Times. He's talked about his work on CNN and 60 Minutes, at the World Economic Forum, TED, and the Aspen Ideas Festival, with Tom Friedman and Fareed Zakaria, and in front of many international and domestic audiences. He’s also advised many of the world’s largest corporations and organizations ranging from the IMF to the Boston Red Sox to the US Intelligence Community.
Michael Krigsman is an industry analyst and publisher of CXOTalk. For three decades, he has advised enterprise technology companies on market messaging and positioning strategy. He has written over 1,000 blogs on leadership and digital transformation and created almost 1,000 video interviews with the world’s top business leaders on these topics. His work has been referenced in the media over 1,000 times and in over 50 books. He has presented and moderated panels at numerous industry events around the world.
Transcript
Table of Contents
- About Andy McAfee’s book, The Geek Way
- The culture of AI in business
- Why culture drives corporate performance
- Lessons for creating an AI culture
- Does building an AI strategy require a unique culture?
- Microsoft’s corporate culture under Satya Nadella
- The “industrial era playbook” of workplace culture fails today
- How to change corporate culture
- How to manage resistance to culture change from a leadership team
- Corporate culture examples: SpaceX vs. NASA
- Overcoming challenges and obstacles to culture change
- How to manage the risks of culture change
- How business leaders can prepare for changes in work culture
- About Workhelix from Andy McAfee and Erik Brynjolfsson
- How organizations can drive employee engagement around culture change
- Ethical considerations when driving long-term culture change
- Andy McAfee on the future of work
Michael Krigsman: Welcome to Episode 812 of CXOTalk. We're discussing how to build a culture for AI. Our guest today is Andrew McAfee. He is a principal research scientist at MIT.
I've known Andy for years. With that, Andy McAfee, welcome to CXOTalk.
Andrew McAfee: Michael, I appreciate how you didn't specify the number of years. I think you and I are both happier if we leave that vague.
About Andy McAfee’s book, The Geek Way
Michael Krigsman: Andy, it's great to see you. You just wrote a book called The Geek Way. Tell us about your work and tell us about your book.
Andrew McAfee: I keep on getting interested in this question of how technological progress changes X and X varies. I've written books about how it changes jobs and wages and the impact on the labor force.
I wrote a book about how it changes our relationship with the environment. Eric and I wrote a book about how it changes business models that are possible and smart.
This book is a little bit different because, instead of the X part, this is a book about another kind of technological progress. I'm not trying to get cute here. The company is a technology. The company is a thing that we human beings invented to help us accomplish our aims.
The reason I wrote this book was I became convinced that the technology that is the company has received a big upgrade, and I wanted to write about that upgrade.
The culture of AI in business
Michael Krigsman: A lot of this book relates to culture. When we talk about this culture of AI, how is it different from any other type of culture?
Andrew McAfee: This book, The Geek Way, is largely about culture. And if anybody had told me, even five years ago, that I would write a book about anything that felt or sounded like organizational culture, I would have laughed in their face.
I didn't get a lot of value out of the stuff that I read about culture. I had no desire to contribute to it. It wasn't anywhere on my radar. Then this book came out.
The only explanation that I have was, as I tried to pattern match and understand what was allowing some companies to be able to kind of do it all, to be agile, to keep innovating, to execute all at the same time and do it as they grew, as they scaled, as they became large companies, all the explanations that I came up with, the things that made sense to me, were not aspects of their capital allocation process. They weren't aspects of their technology stack. They were aspects of how they went about getting things done, what the environment in the company was like and, for now, that's a decent definition of culture.
I said to myself, "All right. You're writing a book about culture," because as far as I can tell, that is what separates these companies that I call "geek companies" that are really just performing like crazy versus the kinds of companies that I've also been studying that are getting left behind by the geeks.
Why culture drives corporate performance
Michael Krigsman: Culture then became the determinant, if you will, of corporate performance.
Andrew McAfee: You've heard the old saying. I looked it up. It's actually not that old. People think Peter Drucker said it, but there's no evidence that he did.
"Culture eats strategy for breakfast." Most of us believe that, but I don't think we've been talking about culture in ways that are as helpful as they could be.
Michael Krigsman: What's the right way to talk about it?
Andrew McAfee: Less hand-waving and less virtue signaling are two things that would come in handy. I have always found discussions about culture to be kind of vague, and they always have some kind of... Not always, but they very often have some kind of lesson associated with them. In other words, this is a good culture because it's nice or because it treats people well.
CEOs talked about the culture that they built that allowed them to be as successful as they were, and you could just kind of hear them patting themselves on the back.
I didn't love the whole conversation around culture. But as part of my research for the book, I came across a definition of culture that I adore and that turns it from this kind of vague, handwavy, scolding kind of a conversation into a really, really pragmatic conversation, so I want to read it for you.
It comes from the anthropologist Joe Henrich. He says, "By culture, I mean the large body of practices, techniques, heuristics, tools, motivations, values, and beliefs that we all acquire while growing up, mostly by learning from other people."
Now I want to stress two things there. Joe Henrich concentrates here on people acquiring culture as they grow up, the cultures that they're born in. Let's take that "while growing up" phrase out of it. It's also what happens when you go to work inside an organization.
You are surrounded by people. You absorb the culture. You teach and you learn and you become part of the culture.
The other thing that I love about this definition is that it is rock solid and it's really pragmatic. Culture is how you go about getting things done inside of any kind of human group, including a company. So, I can take this and run with it.
Michael Krigsman: As you said, very often culture is discussed in a holier-than-thou way. This enables me (as the expansion of who I am) to accomplish, if I'm the CEO, or, as you said, it can be very vague and handwavy. Now you're making it practical.
Lessons for creating an AI culture
Can we apply this now to companies that are trying to transform to incorporate, to integrate AI? What are some of the implications there?
Andrew McAfee: The reason I got really excited and decided to write this book was two things came together. The first was all these observations that I was accumulating over the years as I went to go study companies that were just excelling at whatever they chose to do. A lot of those were concentrated in Silicon Valley in northern California, but not all of them.
I was trying to do my pattern matching and try to figure out what made these companies able to perform at such a high level, that was the first thing. The second thing was I came across a body of scholarly work that I thought explained why the geeks were performing as well as they were.
It's a relatively young discipline. It goes by a few different names. I like the name "cultural evolution" because, to oversimplify a ton, this discipline kind of asks the question, "Why are we human beings the only species on the planet that launches spaceships?"
That's true. There's no controversy about that statement. Chimps launching spaceships makes for good sci-fi. It's not going to happen anytime soon. But why not?
When you dig in on that question, you learn a couple of things. One of the important things you learn is that other species have cultures. They don't have rapid cultural evolution.
There's a great phrase from a psychologist named Steve Stewart Williams, and he nails it. He says, "Ten thousand years ago, the pinnacle of chimpanzee culture was sticking a twig in a termite mound to get termites out. Today, the pinnacle of chimpanzee culture is sticking a twig in a termite mount to get termites out."
The cultures of other species on this planet don't evolve quickly. Ours absolutely do, and that was kind of a eureka moment for me because, wait a minute; this discipline explains how we are able to change, to evolve our cultures, to get more tools and techniques in our cultural toolkit.
That's kind of the same thing as saying, "How does a company transform itself? How does it innovate? How does it accomplish big, complicated efforts?" These are all flavors of cultural evolution. We should absolutely be importing the insights from the discipline of cultural evolution and applying them to the work of having our companies run better in the ways that we want them to.
Does building an AI strategy require a unique culture?
Michael Krigsman: Is there something unique about AI culture and the attributes of a culture in a company that will be successful in integrating AI into their operations?
Andrew McAfee: I don't think so, so let me turn that question around. What kinds of companies do you think are going to be most successful at incorporating generative AI, the older flavors, machine learning? Are they going to be the mainstays of the industrial era? Is generative AI finally what allows General Electric to make a big comeback? Maybe.
Doesn't it seem a lot more plausible that the companies that have been hitting the ball out of the park (with all the technologies in the 21st century so far) are going to be the ones most successful, most likely to incorporate the newest flavors of technology, including generative AI, and put them to work inside the enterprise? I absolutely think so.
To make that concrete, who is going to do more interesting work with generative AI? It is going to be Netflix or HBO? I'll bet on Netflix.
Michael Krigsman: The geek culture, as you've been describing it, really is the successful underpinning of corporate performance, in general. But in particular, adoption of any kind of new technology, whether it's AI or anything else.
Andrew McAfee: I believe that. I also believe that the geeks have experimented and iterated their way into a set of cultural practices for managing large, complicated efforts. That can be anything from bringing gen AI into the enterprise to figuring out how to lay a subway system for an entire metro area faster, cheaper than we've been able to do it in the past.
This geek toolkit is very, very broad. That sounds like too strong a claim. Let me boil it down a little bit.
I think what the geeks have figured out how to do most fundamentally is to evolve their cultures in the desired direction no matter what that direction is. It's all cultural evolution, and the geeks have again iterated and experimented their way into a faster pace of cultural evolution.
Once you've got that capability, you can point it at lots of different things. AI is clearly one of them.
Microsoft’s corporate culture under Satya Nadella
Michael Krigsman: What should companies do that don't have this culture because, in a way, you're kind of almost writing off a large swath of enterprise, of business?
Andrew McAfee: Let me explain why I don't think I am. I think any company can get geekier, and the entire book is based on that premise.
My Exhibit A for a company getting geekier in a huge hurry is Microsoft (under Satya Nadella).
Michael, your career like mine, is long enough to remember all phases of Microsoft's existence. The Go-go years from founding up until about the turn of the century, and then, for the first 10, 15 years of this century, Microsoft was a very large corporation but it was dead in the water.
The stock price went absolutely nowhere. They might as well have taken their R&D budget and lit it on fire in the front yard for all the good that it did them. They were a deeply sclerotic, bureaucratic company. The in-fighting was kind of terrible. It was this politicized environment.
They had aged in a very, very ungraceful way, and not many people were expecting that to change when Nadella took over. But he became CEO in 2014 and has accomplished one of the great corporate comebacks, I think "the great corporate comeback" of my career. Unlocked a crazy amount of value.
I had the chance to interview him for the book. As he was describing what he did, I just find myself going, "Check. Check. Check." This is all just stuff that's in the geek playbook:
- Decentralized authority.
- Push decision-making down.
- Get rid of a lot of the needless bureaucracy that's jamming things up.
- Try to walk away from a culture of defensiveness and toward a culture of more openness.
- And being willing to admit that you're wrong and that a pivot is necessary.
All the things talked about throughout the book, Nadella talked about them, too. So, I do believe that any company, even if they're in a fairly jammed-up place, can get geekier. What I don't believe is that companies that continue to follow the industrial era playbook are going to continue to be competitive once the geeks come to town.
The “industrial era playbook” of workplace culture fails today
Michael Krigsman: Can you elaborate on that? What do you mean by that, "Companies that follow the industrial era playbook"?
Andrew McAfee: That was the playbook that I learned from and that I started teaching from early in my career. My career is long enough that it goes back to the 20th century.
We believed a few things. We believed that big efforts need to be very carefully planned out in advance. We believed in business process reengineering and in defining all the processes of an organization, specifying all the cross-functional touchpoints that needed to happen, baking those into software, and then just having the entire company kind of run from that playbook.
We believed that people at the top of the org chart got to make the big decisions because their judgment, their intuition, their expertise, and their experience let them rise to that pretty high level. And we believed that the mantra should be the title of one of Jack Welch's books, which was Winning. Your job is to win all the time.
I believe some of those things explicitly. I think I believed others of them implicitly. And now I think that they're all fairly bad advice.
I think that the way you make fast progress is by iterating instead of planning. I believe that the people at the top of the org chart are as overconfident as the rest of us.
They're way too fond of their own ideas and opinions. And they need to be forced to justify their decisions with evidence like everybody else. That's what the scientific method is all about.
I believe in getting rid of a lot of the cross-functional communication and coordination that has characterized a lot of companies and really pushing authority down, decentralizing, building small teams with high autonomy.
I certainly think that winning is important. Growth and profits and all that stuff are what companies are here for. But the way you get there is by being willing to fail, but not having be a career penalty, and by getting away from this notion that I have to win everything all the time.
That makes people dig in their heels. It makes them unwilling to admit mistakes or missteps. And it turns into this inherently kind of antagonistic, defensive organization, which can be a fairly miserable place to work.
And so, I respect the geeks and their ability to say, "Hey, I was wrong about that. It's time to pivot. This project was a failure, but at least we learned," and you're not going to get fired for taking a big swing and missing. These are all pretty big departures from what I would consider to be the playbook of the 20th century, of the industrial era.
Michael Krigsman: I find it interesting that you're so focused on culture because, when we think of research scientists, I think of very highly quantifiable issues, and culture seems not to be of that nature.
Andrew McAfee: It certainly has been, and I think a couple of things have changed. One is that I came across a definition of culture that I read a little while back that I like a lot more. Another one is that I believe that corporate cultures are becoming a lot more quantifiable.
A colleague of mine for a long time is a guy named Don Sall. He's at MIT with me now. He and his brother Charlie had a really good idea a while back.
Working with Glassdoor, they took all those free text reviews that people leave online about their experience working at a company. They trained up a machine learning system to recognize when a review was saying very positive things about innovation or very negative things about agility. Then they ran the body of all these Glassdoor reviews through that machine learning system.
They were looking at nine different values, the values that companies talk most often about in their mission statements. But I was really interested in three of those values, and they are innovation, agility, and execution.
Those are the ones that if you polled business academics, those three would be really at the top of the list for connection to performance.
It turns out that the companies that I am talking about here, these geek companies that are cultured, concentrated in Silicon Valley, concentrated in the tech industry, if you look at how their own people talk about culture there, their own people are saying that these companies do have (especially compared to the rest of the economy) very high levels simultaneously of innovation, execution, and agility.
I'm with you. Culture is a hard thing to pin down, and that's why there's so much handwaving and happy talk about it. We're getting better at pinning it down. And the things that we learn as we analyze culture reinforce my belief that something really interesting started out West and it's now diffusing.
Michael Krigsman: Please subscribe to our newsletter and subscribe to our YouTube channel. Check out CXOTalk.com because we have incredible shows that are coming up. Check out CXOTalk.com.
How to change corporate culture
Andy, as I speak with senior business leaders on CXOTalk, very often they will talk about trying to create a culture of data, a data culture, and make that pervasive throughout an organization. How should folks go about driving that kind of change?
Andrew McAfee: That's the first of the four great geek norms that I talk about. The bulk of the book is devoted to these four norms.
Norms are an important part of culture. Norms are what the people around you expect of you. They're community policing, so they're very, very strong in any human culture. When you violate a norm, you feel it, and it feels really bad to you.
The first norm that I talk about is the one that you're referring to, which is a norm of science. As you point out, it's about following evidence instead of intuition or judgment or letting evidence be the final decider about things.
The other aspect of science, which is a little bit less appreciated, is that it is an inherently argumentative process. I don't mean argumentative in the sense of people screaming insults at each other, but science is a process of debate.
Science is an ongoing argument with a ground rule: If you believe A and I believe B, we are going to agree on the test, the experiment, the evidence that will tell us whether it's A or B. Then we're going to go gather that evidence. Then we're going to move forward from there.
But we're always going to be arguing. That's inherent to the game. And we have a way to settle those arguments.
We're going to agree in advance on how we're going to settle those arguments. Great. You can bring that norm into a company fairly quickly.
One of the biggest failure modes, though, is that you still collect a lot of data, and you present it, and then you make the decision based on HPPO. HPPO is my new favorite business acronym. It stands for highest paid person's opinion.
It's just disheartening because the analytics teams do their work, they bring a decision, and then the HPPO (the person at the top of the org chart) says, "Okay. Thanks. I'm still going to follow my gut on this one." It's extraordinarily unscientific. I think it's also demoralizing.
One of the first things to do is be aware if you're making decisions based on HPPOs and start doing science instead.
How to manage resistance to culture change from a leadership team
Michael Krigsman: But it's so difficult to make that kind of a change. If you're working in a company and the boss is saying, "Well, we need to do this," and it contradicts what you're trying to do in terms of making data-driven decisions, what are you going to do?
Andrew McAfee: That's a tough problem, and it becomes tougher if your company doesn't have the fourth and final great geek norm that I talk about, which is openness. In a company with that norm, you as the junior person would say, "Hey, boss. We decided we were going to make our decisions based on evidence here, and you're just following your judgment. You're just doing what we would have done without all of the analysis. What gives? Why are we not being scientific about that?"
In a lot of companies, that would be a career-limiting move. That's why I concentrate on it and talk a fair amount about openness in my book The Geek Way because it's a way for people to speak truth to power and for the organization to stay on track and stay true to its stated values or the things that it says it's doing.
However – and I'm sure you're anticipating this – openness is also very fragile. If people see junior people get shot down when they try to speak truth to power, when they see that there's low psychological safety in a particular organizational culture, you're not going to have a lot of openness.
Michael Krigsman: Therefore, the change has to be driven both from the top and from inside the organization.
Andrew McAfee: I think that's right. Some of this stuff does not percolate up all the way. If the people running the company want to make decisions based on their gut, if they don't want to take any risks, or if they refuse to tolerate any failure, or if having a couple of bad outcomes on a project is a career limiting move, those signals get sent and received, and you wind up back in the inherently kind of defensive cultures of the industrial era.
I'll say this again. When that happens, maybe that'll work for a while. When the geeks come to town, when they come to your sector or your industry, you're going to be in trouble because they can evolve their cultures quicker than you can.
Corporate culture examples: SpaceX vs. NASA
Michael Krigsman: What does that mean when the geeks come to your sector, when they have you in their sights?
Andrew McAfee: Let me answer that question as concretely as I can. SpaceX, I believe, is either 21 or 22 years old, founded by Elon Musk, who is an archetypal geek who got started at PayPal, an online payments company.
There's nothing that inherently screams a successful transfer from online payments to building rockets and satellites. That is not an obvious move. But Elon, a space geek from way back, founded SpaceX.
Let's look at what that company has accomplished in its two decades of existence. It's the only company that builds commercially viable, reusable rockets. And they're reused more; they've had more than 100 re-flights. That's where their huge cost advantage comes from in putting payload up into space.
Last year, SpaceX put up, I think, between 80% and 90% of all the satellites that left Earth's surface and 60% of all the payload by weight that left Earth's surface for space. They are now dominating the business of putting things into space. That's chapter one out of three.
Chapter two is that SpaceX is the only organization on the planet that had any ability to deploy thousands of rugged, reliable, high bandwidth, portable satellite Internet terminals into Ukraine, into a warzone after Russia invaded. Nobody else could do that (as far as I can tell) at all.
Chapter three is that, in 2014, NASA gave both Boeing and SpaceX a contract to build a crude capsule, the thing on top of the rocket that takes astronauts to the ISS and, hopefully, onto the moon. Then if Elon has his way, onto Mars. So far, SpaceX has completed at least nine missions for NASA and missions for a few other organizations with their Dragon crude capsule with no loss of life. Boeing has not yet had a crude test of its capsule.
NASA itself has no ability. It has no vessel that can take human beings out of Earth's orbit and put them on the ISS or any place else.
Now, this is fairly remarkable performance in two decades. To me, it brings up a very closely related question. What have the incumbents in the space industry been doing all this time?
I kind of walked around with a lazy assumption that we were kind of at the frontier of what people could do to get themselves and satellites and everything into space. SpaceX has come along and, in two decades, shown me how wrong all my assumptions were and really brings up the question, "What has the entire rest of the global space been doing all this time?" I think they should be ashamed.
Michael Krigsman: From your perspective, this distinction is the culture – that's it.
Andrew McAfee: Elon does not understand physics better than people who work at Lockheed or Boeing or NASA. Absolutely not. He doesn't have any of the patents when he started that company. There's nothing in the realms of physics or technology that would give you any idea that SpaceX would be able to do that.
Elon is able to build a company that can evolve its culture extraordinarily rapidly, especially compared to everybody else in the industry.
Overcoming challenges and obstacles to culture change
Michael Krigsman: There are folks in a company who recognize that they must adopt AI through their operations. It's new to them. They're not sure what to do. What should they do, what are the obstacles that they are likely to face, and how do they overcome those obstacles?
Andrew McAfee: For something like adopting AI, the great geek norm of speed becomes really important. By speed, I don't just mean how fast people are trying to complete a project or how hard they're working.
I mean speed of iteration. How often are they building something, getting it out there, getting valid and useful feedback about it, and then building the next version and getting it out there?
Michael, you remember the old waterfall method for building software, which was this very logical, linear process. You do this, then you do this, and then you do this. The original diagram, which was drawn in 1970, gave the method its name because it looked like a waterfall splashing down through a series of pools.
The problem is it just doesn't work. You get the requirements wrong. The customers wind up really unhappy. The projects wind up being late. It was kind of a nightmare.
Michael Krigsman: I wrote endlessly about IT project failures, many of which were based on the waterfall method.
Andrew McAfee: You're super-familiar with this, and then you also know that the sea change happened starting one February weekend in 2001 when a bunch of people (who were probably as pissed off about waterfall and the failures of software development as you were) got together in Snowbird, Utah, gathered around a whiteboard, and that was the birth of the agile movement, which is this inherently fast cycle, iterative, high cadence method: build it, get feedback, incorporate it, get the next version out there.
I forget the exact wording of the agile manifesto, but it says our highest priority is to deliver value via the continuous delivery of valuable software. It's about as far away from the waterfall as you can imagine, but it's the right way to manage a complicated project in an uncertain time, and especially if the underlying technology is weird and in flux or not totally known. Then the importance of cadence, feedback, and iteration go up.
When we're talking about AI (and particularly about generative AI, which is just weird), the value, the importance of that iterative approach goes way, way up. If companies are going about AI the way they went about their ERP project in 1996, they've got real problems.
The challenge is that we humans are victim very often to the planning fallacy. In other words, we think we have a way to get out the sheet of paper, get in front of the whiteboard, and chart out a course into the future, and that's how it's actually going to unfold. It's a very, very deep-rooted human tendency.
Danny Kahneman named it "The Planning Fallacy," and fallacy is the important word there. We overestimate. We are overconfident in our ability to plan things out and get things done.
There is a minimum viable plan out there, and I'm the co-founder of a company that will help an enterprise develop its generative AI plan. But it's the minimum viable plan. Then you need to get out there and start doing things, iterating, and learning as you go. The problem is that's uncomfortable for a lot of organizations.
How to manage the risks of culture change
Michael Krigsman: We have an interesting question from Twitter. How do you factor in the risk of experimentation or managing the risk? That's another aspect of this because experimentation can blow up.
Andrew McAfee: It can literally blow up, and SpaceX has literally blown up many, many rockets. That is on their development path. They realize that when they start a project.
Now, it's important that they blow them up over very unpopulated areas and there are no people on these rockets when they blow up. But I think one of the central insights of geek companies – and I'll use SpaceX again – is the way you build a capsule that is safe enough to take human beings up into space is by iterating a lot and probably blowing things up along the way.
We very often make an assumption that if what you're doing has to be unbelievably safe and has to be error-proof, then the process for building it also has to be very, very conservative and error-proof. I think that's absolutely wrong. I think you let it rip during the planning and during the development phase, during the iterating phase so that you come up with something rock solid and very, very safe at the end.
I'll say it again. SpaceX is the only American organization certified by NASA to take human beings into space. That tells us something.
Michael Krigsman: If we look at a public company like Microsoft, what did they do? What did Satya Nadella do to enable the acceptability of risk-taking?
Andrew McAfee: He did a couple of things. I found it a super interesting conversation.
He described Microsoft as an incredibly defensive organization. He said that one thing you could never do if you wanted to be successful at Microsoft or have a good reputation was not to have the numbers on the tip of your tongue at a meeting or be wrong about your idea or back down from a position that you held. It was just this deeply entrenched, heels dug in, focus on winning organization all the time.
Part of Nadella's brilliance, I think, was he knew that he had to inject some vulnerability into Microsoft. Those are two words that—Michael, like you know—they really didn't go together for most of the 21st century.
He talks about a couple of very clever ways that he did it. One thing that he did was, in the absolute top management meeting, he brought in a psychologist who pulled a really clever stunt.
His name is Michael Gervais, and he said, "Okay, who in this room wants to have an amazing experience?" Or he said, "Would you all like to have an amazing experience?"
Of course, the whole room was like, "Yeah. By definition, that sounds great."
He goes, "Okay. I need a volunteer," and then nobody stood up for a long time.
Gervais said, "What's going on here? You all agreed that an amazing experience would be a good thing to have. None of you are willing to stand up in front of your peers and volunteer for something like that. There's just way too much defensiveness here."
That's kind of a clever little demonstration. But then what you do is you lead by example, and you show vulnerability.
I also interviewed Yamini Rangan, who is the CEO of HubSpot based here in New England, in Cambridge. I asked her for an example of that same behavior. She said, with her direct reports, she shared the performance review she got from the company's board: good news and bad news, not just the good news.
She said to her team, "These are the things that I heard from the board. These are the things that I'm going to work on. Here's how I'm going to go about it."
Then, of course, that kind of behavior cascades. Even though Rangan didn't mandate that everybody go do the same thing, her top-level reports went off and a lot of them started to share their performance reviews downward as well.
We humans are incredibly attuned to what high-status people around us do. If the high-status people are behaving in less defensive ways, if they're being vulnerable, if they're saying, "That didn't work. I screwed that up. I'm sorry. Here's where I'm going to do better," that signal is incredibly powerful. Those behaviors stand a much better chance of spreading.
Michael Krigsman: I'm wondering if one were to listen to senior executives during earning calls looking for clues about this cultural acceptance to take managed risks, whether you could almost use that as an investing angle, thinking about corporate performance.
Andrew McAfee: I wonder, and I don't know. Someone is going to go try it, pretty clearly, if they haven't already. I think those earnings calls are typically so highly scripted that they might not give you a ton of information.
Now, if you could go plant bugs in executive conference rooms around corporate America – and want to be super clear; I am not advocating that anybody go plant bugs in corporate. Don't do that.
If you could, and then you throw that into gen AI to do the transcription, and then you do a sentiment analysis of it, oh, man, could you trade off that! Again, do not do this.
But you could get a very good idea of the tenor of the organization, what kind of organization is it, just in that kind of speech-to-text and sentiment analysis exercise. You'd learn a ton.
Michael Krigsman: I have to tell you. We have been looking with someone at Harvard Business School about doing analysis of CXOTalk transcripts, and we discussed, because we have this large body of transcripts from senior business leaders.
Andrew McAfee: Yeah! Love it.
Michael Krigsman: We discussed exactly this, looking for these cues.
Andrew McAfee: And that's a flavor of research that is going to happen because our toolkit has improved so much and we wouldn't have to have people probably all over the world in low-income countries transcribing and then expert raters for is this an open or a vulnerable conversation or not. We have technology that'll do that for us.
I hope – I think and I hope that there'll be a round of coaching for people that will just listen to a lot of their transcripts and go, "You know you're kind of 80% defensive here. We need to get that dialed back, and here's how we can do that." I think that'll come.
How business leaders can prepare for changes in work culture
Michael Krigsman: We have a really interesting question from LinkedIn from Jason Talbot who says, "In 2029, they say humans will be in touch with AI on another level. How do we prepare and adapt accordingly to this kind of progress?"
Andrew McAfee: I do not know where we will be in our relationship with technology in 2029, AI or anything else. That's actually... Wow. That's six more years of the kind of progress that we have seen, so I don't know.
But the way to get there, the way to have a healthy relationship is to start, again, iterating, trying things, experimenting. These new flavors of AI are so fascinating and so amenable to this geeky style of learning by doing because we typically interact with them by doing this just with normal human speech. It's a very, very easy onramp to interacting with the technology and learning how to get good at it.
A great piece of advice – I think I heard this first from Hal Varian, who is the chief economist at Google – is you want to be a very expensive complement to something that's getting cheap very quickly. AI is making a lot of things cheap. If you are a good partner, a good complement to the AI, that's a really valuable set of skills for you to take into the workforce.
Michael Krigsman: This one comes from Bruno Aziza, who is at Google. He's my friend and been a guest on CXOTalk a few times. Bruno says he loves your comments on culture. Culture is what we do not what we say.
Your geek norm reminds him a lot of Ray Dalio's principles where the norms apply beyond the tech industry. There is a question around how do we do what we do, and how does emotional intelligence fit into your model?
Andrew McAfee: Emotional intelligence is most closely related to openness, I think, where you're not just shouting people down. You're not screaming at your underlings.
Linus Torvalds had to leave the Linux Foundation because his style was just so emotionally tone-deaf, and he drove people out of the community. He eventually had to step down.
I think you can increase your EQ, and I think you can learn to be a more open person and to change your style of interacting with others to make them feel a little bit less defensive. That's homework we can all take on.
Michael Krigsman: We have another question from LinkedIn. This is from Naeem Hashmi. He says he's been studying the IT landscape of a Fortune 500 company and only a very small number of apps are actually suited to AI even though they may have, these apps may have, high impact. And so, he's making the point that we need to understand the problem before jumping on the AI hype, and he wants to know your thoughts.
Andrew McAfee: I categorically agree. That's why our startup, which is called Workhelix, tries to help an enterprise get its plan together, stage its AI projects in an intelligent, economically smart way.
Companies vary a huge amount in their suitability for AI and in where AI should be applied. But every company has some kind of easy onramps.
About Workhelix from Andy McAfee and Erik Brynjolfsson
Michael Krigsman: Andy, do you want to tell us briefly about your startup?
Andrew McAfee: Michael, like you know, Erik Brynjolfsson is the guy that I've written several books and lots of things with. He and I and Daniel Rock, who is one of Erik's former doctoral students, and James Milin, who is our CEO, we started a company because we noticed that people trying to run enterprises in lots of different industries kept on saying (as we talked about artificial intelligence), "I believe you. I know it's coming. I don't know how to get started. Help me put a plan together. How do I do that?"
We realized that, in the era of pretty good data availability, you don't have to rely on our allegedly expert opinion. You can size up what the company does, size up where the workforce is being deployed, how many of their tasks are amenable to today's gen AI and tomorrow, and use that to do kind of a ranked order list of opportunities. Put that against the technology landscape, and you have a ranked order list of projects. Then you can just start iterating and executing on those.
Again, I loosely think of it as a minimum viable plan. You do need a plan. Willy-nilly is not the way to get anything done inside an enterprise. But once you've got that plan, the key is to start trying things and learning as you go.
Michael Krigsman: The company name is Workhelix.
Andrew McAfee: Workhelix.
How organizations can drive employee engagement around culture change
Michael Krigsman: We have another question from Twitter. "How do you help organizations lose that fear of making mistakes or running afoul of the leadership? Where do you even start?"
Andrew McAfee: It is so hard. I found myself, Michael, a while back at a cocktail party standing next to Jeff Bezos, and I was not going to miss this opportunity. [Laughter]
I ambushed him. I said, "Hey, Jeff. When you see other people running great big companies, what is the most common mistake you see them make?"
And he didn't hesitate. He said, "They just become too risk-averse. They just become too afraid to take chances, to fail, to stick their necks out and not have it succeed."
He said, "I spend a huge amount of my time and energy trying to keep Amazon a place where failure is not awesome, but it's okay. And we need to take bigger bets as the scale of the company increases."
He said in one of his annual reports, I believe, "If we're not incubating multibillion-dollar failures inside Amazon right now, we're not trying to innovate at the right scale."
Now I think Alexa, especially in the era of gen AI, is probably one of those multibillion-dollar failures, but Jeff is not going to hang his head in shame. Andy Jassy is not going to hang his head in shame. That's the scale at which you need to be taking risks.
It's hard to inculcate in a company. It's hard to maintain it over time because our natural human tendency is not to fail, to win all the time, and to preserve the status quo. This stuff is hard.
This is a great question. The geek way is not easy, and I think the main reason it's not easy is because it goes counter to a lot of tendencies that are part of being human. You can channel it, and the geek way is brilliant at channeling those tendencies, but we have those tendencies.
Michael Krigsman: Really tough in a public company and especially as a company gets large. I suppose in a startup where the future is totally open and you don't have any past, failure is easy.
Andrew McAfee: I think that's right. But again, we can look at Tesla. Tesla is a public company, and they have this inherently iterative, agile approach. They've missed their deadlines, they've screwed some things up, and they just continue to swing for the fences.
I have no idea if the Cybertruck is going to be a real thing or not. But they're willing to try it. Good for them.
Ethical considerations when driving long-term culture change
Michael Krigsman: Andy, as we finish up, are there ethical considerations that one needs to take into account as you're driving, trying to drive this kind of change?
Andrew McAfee: There are always ethical considerations with a change. But I do think that when I look at the evidence about how people like working at the kinds of companies that try to follow these norms, I get encouraged that these are healthier places to work, and it might be unethical not try to make them a little geekier.
In 2016, LinkedIn did this really interesting exercise where they looked at the most attractive places to work. In other words, LinkedIn has this massive trove of data about what people are clicking on. What companies—I'm being informal—were getting clicked on the most? What kinds of companies (according to LinkedIn's data) were the most attractive for the professionals on the LinkedIn network?
The top 11 places, numbers 1 through 11, were all West Coast companies, all in what we would call the tech space, and then Tesla, which we incorrectly call a tech company. They make cars. They're just categorized in a different industry.
The top 11 places were all those kinds of companies. I look at that, and I think, "Wow. The rest of the business world might need to wake up a little bit."
Andy McAfee on the future of work
Michael Krigsman: What does all of this mean for the future of work and how people work together with remote work, hybrid work?
Andrew McAfee: I think the geek way is going to spread for the very, very simple reason that it works better. And in a competitive environment, the better way tends to win out.
An analogy here is that Alexander the Great, as he was conquering a huge portion of the known world, never lost a battle. You can bet that, after Alexander's death, his methods were studied very, very carefully, and the way that war was being waged changed. His practices, his norms diffused because they worked better.
When I look at the competitive battles between geeks and industrial-era companies, they're kind of lopsided. I think, until the rest of the world starts to realize the power of what the geeks are doing, it's going to continue to be lopsided. The geek way is going to spread because it works better.
Michael Krigsman: With that, I'm afraid we're out of time. A huge thank you to Principal Research Scientist at MIT, Andrew McAfee. Andy, thank you so much for taking the time to be with us today.
Andrew McAfee: Michael, it is always a pleasure. Thanks for having me.
Michael Krigsman: Thanks to everybody who watched, especially those folks who ask such great questions. You guys are an awesome audience.
Now before you go, please subscribe to our newsletter and subscribe to our YouTube channel. Check out CXOTalk.com because we have incredible shows that are coming up. Check out CXOTalk.com.
We will see you again next time. Have a great day, everybody.
Published Date: Nov 10, 2023
Author: Michael Krigsman
Episode ID: 812