How to Manage Data and Analytics Teams?

Despite the importance of data and analytics, managing data science teams present a range of opportunities and challenges. We speak with Sol Rashidi, Chief Analytics Officer at Estee Lauder, who explains various types of data and offers advice on managing analytics teams.

38:15

Jun 04, 2021
19,993 Views

Despite the importance of data and analytics, managing data science teams present a range of opportunities and challenges. We speak with Sol Rashidi, Chief Analytics Officer at Estee Lauder, who explains various types of data and offers advice on managing analytics teams.

The conversation covers these topics:

Sol Rashidi currently holds 7 patents, with 21 filed in the Data & Analytics space and is a keynote speaker at several technology conferences speaking on various topics such as Machine Learning, Data & Analytics, and Emerging Operating Models for organizations taking on transformations in the D&A space. Prior to joining Estee Lauder as their Chief Analytics Officer, Sol was the Chief Data & Analytics Officer for Merck, EVP and CDO for Sony Music, and Chief Data & Cognitive officer for Royal Caribbean.

Transcript

Sol Rashidi: Machines aren't doing the job for us. We're still training the models and you've got companies, vendors, and software providers saying, "We've got embedded ML. We've got embedded AI."

I'm like, well, you've got business rules and SQL scripts running behind the scenes. It's not AI. We just overuse the word.

What does a chief analytics officer do?

Michael Krigsman: We're speaking with Sol Rashidi, Chief Analytics Officer of Estée Lauder. Tell us briefly about your background.

Sol Rashidi: My running joke is no one ever grows up and says, "Hey, I want to go into data and analytics," and get juiced up about it. You hear, "I want to be a doctor. I want to be an astronaut, a president," but to be in data and analytics wasn't a thing.

I went for management consulting and my first gig in the industry was as a chief data and cognitive officer for Royal Caribbean. The scope of the work was really around the data ecosystem. We were producing and building the data ecosystem that was going to power all the digital products the digital team was building. We were essentially a service provider.

From there, I went to become the CDO for Sony Music. From there, a very short stint at Merck as their chief data and analytics officer. My career has progressed from being the chief data officer to now the chief analytics officer with Estée Lauder.

Michael Krigsman: What does a chief analytics officer actually do?

Sol Rashidi: The end result is ultimately about insights. We talk about insights, but a feeder into that is the analytics.

In the data world, depending on the scope of the CDO role (in the organization that they're in) it's around aggregation; consolidation; first-party, second-party, third-party data ecosystems; connecting the information, not just collecting the information; data quality; data fidelity; what I call sort of the defensive playbook. It's all around the backend ecosystem that's going to support insights and analytics.

The offensive playbook is really around the analytics that you're going to drive from the data. It's really around the insights that you're going to generate from the analytics. And it's really around working with the business of, "Hey, we're picking up on these market trends. If I use historical past performance data that we've leveraged with these market trends, here's how I think the business is going to operate in the future or portfolios that we should diversify."

The chief analytics officer is really focused on the offensive playbook because data is a means to the end, but the end is really around analytics and insights. So, in this particular case, it's going to be an ecosystem of both data and analytics versus mostly just focused on data.

Michael Krigsman: What are the outcomes that you're trying to achieve? You're looking at the full data lifecycle, determining what to collect, collecting it, aggregating it, and then doing something with it. What is it that you're trying to do with that data?

Sol Rashidi: Regardless of the title, and regardless of the company, the biggest objective is for the presidents who run the businesses and have a P&L to manage, they need to either continue current growth or capture additional market share to accelerate their growth or to maintain their current positioning within a very competitive environment. We are a support service to be able to support that P&L growth (maintenance and growth). So, it's all-around supporting the individuals who run the P&L.

We're not collecting data just for the sake of collecting data. We're collecting data to do something with data. We're rallying around generating analytics and insights.

How do we empower? How do we work with the business to say, "These are your objectives. We can be a service provider to support you with elements, components, analytics, and insights to support your objectives"?

Data acquisition and data gathering

Michael Krigsman: Where are you getting the data? How are you deciding what kind of data that you should be collecting?

Sol Rashidi: Let's collect any and every data element, data set that we can get our hands on, whether it's first-party, second-party, or third-party. Or you can take the posture of, let's be use case driven. I have the use case; I need the data to support it. I have a use case; I need the data to support it.

To be honest with you, I've managed both. I don't think either is ideal. If you take a very use-case-by-use-case approach, you're always going to be limited to the data sets that support that use case. I don't think that's going to give you the insights, necessarily, because the breadth, depth, and span of data sets you have by default are limited to the existing use cases that have come about.

Just being a data hoarder also doesn't solve the problem. There has to be a rhyme or reason.

I think, if you take a look at those two postures, I'm working on finding that balance in between. What are the fundamental data sets that we need—consumer, product—and prioritizing those first and foremost. Then going to sort of the next data sets, then the next data sets, so that we, at a minimum, have prioritized data sets that are just fundamental and foundational to the organization and our aperture of the data sets are comprehensive. And give people an arena to be able to use that.

I've gone from, "Let's go data hoarding," to "Let's support it by use-case-by-use-case," to now "Let's prioritize the data sets that we think are going to run our business in the future.

Michael Krigsman: How do you advise business executives or business leaders to navigate these different approaches? Again, as I talk with business leaders and also data scientists, it seems like this is a really hard challenge.

Sol Rashidi: In my experience, they don't care. [Laughter] I don't think that's for them to decide. I think it's for whoever the CDO is, whoever the CIO. If you don't have a CDO, the CIO. If you don't have a CDO, maybe if there's a CAO.

I think, ultimately, whoever has been empowered to get this organized to enable and empower the business, the decision is on that individual but with the right objectives, purpose, influences as to why these things are important and how it's going to support the business of the future. Then you get everyone else aligned to the prioritization schema that you've established.

To be honest with you, the businesses don't want to get involved in those details. They're just like, "I need an answer to my question and I'm tired of waiting four months for a report to be spun up," or "I need an answer to my question, and any time I ask my data team, it takes them weeks because they have to go on an Easter egg hunt to be able to find all the data sets." They just kind of want that stuff fixed.

Michael Krigsman: So, they're not looking at the technology aspects. They're saying, "It's taking too long to get a report. Give me my answer."

Sol Rashidi: Nothing is quick enough and nothing is comprehensive enough. As soon as you do provide something that's quick, they want more. The demand and supply, you're just always in a net negative game there.

Yeah, they don't care about the decisions being made behind the scenes. All they want is better results in terms of timeframe, cohesiveness, comprehensiveness, and your acumen in understanding their business so that when you do present something, you've validated the numbers so that when they look at it, the first thing they don't spot is, "Well, this is wrong," because if your numbers don't align with the way they're reporting their numbers, you completely lose credibility. And so, you can't do this alone.

If there's a business analyst or a data analyst or a finance counterpart supporting that business, no matter what you've put together, you've got to make sure your data models, your logic, and your data sets align with the financial individuals who are providing the numbers to make sure there's integrity and fidelity in what you're presenting to the business executive. Otherwise, they'll dismiss it right off the bat.

How to align data science and analytics with Finance?

Michael Krigsman: Sol, you mentioned the real importance of having that understanding of the business and especially aligning closely with the financial folks. How do you achieve that understanding because you can't be a domain expert across every part of the company? How do you do that?

Sol Rashidi: It's a three-prong approach. One, you build relationships and you find the people who are willing to just have another 30-minute meeting with you or have an hour meeting with you and tell you the way things are. You go to them for coaching, counseling, and guidance of, like, "I need to produce this but I've got four different agendas to appease. I'm not sure which one to align to and the last thing I need to do is contain the scope of this so I can deliver something in a few weeks or in a few months."

You kind of need your inner circle, your advisory councils, and they've got to be individuals who have been with the company a really long time. It's really hard (in the first six months) to identify who they are, but there are some that, as you're doing your one-on-ones when you join a company or join a new team, they're just more open to it, gravitate towards that, and use them as your bench and your council. I would say that's the first prong.

The second is when you have a use case, business problem, project, or program that you need to run for a business. Use that opportunity because they've now pulled you in. You've been invited to the dinner table. You officially have an invitation. I think the more you express interest in learning their business, the better off you'll be.

Now, you've got to balance how many people to include in that meeting (as you're learning the business). You can't bring your entire team who needs to learn the business. But that's the opportunity to really start unpacking how they think through things and also start establishing the maturity level of the business. That's going to give you an indication of how much you can be hands-off versus hands-on in the discovery process, requirements gathering, solutioning, development, engineering, and all of the above.

Build the relationships. Use your councils. Use different use cases or projects (that you're being pulled into) to really understand the business because they've now invited you to the dinner table, and now is an opportunity.

The third is, I would actually sit with their finance team. Every business team has a financial operation that they go to, to say, "Tell me about my performance across the following categories, sectors, product portfolios." Those are the numbers that they eat, live, and breathe.

Attend the financial meetings. Sit with them and understand their models. Where are they pulling the data sets from? What are their sources of truth?

I think, if you combine those three, you now can start producing deliverables or products or insights that align with the way they've been viewing their numbers historically or the data historically. And you've automatically built some trust.

Michael Krigsman: It seems like you've got to have a really clear understanding both (from a product or a service standpoint) of what these folks are delivering and, at the same time, be really clear with the financials, the financial folks.

Sol Rashidi: Yes.

Michael Krigsman: It sounds like you're doing both really strongly at the same time.

Sol Rashidi: One hundred percent. The financial teams, the finance team, they're your very first, at the onset, data and analytics team. It's all they do is spreadsheets, aggregations, models, and forecasting. It's where it all started.

If the company is depending on a core group of individuals to serve each business, partner with them because what they do matters. Now, you may need to aggregate not only the financial numbers with consumer-level data, product-level data, whatever it may be, but the root of it all is everyone wants to understand performance, growth opportunities, and how certain product portfolios are performing, and where areas of opportunity to capture and capitalize on. It all starts with finance.

Data and analytics vs. opinion and intuition

Michael Krigsman: We have a couple of questions from Twitter. Arsalan Khan—who is a regular listener and I always thank him because he asks such outstanding questions—says, "How do you balance between what the data is telling you versus what your business executives believe to be true?"

Sol Rashidi: [Laughter] Yes.

Michael Krigsman: "Sometimes, business executives veto the analytics and go with their gut."

Sol Rashidi: You're never going to overcome that until you have one or two wins under your belt and then they'll start listening to you. I'll share a story.

At one of the companies, I was responsible for building a 360-view of the consumer. Because consumer care was going to one portal, customer service was going to another portal, the finance team was going to another portal, the marketing teams were going to another portal, the sales team, et cetera, the consumer data was fragmented across about 11 different applications. In total, there were 76 source systems or platforms that held consumer information, but we decided that the ones that matter comes down to 21. The same as that prioritization schema, we're not going to hoard all the information, but what are the data sets that matter?

We determined, of the 76, 21 matters. Now, we could solve a convenience problem in that we could, yeah, build a single UI, consolidate these level applications into one, but I think the intention was really around new insights.

So, we had built this 360-view of the consumer, single-pane UI. It was sexy, cool, fun, great buzz. The business loved it because what had happened was, instead of them having to go on an Easter egg hunt across multiple applications, it was a one-stop shop for them. But that wasn't really why we built it.

We built it to generate new insights about the consumers that we were serving. But the business wasn't using it that way. Okay, no problem.

Every Friday afternoon, I had sort of a rule of, like, "All right. Happy hour, we're just going to noodle over the things that we've built, opportunities. Let's think big. Let's get away from our laptops and computers and let's just do a huddle."

We decided that we were going to look at our own product and see if we could find new insights. Low and behold, what we discovered was our loyalty members that were cherished dearly within the company that we would bend over backward to appease (because there was a notion that we were making the most revenue out of them) we found out that our margins on them were single-digit and sometimes negative.

What had happened was, when we finally combined the financial data with the customer service data with the platform at the business level data, we realized that we had four call centers and the systems weren't integrated. Our highest loyalty members figured out that if they call and complain to Wichita and we comp them, they could call and complain to Tallahassee; we'd comp them again. They could call and complain to Tacoma; we'd comp them again.

When you looked at the overall net spend, because of all the comps, we weren't making the money that we thought we were. The margins were minimal if not negative.

I approached my boss. I said, "I think we've got something. We've been literally noodling over this for about a month and we've been wracking our brain."

He's like, "All right. Then bring it up in the next EC meeting."

We were in the executive steering committee meeting and it was my turn to give a status. "Here are the products. Here's the data ecosystem. Here's how we're doing."

The CEO asked, "All right. Any net new insights?"

I said, "Well, we've got something but we need to vet it out a little bit more."

My boss looked at me. "It's okay. Share. You guys have done your due diligence."

I said, "There's a possibility that there's a consumer class that we think we're making the most money on. There's a potential that it's not double-digit margins but single-digit to potentially negative margins. But let me do my homework and I'll come back."

The CEO (when the meeting ended) patted me on the back, "That's why you're here."

My boss said, "Great job on delivery," and patted me on the back. "That's why you're here."

The brand president looked at me and said, "You will never work for us again."

Well, that's pretty strong. Well, why? Because the business had been operating under a notion that this consumer class was extremely profitable. Here's this new individual, new to the company, who, yes, knows the tech stacks and knows modern ways of doing analytics, blah-blah-blah, but I haven't been a part of the business. I haven't run a P&L. I have no credibility with them just yet. They completely dismissed it – completely.

What I ended up doing was just trying to save face like, "I'm sorry. I apologize. We are probably wrong, and I will publicly announce that."

What I ended up doing is I brought their analyst into the picture. I said, "I need you to troubleshoot what we did wrong because I would like to announce that this was incorrect. Our bad. Next time, we're going to do some more due diligence."

Well, it turns out their analyst came in and said, "Actually, you guys aren't wrong. Shit." [Laughter] Pardon my French.

Michael Krigsman: [Laughter]

Sol Rashidi: They had a moment like that because, based on experience, based on instincts, based on the number of years of running a business that's been successful, they haven't had to view it in a different way. The analyst was telling them something different, they were like, "I'm a successful business. I know how to run my business. Don't tell me how to run my business," even though we were right.

Anyway, from there, that's when I learned it's not a matter of the analytics, per se. It's about how involved you get the business in the analytics you're trying to generate and the new insights you're trying to present. Don't take credit for it. Let them take credit for it because only they can call their baby ugly. You can't call someone else's baby ugly.

That's part of another mistake that I learned is you've got to include the business from the very onset. You'll always have challenges with getting them engaged, staying engaged. They just want the answers. It can't work that way.

You've always got to have one person that's on point that you could constantly reach out to and go, "I need you to validate this. I need you to validate this." In every status, you've got to be maniacal about saying, "So-and-so from the business validated this."

Then, when you present, you show the list of individuals you worked with from the business, that they validated it, and so that when you're generating analytics or presenting the new insights, there's credibility that their own team was involved in the journey. Now it's their story, not an outsider's team telling them their story.

How to reduce bias in data science?

Michael Krigsman: I guess really what you're talking about is the tension between what is true (based on the data) versus what feels good based on our pre-existing bias. Let's not have a discussion now about politics and vaccines and all of that.

Sol Rashidi: [Laughter] Yeah. Yeah, there's an old adage, right? "Feelings aren't facts," but, in business, sometimes they are.

Michael Krigsman: I can see the approach that you took. It's very skillful in terms of bringing them into the conversation because it's not just the data; it's the trust and the confidence in how we arrived at that data.

Sol Rashidi: That's it, 100%.

Michael Krigsman: That seems to be a really fundamental issue here in terms of being successful doing the kinds of work that you do.

Sol Rashidi: One hundred percent, and I think that's why sometimes data science teams struggle. Most of what's built becomes shelf-ware.

Unless you're in this very specific industry (like the financial industry, insurance, or whatever it may be) where the stuff that you're building legitimately goes into products, most data science teams are doing one-off use cases and they're building decks to share their results.

But you share it and then what ends up happening? They listen. They go, "That's interesting."

But I'm willing to put money on the table that says, "Okay, what are they doing with that now and how are they acting different or making different business decisions?"

Most times not, it ends up in shelf-ware. For me, you're always going to have analysts or data scientists just focus on the D&A team who they're okay with that. But for me, I always say that no matter what position I've taken, I always want to leave one lasting impression that, without the existence of our group, that business would still be doing things the way they have and they wouldn't have had a gain benefit or business benefit from it.

Always leave a legacy behind. For me, when you bring the business along with the journey, the stickiness factor kicks in and it sticks.

Now, in two years, people are going to forget who even did it, who created it. What net new insight, it'll just be like, "Well, how did we live without this information?" because it's so foundational, just fundamental to the business.

It doesn't matter. You know that you had influence. You've changed the way the business is operating, and the stickiness factor only kicks in when they're along for the ride. The only caveat to that is it's going to take time because they're never available when you want them to be available.

Michael Krigsman: We have a question from Twitter from Gus Bekdash who asks a really good question. He wants to know what kind of training folks can get to better understand how to use data (executives, leaders). I'll broaden that to maybe you can tell us how to get started, how to get started doing data and analytics in this kind of way that you've been describing and, along the way, what do business leaders need to know?

Sol Rashidi: We've raised the acumen but no one takes an interest to the level that we take an interest in. You can't expect that. It's our business. It's not their business.

They're running the P&L. That's their job. How do I maintain market share? How do I grow market share? How do I diversify my portfolio? That's their job and that's where a majority of their capacity is focused.

But if an organization is investing in this, or they've invested in it but it hasn't really kicked off the way they need it to, the one thing that I have sold to the powers above me, so the C-suite above my C-suite, is we should probably have an executive boot camp around data and analytics. I say two days is enough, but never back-to-back, and it should probably be four hours day one, three hours day two. It's non-negotiable. It's mandatory.

The way I make a business case for it is you're about to spend millions of dollars in a practice that may not take flight because the pure dependency on this is based on the understanding of the executives. For me, the ROI is it's four hours of the executives' time on one day, three to four hours on the second day. We have a higher rate of success in terms of any of the things that we want to deploy and sponsorship. It's worth the time.

I'll be honest. Half the time, I've been successful. Half the time, I haven't been. But the ones where I have been successful, I've worked with other partners (because it's always important that you're not the one pitching the story).

You may know the story. You may know the area. But I've partnered with McKinsey before. I've partnered with Deloitte before. I've partnered with multiple partners in the past of we need to do an executive boot camp around data and analytics.

What is the difference between data science, data engineering, data analysts, business analysts, modeling, aggregation? Just basic fundamentals because it's not all D&A. Who is doing what and what goes into producing something?

One, they have an appreciation for it. Two, they understand how things are generated and that not everything leads to a report, not everything leads to a dashboard. You can interactive portals. It can be shelf-ware. There are different types of ways of performing analytics. Just a boot camp gives them just a base knowledge around data and analytics, and it has better influence. I have found that helps.

Also, to supplement that, I've always asked them, "Designate someone in your team that's going to be your point person for D&A. It's got to be your number two or number three person."

When I go to a business, I'm like, "Who is your number or number three? Can you give them the responsibility for data and analytics so that I can work with them?" because if it's coming from that source, by default, the executive is going to listen. You've also got to find the right allies in the business.

Michael Krigsman: To what extent are business leaders willing to accept this kind of training when, as you said earlier, many of them, I'm sure, simply want to come to you and say, "I need this report. Can you make sure I get it faster? And I don't want to know anything more about it"?

Sol Rashidi: You're always going to get that mixed bag of nuts, and that's just the reality of our jobs. I always say there are just certain jobs that are very thankless. [Laughter] You've just got to get comfortable with it. You're never going to change that paradigm. I've been trying to for many years.

What are critical aspects of organizational maturity in data science and analytics?

Michael Krigsman: We have a very interesting and important question from Twitter. This is from Dr. Alexander Bockelmann. "You spoke earlier about organizational maturity. What are the key maturity dimensions to look at, and what are the most critical aspects of organizational maturity for you?"

Sol Rashidi: I think there are a few dimensions that go into maturity level. One, I look at the hard skills. Many of the businesses, whether they're structured by country, region, product line, brand, label, you've got to look at the structure as a whole. How many different dissections occur? Most of us operate in a very matrixed environment.

Within those pods of teams and groups that you have to serve, I do look at the hard skills. Do they even have a designated D&A team: a data analyst, a business analyst? Who is on point for this stuff?

If they don't have one, that's a telltale sign that, on the pendulum of one to five, they're not there because there just isn't a focus and they haven't invested in it (for a multitude of reasons).

If they do have individuals, I look at their hard skills. Can they actually write SQL queries? Are they advanced enough? Do they have statisticians? Can they write R scripts? Have they hired data scientists? Can they write Python scripts? Then I look at the workbenches that they work at.

Sometimes, we tag people's data scientists who are really doing data analyst jobs. They're not leveraging cluster/non-cluster functions, RKNN, or random forest. They're just writing basic SQL script select statements, and that's how they're doing their job.

I try and look at do they even have a designated team? If they have a designated team, what's the function? I look at the hard skills because, if they don't have any hard skills and they only have soft skills like champions or translators or business analysts, that's another indicator into their maturity level.

But then if I have the time, I look at all the reports or dashboards or portals that they have access to, to see how they're viewing the information at hand. If it's continually around descriptive analysts—so, I want to understand performance based on historical data—that's another indication for me.

I hate to say it but the last-last one, and this is probably going to anger a few folks, anyone who says that we're doing AI as a broad-brushstroke term, that's a clear indication for me as well of their maturity level. The notion of AI is a very big, broad-brushstroke notion. It's the kitchen sink of a multitude of practices.

Most people who are seasoned or veteran will never say, "I do AI," or "We are doing AI." They will legitimately talk about their models, their techniques, natural language processing to extract sentiments and we created these dictionaries and reference. They will never say, "I'm doing AI."

If I combine the assessment of whether they have a team, hard skills, soft skills, their use of the term AI, and how much of their budget is allocated towards this investment, that gives me a good indication of where they are on the maturity curve.

Michael Krigsman: How important is this to you, this level of maturity in the organization?

Sol Rashidi: For me, fundamentally, because we can't be everywhere and we can't be everything to everyone, it helps knowing which teams need sort of a one-stop-shop approach towards analytics versus those that can be self-service based or can, quite frankly, be autonomous and on their own, but they tap into you for best practices or for ideas or there are opportunities to partner things. We may be able to do things, some things, faster – whatever it may be.

This actually helps me with capacity planning. It also helps me understand which teams to partner with and which ones not to partner with because, if on a maturity scale, there is tremendous interest because they're being great corporate citizens but there's no investment, there's no talent, and they're not giving time, it's hard to invest in a team like that because you don't know if it's lip service. You don't know if they're too small for the organization and you probably shouldn't prioritize there.

If there's a team that's not necessarily producing the most, however, they've made the investments, they have a team in place, they just need to accelerate what they're doing, well, that's a team you can really support. You want to be helping them in their journey to growth. It helps me with capacity planning because we can't be everywhere at the same time.

Michael Krigsman: Why this point about AI, the point that if they're using AI as a buzzword, that sends a strong negative signal for you?

Sol Rashidi: When I left management consulting and went into industry, my last gig in management consulting was launching WATSON. I really learned about that space, what it means to truly productionalize AI at scale, and how immature we really are.

Machines aren't doing the job for us. We're still training the models and you've got companies, vendors, and software providers saying, "We've got embedded ML. We've got embedded AI."

I'm like, well, you've got business rules and SQL scripts running behind the scenes. It's not AI. We just overuse the word.

Then I always ask, "Are we talking about augmented intelligence, automated intelligence?" because nothing is artificial. We're still fingers to keyboard in training those models. There's nothing artificial.

It's not a silver bullet. You don't activate something and it gives you an answer. That world doesn't exist.

And so, I think the word AI, one, there's a lot of hype just like IoT back in the day, digital transformation. They're just broad-brushstroke terms.

Two, I think it gets really dangerous because people view it as a silver bullet and it's going to solve all their problems and it doesn't. We're still fingers to keyboard. It's still extremely and heavily involved.

Three, I just don't believe there's anything artificial yet. It's coming. I don't think we're there yet, but it's a personal point of view, so please take it with a grain of salt.

How to encourage organizational cooperation?

Michael Krigsman: We have another question from Twitter. Again, this is from Arsalan Khan, who says, "Data and analytics teams are trying to change the status quo. So are many other departments: IT, operations. How can we ensure that these different departments do not overstep on each other's goals and work in conflict as opposed to working towards the same goal?"

Sol Rashidi: I don't think that's possible yet. I always say, in a position like mine, you need a backbone and not a wishbone. There's going to be conflict because what you've done is you've now taken a classic organizational structure where you've got business and you've got IT. The two have always remained separate, but the business of data and analytics is it's a multidisciplinary field that fuses business and technology together. We are that Venn diagram in between the two circles.

We naturally—and this is the reality of what we do—are never a part of the business because we're not running the P&L. We're kind of like at the side dinner table during Thanksgiving when the real dinner table has outgrown its seats and so they're just going to tack on something.

You're never really a part of the business, but you're serving the business. You're never really a part of IT, but you have to partner with IT to unlock and enable because a lot of what we do has a technology dependence.

You've got this group of individuals who have to learn the business to generate the analytics and insights, but they have to have a technical acumen because everything we do is around data engineering and data modeling. It's always going to be slightly awkward unless this D&A team has complete technical ownership and sits within the business or has complete technical and business ownership and sits within the IT team.

This is my fourth post now, and I've never been in IT. I've always been within the business, and that Venn diagram just exists. It's uncomfortable for the first six to nine months because you're trying to figure out roles and responsibilities because data and analytics fundamentally is owned by everyone and no one.

You're crossing over into everyone's swim lane, and your job is not to solve world hunger. It's just not to boil the ocean. You've got to pick and choose where the opportunities and gaps are that either are claimed but there just hasn't been the investment and so they need subject matter experts to run it or are gaps in opportunities that you can claim in saying, "The symptoms we're feeling are a result of these root causes. This is what we're going to fix as a team, and here's how we're going to start."

Then you can say, "Hey, let's talk about roles and responsibilities, RACIs, operating models," and those conversations are never fun. Like I said, we did not choose this field. It chose us. [Laughter] You've just got to ride the wave.

Michael Krigsman: What I find fascinating is how this highly technical field, success in this highly technical field, is coming down to these human and political issues.

Sol Rashidi: Always. We are the team that crosses over into swim lanes. Sometimes we're welcomed in, and sometimes we're not welcomed in.

Michael Krigsman: We have another comment from Twitter. This is again from Dr. Alexander Bockelmann, who I find to be amazing. He says that, for him, culture is the key dimension. We were talking earlier about metrics and organizational maturity. He says culture is the key dimension for him. If data-driven decisioning and data affinity are not part of the culture, there will be no pull to use data and to develop data use cases.

Sol Rashidi: I would agree with that. The only facet or caveat I would add is, if you are a new company, it will be naturally a data-driven culture because you're just hungry for information and facts, and you're making business decisions based on that.

But if you're a company that's been around 25, 50, 75, 100+ years, you will always have executives that have been there 20 years, 30 years, have retired from that company. They're not changing. The culture is codified and it is set. You can bring in new leaders to infuse a new way of working, infuse a new operating model, infuse a data-first mindset, infuse being data-driven not just data-rich, but unless there's critical mass, you're still always going to be fighting the big machine of the culture that existed before you came. That's the reality of it.

I butt my head up against the wall because I'm a change agent, I'm a disruptor, I like to build. I'm brought into these positions to start helping us pivot and sometimes you do and sometimes you don't.

Depending on how deep and thick that culture is codified with business leaders who are very respectable and have taken the company to someplace and grown it to where it is, if they are tenured, you're not going to change that. Not unless you're willing to be there for 20 to 30 years and you've got new leaders who are also willing to be there 20, 30 years, and we fundamentally talk different, act different, think different, and the company has no choice but to pivot because you have enough new leaders infused into the working stream to drive that change in mentality.

Advice for business leaders

Michael Krigsman: What final thoughts or advice do you have for business leaders that are listening to this who say, "Yeah, I want to drive change using data"? What should they do?

Sol Rashidi: The easy one is just to embrace your D&A team more. Loop them in. It could be weekly status meetings. It could be quarterly reviews. It could be when you're reviewing major strategic initiatives that you have to unlock or activate. Bring a member of your D&A team into the fold because you'd be amazed.

There are a lot of us who are passionate and care about what we do. Again, you don't choose this field. It chooses you. But once you're in it, it means you really are dedicated to it.

Brind them into the fold. Treat them like one of you. That's what's going to help your story. They're not a tangential Thanksgiving sidebar dinner table, like the children at the kids' table.

Fold them in early. Let them provide a perspective. It doesn't mean you have to agree, but that will really make a statement for your organization, to your leaders, to their leaders, and you never know what golden nugget you're going to discover that you didn't know before.

Michael Krigsman: Okay. Lots of words of wisdom. Sol Rashidi, thank you so much for being with us today.

Sol Rashidi: Of course. My pleasure. Thank you.

Michael Krigsman: Thank you for watching. We've been speaking with Sol Rashidi, Chief Analytics Officer at Estée Lauder.

Before you go, please subscribe to our YouTube channel and hit the subscribe button at the top of our website so we can send you our great newsletter and let you know about upcoming shows. And tell a friend. Thanks so much, everybody, and especially to the folks who contributed and asked such great questions. We will see you again next time and have a great time. Thanks a lot. Bye-bye, everybody.

Published Date: Jun 04, 2021

Author: Michael Krigsman

Episode ID: 710