Data Governance, Trends, and Challenges 2020

The increasing role of data in business, government, society, and culture has created technology and policy challenges for leaders in business and government. This episode explores the issues and challenges of data in the year 2020.

43:49

Jan 10, 2020
5,859 Views

With the growth of machine learning and AI, data governance strategy and data protection issues have become significant technology and policy challenges for leaders in business and government. This episode explores big data policy, strategy, and trends in the year 2020. Our guests on this episode are Amy Webb and Dr. David Bray.

Amy Webb is a quantitative futurist and professor of strategic foresight at the NYU Stern School of Business and the Founder of the Future Today Institute, a leading foresight and strategy firm. Now in its second decade, the Future Today Institute helps leaders and their organizations prepare for deep uncertainties and complex futures.

Dr. David A. Bray serves as Executive Director for the People-Centered Internet coalition, co-founded by Vint Cerf, focused on providing support and expertise for community-focused projects that measurably improve people's lives using the internet. He has received numerous honors and awards for his work.

Transcript

This transcript has been lightly edited.

Introduction

Amy Webb: Synthetic content is not an avatar. Think of a deep fake intended for benevolent purposes.

Michael Krigsman: In 2020, what does data actually mean? Where is data going?

Amy Webb: I’m Amy Webb. I'm a quantitative futurist, a professor of strategic foresight at NYU Stern School of Business, and the founder of the Future Today Institute.

Michael Krigsman: Our second guest and guest co-host is Dr. David Bray.

Dr. David A. Bray: I've been Executive Director for what’s called the People-Centered Internet coalition. I am now with the Atlantic Council at a new project that we'll share more details within about two or three weeks.

Why is Data so Important in 2020?

Michael Krigsman: We're talking about data. Why is this such a crucial and pressing topic for 2020?

Amy Webb: I think it's useful, though, to think of data within the realm of some context and historical perspective. We tend to get collectively very excited about things and, usually, that excitement begins with some misunderstanding and, typically, a significant amount of optimism regarding what's possible. That's usually followed by some misdirected fear around what it could all mean. Then, at some point, that hype cycle levels out and people start talking about scaling, business opportunities, and the like.

I think, probably, as we begin 2020, that hype cycle has started to even out a little bit. We're going into a new decade with a number of technologies capable of mining and refining data in real-time that will lead to all kinds of potential opportunities and, of course, risk.

The other piece of this, of course, is generating and collecting all of that information using next-generation collection technologies and network infrastructures. I think, if you're hearing constantly the word "data," "big data," "data lakes,"—you know there are lots of different permutations—this is why.

I would, as we kick off this conversation, note that having a talk about data is a little bit like having a talk about Internet. It's sort of like an umbrella term that means many different things. This can be a sweeping or wide-ranging conversation.

Michael Krigsman: David, why has data taken on these attributes of the hype cycle that Amy was just describing?

Dr. David A. Bray: Building on what Amy said, I completely agree that we go through waves of both euphoria, then misdirected anger and fear, and then, finally, we figure out and, in fact, the research shows it takes between 10 to 25 years to really understand what anything new really means. Why are we seeing this now? Well, for the last two decades we have been steadily increasing the number of people connected to the Internet but also devices connected to the Internet.

2013 was the year where there was the same number of people in the world as there were network devices in the world: 7.1 billion network devices, 7.1 billion people. It doubled two years later. We're now at about 40 - 45 billion network devices on the planet relative to 7.6 billion people.

The amount of data on the planet that's now being produced by these devices is also doubling about every two years. 2013, we saw about four zettabytes, four billion terabytes of data on the planet. By 2022, which is only 2, 2.5 years away, we're going to see 96 or 100 zettabytes worth of data, 100 billion terabytes, which some say is actually 3 times all the conversations we've ever had as a species.

We are drowning in data and the question is, what can we do with this data for good? What can we do with this data for not so good? How do we even make sense of it? Are there going to be people left behind if they're not literate in how they use this data?

We really have these three doors. Do we have a world in which the sensors, technologies, and data result in surveillance capitalism; these sensors and technologies result in surveillance states; or do we find some way of data dignity where we still have choice in this new world going forward?

Amy Webb: There are some real-world challenges that I just don't see being addressed in meaningful ways – data governance. David is right and he just gave a terrific background on some of the challenges that are upcoming. I do not see strategic conversations happening about data governance in the right places—in the private sector, in the public sector—at the highest levels.

I totally get that data governance may not sound like the sexiest topic to [laughter] center a meeting around, but it's important because, at some point, the people who make critical decisions about our companies and our infrastructure, all of the services, devices, and the tools that we use, at some point there are going to have to be conversations around things like hygiene, data hygiene, data storage. The numbers that David just mentioned are, to some extent, incomprehensible. It's hard to wrap your head around that.

We also have some storage issues. Whether you're a government person, local or national, or you're a business, you're going to have to start making decisions around how those data are stored because that involves personal information in some cases and proprietary information, but there's also a cost associated. How you're going to continue to maintain those data going forward as our systems and devices change and we move more and more into ambient computing, who gets access?

As the geopolitical landscape changes, who have sovereign domain over that data? Who gets to decide what is done with those data?

If you want to get even more complicated, what about the packet transfer? This is the crazy thing. We think about airspace overhead as planes are flying across and around countries. Well, what about the transfer of data? Are there domain issues as a packet moves?

We're moving into, in some ways, sort of paradoxically a much more exciting, easy living situation where so much will be automated. On the other side of that are many complex issues that I don't see being addressed head-on or, worse, I see being sort of shoved to the outer fringes of an organization where just maybe the IT department works.

This is nothing against IT department. I like IT people a lot. But they alone shouldn't be making these kinds of governance decisions for an entire organization.

Challenges of Data Literacy for Senior Executives

Michael Krigsman: Amy just mentioned IT people. As she was talking I thought to myself, "Oh, great. We've just covered; she's just turned all of us into IT people and, at the same time, she's gone to the other extreme turning us into metaphysical analysts because of the range of issues." How do we even grapple with that?

Dr. David A. Bray: This is a revolution that makes what happened with the printing press and the availability of books to everyone pales in comparison. This is like that times ten in an ultra-compressed period of time. You're absolutely right.

The good news is, IT is becoming more and more strategic. It used to be a function that only reported to chief financial officers and then, later, chief operating officers. Now, if you are not a data-literate CEO, odds are your company may not still be in existence five or six years from now. This actually has to be board-level conversations, CEO level conversations.

You're also right, Michael. The question is, these are issues that need to be discussed that impact local cities, states, nation-states, even national. The trouble is, I'm not sure most of our politicians are even digitally literate enough to have these conversations. It's happening all at the same time and it's causing massive disruption.

Amy Webb: My favorite/least favorite story of 2019 took place in the city where I'm currently talking to you from, so I have a house in Baltimore and also in New York. The City of Baltimore was locked out for months because hackers had exploited a vulnerability. There was no forward thought and there were no resources devoted.

Baltimore is not a tiny town. It's a huge city. It's one of America's largest cities and, not for a few days, not for a few weeks, but for a few months citizens could not pay their parking tickets. They couldn't pay their water bills. I know that that doesn't sound like the end of the universe except for the water bill that couldn't get paid doubled. Not everybody is great at managing their finances so a whole bunch of people got hit with a very large bill that not everybody was able to pay right away.

It's more than a minor inconvenience. For a lot of American cities, cash flow is an issue. Between the two of us, David and I could probably rattle off a hundred examples, over the past year, of totally preventable problems if somebody was really thinking about the future of data.

The last thing. I was, over the summer, at MIT at a symposium for people who work in executive leadership positions within data, so chief data officers. It was a little more academic but there were plenty of people from the corporate world and from government as well.

It just struck me that A) there are not enough people with enough leadership training who also have backgrounds in data science and that B) a lot of the data capabilities are still housed within areas like marketing. Data is not just predictive analytics. We're talking about core functions of businesses and organizations. We are beyond, I think, the point at which we can have a casual conversation. People need to start thinking much more holistically and specifically.

Dr. David A. Bray: I agree. I think it actually points to, I don't think any of us took any courses in middle school or elementary school that were about data but I think we need to start doing that because it's going to become almost as important as being able to write and do math is to be data literate and understand it. This also means we've got to play catchup for those that have not had that training.

If this is relegated just to the people that are data scientists, data officers, and IT people, the challenge is, as Amy indicated with Baltimore. Baltimore, the IT department probably asked for more money and they were probably turned down because no one saw the value of doing it.

Amy Webb: Right.

Data Issues Across National and International Jurisdictions and Borders

Dr. David A. Bray: It looks like it's the IT department's fault but, in fact, if nobody funds it…

I think the question is, how do we play catchup in this period of massive change? Meanwhile, you've got Europe with General Data Protection Regulation, with that approach. You've got China, which is now drafting possible laws that'll say it'll be illegal to have any data in the country that is not available to that government, even if you're a foreign national or foreign company. I'm actually waiting for the train wreck between GDPR and China to happen and see how that plays itself out.

Amy Webb: And California. We even have a state-by-state system in the United States that's not consistent, which again if you as a decision-maker are not really paying attention to this, the cost of compliance could go through the roof for your organization.

Dr. David A. Bray: Then on top of it, you can just imagine a little popup that says, "Are you a Californian? Are you from Europe?" Of course, now, if you have little popups when anyone visits a website, that's an easy phishing attack, too, and it can be really bad.

Amy Webb: Right.

Dr. David A. Bray: Yeah, it's going to be messy but it has to be solved because this is now commerce. In some respects, I would even venture to say this interesting proposition, which is, "Data is money and money is data." The question is, of course, who is making that money? Is it only in the hands of a few? Is it something where we can benefit?

You see Europe and you see the U.K. now talking about sending what they call data trusts because, the reality is, my data by itself is not that valuable. But if I'm willing to put it towards a data trust that keeps it fairly anonymized but it can be used for health research or things like that as part of a larger aggregate with another million people, that could actually produce revenue that could actually become sort of an equity that pays me on an annual basis for the ROI that was brought in from the data that I contributed to.

Amy Webb: Mm-kay.

Dr. David A. Bray: I think this is where recognizing the role of banks, the role of public institutions, private institutions, this will change the world in the next decade.

Data Trends for 2020

Michael Krigsman: This is a show about data trends in 2020, so let's start identifying some of the key trends. What's going to happen? Amy, do you want to jump in first?

Amy Webb: One of the sorts of surprising key findings for this coming year has to do with synthetic data. I say surprising because we've been covering things like synthetic content and synthetic data for a while but we've just started to see some inflections and some crossovers where I think synthetic is going to be something you're going to hear a lot next to data going forward. That ranges from things like synthetic genomic data, synthetic content.

Synthetic content is not an avatar. Think of a deep fake intended for benevolent purposes, so characters that act out in a generative way like a soap opera timeline, something like that. Synthetic media, in general, so this would be algorithmically generated music compositions or art. Then synthetic data in general.

There are some reasons why we're starting to see this, especially within the health sciences and medical space. We have a lot of restricted regulations in the United States, but things like AI show quite a bit of promise.

Because there isn't a system yet to enable everyday people to knowingly, and with all of the information that would be needed to make a good decision, give others access to de-identified data, big, huge tech companies are having to generate synthetic data sets so that they can test and run their models. There is good and bad with that. Anyhow, we're starting to see a lot in the synthetic space.

We're also seeing a lot of diagnostic capabilities, whether that has to do with the devices in your home determining when firmware upgrades are needed and making some of those decisions using the data around them or the enormous number of home diagnostics that are going to wind up in our bathrooms and our kitchens at some point, notably because there is so much data ripe for collecting and because it's those data that are necessary to make decisions,

Scoring is another huge finding and a big area of our research this year. Everybody is being scored. It's not all nefarious, but it's definitely not all transparent either. Those are some big areas that then lead to things like traceability and transparency going forward. There's a lot of confusion; certainly, a lot of conversation around what can and should be done.

Then, of course, this sort of enormous rush to try to capitalize on doing something with all that data, whether it's mining it to make predictions or protecting other people. There's this crazy company called DID. It's Israeli. David knows exactly what I'm talking about. They do a bunch of things but one of the prototypes that exist is a way to create a deep fake for a famous person that looks slightly different.

David is a famous person. I could deep fake him but add a mustache. David doesn't currently have a mustache, right, so you can think of this as somebody taking a really popular song and tweaking the melody just a tiny bit to evade copyright law. The other kind of crazy thing that we're seeing is the emergence of that. It's a very different world that we are headed straight forward into.

Dr. David A. Bray: In some respects, synthetic datasets may provide a way that we can still remain individually anonymous, but our data can still have value if we can make sure the synthetic data is representative enough and diverse, but we should also recognize that synthetic data can also be used whether it be for periodic purposes or more malevolent, the equivalent of deep fakes for data. I think that's something to be aware of.

In terms of three things that I would say that I would add to what Amy talked about, I think there need to be conversations about the future of data and the future of work at the same time because, in the past, if you bought an ERP system or you bought a manufacturing system, as that IT system got older, you depreciated it because it was older and it was out of date. Data is actually the reverse. You don't want to buy that AI that has had no training whatsoever. You want to buy that one that has five years' worth of data fed to it or ten years' worth of data, almost like fine wine, because it'll actually be better at its job the more it's been trained as opposed to new.

That blows up accounting models. That blows up how you work with it. How do you compensate workers if what the worker is doing is training the machine but, in the end, it actually means they no longer have to do the task they're doing because they've now trained the machine? Future of work and future of data, I think that's key for the next one to two years trying to sort that out.

Second is this idea of what's called three-factor authentication with data. It's not just that we know it's you and it's data about you. It's actually a question of, have you given your consent to that data?

It's actually something that you can do on a continuous basis. If you turn off the app on your phone, just because they know you're there doesn't mean you've given consent and so that data, in some respects, can be cryptographically made so it cannot be traced back to you. It's this idea that maybe we can actually seize back some locus of choice and control in this era by saying it's not just about that it's me, but have I actually said you can know it's me and have that be a continuous on/off or rheostat that I can say, you can know I'm over 21 but you can't know my specific age – something like that.

Then, finally, I did a paper in October with MIT Sloan talking about the need for what I would call data ombudsman or Ombud's functions. It's like, who in the organization is responsible for making sure that data is useful, is actually appropriate for the question we're asking, is diverse? Then when it's fed and actually conclusions are made from it or decisions are made from it that those decisions themselves are ones the organization is comfortable with, if they're ethical, and they're things we should do.

You've seen a lot of AI ethics conversations. I think we actually need to talk about data first before we rush to AI.

Amy Webb: Mm-hmm. That's right. Nothing drove that point home as much over the past couple of weeks as watching everybody share on social media their statistics. Think about all of the different connected services that you use, whether it's Google Maps sending you out a map of all of the locations, the number of cities that you've been to, or Spotify telling you what you listened to over the past decade.

I don't think the average person is all that into the quantified self-movement. I think, however, we are all susceptible to nostalgia and to ego. There's a piece of this that feeds our ego.

What really concerns me, going forward, is that we are exchanging cool graphics and maps for the more challenging questions like, what does it mean that Google was able to send me a neatly tailored data visualization of literally every place I've been? What are the broader downstream implications of that?

To David's excellent, excellent point, we need to have more public conversations. A data ombudsman in an organization is great and especially if that person is having more conversations with everyday people to help them understand what's at stake.

How to Develop a Culture that Respects Data Privacy?

Michael Krigsman: We have a few questions from Twitter. Sal Rasa asks, "How do you develop a culture of becoming aware and respecting these data issues you were just describing?" It seems like a really important issue.

Amy Webb: We primarily work with the executive leadership of Fortune 100s, very large organizations, and everybody is grappling with the same question. Again, this is going to come out sounding like I am not in love with the IT folks within organizations. It's not true. I love the IT people. I married an IT person.

Part of the problem is that, for too long, data was conflated with the hardware that helped to power organizations. They really are different functions. Then the other side of this coin was that all of the data science was sort of relegated to marketing. When you silo some of your core business functions, you are not able to create a culture of transformation where some of these more pressing, modern issues are at the fore.

Where I've seen success with organizations is when, if a chief data officer position is created, that person and that team sits almost as a connective tissue between all of the other departments and teams because you're going to need those people to deal with supply chain and logistics or compliance—that's the other thing—or the data function is housed within the compliance and risk function inside of a legal part of the organization. An easy way to spark transformation, maybe not so easily if not everybody is in alignment, is to create a situation where those experts in data are sort of at the center, almost like a hub, and are working together with all of the other parts of the organization because it should be much more of a seamless collaboration.

How Should Boards of Directors Relate to Data and Data Privacy?

Michael Krigsman: David, on a related topic, Arsalan Khan asks, "How can CEOs and boards even manage all of these issues because they're so complex, so detailed, and there's just so much of it that's coming at them with regards to data?" What should business executives do?

Dr. David A. Bray: I think it's recognizing that, on the board, you wouldn't have anybody in your board that didn't know what a P&L, profit and loss, was or anything like that. That's something that, to get on the board, you need to have that expertise.

I'm not saying that everyone on the board should have the ability to go deep on data issues, but if you don't have anybody on the board that can go deep—and it's not just one person; have a handful—then don't be surprised if your board is not going to be able to ask the right questions of the organization, of the CEO, and then the CEO themselves may not be able to ask the right questions too. I think part of it is recognizing that you're going to need people that aren't necessarily in the data weeds but are able to understand the value of how it can be transformative for your company.

The other thing to build on the previous question and build on what Amy said, I think there's a lack of data empathy. I think this is the big challenge, which is, how do you step back and say, "My target audience might be 20-somethings, but maybe I need to think about those that don't necessarily even have the knowledge when they bring in this device or something like that," to understand what the choices are and with what's being done.

One of the things that we've done with the People-Centered Internet—not on data but on Internet, but the same model could be applied—is we've worked with Native American tribes about getting connectivity. We drew from public health because I actually had a master's in public health.

I'm a big fan of the best way you bring about change is you don't rush in and say, "I have the answers." It's actually listening and saying, "What do you want? How do you want this done?"

That's where I think companies may need to think about it in whatever function it falls under appropriately. Maybe it's the chief data officer. Maybe it's marketing. Maybe it's outreach. But going to your stakeholders, your customers and saying, "We'd like to empower you with choice." It doesn't have to be a billion choices, but it's a menu of choices. Then, from there, we can actually then allow; instead of a one-size-fits-all with data, you can allow people different options.

I think the same thing needs to be done by local governments and national governments. Imagine if you actually had the choice which said, "You can either A) have more data going in regularly for filing your taxes versus B) just doing it once a quarter or once a year. If you file it more regularly, instead of you having to file taxes at the end of the year, we'll send you what we think you either owe or we owe you, and you can correct it if it's wrong versus you have to file your taxes. If you don't want to, you can do it the other way, which is the more conventional way of doing it." It's giving choice architectures to people and I think that's going to have to happen and that's only going to happen when you have all the way from the technical level to the strategy level to the board level, people that can speak the language of data.

Michael Krigsman: Amy, we have another question from Twitter. Zachary Jeans asks, "Do you see a day when consumers will have a right to read all communications that companies and governments have about them and their data?

Amy Webb: Technically, you are supposed to be able to do some of that now. There are a handful of companies that act as intermediaries that effectively score you, score consumers. Much like you have a FICO score or much like there are scores regarding your financial health, there are also scores that third parties can access to help determine algorithmically what you are likely to pay, for example, for a roll of toilet paper. There's some dynamic decision-making that happens in an automatic way as a result.

Now, theoretically, you can contact these companies and request your files the same way that you could contact the FBI and ask them for whatever files there are about you. But it's important to note that the entire infrastructure of our digital era is not static. At any given moment of any day, the numbers could be fluctuating and go up and down.

I actually think it's worthwhile to go back to something David just said a couple of minutes ago. That has to do with choice. I think the problem is that we have given up on uncertainty. As a futurist, I will be the first person to tell you that I cannot predict the future. It is not mathematically possible. The math doesn't work out.

Uncertainty is okay and my job is about reducing uncertainty. I think when it comes to a lot of these questions around data, my observation is that we try to impose our existing frameworks of thought when the very nature of the ecosystems we're talking about is dynamic, in flux, and is going to continue to develop over a very long time. We have to shift our thinking a little bit.

I don't think we can regulate companies or tell companies they must follow these ten rules for transparency or whatever. I think we have to create a situation in which we are incentivizing them to lean into uncertainty, to help us lean into uncertainty, to offer more transparency. We haven't gotten into the operability question yet. Yes, there's a ton of data, but who actually owns it and how does it move around? Putting people more at the center and at the focus, I think that there's plenty of money to be made if we figure out ways to incentivize people versus monetizing just the data and productizing the data.

Dr. David A. Bray: I agree 100%. In fact, on that note, as you both know, I try to be a good nonpartisan and I've been nonpartisan, which means I have no top cover as a result because I don't pick a party. Here, in the D.C. area, I've actually tried to bring together people from both sides of the aisle to try and figure it out.

We know there's polarization happening, not just in the United States, but also in Canada and Europe. What's an interesting thing that I've heard from people from both sides of the aisle that actually have done political games—I've never done one—is they say it's a lot easier right now to build data models to convince a political candidate to win by galvanizing and polarizing their base.

Amy Webb: Yeah.

Dr. David A. Bray: It's a lot harder to build data models about how do you win by the middle. Here we are surprised that the world seems to be getting more and more polarized for Europe and the United States, and this might actually just be because we've not understood, as Amy said, the data of uncertainty and we're surprised as we get more and more fragmented and more splintered in societies.

Amy Webb: Right.

Dr. David A. Bray: I think the same thing is true with commercial companies. It's got to be that we find a way to motivate both them and customers to pursue transparency and their own self-interests as opposed to something that we try to force because any regime that we try and force will be out of date by the time we pass a law, and they'll find ways around it by the time the law is in effect.

Amy Webb: That's right. You actually just made me think of something I hadn't thought before, which is, again, from an architecture point of view. My academic background is economics and game theory, which I should just say really quickly.

If I were trying to build a model to attract the most number of people to interact with something that I've published, the systems like Facebook is, I think, still pretty clunky. It makes sense to me that the easiest way to reach people is by going to whatever the extreme is because the extremes tend to be binomials.

Trying to approach the gradients, which I really do think much more represents who we really all are from a technical standpoint, is a heck of a lot more challenging, which now makes me think maybe we are, at least in the U.S., not quite as divided as we all feel that we are.

You get bad data in. Bad data in and bad data out, right? Maybe this is sort of like a bad data out problem. I think we are divided, but maybe we're not quite as divided as we're being told that we are.

Dr. David A. Bray: I agree. I think it sells.

Amy Webb: Yeah.

Dr. David A. Bray: We know the number one way to make something go viral on the Internet is to make it angry. The number two way is to make it fearful. It's not to make everybody angry. It's to make one group angry and the other group angry in response.

Amy Webb: Right. Right.

Dr. David A. Bray: It bounces back and forth. This is why data matters and why people say it's not just the numbers.

I think the other thing is also, we may be trying to apply 20th Century geographical boundaries, whether it's state or nations. As you said, where is the packet of information? Where is the data?

If you really wanted to blow up the data regime or regulatory framework, I actually had this idea about five years ago. If anyone wants to do it, you're welcome to. What if you actually had your data stored in encrypted form, so cyphertext, but you had a system where the key to unlock the data changed the country it was in every ten seconds?

Amy Webb: Can you describe that more?

Dr. David A. Bray: Yeah, so the data is encrypted.

Amy Webb: Right. Right.

Dr. David A. Bray: But the key to unlocking the data is moved from country to country in electronic form every ten seconds.

Amy Webb: Oh, like a TorGuard, like a proxy for your data.

Dr. David A. Bray: Kind of, yeah.

Amy Webb: That's kind of crazy.

Dr. David A. Bray: If you have to try and get a warrant to get the key to get your data, you're never going to know which country it's in every ten seconds.

Amy Webb: Right. That's an amazing idea and a horrible idea. [Laughter]

Dr. David A. Bray: I know. Like all things—

Amy Webb: Right.

Dr. David A. Bray: –technology will empower wonderful things and awful things at the same time.

Amy Webb: Right. Right. Right.

Dr. David A. Bray: I think that's why it's so important that we need to have data literacy within the public and within our leaders, whether they be private sector or public sector. If we don't have conversations about data now—

Amy Webb: Yeah.

Dr. David A. Bray: –we will find autocracies. We will be surprised, but we will find, five to ten years from now, that it's being used for autocratic regimes that are oppressive.

Michael Krigsman: Why should businesspeople care about these issues? Their job is to make money. You're saying, "Well, they should be focused on these broader societal issues and broad influence campaigns." Why?

Dr. David A. Bray: If your job is making money, you need to have marketplaces to sell to. If you're not careful and if you don't think about these issues now, you may not have open marketplaces in five years.

Michael Krigsman: Sorry, David. That's not my problem.

Dr. David A. Bray: It is your board's problem if they're thinking beyond five years. Your board should be asking these questions of you.

The other thing is also that if you don't figure out ways to thrive in this new regime, your competitors will, and so someone else will eat your lunch. It's not just the desire to have open markets. You will be disrupted by someone else if you don't figure out how to be a disrupter yourself.

Big Data in Healthcare

Amy Webb: I can offer just a very pragmatic reason why this matters. I could do this industry-by-industry. We don't have all day, so I will just start with health and medicine. The way that our current laws are written in the United States, there's something called HIPAA. That protects the privacy of your individual health data and, as a result, there are many, many hoops that doctors and healthcare providers have to jump through. There are forms. There are a lot of technical requirements that they must observe and obey.

The interpretation and application of HIPAA start to get really confusing and murky when we're talking about big tech companies that are wading into the areas of health and medicine but not technically performing health or medical tasks, right? I'm thinking about Apple Watches and the data, the personal data that's being delivered. I'm thinking about Amazon and Amazon's connected devices and the various components that go along with it. I'm thinking about Google's devices. That's big tech.

There's a whole other area where there are startups in the sort of popup genomic space. In the U.K., in one of the grocery stores there, recently there was a popup where you could get your DNA tested and it would spit out nutritionally what's going to be best for you to buy.

If you are a doctor, it would be hard. If you're in the medical community, it's going to be hard for you to come up with a lot of those same business applications. If I was anywhere in the pharmaceutical, hospital, healthcare, healthcare provider, which is a chunk of America's economy, I would be scared out of my mind right now if I didn't have a plan and some kind of good point of view—and by good, I mean comprehensive—on what to do with all of the data that my organization has access to. Better yet, how to mitigate existential risk from those third parties that also have access perhaps to even more data than we do.

I could go through, industry-by-industry, and give you specific stories that relate to data and adjacently related companies or tech companies that stand to totally destroy the way that that industry operates over the next ten years.

Dr. David A. Bray: Right. Yes.

Amy Webb: Yes, we should all care, and I do, about the greater societal implications. But, from a P&L, like a dollars and cents point of view, if you are not very closely watching what's happening, I would hedge [laughter] and say that you're probably going to find yourself in trouble.

Dr. David A. Bray: In the health space alone, there are estimates that in the United States, by itself, health data and that exchange is a $60 billion industry. You can bet that there are people that want to come in, compete, and disrupt that market. The question is, do you want to be the next Blockbuster or Kodak, or do you want to be something else?

Advice on Big Data to Business Leaders in 2020

Michael Krigsman: Let me ask each of you to share your advice to business leaders in 2020 when it comes to data.

Amy Webb: David made a really great point, which is that, if you're in the C-suite, you don't have to wake up tomorrow and also be a data scientist. I don't think there are any expectations. But you have to have enough fluency in the lexicon to have informed conversations. As the leader of your organization, you need to lead your data strategy as well and not simply relegate it to the few people in the organization that have some understanding of what data is, how to use it, and everything else.

Here's a good question to take back with you and just mull over for the next couple of days. Stop for a moment and think about the number of people that would be qualified to do something with the data that your organization has. See if you can quantify them. How many are there? Then, look at the industries and the companies that stand to disrupt you and ask the inverse. How many people on their staff do they have that could fill the functions of whatever it is that you're doing?

Here's a specific example. If you were, let's say, a central bank and you went through and tried to count the number of people who have degrees—Ph.Ds., let's say, in data science, AI, computer science, those kinds of fields—stop and try to figure that out. Then consider this. Amazon, at last count, at least publicly, has more than 150 economist Ph.Ds., right? That should tell you something [laughter] about your future.

You have to be smarter about this going forward. Plain and simple.

Dr. David A. Bray: Real quick, I'll give three points for any CEO or C-suite leader. First, it is worth polling your direct reports and core staff, "Why are we really here? What's our business model?" because I don't think companies revisit that enough. If you haven't revisited your business model in the last five years, let alone ten years, it may be that you need to revisit why you're here and how your business model is changing as a result of data.

That gets to the second point, which is, ask as many people as you can, which is, how is our data strategy moving us forward faster and providing a competitive edge? There are a lot of places where, actually, they don't have a data strategy or what they're doing with data is holding them back when, in fact, it should be moving them forward.

Then, last, the third one, I'm going to have to ask this, Michael, just because it needs to be happening, which is, apply the data golden rule, which is, do unto others as you would have them do to you. If you are not at that company, if you are on the outside and you were a customer or a member of the public, would you want what that company is doing to be done unto you? That's the three points.

Data and Privacy Advice to the Public in 2020

Michael Krigsman: Great. Let me ask you each then one final question, which is, what advice do you have for the public and for communities with respect to data in 2020? Amy, do you want to start with that?

Amy Webb: We are at a point where everybody needs to have a certain amount of digital street smarts. This is something that you are better off developing now before a lot of our ecosystems start to become extraordinarily complicated, more complicated than they are now. There are plenty of resources online to help you develop some of those street smarts and skills, but the easiest single thing that you can do all the time is to ask, "And then what?"

If somebody is sharing some funny app where you can see yourself age, stop and ask yourself, "And then what?" Right? Who has access to this data? How do I know what's going to happen to it next? Who stands to gain something besides me?

Do that all the time. Do that every time somebody asks you to smile for a photo, enter your thumbprint, give over your personal details. Apply the same scrutiny to all of the cool new consumer gadgets and gear that you would if somebody, a stranger asked you for your social security number. Right? If you stop and do that over and over again, you will start to develop some digital street smarts that will certainly help you in 2020 and beyond.

Dr. David A. Bray: You're absolutely right. You need to develop data street smarts. Be data curious. Find out what you can online through different videos and, of course, Michael, I've got to say, subscribe to CXOTalk because it can help inform you as to what you need to know about data as well.

Michael Krigsman: We've been talking with a lot of intellectual horsepower today with Amy Webb and David Bray. Thank you both. Amy, thank you for taking time to be with us today again.

Amy Webb: Thank you, Michael. It was great to be here.

Michael Krigsman: David Bray, thank you as well for taking your time to be here today.

Dr. David A. Bray: Thank you, Michael, and thank you, Amy. It was a great conversation.

Michael Krigsman: Everybody, be sure to subscribe on YouTube and hit the subscribe button at the top of our website. You can get our newsletter and we'll send you great information. We have fantastic shows coming up, so check out CXOTalk.com. Thanks so much for watching and have a great day.

Published Date: Jan 10, 2020

Author: Michael Krigsman

Episode ID: 641