Digital Privacy with Michelle Dennedy, Chief Privacy Officer, Cisco

Privacy is one of the most important, yet complicated, topics in our digital lives. On this episode, we learn about privacy engineering one of the top experts in the world.

46:02

Jun 24, 2016
806 Views

Privacy is one of the most important, yet complicated, topics in our digital lives. On this episode, we learn about privacy engineering one of the top experts in the world.

Michelle Finneran Dennedy currently serves as VP and Chief Privacy Officer at Cisco She is responsible for the development and implementation of the organization's data privacy policies and practices, working across business groups to drive data privacy excellence across the security continuum. She is co-author of The Privacy Engineer's Manifesto.

Before joining the CIsco, Michelle founded The iDennedy Project, a public service organization to address privacy needs in sensitive populations, such as children and the elderly, and emerging technology paradigms. Michelle is also a founder and editor in chief of a new media site—TheIdentityProject.com—that was started as an advocacy and education site, currently focused on the growing crime of Child ID theft.

Michelle was the Vice President for Security & Privacy Solutions for the Oracle Corporation. This team worked closely with customers to enable them to proceed with the confidence that information is protected and accelerated as an asset. Before the Oracle acquisition of Sun, Michelle was Chief Data Governance Officer within the Cloud Computing division at Sun Microsystems, Inc. Michelle worked closely with Sun's business, technical and legal teams to create the best data governance policies and processes possible for cloud computing in order to build trust for cloud environments through vendor transparency. Michelle also served as Sun’s Chief Privacy Officer.

Michelle is a sought-after and provocative public speaker, evangelizing new approaches and business justifications for soundly-defined, transparent security and privacy policies and systems that protect healthy, safe global businesses.

Michelle has a JD from Fordham University School of Law and a Bachelor of Science degree with university honors from The Ohio State University. In 2009, she was awarded the Goodwin Procter-IAPP Vanguard award for lifetime achievement and the EWF – CSO Magazine Woman of Influence award for work in the privacy and security fields. In 2012, she was recognized by the National Diversity Council as one of California’s Most Powerful & Influential Women. In 2014 she was cited as an AlwaysOn Power Player in On-Demand Computing and was honored as the Gold Stevie award as Woman of the Year in Technology.

Transcript

Michael Krigsman:

Welcome to episode number 179 of CXOTalk. I'm Michael Krigsman and what an interesting show we have today. We're talking about privacy and privacy engineering with one of the most preeminent people on the planet, one of the greatest experts on the planet to talk about this topic. And that is Michelle Dennedy, who is the Chief Privacy Officer at Cisco and she is also author of one of the definitive books on this topic. Michelle how are you?

Michelle Dennedy:

I'm doing well thanks so much for having me on your show Michael.

Michael Krigsman:

Well it's great to see you again. We were on a panel together in Stockholm nearly what three or four weeks ago, and we had a lot of fun and I certainly learned a lot and so I'm excited to hear what you're doing. And so tell us about Cisco and tell us about privacy and give us a little bit of background about yourself.

Michelle Dennedy:

Yeah so and thank you again for having me, I really appreciate this opportunity. Privacy is a relatively new field and I was actually fortunate enough to be one of the first named Chief Privacy Officers and I come from an intellectual property litigation background. But there's a lot of different paths that people take into this profession, through the IT field proper, through IT security, through marketing HR. Anything that has to do with really curating assets about human being. And so that's where I happen to come to the legal path. But at Cisco I’m actually in a business unit that is a discreet business unit called The Security and Trust Office.

So we bring together security, privacy quality, privacy engineering, and security engineering as well as basic research in these specific fields. And we build out those tools for our customers as advisors to those folks and work with public policy and legal in that capacity. And we also work on our own future system and current system to bring them into really centric around data, data assets and the human beings that that use them. And really leverage them at as assets in their business and in their lives.

Michael Krigsman:

Michelle privacy is on the surface may seem like this kind of simple thing, you know we use Facebook we need to set the settings. But it's actually quite a bit more complex than that, and you talk about privacy engineering which I think indicates some of the layers of complexity that exists. So maybe can you tell us what is privacy engineering and why is this whole field so fraught with challenge and with difficulty and with complexity.

Michelle Dennedy:

Yeah so it's a great question, and I’m so thrilled that you asked that. I think its privacy engineering is often confused and kind of equated with privacy by design. Privacy by design is the public policy we're entities that run systems that have information about people are tasked with thinking about privacy before they build and deploy.  Having settings that are easy to use for customers etc. you know great aspirational public policy type aspirations. To build those aspirations we need what we call privacy engineering.

And privacy engineering I look at not just as a technical field, it really is bringing together people, process and, technology. But leveraging basic infrastructure, traditional engineering concepts such as business activity diagramming, process flow diagramming, user interface. And using even integrating new technologies like artificial intelligence and using analytics to build systems that actually respect the identities and they're kind of full form and leverage of the use about people.

So in a sentence the way I define privacy in functional terms is that it is the authorized processing of personally identifiable information according to their legal, moral and ethical principles.

Michael Krigsman:

Wow, so please please dissect that for us because it's bringing together so many elements.

Michelle Dennedy:

Yeah so basically I’ll give it a little bit of the origin of the book. So there are three co-authors Jonathan Fox, Tom Finneran and myself and we come from different perspectives. Jonathan is the operational guru. He came in initially as a pioneer in digital licensing. Tom Finneran who coincidentally is my also my father has been in the computer architecture services and security business for the last half century. He probably would like. Then I come in from the legal side and as a Chief Privacy Officer, and basically we came together and said that we're talking a lot in public now finally, about what does it mean to be a digital citizen. Does it mean that you have to give up all your information to do things like search? Do you have to only have a business that's run on your own personal information, like a social networking platform that's free. Or is it something different. Should there be a different model when we are able to send that command and control elements of our identity.

And I really think it's the latter that really creates what I call values to value, which means if we value our integrity or individualism, the special powers that we bring to bear as individual customers, as individual employees, as individual citizens. And then also respect the aggregate. How do we secure, protect, monitor that ecosystem so that we know that people are safe so that we know that information can be verified.

So we take those basic concepts and we say, how do you functionalize that? What does that mean to the engineer? Because public policy is great, but when you go in and you talk to people that are building things they're thinking in terms of 0 and 1. Switching, where does the traffic go? What does it mean to process information fairly? What does it mean to have proportional access to information?

And so the answer to those questions is actually a breakdown of various techniques. So I’ll give you an example that's kind of an easy one. How many privacy policies have you read recently Michael, zero?

Michael Krigsman:

Yeah, zero to one. I see them and you know I think the feeling is they’re either too complicated and what's the point anyways.

Michelle Dennedy:

Exactly, so perfect reaction. I think that's the reaction many people, and the person who is like myself the governance officer who is in charge of presenting this information to you we're actually serving many masters. When we write a privacy notice, we’re required by law to make it very complicated, and yet those same regulators say, ”Why isn't this simple enough for consumers to use?”

We also recognize that that we are a multinational world. And you look at what just happened just last night with this vote with Brexit with the United Kingdom leaving the European Union. What a shock for this generation that's going to be a huge sea change. At the same time we are heavily interconnected as commercial and cultural entities globally. So when I look at that and I say I can either say this is too hard and therefore there is no privacy let's give up the whole thing. Or I can sit back and say these are policy decisions that have been made.

What are policy decisions? They’re business rules. And how do we create systems? We look at business rules and we look at the functionalities, the requirements and the specification and then we figure out what is possible to build with technology. What do you create with context, training more accessible language, greater platforms, using more senses you know using video rather than flat text for example, and all that it is brought to bear.

So the future of the privacy notice I think, we can either say it's going to be a footnote history, it's going to be the realm of lawyers suits and regulators walks. Or we can say this is another opportunity to really create a context. Where we say to you Michael, the choice of giving us your information is to give it here, and this is where I’m going to share it. This is how it will be used. Or this is the choice of either not using the platform or may be paying more for a platform that doesn't share your information with advertisers. So you can still get the kinds of information services that you want without the tradeoff of using your own identity, or may be tracking you or referring your face or other personal aspect to run their business.

And it is a requirement to have a sustainable business. So you can't just say make it all so that the citizens gets to choose everything. If I have to choose everything I would tell the IRS, our taxing agency in the US that I didn't make any money last year. But I certainly wouldn't want to tell my bank or my creditors that.

So there's a choice to be made that is your individual choice. There's another one to be made that is in your relationship choice, and that is the exquisite opportunity in privacy engineering and a really important reason why I'm doing it, where I'm doing it at Cisco go right now because we're all about flow. We’re about that network that connects every one of these services whether it’s a brick-and-mortar shop, or it's a wholly information driven social network. There’s the network underneath that making decisions about where information goes. So we want to build those system really giving you more tools to control your data, to understand what the collective choices are made for monitoring, surveillance, getting a warrant if there is a suspected crime going on. And I think all of those are the requirements that are just perfect storms for innovation.

Michael Krigsman:

Right so given all of these pieces which range that you're looking at, which range from

how a business or an individual should respond to privacy or think about privacy online, to building products to even you mentioned business process flows. Even designing organizations and then there of course is government policy. What is the guiding principle or the central thread that links all of these many disparate pieces together?

 Michelle Dennedy:

It's such a good question. I think at the highest order and the kind of fluffiest answer I can give you most comprehensive I should say rather than fluffy is it's really about on respect and ethics. So respect of the individual as a person. So I respect your information as an individual, but also your common respect for the informational economy.

And then ethics, there's a lot of things that we can do with technology. We can do mass surveillance at scale we've never been dreamt up. And the question of ethics is should we, and where should we, and how do we respect and get located on the United Kingdom simply because it's such the news today, how do you respect 51-percent of a nation that says we want to decide independently of our neighbors. And yet we are absolutely dependent economically and culturally with trade and interaction with the same people, those decisions that on that level of ethics and morality and respect have to be respected within the networks.

So that when you are American online and dealing with American ethics and a kind of rugged individuality ethics as a nation, where you may have different requirements and settings than when you're in France and you have a very different perspective based on what happened during World War Two with anonymous tip lines. It's unlawful to have an anonymous tip line in France which is kind of an interesting thing, so it seems very antithetical to privacy, but that's one of the choices they make.

So when you figure out how does that flow go from network to network you have to have enough complexity so that individuals and individual countries, and individual entities like businesses have some choice. But at the same time you have to have enough standardization and common language that people can really interoperate with these platforms and get things done, and are able to go back and audit and police those same networks.

Michael Krigsman:

We are talking with Michelle Dennedy, who is the Chief Privacy Officer at Cisco, and you can ask her questions on Twitter using the hashtag cxotalk. And we have a question from Twitter Michelle from Arsalan Khan, who's asking about the relationship of technology to put on my glasses, I know if I don't have my glasses that can barely see. So how tech-savvy should do privacy officers need to be, in order to give functional requirements to engineers. So I think the broader question is this relationship between policy requirements and the engineering part of it, the technical part of it.

Michelle Dennedy:

Okay, so I'm going to sound very narcissistic here for just 10 seconds, but the only reason I busted my butt to write a book is because I needed to answer that question. A big part of success in this field I believe, and I think really interesting conversations and innovations will pop out when legal people learn to speak tech and tech people learn to understand legal. And I’ll give you an example.

I went to the prior organization, I went to India and I was working with a developer that was doing very large scale back end enterprise level, the tagging of systems and it was for you know security purpose. So very virtuous type of the purpose and yet it it's one of those things where suddenly instead of handing a dumb client. Or dumb piece of technology to your customer, suddenly you have all this information coming back and creating big data sets so that you can manipulate those data sets to discover anomalies, patterns, and threats. And yet at the same time you still have that information.

So I flew and I was working with the engineering team from the beginning, which is the number one to answer directly, answer the question, get there as early as you can. When I was discussing what we were expecting from a privacy perspective, six months later after I got home I felt like I was being very clear. Here are the requirements, here’s proportionality, here's the nation state issue. Here’s recognizing people that have to be bumped off the system and the notice that must be given, blah, blah, blah.

I get home and about six months later I got a call from the team and they're so excited they build something really cool. They want to show me and I said well show me the aspects of that technology that we were talking about when I was in Mumbai. And they said ‘Oh we just thought you were going to write a disclaimer.’ And I just thought you know it's like Cool Hand Luke, “What we have here is a failure to communicate.” 

So this was actually a big you know we've been talking about doing a book for a long time to capture all these really rich discussion that we're having about the overlay of technology, and policy and law and business really is the impetus. So in chapter 4 of the Privacy Engineers Manifesto, we specifically address how do you convert something like all these principals and spare access principles and things like (STICKRA? 17:02) , and all these other specific privacy laws.

How do you convert that language into the language of engineering, which is requirements specifications. And when and where do you fit that into either an agile or a waterfall type development process. The same discussion when you're doing M&A transactions where you're trying to get the value out of a usually a smaller company, but sometimes same size companies merging together. And what you're looking at as value from the economic perspective is often customer base.

If you fail to engineer a process that respects the privacy of the customers that you're buying, you may find that you paid millions and millions for apathy you can touch because it's against the law.

So I think the other end of that question is how much does a lawyer need to know? My answer is I'm sorry because the law is complicated itself as much as you can. You need to get geeky. I was a psychology undergraduate. I was you know then I went to law school. Granted I loved getting taught in law and I went into it because I love technology that's new and not obvious. But you really do have to get down and dirty and ask the question so that you know when you're getting shined on by people who want to dismiss you and say, we're just going to look at those compliance and give you a check mark.

And you want to also be able to give your input and say, ‘You know it's not going to work this way, but how about this.’ And it's the how about this that really inspires, to me, the Privacy Engineers Manifesto because it's the new authentication model. Its understanding what do all these new big data and analytics models have to do to support, what we're trying to do should lessen the complexity of data management for the individual and the individual business. While also making sure that we have standards that are clear enough and ethical enough to pass scrutiny.

So I know answered probably four different questions but the short answer is yes get to know your technologist, take them out to lunch, figure out their language and you'll get a lot further. And maybe you'll even get some of their budget which is key for many professionals.

Michael Krigsman:

Okay, so we have another question from Twitter and Wayne Anderson says that today in the world of customer experience personalization is everything. And so from a privacy standpoint and an ethics standpoint where do the boundaries lie, the appropriate boundaries?

Michelle Dennedy:

Okay, so this is one of my favorite question because I  often hear from particularly marketing executives when I'm talking to them they say, we need more, we need more.’ Well it's true that you need more information about someone to become more personal, but I think the best analogy is kind of the first date analogy.

If you have gone online, and we all can now and googled everything and called that person's neighbors and call their bosses, and followed them around town before the first date, and you start like presenting things based on all of your extensive surveillance of that individual. I will guess that you probably won't get date number two, because guess what? Super creepy.

The online world isn't all that different. There are times when I simply want to be looking at shoes, because I’m trying in the back of my head to work on a really difficult business problem. I don't want you in my face, following me around the net with those shoes. I don't want you personalizing and sending me shoes that already purchased, which drives me nuts. We’re collecting all this information and you're trying to sell me things that already bought.

So personalization is critical and this is why privacy engineering is so profitable as well as necessary from a compliance perspective. And this is where I'm really focusing at Cisco today. My team in particular is looking at business models that are enhanced by having a grip on the complexity of human information. So when you know that a person is kind of coming down that purchasing funnel, or entering into a sister type of business, so air flight and rental cars has often been a great analogy from the early days of federated identity discussion.

That's the key is understanding who owns what, who can deliver on what, and when is that perfect moment when the person wants to have all of those information streams joined. And really importantly for your compliance efforts going forward in light of the latest European legislation, how do you disjoin sections of that information appropriately, so that you don't lose the whole customer when they say, I don't want some sort of a subscription mailing from you.

So by really thinking hard about what are the components of a relationship, and how do we kind of lift and separate and curate them. Then we're going to understand that we actually have much greater asset that's to combine a new layer.

Michael Krigsman:

So compliance does not lead to greater profitability. But if I’m understanding what you’re saying correctly, understanding the bounds of ethics and privacy and what

consumers care about and I'm not going to put words in your mouth, what consumers care about means that you can design products that will be desirable to those consumers and they will trust you, and that is what will lead to greater relationship and therefore they’ll buy more from you and create profit obviously from that.

Michelle Dennedy:

Absolutely, I mean we talk about that actually in the book about Facebook. I think it's such an interesting example. So you know it's always one of the more notorious ones that's always you know in the crosshairs of everyone in this discussion. So you know they have a wonderful team there, so I have no casting aspersions on Facebook. It’s a platform that I  use in my life.

However, when they first started let's think about what the landscape looks like. Myspace was King until a young girl met someone online who she didn't know and went to go meet him. And it turned out that he was a murderous rapist. Suddenly bad press everywhere, who knows whose on MySpace? It's no longer this cool music sharing platform. Now it's a dangerous place where you don't know anyone. At that perfect psychological moment in comes you know, whether it's the twins or are mr. Zuckerberg or whomever. In comes these nice clean-cut boy from Harvard, and the only people who can get on the platform are nice clean-cut people from Harvard, and then nice clean-up people from the London School of Economics, and then nice-cut people from Stanford.

So it was the ultimate Dilbert wrote if you will when they first got started and first started building momentum, and if you think about it it was not an open Internet at all. It was the ultimate in private. You have to be a student who either could learn or buy your way into one of these elite universities to even get on the platform. Very desirable.

Their next move after they went out of the edu stage was to have something that they called Circle, so only people in your circle could see the page which was a static page at the time that you would post. Until one day I looked on my circles and I found that one of my circles was London. I'm friendly gal. I don't know everyone in London, and so sudden you then you have the whole world and then you know Facebook as we know and love it today exists.

So the interesting thing to me economically is how, did they beat out MySpace, how did they beat out Homestead, how did they beat out At Home? All these names that many people probably don't even remember that we're going on in the Valley. You know I was living out and in Silicon Valley at the time when Facebook got really started. And at that point we heard 10 to 20 of these different platforms that sound exactly the same. So if you're being pitched for Facebook you heard this story a hundred times. It was that authentication piece that really won the day.

So I think that it’s a really interesting topical story about how building in at least the marketing around and I'm hoping the engineering behind it, if you build in gradual trust -  I'm not saying to take that trust away later. I'm trying to build in that trust and really leverage the platform that is private, that is something that people can trust over time, I think you really will find something that will build and grow and then people will stick on the platform, because it's really not all that different than any other messaging platform.

Michael Krigsman:

So essentially what you're saying is build trust into your product into your platform and there will be a customer loyalty behind that, and privacy play important play an important role.  

Michelle Dennedy:

Yeah and I think both privacy and trust are absolutely married to each other. They don't exist unless they exist over time. So just as you know the question of personalization is so important, don't be afraid to take a little time. Don't be afraid to actually build a true human relationship, and to respect that with tools. You know loyalty cards that actually treat their loyalty guests with meal deals, those things are sticky. The things that last, the things that are over time that can be trusted is the quality, same, and consistent over time.

I think those are the things that really build it. I think where we get led astray, particular privacy discussion is, we have the feeling particularly younger people just want to give away everything and want to say everything. And I think also the other thing that's not recognized about the younger generation is they will jump from platform to platform if they don't like what you're doing. They will hide 1% from their parents; they will share a very different with their girlfriends and share a different one yet again with boyfriends or someone that they just met and employers.

I think that behavior is what we need to build into every one of our system because that's the right pace. Figuring out how we can be our complex human diverse ourselves over time, is a living and delightful platform for our users.

Michael Krigsman:

So I want to talk about government policy and privacy policy, but one point here relative to what you're just saying which is from a practical standpoint you mentioned young people. How should young people and how should the parents of young people be thinking about privacy? And on one level that's kind of a far cry from you know the next part of the thing that we haven’t to talk about. You know these government policies and regulations and all of that, but I think that's what people care about. Everybody's nervous about their privacy online and for good reason.

Michelle Dennedy:

Yeah, and I actually think it should be upside down. I think we should be talking first to our families, and our kids, and our communities about what their expectations are and what their desires are for privacy, with both the view of the ‘I want it now’, of the younger brain but married to the wisdom of the parents that say ‘I wanted it now then to and here's what happened.’

It's called judgment. You know, we are not antiques. We really are useful to our kids whether they acknowledge it or not. But I think actually taking these requirements and specifications for what we want as a society first, and taking those to the public policy debate rather than having sitting back and saying,  ‘How are we going to control this? How we going to get in front of it? How do you slow this down and stop this?’ That has been chronically in effective. If you read our history we know it doesn't work. So I think those two notions there are very wedded together. And I think even the least technologically savvy person has a story to talk about with their kids about reputation. About how change and change in the things that you share with the types of people you share over time happens. Great stories about how you keep those friends. Sad story about how you lose a friend.

All of those things are reflected in our online communications as well. And all are really important basic values and principles to build into our discussion that at the very earliest stages. We're seeing one year olds who know how to use these smartphone better than the parents do.  So it can't be too early to start talking about your brand, your reputation, your personality.

You know what's your story? What story do you want to tell? If people saw you and just read what you put out there, what story would they tell about you? These are the kind of exercises we can talk about before we ever get into it the cool discussions of what platform? How we code? You know what does the new professions look like in the future that actually curates, regulate, and manages these online digital assets. And I do believe there are assets I do believe there will be a Digital Asset Manager will be a title in the future.

Michael Krigsman:

And I know you care a lot about this topic of children and privacy. You started an organization called the Identity Foundation.

Michelle Dennedy:

Yeah the Identity project was started after I was working with a company called All Clear Idea at the time as a consultant. And I just got in their platform looking to see how it works, and low and behold discovered that my own daughter had her identity stolen two times. Once 11 years before her birth I hadn't even met her father, and another time when she was about 5.

One was used mostly for commercial to acquire credit cards and credit and you know ran up some debts and abandon them. And then the next was to traffic human beings that crossed the border into the states and acquire utility account also abandoned. So by the time she was eight years old she had the worst credit score possible, what I call a financial birth defect because there's no way I could have protected her identity, 11 years before she was even born.

So I was really surprised and I think the line line story I guess is I had already been a Chief Privacy Officer for probably about 10 years when I made this discovery. I'm very aware, I teach people, I have my settings set to all the stuff and still this was yet another topic that I just really hit me by surprise and could have impacted her future in a very material way.

Michael Krigsman:

Wow, your daughter. So on the topic again I want to talk about things like AI and privacy and again government regulation we need to talk about. But just one more point on Facebook, how do we handle - what it is Facebook's responsibility? Because we are at their mercy from a privacy standpoint, and of course as you said earlier you know they have the standard disclaimer language and their contract essentially that we agree to. But when Facebook becomes part of the fabric of our society our culture, which it is, at what point where do their responsibilities begin and end, and look at it from a privacy standpoint, from a legal standpoint, from a moral standpoint. How do we dissect that?

Michelle Dennedy:

Yeah, so I mean obviously had I am not their privacy officer, so I can't speak for them as anything other than a consumer of their services which I do like. But I can speak in generalities about these social network platform, and I think it's a very much a two-way street. I hate to fork over too much responsibility to the consumer, but you really do have to understand what you're getting in for.

Think about how much money you're paying for it. Are you paying any dollars and cents? No, what are you paying with? One of those little adds that you're seeing on the side…

Michael Krigsman:

You’re training your privacy.

Michelle Dennedy:

You're trading in data. So sometimes its private data sometimes its metadata, trends, analysis. I will argue that most of the valuable advertising dollars go to the non-personalized giant data sets. I think those are much more effective, and I also think that using the so-called softer sciences of psychology and sociology and anthropology, we're going to figure out people's perspectives or trends the way they behave a lot faster than watching some teenage girl click on you know the blue scarf versus the red scarf.

So I’ll say that I think the trend is going to be more and more big data married to a big research, rather than trying on discrete activity. So I think that is a trend that's actually a protective trend for customers and consumers. I think the leveraging and using the tools that they do have - picture your identity is important. I think this is again where that conversation and I guess this more processed people in protest of you know before you get on the platform think about what you are using that platform for.

Are you using it for photo sharing with grandparents or connectivity with with friends far away. And if so, how personal and in a bad light of photography because privacy is so married feelings and really where when I do it, I do it in a very industrial sense of curating an asset that is you know it's a critical human assets. But if you think about what you're doing online - I say a lot online. I experiment on these social networks extensively because of what I do for a living in my personal and professional interest.

But not everyone does, and I think as long as you understand what it is and what it does  and how many people can see what you're doing, can judge what you're doing on a very flat sense. They don't know me in person but they can make a lot of judgments, and then understanding the basic safety for your kids. I see kids posting things online like mom and dad aren’t home yo yo I’m partying, and there's an address in the back, I’m home alone. So there's a lot of things to consider before you get into the platform and use it as a tool.

So that’s being number one. On the other side which is what is the private and the public organizations responsibility? This is where I'm really excited about the work we're doing with the IEEE on ethics. I think there is law and I think that all of these organizations, these larger organizations in particular they work very hard. They have great teams.  I know many of the Facebook team. I know many of the LinkedIn team. You guys work really hard and they're very sophisticated and they're attempting to comply with over 200 countries various laws and regulation.

But compliance is one thing and human behavior and human desire than another. So I think that they’re trying to really focusing more on ethics and instead of a do no harm or don't be evil kind of mantra, this is where I'm excited about. Really digging in with the IEEE teams and saying is there a framework like the privacy engineering framework. Can we use business activity diagrams and create business rules for ethics? What is your decision and structure? And I think that is a really critical responsibility for all of us who have fiduciary operational control over data.

Michael Krigsman:

And you're referring to the an initiative by the IEEE which I’m sure most people are aware of it’s a standards body, that you and I are both involved with regarding the ethics and policy-making issues in relation to privacy with respect to artificial intelligence and autonomous systems. So why is this so important and actually when we talk about AI and autonomous systems, why does privacy become so thorny in that domain?

Michelle Dennedy:

Well I think it's - okay I’m going to back up, let’s talk about artificial intelligence first. I think there are rules of robotics to think about, you know robots should do no harm kind of things and then you think about the robots that ignore the human commands because they're trying to make judgments on our behalf and that kind of science fiction artificial intelligence.

So there's artificial intelligence in terms of simulated human artificial intelligence. Can we have a machine that basically can have judgment? And then there's artificial intelligence of the kind of what is creepy buying behavior what is creepy monitoring behavior, what does an anomalous interaction start to look like when there's a suicidal child on a public platform making statements?

There are a world of different applications. There's even artificial intelligence just to say how quickly do we sail over a back-end system to keep critical infrastructure systems running, you know keep the power on, keep the food supply base and keep any sort of negative false information out of that system. So I think the applications for artificial intelligence are as wide as the imagination.  It's not just robots that will come and walk and talk and commit (Unclear 40:03).

So that's the number one of what is this. The reason I think it's so throny is we don't have a universal set of laws even around what should be done with automated decision making, with analytics, with big dataset quality. You know, when is a dataset large enough that you can rely on its quality, and if so are you really pushing down quality to the standard curve you know, who’s under the main curve where some applications like specialized cancer and personalized medicine, you're really talking about the Sigma out here on the tail.

So you may only have two subjects and so there's a really interesting traditional statistical reliability discussion that is ethics. There's the legal discussion of ethics. There's a cultural discussion of ethics. Think about how different certain groups in the Middle East ethics are compared to those in Canada, compared to those in China, compared to those in other places in the world.

So when you have that kind of level of really innate and bedrock diversity, are there enough common threads where we can come up with a framework for at least decision making,

if not as much commonality that will be accepted to as many people as possible when you're using these advanced technologies that are largely invisible to the common user.

Michael Krigsman:

So we're almost out of time, but one of the key things that you're saying is that the laws are not keeping pace with the development of technology.

Michelle Dennedy:

They never will. They never will and god bless them. You know if the lawmakers could keep up with us, then they should be here building stuff. You know, we want them doing what they do and we're going to do what we do, but I don't think we'll ever catch up. I think it's another thing and I think this is a trend that's probably only less than 20 years old as you know, technologists and even very very small companies need to understand that they have have to get involved in that public policy debate. So we’ve already asked for lawyers to become commercial engineers. We've asked the engineers to understand law, and now policy, but I  think when we really look at this as a series of requirements and specs for what we're going to build, then you understand where your dance card is.

You don't have to figure out gun control, you just have to figure out this one piece of this one thing that you know something about that the guys in DC and Brussels and Beijing may not, and you have to figure out a way to share that information so that we're not making dumb decisions in our legislation. Because it kind of has to be a one size fits many models. And when I started out in privacy, I would have told you that by 2016 we would have a harmonized treaty for data flows, much like we have for shipping lanes and airlines and space travel. And we don't. We’re vulcanizing, and I think what we've seen in Europe just in the last 24 hours, Europe and the UK I should say now where we are separating ourselves more than we are unifying. And so the complexity in this issue is not going to go away. And the good news is where there's complexity we need an innovation and innovators and creativity, and even artists to tell us what it is that we're draining of and what are the impacts of those dreams.

Michael Krigsman:

Okay and I have just one last question for you because unfortunately the time has just flown by. So we have a lot of Chief Information Officers that are in our audience and constituency, and what advice do you have for CIOs on dealing with these privacy issues and all of the complexity that you've been describing.

Michelle Dennedy:

So one of the most concrete things that you can do if you don't have a privacy officer sitting on your staff already, find them and legal, find them in public policy, find them in your Chief Privacy Office if you have one, and I do I think you should get one. I think it's working very well for us. Get them on your staff, talk to them early about what your objectives are, even if you don't think it has anything to do with privacy per se. Because metadata and all of the things that they are running for efficiency so that they can have systems that run well, systems that are efficient, things that are cost-efficient, many of these decisions are going to be information-centric. That's my kids arguing in the background by the way I’ll go beat them later!

Michael Krigsman:

Okay well we have been talking with Michelle Dennedy, who is with her family in a hotel room at Disney World and so Michelle thank you so much for being here today.

Michelle Dennedy:

Oh thank you so much for having me. It's such an important and exciting topic, and I really appreciate your time.

Michael Krigsman:

And thank your kids as well because they've been really good.

Michelle Dennedy:

Thank you guys

Michael Krigsman:

Everybody thanks again for watching episode number 179 of CXOTalk. Next Friday there won't be a show because it's the run-up to the July 4th holiday here in the US, and we'll be back the week after. Have a great weekend everybody and we will see you soon.

           

Companies mentioned on today’s show:

Cisco                            www.cisco.com

Facebook                     www.facebook.com

LinkedIn                       www.linkedin.com

Michelle Dennedy:

Blog                             https://blogs.cisco.com/author/michelledennedy

Twitter                        https://twitter.com/mdennedy

LinkedIn                       www.linkedin.com/in/michelledennedy

Published Date: Jun 24, 2016

Author: Michael Krigsman

Episode ID: 361