Disinformation, Cognitive Security, and Influence

Fighting disinformation (and misinformation) attacks has become a crucial part of information security (Infosec). On this episode of CxOTalk, three experts explain how to disrupt these dangerous attacks.

44:44

Sep 06, 2019
7,406 Views

Fighting disinformation (and misinformation) attacks has become a crucial part of information security (Infosec). On this episode of CxOTalk, three experts explain these dangerous attacks, which are based on influence and manipulation, and how to disrupt them.

Sara-Jayne Terp is a data nerd with a long history of working on the hardest data problems she can find. Her background includes designing unmanned vehicle systems, transport, intelligence and disaster data systems with an emphasis on how humans and autonomous systems work together; developing crowdsourced advocacy tools, managing innovations, teaching data science to Columbia’s international development students, designing probabilistic network algorithms, working as a pyrotechnician, and CTO of the UN’s big data team. Her current interests are focused on misinformation mechanisms and counters; she founded Bodacea Light Industries to focus on this, is working with the Global Disinformation Index to create an independent disinformation rating system, and running a Credibility Coalition working group on the application of information security principles to misinformation. Sara-Jayne holds degrees in artificial intelligence and pattern analysis and neural networks.

Pablo Breuer is currently the director of US Special Operations Command Donovan Group and senior military advisor and innovation officer to SOFWERX. He's served at the National Security Agency and U.S. Cyber Command as well as being the Director of C4 at U.S. Naval Forces Central Command. He is a DoD Cyber Cup and Defcon Black Badge winner, and has been adjunct faculty at National University, California State University Monterey Bay, and a Visiting Scientist at Carnegie Mellon CERT/SEI. He has taught classes for various U.S. government agencies and industry on topics ranging from malware reverse engineering and exploit development to cyber policy and authorities. Pablo is also a founder and board member of The Diana Initiative, an InfoSec event focused on advancing the careers of women in cyber security, and is on the staff for BSides Las Vegas and CircleCityCon. Pablo holds degrees in computer science and is a PhD candidate in information science.

Dr. David A. Bray was named one of the top "24 Americans Who Are Changing the World" under 40 by Business Insider in 2016. He was also named a Young Global Leader by the World Economic Forum for 2016-2021. He also accepted a role of Co-Chair for an IEEE Committee focused on Artificial Intelligence, automated systems, and innovative policies globally for 2016-2017 and has been serving as a Visiting Executive In-Residence at Harvard University since 2015. He has also been named a Marshall Memorial Fellow for 2017-2018 and will travel to Europe to discuss Trans-Atlantic issues of common concern including exponential technologies and the global future ahead. Since 2017, he serves as Executive Director for the People-Centered Internet coalition co-founded by Vint Cerf, focused on providing support and expertise for community-focused projects that measurably improve people's lives using the internet. He also provides strategy and advises start-ups espousing human-centric principles to technology-enabled decision making in complex environments.

Transcript

Michael Krigsman: We hear terms such as misinformation, disinformation, campaigns of influence designed to manipulate us. That's our topic on CXOTalk. Sara-Jayne Terp, tell us about your work and tell us the things you're involved with.

Sara-Jayne Terp: I'm a data scientist. I've been working on misinformation/disinformation for the past few years. Before that, actually for many years, I've worked on how algorithms affect human beings, specifically autonomy theory robotics.

All my work at the moment is misinformation. It's just my life. [Laughter] I work half of my time on misinformation that's financially motivated, so tracking down pages that are fake news sites and related sites. The rest of my time, I work on large-scale misinformation.

Michael Krigsman: Our second guest is Pablo Breuer. Please, tell us about your work and your affiliation.

Pablo Breuer: Currently, I am the Military Director for the Donovan Group, which is U.S. Special Operations, Command, Future Studies, and Think Tank where I think about all of the things that nobody else is thinking about so that we can plan for them in the future. One of those things is the prevalence of misinformation, and so I've been working with S.J. for the last nine months or so on trying to solve this problem.

Michael Krigsman: Our third guest, guest cohost and subject matter expert, is Dr. David Bray. He's a well-known figure here on CXOTalk. David, welcome back to CXOTalk.

Dr. David A. Bray: Thank you, Michael. It's great to be here and great to be with S.J. and Pablo and to learn from their wisdom and expertise.

Michael Krigsman: When we talk about disinformation and misinformation, what are we talking about? Set the stage for us.

Pablo Breuer: Yeah. Yeah, there are a lot of arguments about the ontology of disinformation versus misinformation. What we've defined it as is the deliberate promotion of false, misleading, or misattributed information in either content or context.

Sara-Jayne Terp: That actually matters a lot. People think of misinformation as just being fake news, as being maybe putting out text that's completely wrong. Most misinformation is actually true but set in the wrong context where that context might be who you think it's coming from, when it's coming from, where it's coming from. It's those trails and those contexts become incredibly important both for production and for tracking. Pablo, to you.

Pablo Breuer: No, no. That's absolutely right. In order for misinformation to work, it's got to be 95%, or better, true. The reason is that if it's totally false or mostly false, it's very easy to identify as false, and so it's got to be mostly true and just false enough that you're likely to believe it.

Sara-Jayne Terp: Yeah, I mean you're not trying to get people to believe lies most of the time. Most of the time, you're trying to change their belief sets. In fact, really, deeper than that, you're trying to change, to shift their internal narrative slightly or at least use their internal narratives.

Dr. David A. Bray: To pull on that, because that's fascinating, S.J. and Pablo, you're saying really what it is, is taking things out of context with the goal of not necessarily deliberately putting false information out there but taking things out of context in a way of shaping what people rethink. Can you tell us a little bit more about the narratives are there?

Sara-Jayne Terp: It's also about the way that they feel. One great example is, a lot of people say Black Lives Matter, but Black Lives Matter is an actual group. It's a real group, but some of the fake black activism happened back in 2016, so there are genuine problems. A lot of the work done was to exacerbate the emotions around those problems, to exacerbate the splits in society.

As an aggressor, you're trying to weaken an opponent where that opponent could be a nation-state. It quite often is a nation-state, but these days could be a company also. Pablo, again, let's play tennis.

Pablo Breuer: Mostly what these narratives try to is take advantage of the listeners' or the viewers' cognitive and social biases. You already have to be predisposed in some way to believe the narrative that you're being presented. Otherwise, you end up with cognitive dissonance and you actually reject the message. You have to be predisposed in some way.

Now, one of the points that you made was about completely false information. As we get to deep fakes and fake videos and fake audio, which are, at least on the surface, indistinguishable from the real thing, it'll be interesting to see how far that gets pushed. That’s' certainly a great concern.

It's one thing when you're reading static text or when you're looking at a static picture and maybe the narrative that you're being presented about that picture is different than the narrative from which it was originally told. It's quite another thing when what your eyes see and what your ears hear can no longer be believed.

Sara-Jayne Terp: When we say narrative, there are some confusions around that, so I guess we have to clear that one up fairly early. Narratives, in the frame that we talk about, mostly when we're talking about misinformation together, that's the set of stories that are your baseline for your culture. That might be the baseline for your culture as an American, a baseline for your culture as whatever American you happen to be, like Irish American or Italian American. Name your set, except British American. We don't seem to exist. [Laughter]

For example, we're geeks. Geeks have a baseline set of stories. The layer on which a lot of the incidents, a lot of the attacks happen is in shifting those stories, is in using those stories, using those narratives, those base layers, and using stories and memes to push out those, to attack those.

Michael Krigsman: I have heard the term "Cyber Kill Chain."

Sara-Jayne Terp: We are the people who came up with this idea of mapping the Cyber Kill Chain to misinformation and it works quite well. Pablo, all yours.

Pablo Breuer: Sure. The concept of a kill chain, and it's unfortunate that it's got the word "kill" in there because that's not necessarily accurate.

Sara-Jayne Terp: We can call it an influence chain. That works.

Pablo Breuer: Yeah, so the concept is that, in order to accomplish misinformation, there are certain actions that you must take. If you envision this as a chain, if any one of those links is either skipped or broken, the attempt at misinformation fails completely.

What S.J. and I and the rest of the MisinfoSec Working Group did is we took a look at how similar problems had been viewed in the past. One of those was cyber warfare, and so Lockheed Martin had written this Cyber Kill Chain. There'd been others that had been using it. Then Mitre had developed the attack framework, which is the kill chain plus associated techniques, tactics, and procedures to accomplish each link in the chain.

We went, "Hey, this is great. This actually works," and so we developed AMITT, which is our framework for adversarial misinformation and influence tactics and techniques. We've defined all of the necessary tasks to complete misinformation and we started cataloging real-world incidents, what the techniques are to accomplish each of those tasks.

Sara-Jayne Terp: When we're talking about a chain, we talk about it in terms of the stages that you would need to go through. For example, we have planning stages. We have content stages. We have a people stage for things like creating the fake personas you would need and/or recruiting the real humans you would need to create a full influence campaign, so misinformation campaign.

This is a lovely sort of horizontal versus vertical. This is when I wave my hands around because I would love to show you the diagrams.

Dr. David A. Bray: Do we see times when misinformation is being directed to all sides? It's not just trying to sway one way but, in fact, try to create polarizing wedges or social wedges from multiple dimensions and actually get groups that naturally may not necessarily like each other but are not really that vehemently in disagreement become actually more vehemently disagreed and actually have emotions on a high? Is that what they're trying to do?

Sara-Jayne Terp: All the time. This is the point. This is just one form of a misinformation campaign. There are many other types, but this is one that's being well used against the U.S. Some of it has gotten physical. There have been a few times.

Amusingly, the pro- and anti-Beyoncé campaigns after the Super Bowl incident where the opposing groups have been given fake meets, physical protest calls, and turned up in the street opposite each other, we think that was the Russians playing, you know, testing stuff, but this isn't just people shouting on the Internet. The potential is much, much higher and wider.

Pablo Breuer: Generally speaking, though, we talk about five overarching strategies for misinformation and we call those the five Ds. One of those that, David, you alluded to is the divide where you take a population and you bifurcate that population so that they're on opposing sides of the same issue. You get them fighting with each other. Therefore, they're not paying attention to anything outside of their own population. We certainly saw that or attempts to do that, during the 2016 presidential election.

Other strategies are distort, which just takes a narrative and distorts the actual facts. The Russians might say, "We're not invading Ukraine. We're liberating ethnic Russians."

Another one is to dismiss, so that's the typical dismiss your accusation and make your own counteraccusations. China, certainly, every time they're accused of stealing intellectual property or engaging in cyber warfare, their standard narrative is to say, "We don't do that. However, we're often the victims of U.S. aggression in cyberspace."

Distract is another one of those strategies. Distract actually tries to ignore the current narrative and start a completely different narrative. A recent example of that is the MH17 shootdown. The Russians never directly addressed whether or not that was a Russian missile. Instead, they asked the question about why is a commercial airliner flying through an active combat zone.

Then the last one is dismay, which is an ad hominem attack. These are attacks that are so outlandish that, even by responding to them and saying that they're outlandish, you lend them credence. Probably the best-known one of those is the Pizzagate during the last election.

Those are the typical tactics: distort, dismiss, distract, divide, and dismay. Any of those tactics can either be used to coalesce people to get a narrative going or to bifurcate people to get them waring with each other so that you can do other things.

Sara-Jayne Terp: Yeah, those are the five Ds, so Ben Nimmo came up with the original four Ds and we added "divide" into there. Originally, they were talking about Russian against U.S., but as you're listening to Pablo, this is all over the world now. Pretty much everybody who can stand up a misinformation campaign is doing it and pretty much everybody who could be subject to it is seeing it in their feeds.

Dr. David A. Bray: I guess gets to the question, why now? Is this just because the Internet is more widespread globally or have these things been going on since the printing press, the radio, and television and it's now just that anyone can actually be a producer of misinformation?

Pablo Breuer: If you look historically, the ability to reach a mass audience was limited to a select few. Originally, when you talked about handwriting scrolls, the church and governments were the only ones that were literate and had the manpower and the finances to be able to do those things. Then you move forward to movable type and, again, it was expensive, and so very few people could transmit. You could reach a wider audience because literacy became more prevalent and anybody could receive these as long as you were within physical distance.

Now, certainly, one of the things that we also see occurring throughout history is that the purveyors of this kind of technology never envisioned how it could go awry. As the Catholic kind of allowed for the Gutenberg press and printed the Gutenberg Bibles, they certainly didn't envision Martin Luther mass printing his 95 theses and nailing them to church doors.

Then you fast forward to radio and telegraph, and it's the same thing. Very few people have access to the studios to transmit. You get more and more people that have access to receive the transmission and it becomes the de facto new standard and, certainly, nobody envisioned that an entertainment show War of the Worlds would be mistaken for actual news.

Then you go to television and we are where we are with television. But, in the 1980s, if you were an American, you got your news from ABC, CBS, and NBC. When you went to talk to your neighbor, you could agree or disagree with the news. However, you at least saw the same news.

Each of those phases of this evolution of information would allow the limited few, i.e. the President of the United States who could talk to ABC, CBS, and NBC and say, "Hey, look. I want to talk to the whole U.S. populous," to reach wider and wider audiences.

Now, what happened with the Internet is, we've now democratized who can reach mass audiences, so anybody can hop on social media. We now live in a world where an entertainer like Katie Perry can reach twice as many people as the President of the United States via social media and there are no gatekeepers anymore.

We've now completed the circle. We've gone from very few can reach a mass audience to anybody can reach a mass audience.

Sara-Jayne Terp: This is also the dark side of the Internet, the dark side of big data, if you will. Yeah, back in the old days, we're talking about ten years ago, we talked about the three Vs: volume, variety, and velocity. There's a lot more of it, it's a lot faster, and it's across an awful lot of different platforms. Misinformation carries in the same places as well. It's just an awful lot easier.

I'm a data scientist. It's very easy to set up something that broadcasts across to a lot of people very fast.

Michael Krigsman: We have an interesting and important question from Twitter. Sal Rasa asks, "What can people do? How can people understand what's happening around them with this stuff?"

Sara-Jayne Terp: There's a whole set of things. Part of the reason we built things like AMITT, we broke down the misinformation campaign, so misinformation incidents into techniques were so we could look at each technique, look at each stage, and say, "What can we do against these techniques, against these stages?"

There is this idea of left of boom when we're talking about stage-based models. Left of boom is before. It's an old term from--

Pablo Breuer: Bomb.

Sara-Jayne Terp: Bomb disposal. Thank you. [Laughter] Where the stuff that happens before the thing goes off versus stuff that happens after the thing goes off. In this case, the thing going off is the misinformation reaching the public.

Most of the time, if it's reached the public, it's getting late. We'd really like to deal with it before that. We're working on processes to do that.

If it's got to you, be critical. Be critical of what you're reading. Be critical of what you're sharing. If it looks too good or too exciting to be true, check it. Look at the providence on it. Look at the date on it. I still get friends sharing stuff that's two, three years old because it's really exciting or really hits them in the feels.

It hits everybody. I work with this all the time and I too have shared misinformation and been called out by my friends who are wonderful and useful, which is why we need to do left of boom. Think about what you're reading. There are some very good explainers out there. There's a lovely one by the State Department that talks about the pineapple pizza, how people are being divided over whether they put pineapple on pizza or not and some of the tactics to see.

Pablo Breuer: There are things that you can do just as a consumer of media and there are things that you can do as a purveyor of media and platforms. As an average consumer, as S.J. mentioned, not only be critical but most of us tend to follow news sources that follow our own biases. You should also go back and listen to the news sources that absolutely enrage you, get possibly the other side, and then make a more informed decision because, realistically, the truth is somewhere in the middle.

The way that misinformation often works and, S.J. again alluded to this is, it tries to provoke an emotional reaction. We're all familiar with clickbait. We're all familiar with these fantastic headlines. Then you click and you find out there's not quite as much meat there as we expected from the headline. If you're finding yourself having an emotional reaction, that's the time to take a deep breath and really examine what you're reading. Oftentimes, these stories say, "Well, party A said B and party B reported C," and so, hey, go back and make sure that that's actually what was reported.

One of the other things that you can do is try as much as possible--and this is hard--to verify the provenance of the information. If the information is coming from a legitimate news source, and I leave it to the consumer to decide what that is, it's probably going to be verified more by actual journalists than somebody that's putting out an editorial blog. These are things to keep in mind is, consider the source; consider where the original material came from.

Dr. David A. Bray: Think about if you want to look at the People-Centered Internet. There are other groups as well that are trying to do demonstration projects that show more resilient community approaches to this and other events. Try and play a role in encouraging what President Lincoln said, which is, "I do not like that person. I must get to know them better."

The same thing could be applied to either people or ideas. If you see an idea that makes you feel like you don't like it, take the time to at least get to know the other side, articulate it enough that you can understand where it's coming from.

Michael Krigsman: Are there centers of concentration today for organized disinformation and misinformation campaigns? In other words, where is it coming from today?

Sara-Jayne Terp: Well, there are the classics, the Internet Research Agency in Russia, but there are also some emerging industries. For instance, we're starting to see what's I guess misinformation farms starting to happen in places like the Philippines. We're seeing companies starting to look at misinformation as a service.

There are different types of misinformation. For example, China does these amazing charm campaigns. The information is about the omission of Tiananmen Square, the presentation of the nation in a good light.

Dr. David A. Bray: I'm going to put out that, in their framework, they have this great analogy of a triangle. If you imagine a triangle, a tip on the top, the base is wide, oftentimes those that are trying to do misinformation or share or promote misinformation or disinformation, they start with a campaign plan and then initiatives that flow from that campaign plan. Then there are actually the narratives they're trying to change or shape. Then, finally, the artifacts of that.

They can see everything that's happening and they understand the grand campaign. The challenge of the people that are trying to defend or create communities that are more resilient to misinformation and having it been thrown their way, they start with the artifacts first, and so they're trying to piece together what's happening and, oftentimes, it's not until long after you even go further up the chain and see the narratives, you see the initiatives and, if you're lucky, maybe actually get to the source of the campaign.

It's what makes, much like in the cyber realm, the counter-misinformation realm, the advantages currently go to the offending side. The challenge for the defenders is piecing it together fast enough and, as S.J. and Pablo said, getting left of that boom, left of that event fast enough so that you have an effective response.

Pablo Breuer: Part of the problem there is, when you get to the Internet, it's just the vast amount of information. I love the infographic Internet Minute comes out every year or two and they show you how much information traverses the Internet in a minute. It's millions of hours of YouTube videos and it's hundreds of millions of tweets and Facebook likes.

The first problem is, how do you look at all of that information? Then the next one is, how much of it is false or misleading? Then why is it misleading? Is there an intent there? Did somebody make a simple mistake? I could quote you all sorts of statistics, but I'd be lying about those statistics and I don't want to contribute to the misinformation.

The intent goes a long way. Sometimes people just make mistakes and there's no intent there. Discerning that and finding out where that information came from and what the intent was is a hard problem. It really requires not just a technical look, but a sociological look at this. This is not a technology problem. This is a sociology problem that is enabled by technology.

This is a sociotechnical system. Part of what that requires is it looks at, from various different fields, from policy fields, from journalistic fields, technical fields, social fields, and economic fields.

Part of the problem is we've all been looking at this problem differently. We've all been using very different languages and so that's also part of the reason that we wrote this AMEC framework to have a lingua franca, if you would, where different groups with different backgrounds can talk about the same problems in a way that all of the other groups can understand it.

This is not going to be a silver bullet problem. This is going to be a thousand bullet problem.

Michael Krigsman: That explains why it's so complex to get rid of. I want to ask S.J. just to elaborate on something you said earlier about the Chinese charm campaigns. I'm quite interested in that because I was actually touched by that personally. It was quite fascinating, so can you just elaborate a little bit on that?

Sara-Jayne Terp: There's just work on all channels. I mean I tend to be careful talking about what I'm tracking, so attribution is really hard. Generally, if we're talking about who did this, we can only ever put a probability that somebody did this; a probability that their nation-state was involved in this.

We can say things like, there's a high probability Iran did this, or there's a high probability that China has done this based on the way that it's done, on the intent that we believe is in it. Yeah, China has done a lot of work in lots of different ways.

Dr. David A. Bray: Is this being done more for their own population or external?

Sara-Jayne Terp: Oh, okay.

Dr. David A. Bray: What are the reasons why?

Sara-Jayne Terp: Some are on their own population. Some are on external population. We tend to see more of the external population misinformation, well, disinformation than we see the internals.

Pablo Breuer: Here's an important point is that different audiences are going to require different methods of delivery and different messages. That's because they've got these preexisting social and cognitive biases. Certainly, if you talk to the average Chinese citizen, they absolutely believe that the great firewall of China is not there for censorship. They believe that it's there because the People's Republic of China and the Chinese Communist Party want to protect their citizenry and they absolutely believe that's a good thing.

If the U.S. government tried to sell that narrative, we would absolutely lose our minds and say, "No, no, no. This is a violation of our first amendment rights."

The ingroup and outgroup messaging have to be often different. Certainly, there is a lot of internal messaging that China does with its own people. Certainly, there is a lot of external messaging they do to the greater world. Part of this is just standard diplomacy and information has always been kind of one of these instruments of national power, but there is no doubt that China has stated that they want to be the preeminent world superpower and part of that is, "Hey, we want this Belt and Road Initiative. Here's why it's great for you."

Part of this, it's kind of funny. If this were a company, we would call it advertising. Because it's being done by a country or because it's being done by a corporation, we may call it something else.

Sara-Jayne Terp: False advertising in some cases, though.

Dr. David A. Bray: I have a question for you too, though. Going further, though, is this to demonstrate that representative forms of government can't work to make us fight amongst each other? Is there money in it? I mean is it just simply to have influence on the world stage using this as a tool of national power?

Sara-Jayne Terp: All of the above.

Dr. David A. Bray: Do you all worry? Since you're researchers in this space, are you concerned that they're ever going to come after you or other researchers in this space? How do you make sure it doesn't get focused, the laser beam, on us?

Sara-Jayne Terp: Well, there is a reason I don't have a home address. [Laughter]

Dr. David A. Bray: Okay.

Sara-Jayne Terp: I think part of this is, we've created a discipline and communities. That there are a lot of people working in this field now means that the risk is spread, which is useful.

Dr. David A. Bray: Mm-hmm.

Sara-Jayne Terp: Yes, initially, there was concern. It was very comforting to go and hang out with the Special Forces for a while. That kind of helped a lot. Thanks, Pablo. [Laughter]

Michael Krigsman: Why? Why was that helpful to spend time with Special Forces and what was the nature of that hanging out and interaction?

Pablo Breuer: I wear two hats and I mentioned the one hat, Military Director of the Donovan Group, which is that future studies and thinktank. In my other hat, I'm what they call an innovation officer.

I'm one of two innovation officers at SOFWERX, which is a completely unclassified 501(c)(3) nonprofit that's funded by U.S. Special Operations Command. That's so that we can get after nontraditional problems and nontraditional tactics and work with nontraditional partners. That allows us to get into one room. S.J. is a data scientist, maybe somebody from one of the social media companies, maybe a few Special Forces operators, and some folks from the Department of Homeland Security talk in a non-attribution, open environment in an unclassified way so that we can collaborate better, more freely, and really start to change the way that we address some of these issues without worrying so much about--pardon the expression--the cultural China.

Now, am I worried about being targeted with misinformation? I think we should all be worried about being targeted by misinformation. If you're on social media or if you're surfing the Internet, you are constantly bombarded by messaging. Unlike on radio and television, these ads don't have to identify themselves as ads. If they're political ads, they don't have to identify who the special interest group is that funded them. Perhaps, it's time to reevaluate some of those things.

Am I concerned personally about being attacked? I don't think my ego is quite that big yet, but I don't think I'm going to be going to Beijing any time soon. If I do, I don't think I'm going to be taking any of my personal electronics.

Dr. David A. Bray: The other question is, what gives you hope? Is this going to be a problem that we can figure out new solutions to? Is this something where maybe new human methods? What gives you hope?

Sara-Jayne Terp: InfoSec gives me hope that another community has seen a similar-sized problem and dealt with it, created an entire ecosystem around dealing with it. It's not perfect, but there is a path that's already trodden. We've already picked up a lot of their methods, so they've given us an acceleration on this.

Also, humans; humans are resilient. We will adapt to this. We have adapted to Internet worms. We adapted to spam. We'll adapt to this as well.

Michael Krigsman: How is this different, misinformation/disinformation different from InfoSec, Information Security? I think the layman tends to lump it all together as bad stuff happening on computers.

Sara-Jayne Terp: The endpoints are humans. Most InfoSec, the things being attacked are your computers, so you're shutting down computer networks. You're shutting down individual computers. You're infecting or you're getting information off computers.

At this point, the things you are shutting down are human networks. You're poisoning human networks. You're affecting individual humans. It's carried on the Internet, but it's very much about the wetware, about the people.

There is a long tradition of humans being part of the attacks in InfoSec, of social engineering. You will attack the humans to get at the systems. Now, this is attacking the systems to get at the humans.

Recently, I was quite cheered to see. We were at the ISAO meetings recently when the Cognitive Security ISAO got announced. To see that the three layers now are physical security, cybersecurity, and cognitive security. We're now part of the layers that need to be protected. The answer is yes and no all at the same time.

Michael Krigsman: What do we do about this? We agree it's a bad problem. How do we solve it? How do we fix it?

Pablo Breuer: There are a couple of things that we can do. This ties into, "What gives you hope?" When S.J. and I first started talking about this about a year ago, nobody was talking about it. There were still arguments going on about whether or not there were misinformation campaigns. Now, everybody is talking about it.

You see it covered on the news, regardless of which news channel that you're watching. You hear it being talked about on radio stations. We're talking about it here on podcasts. That's hopeful.

Now that we've got a bit of a framework that multiple disciplines can use, Department of Homeland Security has helped stand up this Information Sharing and Analysis Organization or ISAO. There's a recognition that there's more to ones and zeros. That there is a social aspect to this.

We've received very positive responses. We've gone around and talked about this thing is going on. There's been really very little pushback and there's been no pushback on, we really need to address it.

There are some things we can do as a technician. If you're developing a platform, think about, how could this platform be abused? We want to give everybody a voice, but how do we guard the whole against the one or two bad actors?

If you're a consumer, how do I make sure that I get a varied viewpoint and I don't close myself off? How do I engage in civil discourse, which is something that we've gotten further and further away from?

We can do these things. Having policy people and government talk about it. Having technical people talk about it. Having nonprofits talk about it is all helpful.

I think there's been a recognition by journalists that, in the drive for more clicks and more advertising dollars, the producers are driving the news as opposed to the journalists driving the news. I know I've had discussions with journalists about this. There is great concern. I think we're going to take a round turn on that shortly. There is something all of us can do to make this better.

Sara-Jayne Terp: Yeah. We take some of the money out of it so we don't get those amplifiers for money. The MisinfoSec Working Group, the next part of their work is now on counters. We're starting to collect where countermeasures have happened, whether some of them have worked. We're starting to look at both those stages and the techniques within them to see how each of those could be counted individually.

This is similar work to the stuff that was done in InfoSec on counters to common techniques there. We just keep going. We just keep moving forward on this.

Michael Krigsman: Where does the funding come from to solve these problems?

Sara-Jayne Terp: We're self-funded on this, but yes. David?

Dr. David A. Bray: You hit the nail on the head and I would say there is a nonprofit called the People-Centered Internet Coalition that we are trying to galvanize different players from the for-profit sector, from the private sector, from nonprofits, from governments around the world because you're absolutely right that this is going to be a hard challenge if it's not got some fuel to make it happen.

What both S.J. and Pablo were saying to me, I think it's a lot like how epidemiology became a field. When the first epidemiologist ever, John Snow, turned off the pump because that was a source of cholera, he didn't even know what bacteria were. We didn't even have cell theory, but they knew enough from what they were seeing that that was a source of the problem. Then the science evolved into having the framework much like how Pablo and S.J. have put that forward. Now they're actually becoming more rigorous in thinking about what counters work so that can actually become evidence-based scientific field, but that's going to require fuel.

The good news is, we now at least have a common vocabulary whereas, a year or two ago, we didn't have a common vocabulary. That itself was sometimes used to take things out of context. You're seeing it right now, and C-suite executives should help play a role in the creation of this new field much like how cybersecurity has been trying to mature over the last 25 years.

Michael Krigsman: What should this field be called? What's the term?

Sara-Jayne Terp: We've called it MisinfoSec but, also, cognitive security covers a lot of the thing that needs to be protected, so you want to have a positive thing you want to protect word rather than misinformation, this thing you want to get rid of, because there'll be another thing.

Michael Krigsman: Is cognitive security the same as misinformation?

Sara-Jayne Terp: No. Cognitive security is the thing you want to have. You want to protect that cognitive layer. Basically, it's about pollution. Misinformation/disinformation is a form of pollution across the Internet.

Just because we're going to get comments about this, my position on this is clear, has always been clear. We don't want to remove people's voices. What we're trying to remove are artificial megaphones.

Michael Krigsman: A question from Twitter: What's the difference between misinformation and disinformation?

Pablo Breuer: That's an epistemic argument we'll never dig out of the hole. It's going to be how it's perpetrated, how it's transmitted, and who is doing it. Misinformation is usually changing the context or content. Disinformation is completely made-up facts, generally speaking.

Michael Krigsman: Okay. Zachary Jeans asks a great question. Thank you, Zachary Jeans. "What conferences or meetups do you recommend around this subject?"

Sara-Jayne Terp: MisinfoCon is a series of conferences around this. There was a World Wide Web conference really that had a side conference on it. There is a list. If you look on the MisinfoCon website, I think the list is held there of all the conferences connected to misinformation.

MisinfoSec, we're talking about a conference. Just watch that slot.

Pablo Breuer: I really hope that we start having the media conferences, the journalistic conferences, and the policy conferences start talking about this. I think most industries should be talking about this in some form or fashion.

Michael Krigsman: How is this different from various journalistic organizations and academic institutions with journalism centers? How is what they're doing differently from what you're doing because it seems very similar to me?

Dr. David A. Bray: I would say the difference is, whereas journalists, especially post WWII United States, we're trying to make sure stories were objective and had perspectives from all sides. I don't think there was ever an intent that there were actual actors trying to manipulate things out of context to misuse these social media platforms and other outlets to create polarizing social wedges. What's different now is, clearly, as the Internet has become more available to people and the more immediate reach that has surpassed what television can do now with the Internet, it's about trying to both change the thousands of narratives as opposed to three or four news channels that are out there to shape what you think and then try to involve you as a human to amplify what might be things out of context to create polarizing wedges even further.

This is something that can't be solved by any one sector. It's going to require nonprofits. It's going to require governments. It's going to require for-profits. It's going to require journalists. It's going to require all of us. Of course, the challenge is how to do that at scale, given that all of us are already busy to begin with, in a way that's manageable for open societies.

Sara-Jayne Terp: That is at scale in real-time as well, so it's this idea of real-time response. Journalists tend to do either pieces on something or they're looking after their stories.

Michael Krigsman: David, there are many organizations, journalistic organizations, academic institutes. Craig Newmark, who we both know, is funding great efforts around journalistic and information integrity. I'll just call it generally fake news, even though I realize it's not the right term. How are those efforts different from what S.J. and Pablo are working on?

Dr. David A. Bray: It's a diversity of people coming together, as you saw. S.J. is a data scientist. Pablo is with the Donovan Group and SOFWERX. We've got Vint Cerf at the PCI as one of the co-creators of the Internet. The difference being is, we are trying to bring together a diversity perspective, which first requires a common vocabulary. Not to say that those efforts aren't needed. Those are great. But it's really that big umbrella in which you have different groups and different perspectives because this is going to require almost a whole of society response.

Sara-Jayne Terp: It's also the InfoSec perspective but, more importantly than that, it's that most efforts are right of boom at the moment. We're looking left of boom as well.

Dr. David A. Bray: Right, or before the misinformation is out there as opposed to clean up afterward.

Sara-Jayne Terp: Yes. Yeah.

Michael Krigsman: Back to the funding question, S.J., you mentioned that you are self-funded. The question is, why? With all of this topic so much in the news, it affects the government on such a national level, why are you self-funded?

Sara-Jayne Terp: There's an independence so we can move fast but, also, getting funding that quickly for something that different. Originally, the ideal of MisinfoSec was just so different to what anybody else was doing. It just wasn't going to get it in time. We wanted to build this as quickly as we could.

We've just started a new company, CogSecTech (phonetic, 00:41:50), to build out from that on consultancy, so we may not be self-funded forever, but it mattered.

Pablo Breuer: There are other sectors that are funding this. The original discussion about this was funded by the Donovan Group.

Sara-Jayne Terp: Yeah.

Pablo Breuer: We had a radical speaker series in December of last year. You can find all of those talks including S.J.'s on YouTube by looking for SOFWERX. Facebook has just announced a new prize challenge where they will create deep fakes and ask people to look at them. DARPA has numerous challenges concerning misinformation and deep fakes.

We're just now starting to see that there are other funding sources out there. The problem is that nobody really saw this as a problem a year ago.

Sara-Jayne Terp: We did get a little bit of money from Craig Newmark, the Newmark Foundation for MisinfoSec Working Group. We need to mention that too. Thank you. Thank you, Craig.

Pablo Breuer: Yeah, so we're just now starting to recognize the role. As a society, we are starting to recognize that this is a problem and so you're starting to see government, industry, and educational institutes get interested. Hopefully, what we do correctly is share resources, share findings, and share research so that we can optimize the funds that are available to address this issue.

Dr. David A. Bray: If I could just real quick amplify that, it is exactly as you said. Now, clearly what's different from a year ago is there is at least a dozen, if not more, flowers blooming in this space. But if we really want to be a force multiplier, there needs to be communication across these different efforts.

As they said, Michael, the challenge in the past was just having a common vocabulary to describe the problem you were facing, much less convince funders that you actually had a way to tackle it because people don't usually want to spend money unless they are convinced, they're going to get a return on what they're funding.

I think now that we've begun to work, their framework, one, is amazing. Two, we have a common vocabulary. Three, now that there are all these different efforts that are blooming, what we really need to make sure is, are there ways of peer review and peer sharing of knowledge so that we can be cumulative in our lessons learned as opposed to each doing things without any cumulative advance together.

Michael Krigsman: I would like to thank Sara-Jayne Terp, Pablo Breuer, and Dr. David Bray. Thank you all for taking time to be here. It's a very fascinating and important discussion, so thanks, everybody.

Pablo Breuer: Thank you, Michael.

Dr. David A. Bray: Thank you.

Michael Krigsman: Everybody, please subscribe on YouTube. Hit the little subscribe button at the top of our website and we'll send you our newsletter, which is chock full of information on upcoming shows and great guests. Thanks so much, everybody, and I hope you have a great day. We will see you again next time. Bye-bye.

Published Date: Sep 06, 2019

Author: Michael Krigsman

Episode ID: 619