Pandemic Venture Investment Series - Episode 7 | SALT Talks #134

“Because you've got a large audience [with the pandemic], it provides a large opportunity for nefarious actors to use it as a hook to either push new types of disinformation narratives or to connect longstanding disinformation narratives to exploit this new opportunity.”

Dan Brahmy is Co-Founder & CEO of Cyabra, software company designed to identify disinformation. Vincent O’Brien is a State Department Foreign Service Officer & Former US Army Special Operations Officer and recently worked at the Global Engagement Centers to identify Russian disinformation.

Misinformation and disinformation have been around long before the Internet and it’s existed on whatever medium corresponds to the time period. Social media represents the latest iteration in communication and has served as a uniquely effective medium for misinformation and disinformation. The global pandemic has only exacerbated the volume and effects of disinformation because so many people are paying attention. “Because you've got a large audience [with the pandemic], it provides a large opportunity for nefarious actors to use it as a hook to either push new types of disinformation narratives or to connect longstanding disinformation narratives to exploit this new opportunity.”

Russia, China and Iran represent the three major state actors engaged in disinformation intended to hurt the United States. Russia and Iran are more focused on simply creating chaos among other nations and exploiting any domestic political divisions. Beijing has the more explicit goal of shaping a new world order that sees China at the top.

LISTEN AND SUBSCRIBE

SPEAKERS

Dan Brahmy.jpeg

Dan Brahmy

Co-Founder & Chief Executive Officer

Cyabra

Dan Brahmy.jpeg

Vincent O'Brien

State Department Foreign Service Officer & Former US Army Special Operations Officer

EPISODE TRANSCRIPT

John Darsie: (00:08)
Hello, everyone. Welcome back to SALT Talks. My name is John Darcie. I'm the managing director of SALT, which is a global thought leadership forum and networking platform at the intersection of finance, technology, and public policy. SALT Talks are a digital interview series with leading investors, creators, and thinkers. What we try to do on these SALT Talks is replicate the experience that we provide in our global conferences, the SALT Conference, which we hold twice a year, once in the United States and once internationally.

John Darsie: (00:40)
What we try to do at those conferences and what we're trying to do on these talks is provide a window into the mind of subject matter experts, as well as provide a platform for what we think are big ideas that are shaping the future. We are thrilled today to welcome you to the seventh and final installment of our pandemic venture investment series, where top entrepreneurs, investors, and business leaders dive deep into the challenges and opportunities arriving from the pandemic crisis and discuss breakthrough technologies that address issues from coronavirus prevention and cure to social distancing and food supply.

John Darsie: (01:17)
A reminder, this series is presented in partnership with our friends at OurCrowd, which is a leading global venture investment platform. Today's episode is called the tech cure for fake news and deepfakes. It features Dan Brahmy, who's the co-founder and chief executive officer of Cyabra, and Vincent O'Brien, the State Department Foreign Service Officer and a former US Army Special Operations Officer. Today's episode will be moderated by OurCrowd president and CIO, Andy Kaye. Just reminder, if you have any questions for any of today's panelists, you can enter them in the Q&A box at the bottom of your video screen on Zoom. And with that, I'll turn it over to Andy for the interview.

Andy Kaye: (01:57)
Thank you very much, John. As mentioned, I'm the president and CIO of OurCrowd. And joining us today are Dan Brahmy from Cyabra. Dan was born and raised in Paris, moved to Israel at the age of 14 and graduated from the IDC International Business School. During his career, he spent time at internship at Google and founded a startup called B2G and then went on to become a senior strategy consultant at Deloitte before founding in 2018 Cyabra. Cyabra is a platform that uses AI lens to detect disinformation and filter the real from the fake.

Andy Kaye: (02:39)
We're also joined by Vincent O'Brien from the US State Department. Vincent is a State Department Foreign Service Officer, a former US Special Army Operations Officer, and has spent a recent role as the Global Engagement Centers to identify counter-Russian disinformation. He served in many places around the globe, including Russia, China, Sweden, Germany, Estonia, Afghanistan, Southeast Asia, Africa, and Central Asia. Vincent got an MBA from Northwestern University – Kellogg School of Management. Today, will be representing in himself with his personal views and in no way is representing the views and opinions of the State Department. With that, thank you very much for joining us gentlemen. What I'd like to ask as the first question, Vince, and if you could lay out the guidelines and definition of misinformation and disinformation and provide some guidelines or examples for the audience.

Vincent O'Brian: (03:41)
Sure, and he can do. Thank you. Greetings from Washington D.C. as by the way. Let's just talk about misinformation, disinformation, propaganda and its history. Certain state actors have been weaponizing information for decades. Even today, there are new technologies that are being employed that make it easier and faster to do so. In the past, they used newspapers and television and word of mouth and the like, but social media has become, at this stage of its development, a really very useful medium for those who seek to achieve their goal through deception. Most of what we talk about now will be in the context of digital, media, social media, and basically the internet.

Vincent O'Brian: (04:27)
Let's make sure we're clear on terms. Misinformation is sometimes used when the speaker actually means disinformation. That's probably because this is a new field. But misinformation is actually when someone mistakenly passes on information that they think is true, but it's actually false, kind of like when a friend or a relative forward you an email that contains an urban legend that they think is real, or there are no alternative cure for the flu or even COVID and that information is wrong and it's actually harmful. The content actually can be a lie, but the intention of the center was not to lie or to harm.

Vincent O'Brian: (05:05)
Disinformation is different. That is when a group or a person deliberately creates and spreads a lie or a manipulated information in order to harm another person or group or to deflect attention away from an issue they believe to be detrimental to their own interests. Misinformation, I like to remember it as misinformation is a mistake, disinformation is deliberate. An example of disinformation, a recent one would be the US military created COVID-19 as a bio weapon, they brought it to Wuhan, China, and they released it. That's disinformation. Misinformation might be it's a bio weapon that was released from a lab. We're not sure about that. There you have it now.

Vincent O'Brian: (05:44)
Propaganda is something different. That's the act of arranging or organizing information in such a way as to bring about the perception or a perception that would not normally be held if the facts were presented in a straightforward way. I'll give you an example. It's an old example, but it's one that we've been dealing with in my lifetime of the US government because most of the time I've spent... much of it I spent really dealing with Russia.

Vincent O'Brian: (06:11)
Here's an old propaganda example. During the Soviet times, the Russian and the US national crew teams decided to engage in a race. There were no other competitors, just those two crew teams. The US team won. The headlines in the Russian state media outlets read this, "Russian rowing team takes second place in international contest. US team finished next to last." Now, that's propaganda. It's true in some ways, but if you look to the facts in a straightforward way, you know that the facts are totally different than what the impression was that you were left with. That's basically the definitions of the three areas of what we will say deception in social media.

Andy Kaye: (06:54)
Thank you very much. Dan, can you explain the difference between fake and deepfake?

Dan Brahmy: (06:59)
Thanks again for this opportunity. The way that we look at disinformation from our perspective and our own experience is the fact that it's sort of a triangle. And then when we look at the triangle, we see that the problem is divided within identity fraud, visual content fraud, and written content fraud. A lot of the people when they speak of fake news, for example, they may not necessarily understand that they relate to the written content aspect, meaning anyone can write anything and then go figure out if it's the truth or not. That's one aspect of disinformation.

Dan Brahmy: (07:36)
But the other two which are really interesting and are really big part of what it is, is the fact that you can... You spoke about deepfake. It's a broader topic around the ability to generate highly realistic visual pieces of content, pictures, and videos that may be generated from scratch. Absolutely nothing. You press a button on a publicly available website, and then you can create a picture of someone who never existed before, meaning the consequences for this are, of course, you can have a really great application for this, for the entertainment world, high net worth individuals, celebrities, that's wonderful. But there's a really bad side around this because that means that you can create identities and there's no backtrace around it. You can literally create people from the ground up.

Dan Brahmy: (08:27)
The other problem around deepfake is the video aspect of it, meaning I can make myself reenact someone else's facial expression, even words. I can do what we call face swaps or facial reconstructions of things and make me say things into Vincent's mouth. And that would be a catastrophe if this is something really negative in terms of a brand imagery, in terms of even other governmental agency or country as a whole. And so that's why we look at the problem of visual manipulation as a really uprising problem and threat under that larger umbrella called disinformation. Those are the two real distinction from our perspective.

Andy Kaye: (09:15)
Thank you. Now that we've explained these basic definitions, why is COVID or the pandemic exacerbated those effects? Maybe you want to take that, Vincent, initially.

Vincent O'Brian: (09:28)
Sure. Just unmuting myself. Well, it's a global interest, first of all. It provides an opportunity for anyone to comment on it and you'll have governments are commenting on it, you have pharmaceutical companies, you have political pundits. Like I said, because it's a global interest, everyone worldwide is tuning into it. Because you've got a large audience, it provides a large opportunity for nefarious actors to use it as a hook to either push new types of disinformation narratives or to connect longstanding disinformation narratives to exploit this new opportunity. I think I should have said in the beginning, I deal mostly with state-sponsored disinformation. So, many of my examples will be about state-sponsored disinformation.

Vincent O'Brian: (10:24)
For the longest time, Russia has spread disinformation about US assistance to countries in their near abroad, the Central and Eastern Europe, in developing biolabs, but Biolabs for health purposes to be able to detect cholera and other diseases, not for weapons or the like. And so because we have this global pandemic, now Russia is using it as a platform to push their narrative that these labs are actually a danger to the people in those countries and are an offensive weapon to be used against Russia, which is not true. That's one reason.

Andy Kaye: (11:11)
Do you want to add anything to that?

Dan Brahmy: (11:14)
Sure. On my end, obviously, it's a less elaborated and clever answer than what Vincent would say. But I think on the simpler side, I'd say that the fact that these actors that Vincent mentioned earlier they see an opportunity where people are staying at home, the fact that people are staying at home and they don't have to go to their work offices and they don't have activities outside or at least much less than pre-COVID 19, that represents an ocean of opportunity for good and for bad from a disinformation standpoint because people are what they called wired. They are wired constantly, they are much more consuming content and news and pieces of information much more broadly than they were previously. That's the simple explanation of the more people wired in, the more eyeballs I can draw to a certain snowball effect. And the larger the amount of eyeballs, then the higher the impact that can be created around things that I would like to create or things that I would like to promote actively.

Andy Kaye: (12:30)
Thank you. Who are the players in this world of deepfake and disinformation and what are they really out to gain?

Vincent O'Brian: (12:38)
Andy, actually, if I could just step back from it, something I'd like to add there, one of the other reasons why COVID is exacerbating these effects is because it's new, it's a new phenomenon and the actual information on it isn't clear. That's why it's a great space for disinformation because even the experts who are trying their very best to define it and to explain it and to help public don't have all the information. It's exacerbated by the fact that the point of origin of the virus, the country, that it came from, China has not been completely transparent about it. The learning curve has been a lot steeper than it should have been about this. I think that's just one more reason why COVID has exacerbated the effects of disinformation.

Vincent O'Brian: (13:22)
But then I'll go back to who are the players in the world of deepfakes and disinformation. From the US side, on the government side, the US and our allies have identified three major state players. And that would be Russia, China, and Iran. Jihadist extremists organizations also play a role, but theirs is mostly around recruiting through websites and they put out disinformation for that reason, but it's different. Russia and China generally have different goals in mind as well. I'll say the Kremlin often. When I say Russia, the Kremlin is interchangeable.

Vincent O'Brian: (13:55)
The Kremlin seeks to create chaos in order to just disrupt the current world order. Chaos is their goal. They want to weaken countries that they see as threats by manipulating their information environments to polarize domestic political conversations, to destroy the public faith in good governance and democratic principles, destroy the public faith in independent media and in science and in this case COVID. That's their general goal. In other words, more like if we can't have it, we want to make sure that you can't have it either.

Vincent O'Brian: (14:33)
Beijing's got another goal. They seek to deliberately reshape the current world order to their advantage because of their overall goals of expanding and getting access to resources and setting things in such a way that they would have uncongested global leadership. They deploy comprehensive, coordinated, kind of whole of government influence campaigns. They have the resources to do so and that's the way they do things. 1.4 billion people in China, so they do things big. The point of these global information campaigns is to promote and maintain the Chinese Communist Party's narratives domestically and globally. They use not just disinformation, but censorship, coercion, intimidation sometimes of dissidence and others to counter in silence criticism and portray the PRC, the People's Republic of China as a benign and positive and non-interventionist power when exactly the opposite is probably the direction they're heading.

Vincent O'Brian: (15:33)
Iran's goals are just similar to Russia's. But on a regional scale, they attempt to sow discord and mistrust in Iraq, especially between Kurds and other groups and in Syria and elsewhere in the Middle East in an effort to advance their goal of becoming a dominant regional power.

Andy Kaye: (15:49)
Understood. Thank you very much. Dan, anything, anyone from the private sector playing here or individuals?

Dan Brahmy: (15:55)
There is. I'd like to pick it up where Vincent left it and say Vincent was able to give an extremely detailed answer from a public sector, governmental sector perspective. I think that there are additional players. If we're trying to look at this from a different angle, there are companies like ours or private sector companies that are for-profit or sometimes even NGOs that are here to do some good and bring some transparency to the world. You also have what we would like to believe... There's a very thin line between the regulatory space, governmental space, academia space, and private companies. From our perspective as an early stage startup, it feels like there's a problem that is trying to be solved as we speak. Everyone has a play around this, but there seems to need a certain level of cooperation.

Dan Brahmy: (17:02)
Vincent was right again. Because of the fact that the problem is very new... Disinformation has probably existed for much longer than we're thinking of. But the methodologies that it's being applied today as we speak nowadays, thanks to the advance of social media platforms, of blogs and forums and so on and so forth, it makes everything more accessible. That's the problem. And so that's why we need probably deep technological companies like ours that are trying to build those technological solutions that can be used by the private sector, whether it's consumer brands, CPG, food and beverage and so on and so forth, but also sometimes be used by the public sector itself. Because as much as we would like to believe, sometimes the public sector also might turn to NGOs, academia, and private companies for the sake of strengthening their own internal capabilities up to a certain point. My point of view is there must be some sort of cooperation between the four. That's how we look at this.

Vincent O'Brian: (18:18)
If I can echo that real quickly, Dan brings up a great point. At the Global Engagement Center and with Department of State and Department of Defense and others, we do actually work with the private sector very closely and with NGOs and those in civil society who have over the course of the last five or six years have become real experts in identifying disinformation, kind of keeping track of it, keeping a database of it, and developing ways to counter it and to educate society. But we really couldn't do what we do without the private sector. I like to say there's the speed of government and there's the speed of relevance. And the private sector and companies like Dan's and others that are like it move at the speed of relevance. This is a partnership that's not going to go away. It's just going to make things better and we're going to work more and more closely in the future. I can see it coming.

Andy Kaye: (19:12)
Really, that is our way of protecting ourselves or are there any other technologies or methodologies out there which can help us?

Dan Brahmy: (19:22)
I think the more we move forward, from 2016 and forward, I'd like to say that people are starting to understand what is the meaning of that threat, slowly but surely. It might not be the role of a private company like ours, who's solely focusing on technology, but I think that someone should take the responsibility. The government is starting to do that really well, by the way, to educate people, to make them understand and teach them. I don't want to say what he's right from wrong because that's a subjective point of view. I would like to correct myself and say what is real from fake and how can you educate yourself really quickly to distinguish much better, because today the technology is out there, whether it's for the visual manipulation, where we spoke about deepfakes and such, or whether it's simply encountering a group of 50 bots and sock puppets that are aggressively promoting an agenda.

Dan Brahmy: (20:35)
I think that people need to understand that before you are engaging and before you are doing something, you ought to be able to be cautious, to read between the lines. It only takes a few seconds to be much, much more clever and more accurate, even as an individual. That's the one thing that is really important besides the technological advancements and all the AI that Cyabra or any other Cyabra could develop. I think there's a five seconds gap that people need to put into their minds and just implement that and say, "Wait a second, before I do something, should I?" That's a really good point, I think.

Andy Kaye: (21:20)
Okay. It sounds certainly we picked up on government companies and others. How would you best describe the ecosystem here and how should we really visualize that?

Vincent O'Brian: (21:36)
Yeah. That's interesting. From our perspective, from the Global Engagement Center, when I was with them, we put together a document and you can find it, it's out there in the public space. It's called Russia's Pillars of Propaganda and Disinformation. There are basically five pillars. We say that these five pillars make up an ecosystem because they work in concert with each other. Just from our analysis of the Russian disinformation system, we look at it and say these five pillars are on the one hand of the spectrum official government communication, which is open and out there, but may contain disinformation. And then on the other end of the spectrum, it's cyber-enabled disinformation.

Vincent O'Brian: (22:25)
I think the greatest example of that would be the 2016 US Democratic National Committee cyber hack were hacking release of emails from John Podesta that revealed a few things in the lead up to our election. That was cyber-enabled disinformation. Another recent example was the Lithuanian Ministry of Foreign Affairs website was hacked and a false press release was put on there that stated that the US government was moving assets from Incirlik Turkey up to Lithuania. It was completely made up and it was actually happened when our secretary of state was actually in Europe talking about similar issues.

Vincent O'Brian: (23:12)
That's kind of on the other end, cyber-enabled disinformation. Complete hack. The Lithuanian government was able to immediately identify that as disinformation. But that's something that takes a lot of assets. It's very deliberate. It takes more than one arm of government to do that. And really, only governments can accomplish that. No one else in the spectrum has the power. In the middle of that, you've got the federal government communications and cyber-enabled disinformation. On the other end, you've got state-funded global messaging like RT and Sputnik, which are outward-facing funded organizations by news organizations by Russia. And then you've got also proxy sources.

Vincent O'Brian: (23:52)
This is the hardest, murkiest area where you got websites that are consistently telling a narrative or policy line by a government, but they're not directly connected to the government. But their source of funding is murky. And oftentimes, they get it from advertising and they exist because they make money off of advertising. But they're always pushing a government line, whether it's a longstanding narrative issue or one that's very, very topical like, for example, COVID. And then finally, there's weaponization of social media. That's also in the center, also very difficult to find, where false personas on social media, perhaps they make two different false personas and they'll take opposite ends of an issue only to exacerbate the differences within the actual organic conversations in that particular platform.

Vincent O'Brian: (24:46)
That's kind of the ecosystem. All of this is fitting for creating chaos because it doesn't require any type of harmonization among the different pillars. Sometimes the story can start by government press release, or sometimes it could just start in the middle in a proxy site. And then because it goes viral, then you have a state-sponsored media organization talking about it. And then it could jump the traction to the mainstream media worldwide. This is why we call it an ecosystem. It doesn't require a memorandum or a secret, a dead drop to pick up and read and say, "Okay, let's all go do this." The system understands itself and just reacts at various times to whatever they know to be a direction that the state wants to take a narrative.

Andy Kaye: (25:39)
To understand the five pillars, is it directed toward the governments or is the private sector also a prime target for these attacks?

Vincent O'Brian: (25:53)
Dan, do you want to take that one?

Dan Brahmy: (25:55)
Yeah, absolutely. I can only speak, I assume, from the private sector. I don't know what's happening within the governmental sector once, but this is not an assumption. This is what we see on a weekly basis. We've been working with Fortune 500 companies in the news and media world which are the gatekeepers. They are the gatekeepers of information moving from the ones that are gathering them to the ones that are tuning in and listening to. Sometimes they are having a hard time filtering and vetting out what is going through that pipe.

Dan Brahmy: (26:36)
We're seeing consumer brands in the field of film production and celebrities and even high net worth individuals. None of them are government affiliated and a lot of them are suffering from... You would not imagine. Suddenly, you see a rumor around a football player and it's a rumor being propagated by tens of thousands of non-existing completely inauthentic profiles on social and traditional media that have the power to change and have an outcome that could be hurting tremendously from a financial standpoint, from a brand image as a sport athlete. That's something that we've been seeing.

Dan Brahmy: (27:28)
We've even seen e-commerce websites that have seen what we call decrease in their sales volume throughout the Black Friday period of time. It's absolutely insane. It's becoming more and more accessible, meaning that anyone can go in to that system of ours that has been created a few years ago and can skew with the public opinion for better or for worse. So, absolutely, the private companies are being targeted and are being skewed by bad and fake actors. No doubt in my mind. We're seeing this on a weekly basis.

Vincent O'Brian: (28:07)
[crosstalk 00:28:07]. Sorry, Andy, just to quickly... it's private sector... The question, is it directed at private sector or government? Obviously, it's directed at both. It's also directed at national strategic industries which are obviously run by private sector but have a great interest to governments as well. For example, you'll see lots of disinformation around, say, when... let's just say a Japanese and American and a Russian nuclear power plant development company up for a contract in South America. You're going to see lots of disinformation around that. But of course, it's of interest to... national interest, so many players there who gets that contract to develop that commercial nuclear power plant.

Vincent O'Brian: (28:59)
It'll also affect in the areas of oil and gas and other energies, Nord Stream 2, once again, it's a public private consortium to bring gas from Russia straight to Germany. But there's a lot of private sector business around that. There are very, very strong differing opinions in Europe about how gas should be coming to Western Europe and from what sources. It's something that really affects both sides. And so you'll see a lot of that in that area as well.

Andy Kaye: (29:35)
Thank you very much. Dan, starting with the private sector, what are the challenges and how can technology be used to solve this and what are the challenges? Is it the good versus the bad? Who's leading that race?

Dan Brahmy: (29:54)
That's a tricky question. I'll answer as humbly as I can. But I would say the following, I'd say that first and foremost, as much as technology can do a great work at raising the flag really fast and analyzing the authenticity of everything that's happening out there, I think that for the time being, and I hope Vincent agrees with me, for the time being, there will always need to be as the last mile of that trail to have some sort of context. And so while technology can do a lot of things, creating context around a vast amount of data is something that is, again, very difficult for technology to be able to do.

Dan Brahmy: (30:46)
I would say that currently technologies have the ability to gather around relevant pieces of information, create everything into one big pile of snowballs and start deep diving into just gigantic amount of data. And that is something that is within the technological boundaries that exist today and saves a lot of time for that one last mile, which is crucial and almost indispensable, which is now that the technology found what's the exposure, what's the level of impact and what's the level of realness, authenticity of a certain disinformation threat, the last mile will always be what are we going to do about it?

Dan Brahmy: (31:36)
The countermeasure is something that at least from my angle, we haven't seen companies that are doing the detection. You mentioned the good and the bad and I'm always joking about the subjectivity of it, but the good and the bad. We believe that Cyabra is doing the good side of it because we're not part of the system. We're trying to find a cure for it. We will never be cut and tangled with the countermeasuring aspect. I think there are other people, other players, other organizations that should be taking that aspect and there's a very clear line in the sand around this. That's how I look at this. I think the ability to go deep into vast amount of data and reach a certain high level of accuracy, 75%, 80%, 90% of accuracy and above, filtering between the real bad and fake is something that can tremendously help for the last mile and indispensable one.

Andy Kaye: (32:39)
Vincent, is this government relying on private sector or solving this on their own?

Vincent O'Brian: (32:45)
Well, we work in concert with private sector. I think I mentioned it before the private sector can move much more quickly. They're developing AI products and machine learning products and big data analysis products that government just simply by its very nature cannot go out and design themselves just like we don't go out and design the office furniture and the computers that we use. It's good that there is a private sector use for this. We work with certain private sector companies to go out to do this for us, but in a certain special kind of way.

Vincent O'Brian: (33:26)
A normal process would be if you're a company that sells shoes or you make movies and you have stars that you want to promote, you would go out and you would find a digital marketing company that has an ability to not so much manipulate social media but use social media to promote that product. Well, we have to do the opposite. When that product gets promoted, a cluster of conversation around that, the release of those basketball sneakers or around the release of a film in a certain country, all of that concentrates in a certain period of time, we see it differently.

Vincent O'Brian: (34:10)
What happens when we see disinformation is say there's a concentration of false personas around an election in a place like Chile, or in the Democratic Republic of Congo, or the like. We have to start at that point, look at it and say, "Okay, which of these are false personas that are attempting to direct the conversation and why?" These technology companies like Cyabra and like others that we use can help us identify that and then reverse engineer it and bring it back to the source. Whereas in the opposite and the totally legitimate world of capitalism, the world of commerce, you know the source. You're the company, you want to sell shoes. You purchase the services of a digital marketing company, and you go about and do what I just described.

Vincent O'Brian: (34:50)
It's very, very murky going in the other direction. And so we need the help of companies that have designed these systems to help reverse engineer them and get us back to the source. One of the things I also want to agree very strongly, Dan, and I go back a long way, in many ways, we've developed our views on these together as the situations develop. By all means, it comes down to the last mile. The data that the technology companies can put together and bring to you and hopefully describe in a way that your C-suite can make decisions on, it really just comes down to a decision from leadership, one way or the other, what to do about this.

Vincent O'Brian: (35:35)
It is a cost in many ways, it's a threat, but it's a cost to your company in many ways, like anything else is a cost. And so it comes down to an executive decision. The technology companies, what they can do is they can bring to you an interpretation of this data. But in the end, you got to be able to make a decision on it. I think the best technology companies are the ones that explain it to the leadership in a way that they can make the very best decision with the information that they have at the time. If it gets to it, I can talk about what I believe is a good protection plan for a company, a good way to go about that. If there's interest in that, I can talk in some detail on it.

Andy Kaye: (36:16)
That was very careful to say there is a handover. Do you see the governments as the body that will come back and deal with that threat once they've interpreted or would it be a different type of company that potentially jam or react to what's been described as being fake and dangering the corporation or the government?

Dan Brahmy: (36:40)
I assume that question was pointed at me this time.

Andy Kaye: (36:42)
Well, it's both. I think, yeah. But please start off and then [crosstalk 00:36:48].

Dan Brahmy: (36:47)
Sure, sure. I assume that there are things that the governmental sector is without any doubt able to do that a private sector company will never be able to do so. We've been working with a few public sector agencies and we've been working with a few private sector agencies and we see the difference in the remediation, the way that they would like to solve the problem. We understand that yes, of course, the governmental sector has the will and the power to solve the issue, and they are much, much more educated nowadays than the private sector is. They represent an excellent way in a sense to complete the equation. I absolutely believe that. I think that they are a major part of this equation in order to solve disinformation. Am I saying that the government has figured it out and count to 365, and puff, no disinformation? That that's not what I'm saying. Vincent knows I appreciate everything that is being done.

Dan Brahmy: (38:04)
I'm saying it's a tough problem. And it's so recent and so new that while we are figuring it out and researching it from a tech perspective, academic perspective and regulatory perspective, you need to remember that we all are on the same side of the boat. There's the other side of the boat, which is those bad and fake actors, which are getting better and better every time. While we learn from our mistakes, so while we learn from the gaps, these gaps are still out there for the reason that sometimes it is simply a cat and mouse play. We'll get better at something, but they will too. I hope it answers the question.

Andy Kaye: (38:54)
It does. Fascinating. Vincent, do you want to add anything to that?

Vincent O'Brian: (38:59)
Yeah. It is obviously the responsibility of government to respond to national threats and threats that are persistent and directed at... that are actually crimes. We run into... and this is kind of nefarious actors using our system of openness against us because we do run into an issue with... In the United States, we call it the First Amendment, but it's just freedom of speech and it exists in all democracies. And so it becomes difficult to say how do you go after this? Do you zap somebody and burn their IP address? They'll just start another one. And also, you have to look at it and say, "These are lies. But when in the course of human history, have we ever been able to stop people from lying?" They're just lying better, and they're lying on a medium that has a global reach and moves faster. But it still lies.

Vincent O'Brian: (40:00)
When you talk about trying to limit that, you also get into the area of trying to limit free speech, a limiting free speech. And that's a very, very slippery slope. I think the Supreme Court in the United States has had a lot of opinions on this. They've long-held the First Amendment, which is the freedom of speech in the United States to be sacred and it extends to political advocacy. But I think one Supreme Court's issue that I think sums it up perfectly is the remedy for speech that is false is speech that is true. And this is the ordinary course in a free society.

Vincent O'Brian: (40:43)
Now, that's hard to accept when you know someone is lying about you on the internet, in the chat room or on a website, and they're doing it deliberately to harm you. But in a society where we enjoy the great democratizer, the internet, it's advanced citizenship and it comes with advanced problems so we have to learn to accept. Now, getting around that, I think while there are certain legal recourses and there are certain things or the responsibility of government to do, the most important thing is to build resilience against it.

Vincent O'Brian: (41:21)
Dan touched on that earlier, where educating people... I think the real solution lies in education, critical thinking skills, critical analysis and a recognition on the part of individuals and on societies that someone out there in the digital space is trying to manipulate your decisions and your viewpoints through lies and through manipulated information, deepfakes, and the like. They appear to be plausible, but you have to recognize that indulging some type of natural reflex to only view information that confirms your existing bias or to believe what you see because it's easy and you feel as good to believe, it is intellectually lazy.

Vincent O'Brian: (42:03)
It is the individuals and societies responsibility really to develop the critical thinking skills, or at least the ones that we have and start applying them to what's happening in the digital space. And that will be able to help counter this much greater than laws or than decisive, kinetic countermeasures, the like, because the greatest way to defeat a lie is for everyone to know that it is a lie and for people to understand the truth and demand the truth.

Andy Kaye: (42:38)
One, it's extremely frustrating. Sure. How should we review this? What's a crime? What's not a crime? Where's the line? I certainly know from the world of finance we come from the... It's very well-defined where the red lines are and what the potential punishments are. Do you think the same should exist here or does it exist here?

Vincent O'Brian: (43:01)
Well, libel and slander is a crime in many countries and it's hard to find that. It's also hard to find the source in social media. I described that through the proxy sites are a great way to mask where it's coming from. It is really hard to get to who actually committed the crime. That's not to say that there shouldn't be a body of laws in countries and an agreed upon body of laws that are agreed upon internationally, just like there are in many other fields and human rights and the Geneva Convention and the conduct of warfare. There's maritime laws that we all agree upon. There can be laws or rules and regulations that we agree upon internationally about how information should be conducted in the internet.

Vincent O'Brian: (43:58)
But that's going to take some time to get to some consensus on that. I highly doubt that everyone will agree with it. I think in the end, it becomes what are you doing to build your resistance and resilience to it versus what are you doing to actually prosecute the purveyors of it? I don't think we should try to do that, but I think realistically, it will be a very, very hard to be able to get consensus on what is a crime in terms of disinformation and also what should the punishment be?

Andy Kaye: (44:32)
Dan, do you want to add anything to that?

Dan Brahmy: (44:36)
I think Vincent touched on the crucial points, really nothing to that.

Andy Kaye: (44:42)
Just from a country perspective and continuing on this line which I think is really fascinating, do you see the West at least reaching a consensus and is there a global acceptance that it really is Russia, China, and Iran against the rest of us, or are there other gray areas and gray countries and gray organizations as part of this too?

Vincent O'Brian: (45:06)
I think the latter. It's not just those three that they're the ones we focus on. I take the opportunity to put that out there because it's where most of it's coming from at the time. But if it proves to be an effective medium, others will have already gotten into the game. But it may not be as publicly available information or may be boring to actually read about it in the news, but the truth is that governments are actually working together really closely. We are working to have common playbook on how to deal with disinformation so that it works broadly across regions. I know in South and Central America countries are working together on this from a common approach. The Europeans, the Baltic states, the Nordic states, other Northern European countries have come together and worked informally to come from a common playbook on how to recognize, how to be resilient, how to respond to disinformation.

Vincent O'Brian: (46:13)
It's going to get harder when it comes from more than the sources that I've mentioned and also if those sources are murky and hidden inside proxy sites and inside false personas on social media. But there is a general consensus on how to go about this. I think it really comes down to... When organizations ask us, "What do we do about this?" I always say the three R's, recognition, resilience, and response. I think governments and other organizations and civil society organizations are getting really good at helping groups recognize what is disinformation and how to be resilient against it. That's where that's going and I think you'll see countries and groups across countries, especially civil society are working from a common playbook right now. Whether it's global or not, let's talk about that in a few years.

Andy Kaye: (47:16)
It really sounds as if you've teed up per Cyabra in many ways.

Vincent O'Brian: (47:21)
I didn't mean to, but it just comes so naturally because they really are right there. That's interesting point too. The private sector should probably be in the business of analyzing the data and getting it to the place where the decisions can be made. Anything past that is something that governments should do. But Cyabra is actually one of the organizations that have really done a good job at doing that, kind of identifying the information and just presenting it to you in a way that you can best make a decision about it. Dan, that was just a boldfaced plug for your company. I hope that's okay with you.

Dan Brahmy: (48:05)
Absolutely. Thank you.

Andy Kaye: (48:07)
Dan, anything you want to add?

Dan Brahmy: (48:12)
After the flooding words, why would I?

Andy Kaye: (48:15)
How big is the market though, from a private company perspective [inaudible 00:48:19], we're investors in your company too, how big is this market and how fast do you think it's growing from the private sector at least?

Dan Brahmy: (48:27)
Well, we've never had a doubt on the potential of the outreach of... what we call the TAM, the total addressable market that there is simply because the more we move forward, the easier it becomes to see the nefarious impact and outreach that is being created over largely super well-known brands, when we speak of CPG companies, when we speak of food and beverage companies, when we speak of automotive companies, and when we speak even of the financial sector, because imagine what would happen if there's a publicly traded stock that is being hurt by a very well-orchestrated snowball effect around a certain propaganda or a certain piece of information that is flying fast enough and deep enough and then you suddenly see the stock price dropping and then the valuation goes down by half a billion dollars.

Dan Brahmy: (49:36)
It happened in the past and it will continue to happen as long as we don't have a good filtering mechanism. The addressable market for a company like Cyabra or any other similar company like Cyabra is I'd like to say almost endless. As Cyabra, we're not currently targeting actively small and medium businesses, but anything that is mid-market and above that has the potential to be positively or negatively impacted by disinformation, so those bad and fake actors in question could be a potential customer of ours. Absolutely. We've started working with them back then.

Andy Kaye: (50:22)
That's highlighting the issue, but do you see the company evolving that you will take that next step which you're so careful not to go into and say, "We will develop an active tool in order to respond and maybe turn off the switch, such as the global sort of the Chinese Great Firewall or to react accordingly"?

Dan Brahmy: (50:45)
You're asking me as Cyabra or-

Andy Kaye: (50:46)
Yeah, definitely.

Dan Brahmy: (50:50)
If my co-founder was on the call, he would be dying from the inside of that question. Now, let me answer, let me answer. I do not see Cyabra getting into the remediation. Let me be clear on that. And everyone in that talk understands that we are living an enormous amount of money on the table by saying no to the countermeasure and to the active part of the remediation, but we are sleeping extremely well at nights because we've started the company with a moral compass that helps us understand and draw that line in the sand pretty easily. That was the longer version of the explanation.

Andy Kaye: (51:39)
No, we really appreciate that obviously and I think the audience will too. Vincent, do you want to add anything to that?

Vincent O'Brian: (51:45)
Yeah. I think from a government perspective, Cyabra is on the right track there as part of the reason why we are associates with them. I think on the government side, once you cross the tracks over to doing something like that, then it becomes difficult as a government to actually work with a private sector company because we're not sure who they took those active measures against in the past. But also, we have our own red lines as well. And so we appreciate companies that have those red lines, those moral and ethical red lines. We will never ever use disinformation as a tool against disinformers because when you do that, you're really no better than them and then your argument goes away completely.

Vincent O'Brian: (52:37)
Number two, the truth really is the best tool. Go to any US Embassy around the world at 9:00 AM and go to the consular section near the door where people line up to get visas and count the number of people that are waiting in line to come to the United States. We don't have to lie to push what we have to offer. The truth is so much stronger than that. And so we'll never ever step into that side of the business because then we are no better than the people that are our adversaries and we believe that we are. Once you get into that field, once you get into that world, once you start targeting people, you become a target. It's Murphy's Law of Combat. If the enemy is within the range of your weapons, then you're in the range of theirs. It's not a really good route to go down. We appreciate companies that don't do that kind of thing.

Andy Kaye: (53:39)
Thank you. As we sort of coming towards the end of our hour, I think your last statement certainly was very positive. But do you think we're on our way to having a good outcome here?

Vincent O'Brian: (53:52)
I do. I think we're in a lot better place than we were. I don't want to talk about elections. I certainly won't talk about current elections, but we'll go back to 2016. In the United States, the elections, it didn't come out to the public in a major way that states were attempting to alter the outcome of our election. There's no evidence they did, but they were attempting to until after the election. And so you had this entire electorate saying, "Wait, wait, was my vote manipulated? Did I vote based on false information?" Well, since then, a thriving counter-disinformation community has come up and Cyabra is a perfect example for me in 2018. So many other companies formed since then.

Vincent O'Brian: (54:36)
But this huge counter-disinformation community is comprised of governments that are cooperating, civil society organizations that have either come up out of this or completely repivoted and they're now 50% involved in how to recognize disinformation and how to promote critical thinking in the information space. Academia has moved in this direction, the press, the private sector, citizens around the world are refusing to tolerate these tactics and they're pushing back.

Vincent O'Brian: (55:07)
I think the evidence of that is going into our latest election and there's been many other elections around the world that we monitor through the State Department and through other... others monitor where everyone that goes into the vote knows now that they're being manipulated, someone's attempting to manipulate them. So they go in and they make their best choice. I think we're in a much better place than we were. Will we ever defeat it? Like I said, what's the chance of stopping people from lying? It's never been done before. I don't think we will, but we'll get a lot better at recognizing the lies and pushing back and preserving the integrity of the information space. I think we're on the right track.

Andy Kaye: (55:46)
Thank you. Dan, your closing thoughts?

Dan Brahmy: (55:50)
Look, Andy, I was born and raised in France. I'm a romantic by nature. I'm optimistic. No, jokes aside. I'm optimistic. I think that I agree with Vincent. There's actually an expression in France that says that we're seeing companies pop up like mushrooms under the rain in our field. It's a good thing. I don't see competition as being a bad thing because it only increases the exposure and the education side that Vincent so elegantly explained earlier that that's it. The more companies we have trying to solve that huge problem for the private and the public sector, the better we are at understanding it and the better we are at moving towards a more mature and getting to the last pieces of the puzzle to solve it through technology.

Dan Brahmy: (56:51)
I'm optimistic. I think we're on the right path. There's always that 1% in my heart because I'm always thinking as my co-founders taught me coming from this information warfare like cybersecurity background saying, "You're never sure. You're never sure, they always get better what they do. You always see small improvements." But we are optimistic and we can see that even from a Cyabra standpoint. We see that we're moving along the line really well. We've got wonderful investors to support us. We've got customers that are giving us the right proof. I am optimistic.

Andy Kaye: (57:33)
Thank you very much, gentlemen. It's been a pleasure. Thank you, Dan. Thank you, Vincent. I think we've really discussed some very interesting nuance points today and throughout the last seven episodes. Thank you to the speakers and thank you to our partners at SALT. If you want more information on OurCrowd, please go to www.ourcrowd.com. Thank you, John.