(C) U.S. State Dept
This story was originally published by U.S. State Dept and is unaltered.
. . . . . . . . . .
University of Southern California Election Cybersecurity Initiative [1]
['Adam Clayton Powell', 'University Of Southern California Election Cybersecurity Initiative Executive Director', 'Judy Kang', 'Special Project Manager']
Date: 2024-10
University of Southern California Election Cybersecurity Initiative
THE WASHINGTON FOREIGN PRESS CENTER, WASHINGTON, D.C.
MODERATOR: Hello, and welcome to the Washington Foreign Press Center’s briefing on the University of Southern California Election Cybersecurity Initiative. My name is Elise Crane, and I am the moderator for today’s briefing.
As a reminder, this briefing is on the record. We will post the transcript and video of this briefing later today on our website fpc.state.gov. For journalists joining us via Zoom, please take a moment now to rename yourself in the chat window with your name, media outlet, and country.
Our briefers today are Adam Clayton Powell, III, executive director of the USC Election Cybersecurity Initiative and director of the USC Center on Communication, Leadership, and Policy’s Washington programs, and Judy Kang, program manager with the USC Election Cybersecurity Initiative.
Finally, before opening remarks, a reminder that our briefers are independent subject-matter experts and the views expressed by those not affiliated with the Department of State are their own and do not necessarily reflect those of the U.S. Government. Their participation on Foreign Press Center programming does not imply endorsement, approval, or recommendation of their views.
And with that, I’m going to turn it over to Mr. Clayton Powell.
MR POWELL: Thank you, and thank you for inviting us. We’re going to give a brief presentation and then take your questions. As you can see, we are focusing in this election cycle. We’ve been doing this for quite a while. We’ve been – this election cycle, we’re focusing on fake videos and artificial intelligence, so we’ll be talking a bit more about that.
This is the two of us. We began planning this years ago, and we’ve talked to people who are in government and also people who’ve run political campaigns – including presidential campaigns – here in the United States. And in one meeting back in 2019, we met with the – with Bob Shrum, who ran Bob* [John] Kerry’s presidential campaign and with Mike Murphy, who ran Jeb Bush’s presidential campaign. And I said, “We’re like you guys. We’re going to run a 50-state campaign, every one of the 50 states, training people on election cybersecurity, but we don’t have a candidate.” And Bob Shrum said, “Adam, you’re wrong. Your candidate is democracy.” I thought whoa, okay. That’s going to be our slogan, and Bob, we’ll give you credit.
So here’s – our biggest single program was in 2020. We went to every one of the 50 states. Now COVID hit in March of 2020, so we had to set up state Zoom networks after that. But here we are. The state of Ohio gave us their state capitol for our program. And so here we are under the dome in Ohio, courtesy of Secretary of State Frank LaRose and the governor.
We also have been invited to do programs in other countries. This is one we did in South Africa. He’s from the South African Electoral Commission, and he said that they vote on paper in South Africa. The problem is that they maintain their voter rolls, as almost every country does, on computers. And so people from Russia – not necessarily the Russian Government, but maybe they were just criminals – would attack them, asking for money. We’ve done – here’s another Africa conference that we did two days last June on artificial intelligence and elections in Africa with participants across the continent.
And we’ve also done programs here.
MS KANG: So we participated in both the RNC and DNC as part of the USC Annenberg Roundtable, where we presented on election cybersecurity in the age of AI and hosted a panel discussion on fake videos and election cybersecurity in the age of AI. The panel featured notable experts, including a chief information security officer and professors from Northwestern University and New York University.
As a follow-up, we recently hosted an event on fake videos and election cyber security in the age of AI, led by our Election Cybersecurity Fellow Scott Bates. Earlier this year, we were involved in election cybersecurity events in Singapore, Taiwan, and South Africa. And our coming events include participation in Cyprus later this month and in Australia in early December.
MR POWELL: And these are all open. You’re free to join, either in-person or online. And if you want to be added to our list, just let us know later.
We’re in the month of October now, which is a key month for the United States election cycle. This is the time of the four weeks before the election. And as you can see in this clip from The New York Times just a few days ago, that officials are warning that cyberattacks are going to speed up during this month. And the attackers are from different countries and with different interests.
So here you see an article saying that Iran is hacking into the Trump campaign, along with the accounts of Washington journalists. China, which is expected to be a big player in this election, they’re doing something different. They are uncertain which candidate they detest more, since they’re both not very friendly towards China. So they’re focusing on local political races around the country, and they’re conducting influence operations to try to really disrupt confidence in the election.
And Russia – here’s an article that was just published on Russian paid influencers stepping up efforts to influence the election. It’s very difficult – the United States has 8,000 – more than 8,000 election districts around the country, each of them with different times of voting, days of voting, methods of voting. And so it’s very difficult to change an American election by hacking. But what you can do is sow doubt on the candidates and even more so on democracy. This is true in elections all over the world.
Here is one tool that has really come to the fore this year, which is disinformation focusing on local races so that county clerks and secretaries of state around the country – now most of these election districts, these 8,000 election districts, are run by people with little or no computer science or technical background. So it’s really what one person at the Washington Post called an asymmetrical form of warfare. You’ve got experts attacking, and you have people without much training defending, which is what we do. We try to show the people without much training how they can defend.
New this year also – TikTok. TikTok videos, which can be created now with artificial intelligence in large numbers and realistic appearance – that is now being used by our adversaries. Using artificial intelligence, ChatGPT to generate fake-looking news articles and large numbers of fake-looking information – fake but realistically – realistic-looking information.
This may be the first major example of a fake video going public. And it wasn’t here in the United States; it was in Russia. Somebody did a fake video of Vladmir Putin, and it was supposed to be Putin declaring martial law and military mobilization. He never said that. But it was so good it faked – it fooled Russian television. They put it on the air. Well, of course, the Kremlin immediately contacted them and said this is a hack, stop. But you can see where this can be a major, major source of the disruption and disinformation.
And what we always have are surprises. And I’ll end with this because this was the big surprise for us in 2020. We don’t know what it’s going to be this year. Two weeks before the election, the U.S. Government got wind of a possible attack from overseas on the Associated Press headquarters in New York. Why would they attack the Associated Press headquarters in New York? Well, the man in this picture is Gary Pruitt, the president of the Associated Press. He’s on our advisory board. And after the election, I asked him – Gary, what keeps you up night? He said I’ll tell you what kept me up at night on election night 2020: we had thousands of sophisticated attacks from overseas trying to freeze our computer system, trying to take down the AP central computer system.
Now, I’ve been in journalism so I know what that means, but I asked Gary to speak to one of our workshops. I said, Gary, you have to explain to the election directors here what would happen if the Associated Press computer had been taking down on election night. He said, oh, there are no election returns. All of the election returns in the United States come from one computer on election night. Two weeks later, they’re certified by the states and local governments. But imagine, Gary said, sitting in Washington or Tokyo or London or Hong Kong, you turn on your television or your laptop, and you’re told, oh, there was a presidential election in the United States; we won’t have any returns for two weeks. He said, think of what that would do to credibility of the Untied States and credibility of democracy.
So that’s who we are. Here’s how you join us. And Judy, why don’t you tell them about some of the things that we do.
MS KANG: So we publish a weekly newsletter every Sunday that includes all the programs and partnered events, mostly hosted at our USC Capital Campus. The newsletter also features selected articles on election cybersecurity, disinformation, misinformation, and local election news. So you can subscribe just by visiting our website. And if you have any further questions, please feel free to email to
[email protected].
MR POWELL: Thank you.
MODERATOR: Great, thank you so much for those remarks. And now I would like to open it up for questions. Just a reminder for journalists participating via Zoom: Please be sure your screenname includes your name and outlet. To ask a question, click the “raise hand” icon at the bottom of your screen. But we will start with questions from journalists in the room. Please raise your hand if you have a question, and state your name and media outlet before asking your question.
Alba.
QUESTION: I’m Alba from El Independiente, Spain. And I would like to ask how often does this happen, how – like, what’s the size of this kind of disinformation, if we have a way to measure it? And also, what’s the way to confront it? I mean, does the government usually take these videos down? Is there a way to prosecute whoever does these? I imagine it’s difficult because if they’re in foreign countries, but there might be people who are in the United States. How is the government facing this? Thank you.
MR POWELL: The – in terms of how often, this is a fairly – the secretary of state of the state of Minnesota, who’s the – also the president of the National Association of Secretaries of State, he has a wonderful image. He said this is like a relay race without a finish line. So instead of thinking about how often, it’s really how we can never stop, because there are cyberattacks coming all the time. And so we have to be constantly detecting and defending against attacks.
In terms of controlling mis- and disinformation traffic, we – that’s not our – that’s not our function. Our function is to work with election directors and campaign workers to help them defend against attacks. What we do tell them in terms of misinformation and disinformation is actually, if a campaign, for example, is hit with disinformation – I mean, deliberate distortion about the candidate which is beyond the normal political back and forth – the new rule is: Call the FBI and find out where – find out who this is. Because if it’s coming from outside the United States, it could be legally problematic. And that’s something that no one thought of here six years ago; this is new.
MS KANG: And just to add to that answer, we – during our workshop we had three modules – election cybersecurity 101; misinformation, disinformation; and crisis response. So a lot of the slides in the presentation was giving the audience some free resources and where to reach out to, or who to reach out to within their states.
MODERATOR: Yes, Anil.
QUESTION: I’m Anil Takalkar from Pudhari, from India. We – I represent a newspaper (inaudible). Also I have three questions. Should I ask one by one, or —
MODERATOR: Let’s start with one, and maybe we’ll see if they’re related, and if not we can come back later.
QUESTION: The first thing that you have already explained, okay, that – what AI and disinformation are doing harm to this election. But what are the most other significant cybersecurity challenges facing this current election? And how can election officials and campaign managers mitigate these risks?
MR POWELL: So you’re talking about previous elections – what AI has done and what cybersecurity has done is take what had been a problem before on paper, or maybe on radio, and has really brought it into an entirely new dimension. Some people compare it to the difference between conventional and nuclear war, that suddenly we’re facing a much – the military term would be a stronger threat environment. We – I’m sorry, I guess I’ve only touched on one of your questions, but —
QUESTION: (Inaudible) significant challenges, cybersecurity challenges, facing this election.
MR POWELL: Standards?
QUESTION: Cybersecurity challenges, apart from (inaudible) —
MR POWELL: We – let me back up. The U.S. Government, with other democracies, is – has developed some best practices. We have a local law here in the United States, the First Amendment, which protects speech, including the press, including the ability of expression.
And so there’s a tension here. We cannot stop Americans from speaking. What we can do is address people from outside the United States trying to influence an election. That’s a pretty clear line. In fact, President Obama was the one who issued the first executive order. He said that threats from outside the United States, that’s – that the U.S. intelligence should deal with that. Threats inside the United States, the FBI should deal with that, and to see whether or not they cross that threshold of is this protected speech in the United States or not. People debate that. People debate what that standard should be, and it’s different even among democracies. It’s different in the UK; it’s different in Australia; it’s different in France.
So there really isn’t any – there are few hard and fast rules. There are some things – trying to get money to stop something, that’s clearly illegal. That would be – it’s called ransomware if it’s done online, but it’s illegal if you try to do it in person. So this is an area of law which is evolving, and we – and it will continue to evolve. I mean, artificial intelligence is changing how one statute in America, Section 230, is being interpreted. And 230 has been critical to governing online social media with artificial intelligence. Now they’re saying maybe we have to interpret a different way. I’m glad I’m not a lawyer; I don’t have to keep up with all this.
MODERATOR: Great. Thank you. Anil, let’s go – Marta, did you have a question?
QUESTION: (Inaudible) EFE news services. Donald Trump and the Republican Party, they are always saying that they will only respect the election results if the process seems trustworthy. Is it – are there any real risks, or is it just a way of saying that they only accept a win?
MR POWELL: What we, again, do – we focus on international adversaries, because our view is if you can defend against the experts who are attacking from Russia, China, Iran, DPRK/North Korea, you can defend against anything in the United States. So Trump – Trump’s speech – he’s a presidential candidate. It’s really protected, and we’re seeing now, we’re seeing today in the news, the special prosecutor releasing information about things which he says Trump did which are illegal. We don’t know whether or not they’re illegal yet. That’s going to be up to the courts.
So Trump – what Trump is saying can be partisan politics. Does it rise to a level where it’s illegal? That’s a problem. If somebody fakes – creates a fake image of Trump saying it, that is illegal. We have, I believe, at least 19 states, maybe more than 20 by now, that have made it a law that if you create a fake image of a candidate without the candidate’s consent, that is against the law, and in some states it’s criminal offense. You can go to jail for doing that. So – but if the candidate says it, that has to really rise to – even and including a presidential candidate, that really has to rise to a level which is difficult to defend.
MODERATOR: All right, thank you. We actually have one pre-submitted question from Mohamed Maher of Al-Masry Al-Youm newspaper, on-TV outlet, Egypt. He asked: “How do you assess the coordination between the election cybersecurity initiative and federal agencies like the Department of Homeland Security in monitoring and responding to cyber threats during the election period? And how are you collaborating with social media companies and tech platforms to combat election-related disinformation and ensure that these platforms do not become tools for spreading false narratives or foreign interference?”
MR POWELL: Great question. We do work with Homeland Security. In 2020 the point person in Homeland Security, the director of election cybersecurity, was an expert named Matt Masterson. He’s now at Microsoft. He appeared in many of our workshops in person and by video. But beyond that, we want to know from Homeland Security: Can you share with us what we can be – what we should be sharing with election officials and with U.S. citizens? What do you see as the latest threats and how to defend against them?
So yes, in 2022, after Matt left and went to Microsoft, it was the secretary of state of Washington – oh —
MS KANG: Kim Wyman.
MR POWELL: I’m sorry?
MS KANG: Kim Wyman.
MR POWELL: Kim Wyman, thank you. And she was in that role, and she appeared in every one of our workshops in 2022. Again, what’s new, what are you hearing?
What we don’t do is work with agencies that work with classified material – the National Security Agency, the CIA, and others. And it’s just a decision we’ve made. I – whenever we go into a briefing or into a closed-door meeting, and they go into classified information, I get up and leave. I don’t want it in my laptop. I don’t want it in my phone. I don’t want it in my head. Because we want to – we will be dealing with people who do not have security clearances, so our information has to be unclassified. So we appreciate any guidance that we get from the security agencies, but we don’t work with them closely because we don’t want to be – we don’t want to know classified information.
MODERATOR: And thank you. We have one more pre-submitted question, then we’ll come back to the room. The question is: “What are you seeing in swing states specifically, and do you assess it will make a difference in election outcomes?”
MR POWELL: The – we have seen cases where software was altered. We don’t know by whom. But again, our election districts are so small that it makes such a tiny difference that, as far as we can tell, nothing has – no tampering has come close to changing the result of any election. But if you go back to 2015, which is when our initiative began, it really wasn’t on people’s agenda. And you can talk to people in government agencies here in the United States who will say if this happened in 2014 no one is really watching for it, we can’t find evidence that it did, but no one was watching for it.
Now we’re watching for it very carefully, and so we like to – we like to think that – oh, and we’re sharing information with democracies all over the world in Asia, Africa, Europe, now Australia, Latin America because we have the same three or four countries that are most active in attacking elections are the same adversaries that are faced all over the world in different measures. I mean, Australia they’re most worried about the Chinese, also in Canada most worried about the Chinese. If you go to much of Africa, they’re most worried about Russia. And obviously if you go to South Korea, they’re most worried about North Korea. So it’s the same adversaries in different mixes using some of the same techniques. So that’s – I don’t know if that’s a complete answer to – that’s what we – what we see.
MODERATOR: Great, thank you. Any last questions from the room? Anil.
QUESTION: What role does Google play in supporting this cybersecurity initiative? Google’s role, I mean.
MR POWELL: Our support.
QUESTION: Ah.
MR POWELL: We were fortunate. Back in 2015, the person who suggested that we do this in a meeting here in Washington in our office was the person who co-invented the internet, Vin Cerf. He said you should create a team focusing on cybersecurity to defend against everything connected to the internet, secure everything connected the internet. And everyone around the table said yeah, that seems like a good idea. Everything connected to the internet? That’s a big job. He said we’ll give you some money for it. He’s a vice president at Google now. We’ll give you some money. We have some planning money.
USC’s president augmented that from his budget. He said I will give you money to help organize. And so we organized five USC professional schools – communication, engineering, public policy, business, and law, plus political science in the college. And the president of the university actually paid the deans to organize that.
2016, the next year, was when the Democratic National Committee email incident occurred, and suddenly people are coming to us saying you have the team, please help us. The National Governors Association came to us and said please help us. We actually run elections in the United States and we don’t have any money to do this. And I said well, we don’t have much money either. We got a little bit of money – not a – we got some money from a foundation, and we started out with six states here in 2018. And in 2019, Judy and I were walking down a hallway in Salt Lake City and we ran into some people we’d never met before from Google, and the result of meeting them was Google gave us millions of dollars to set up in all the states.
So we don’t take any government money. We have the luxury of saying that we have a nonpartisan, arm’s length gift from Google which has funded us. And my first question was, “Oh, so we have to show Google hardware and software?” “No, you have to be hardware and software agnostic. This is a USC project.” “Okay, we have to feature Google speakers, right?” “No, feature USC professors.” “Oh.” So they’ve been a great funder.
MODERATOR: Great. All right. Well, I believe that brings us to the end of our time. I’d like to turn it over the briefers for any last comments.
MR POWELL: No, just thank you for inviting us. Again, if you want to get our weekly updates or get more information, it’s a very easy email address: truevote – all one word –
[email protected]. And that’s how you reach us. So thank you for coming.
MS KANG: Thank you.
MODERATOR: Great. I’d like to give a special thanks today to Mr. Clayton Powell and Ms. Kang for sharing their time with us and to the journalists who joined us in person and also via Zoom. Thank you. This concludes today’s briefing.
[END]
---
[1] Url:
https://www.state.gov/briefings-foreign-press-centers/university-of-southern-california-election-cybersecurity-initiative
Published and (C) by U.S. State Dept
Content appears here under this condition or license: Public Domain.
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/usstate/