Speakers
Condoleezza Rice, Co-Chair, Aspen Strategy Group; 66th U.S. Secretary of State, U.S. Department of State
Teresa Hutson, Corporate Vice President, Technology for Fundamental Rights, Microsoft
Anna Makanju, Vice President, Global Affairs, OpenAI
Moderator: Alex Ward, National Security Reporter, POLITICO
Full Transcript
Read the full transcript below or download it to your device.
Click to read the full transcript
Alex Ward
We’re going to talk about AI and democratic resilience, and I want to be very clear that we’re talking about the system of government, democracy, not the Democratic Party. That resilience question is a separate one. We’re not going to talk about that today, but it’s good to see you guys, since we only have a short amount of time, I will just get into it over these very short four days, by the way, thank you all for being here over these short four days, hearing a lot of optimism about AI, what it could do, how it’s going to change the world. I have been hearing as part of being here the more skeptical side, maybe more pessimistic side. For example, people pointing out that in a recent ODNI Office Director of National Intelligence Report, there was mention of Russia, China, other countries using creating AI led information to sway the election, the American election, to the point of using southern vernacular, Midwestern terminology, to reach a certain audience. And so I’m wondering, sort of the core of this entire AI debate, as we think about it, is, how do you get the right information out to the people before the wrong information gets out? So since we have such esteemed people and say, I might leave that to you, how is the defense versus offense moment that we’re in? How’s that going?
Anna Makanju
Well, one thing that should give us a lot of optimism is the fact that we’re having conversations like this. So we’ve really had many years of learning and a much more resilient ecosystem, even beyond companies like OpenAI and Microsoft and others. But we have made it as difficult as possible to use our tools to do this kind of work and what we have seen because we also have quite robust investigative capacity at our company, is that we are able to leverage this technology to do really significant work, in particular influence operations. We released report in May we can do things in minutes right now to investigate this kind of activity that used to take us days, because of the amount of information you can process with these tools, because of the velocity with which you can do it, because of the languages you know, the languages, you know, the instantaneous translation we have. We were able to take down several influence operations. But also what we saw when we were taking them down is that, like everybody else, the bad guys don’t really know how to use this stuff yet. And they are,
Alex Ward
But they will,
Anna Makanju
Yeah, well, but that’s why we need to stay ahead of them.
Teresa Hutson
Well, I mean, it turns out what they’re good at is propaganda, and so they use whatever tools they have. And some of that is, you know, not that modern technology. It’s video editing and they you know, I think what we have seen similarly is that the the nation state attacks haven’t been that effective. Actually, the good news about election related content is the crowds actually help counter the narratives almost immediately. The trickier stuff is the one to one content contacts. So, but I do think it’s, it is it’s sort of the responsibility lies with the tech companies to help build that resilience through the technology and through public education. So both of our companies are doing that, you know, building in tools to make sure that the safety architecture is good, building in content provenance technology so you can, you know, sign content as being authoritative. So if the Russians slap a BBC logo on something, you should be able to determine if it’s actually coming from the BBC itself. So we’ve actually done a lot of work with the BBC around content authentic authentication, using our technology to detect, you know, defects on the system and then taking them down, giving people tools to report that they see them on the on the systems, and we’ve undertaken a lot of public awareness campaigns. We’ve we went out ahead of the EU elections, UK and France elections, and now we were at the RNC this week, and we’ll be at the DNC in Chicago, training political parties and candidates on deepfakes And how easy it is to create them, how hard it is to spot them, and then how to protect yourself as a campaign, because it does require that sort of layered approach, I think, to counteract what you’ve described.
Alex Ward
Secretary Rice, you’re an expert on democracy. Global democracy. There’s a decent amount of elections happening in the world this year that already happened, of course, not including just our own. I’m curious, having studied this issue for so long, have you ever seen it a tougher environment for citizens to make the most informed decisions?
Condoleezza Rice
Oh, absolutely. One where terrorist is threatening you all the time. I mean, I think we have to not overstate this problem. It is a problem. And I do think the technologists I live in the Silicon Valley, so I talk to these folks all the time, and I do think that technology companies are trying very hard to find ways, through the technology itself, to ameliorate some of the problems of the technology. And I applaud that work, and it needs to continue. The other point that I’ll make, though, is that if trolls, foreign trolls want to try to get in and stir up trouble. We’ve given them plenty of plenty of ammunition with which to do it, because technology is rarely the cause of a problem. It can exacerbate a problem. So when we have the kinds of deep divisions that we’re seeing, it’s somewhat easier to do you know, in the nothing’s new under the sun. Joseph Stalin had something called fifth columns within democratic societies, and he went after populations that had reason to be disaffected. And one of his strongest campaigns in the United States was in the African American community, because in the 1920s and 1930s there was a lot to be disaffected about. And so I think we also have to go back and look at why is this a problem? Yes, the technology is exacerbating it. Yes, it makes it faster. Yes, the tools are better, but the underlying problem is that people are believing things because of the disaffection. And it’s not just among minority populations. It’s among populations that feel, as we were talking about yesterday, that they’ve been left out and et cetera, et cetera. So we need to work on both ends of the problem, the technology, but also the disaffection that allows the technology to exacerbate it.
Alex Ward
I mean, at the core of this conversation is really, how do we, how do people, how do we stop citizens from getting pumped, right? I mean, in a way, I mean, if I understand that there could be the undemocratic megaphone of technology, as you were describing, but I’m curious, actually, we have a decent crowd here. How many of you, when you go online, feel that you can get the right information that you want to find, raise your hands. All right. For those on watching on video, that’s probably 70% okay. Now, how many of you would trust, whether a company or a government said this is the right information?
Teresa Hutson
That’s a confusing question. I think what you’re getting. It’s the question of, who do we trust as a trust as a trustworthy speaker? Right? This is a big challenge. And you know, to your point, 100% it’s 100% true that the problem is that we have fissures in society. We don’t have trusted speakers, we don’t have trusted institutions. So we have to rebuild trust in institutions. And so, you know, we’ve talked about this sort of technology layer as creating some trust. But, you know, part of this is on the technology company itself as an institution. It needs to build trust that people that it’s building technology that helps humanity rather than harms humanity. But we’re not alone in that. You know, it is the government that needs to rebuild trust in itself as an institution. It is the media that needs to rebuild trust in institutions. Virtually anybody here who works with an institution probably faces this problem a higher ed, you know, how do we do that? And I think some of it is, you know, there’s some stuff around, you know, be transparent about how you make your decisions, transparency, accountability, leadership. These are human problems. They’re not technology problems.
Alex Ward
Well, I this is why I admit that that was an unscientific poll, first of all, and my question, excuse the nerves, was not perfectly worded, but I guess what I was trying to demonstrate with that, even with this audience here, is that if the messages, or if the goal is to rebuild trusted institutions, if that’s what we need, if we’re trying to sew up the fishes in society, right now, you have, even in an elite audience, general concern that maybe They still can’t trust these kinds of institutions to give the answer, that you need to get ahead of that misinformation.
Condoleezza Rice
But isn’t it true that Americans have always been skeptical? We’re kind of natural skeptics, right? So maybe that’s not a terrible thing. I don’t know that I want to to live in a world in which there is 100% trust in everything that’s said to me. So let’s not make that goal
Alex Ward
The unscientific goal or those who care, right? Right? Sorry, PhD people.
Condoleezza Rice
So I think the more important question is, how do we make certain that the information that is getting there isn’t somehow rigged, and that was your point about being pumped, right? How do we make certain of that? Because I actually think during covid, let’s take covid as an example. I don’t mind that there were people who were skeptical. I was not personally, but skeptical about the efficacy of masks. I don’t mind that there were people who were skeptical about the six foot separation, because it turns out later on that the six foot separation actually wasn’t scientific. So I think sometimes we say skepticism is a problem, when skepticism actually is a bit of a defense against exactly the kind of thing that you’re you’re thinking,
Alex Ward
American Revolution was skeptical
Condoleezza Rice
The American Revolution, bunch of skeptics. That’s right,
Anna Makanju
Although, to give a more boring answer to the question, part of, part of the question, I think you were trying to ask, because I do think we do need some new ecosystem tools here, and that’s why, you know, we are each of our companies can only do so much about the content that is being generated by the tools that we built. But we think we really need to invest in more of these. Ecosystem plays like ctpa, if you’ve heard this, which is basically a digital passport that travels with a piece of content. And the great thing about something like this is that it’s not just that. When you see a Dall e image from OpenAI, you know that that’s where it’s from, but it’s like news organizations are using this. Camera companies are using this. And so if you could have tools like this that you can look at a piece of content from any source and have identifying information, then that is much more helpful than just each AI company identifying its own content. So I think we do need to build some more basically ecosystem tools that generate trust and content across the board.
Alex Ward
So let me there was a official from a government here that I promised I would not reveal. And this official said that when you know they received basically disinformation attacks on their public and then the government tries to correct it with the correct information that there, by their estimation, only about 20% 20, 30% of the people who received the disinformation got access to the correct information. So sorry, does that make sense for people? I’m bad at math, but it’s
Teresa Hutson
To put it differently. A lie goes like all halfway around the world before the truth, right? So whatever that statement is, right, I think that’s kind of what you’re getting at
Alex Ward
Precisely, right? So help me if that is, I guess maybe one of the core issues here to bring part of our conversation already the halfway point. So help me. Help me solve that problem with with what the you guys are doing with what you know, how do we how do we get the truth faster than the lie? If we can, if we can.
Anna Makanju
Well, I mean, a lot of what we do is try to prevent people from being able to spread the lie to begin with. You know, this is why we don’t let people generate real people using the image software. It’s why we haven’t released the voice or video software until we have absolute confidence in the safety mitigations. I think to a certain extent, it’s and we need to invest in, you know, as Teresa was talking about, all of the tools that help people identify truth to begin with and help them understand what impact if any of this technology will have.
Teresa Hutson
I think part of it, part of the challenge, is it’s not exactly, I mean, it’s partly a technology question, because of the distribution question, like, it’s distributed fast, but it’s actually, it is actually that foundational question, why does the lie travel faster? Why doesn’t the truth resonate? Is it because the person who’s speaking it isn’t trusted? Right? So I don’t, I think, I think we it is. It is partly a technology problem. But if the problem is just, if lies and truth are distributed in the same way, why does one take hold? I’m not, and it’s a good question for us as a society.
Condoleezza Rice
I mean, do you ever find Do you have any research that shows that the first thing that you hear is what sticks that? That’s what I would say to your government official, right? You were late in getting to the party with your information, and so you were now already swimming upstream, because people had come to believe a particular thing. I remember that we had this problem in government all the time, because you would wake up in Washington, DC, the time zone anyplace else was already 810, 12 hours ahead, and some lie had spread. So a very famous one was that the United States military had flushed a Koran down a toilet in in one of our one of our bases, right, fundamentally untrue. But while we were checking out the story because we didn’t want to go and say that something wasn’t true, if it in fact, was true, the story had now taken off across all of the Middle East, and no matter what we said about No, no, no, it turns out not to have been true. It was too late. And so once the lie is out there, that’s why I like your your notion that you have to try to make sure that the lie doesn’t get out there in the beginning, it’s actually kind of human nature, the first thing you hear is what sticks. And I think that’s one of the real problems with then trying to go back and so called correct the record.
Alex Ward
So maybe just brought things out to a bit of a geopolitical thing. I’ve in preparation for this, I’ve read some AI books, one of which was, was published around six, seven years ago. And the main thesis of that book was, China’s won the AI race. It’s over. It’s done. There’s no way back for America. And then you heard Eric Schmidt that one of the, I believe the first panel of this, say, actually, the gap is getting wider into the US in the lead and perhaps somewhat unrecoverable. We don’t have to go too much into the, of course, the ramifications for national security, economy, etc. But I’m curious, in terms of the democratic conversation, this feels that actually just feel like a good news story, right? That the United States says is winning this race in terms of,
Condoleezza Rice
Well, here’s here’s why. It’s not just a good news story, but it’s an absolutely critical story. Do the thought experiment that Nazi Germany and or the Soviet Union win the nuclear race ahead of the United States, we’re having a debate, a discussion about the upsides of AI, but also the downsides if something goes wrong. We will have investigative reporters, we will have congressional hearings. You will have whistleblowers. I can guarantee you that won’t happen in Beijing and so whatever the downsides of a particular technology. I want that technological race to be won by a democracy, because that democracy will be open to the discussion of the problems with that technology. And so for me, this is a run fast, run hard, United States of America, run fast, run hard. Great Britain, not much of the rest of the world matters right now, and as you run fast and run hard, don’t do things in terms of government regulation and government decision making. That slows our progress, because you’re also not going to see that in China. And maybe Eric mentioned, you know, this is kind of an innovation leap, because before generative AI, you would have said that China, they were training so much data, they had so much data, they didn’t have privacy concerns. But of course, two things have happened. They can’t do generative AI, because they’ve been denied the chip to do it. Eventually they will catch up. But the other thing that has happened is that they have tried to control from the top Xi Jinping thought has to be in everything that is is put into the models, and they will probably continue to retard their own growth. But we have to run faster and fast, faster and harder.
Alex Ward
No Winnie the Pooh in their models, I guess.
Anna Makanju
And can I add one very important this is absolutely a correct list. But one other really important thing on this list is that China produces by far the most, the largest number of AI researchers in the world, and half of them work here. And in fact, if you look at all of the top labs in the United States, you will find a pretty large percentage of those people are on visas, are naturalized citizens, are not native born Americans. And so one of the huge advantages we have because one of the biggest bottlenecks in this industry is the talent is the leading researchers, and our ability to you know, attract them has been a really important piece, and that is something we really need to keep in mind as not, you know, we should be accelerating that and doing everything we can to increase our ability to continue to attract this talent.
Teresa Hutson
And maybe just one thing to add. I do think this technology is something that gives us both hard and soft power. You know, the more people around the world using chatgpt versus a China model of that, the better. So, you know, let’s embrace the innovation for both of those reasons as well. There will be National Security uses of AI as well, but, but there’s also just the power of American technology that is itself quite a powerful signal about innovation and what the West can bring to the world.
Alex Ward
This brings us to, actually, a bit of a revelation, not revelation, but the RNC this week, they the platform is that that they weren’t looking to, you know, if the Republicans come back to how they’re not looking to regulate, AI, they’re looking to basically let companies lead in terms of, you know, how they develop the technology. Is that appealing to you in terms of one just to not have as much government regulation on this or and is it also necessary in order to develop it in the way you need to develop it?
Teresa Hutson
I can, I mean, we’ve really, actually called for regulation. We do think we need some rules of the road, and also would prefer not to have this regulated at the state level. You know, 50 states regulating this will make business impossible. It’s already challenging enough to be a global business. And then I think there are some things that we can all just agree should be regulated, things like we shouldn’t be able to use this technology to create what is known as non consensual sexual imagery, or porn. We shouldn’t be able to use this to create fake porn of teenage girls like I think we can agree on that. And there are some places where we can just we have some baseline values that we should be okay regulating the technology. So I don’t think we would sort of, we’re not in a position, I think, to say, like, don’t regulate us. I think that the technology is new. It is novel, and we do think there need to be some guardrails on it.
Anna Makanju
Look, I think everyone right now is struggling to figure out the right balance between allowing innovation and generating the kind of trust that is necessary for people to actually adopt and use these tools. And in addition to the fragmentation of the state level, the United States is not in the lead on the regulatory infrastructure, we’re going to also have multiple global models. And so this is one of the areas where it actually is important for the US to take the lead. And in fact, I would argue this has been one of the areas where there has been more bipartisan consensus than perhaps on any other topic. If you look at the number of bipartisan pieces of legislation that has been introduced in the last year on AI, I struggle to think of another area there’s been as much consensus. So I do think this is, you know, there is hope for figuring out this balance.
Condoleezza Rice
Yeah, I don’t think you have to worry. You will end up being regulated.
Anna Makanju
We already are
Condoleezza Rice
I’d be careful what I ask for, because regulators will regulate, even if they don’t understand what they’re regulating, right? So this is the problem, and I see really three issues here. One is that the understanding in Congress and even less than the executive branch, of what the future might look like is really pretty imperfect, and in fact, you can’t even tell them what the future would look like. So let’s not get out ahead. One thing we’re trying to do at Hoover is actually because Senator Mark Warner was with us at one point and said he wanted to enter. So we go to the labs you you say to faith Ali or the folks in in what’s what’s coming up. And how can we explain to our leaders what the implications are for various aspects of national security, the economy, so forth. So let’s try to educate before we start regulating. That would be my first point. The second point is the one that you made. We have a real problem now, because the innovative state is the United States, and to a certain extent, Great Britain, and what you do and Europe is the regulator without the innovation. And if we get a separation of the innovator from the regulator, we’re really going to start to have problems. And so is there a way to increase the dialog with our like minded, democratic European colleagues, to get out of a world in which they’re just going after regulation and we’re going after innovation? I think those are two real danger points right now.
Anna Makanju
And, yeah, I mean, absolutely agree with you. I always joke that I think we’re probably the most heavily regulated per capita company on Earth, because there are already lots of areas where AI is being regulated. Just because you are using AI to do fraud, doesn’t make fraud legal also. So I think people should understand that we’re not sort of like the wild wild west, and nothing applies. And there are already a lot of areas where there are many regulators who have the power to intervene when they see harm coming from this technology. But at the same time, one of the things that we’re really bullish on is something like the AI Safety Institute, because they can have an approach that is science based, that is fact based. There’s creating a common language, because right now there’s just a lot of conversation that is vibes based. It’s like, we don’t even know what you know catastrophic risk means, or, like, how to measure when it happens, and we really need that scientific basis on which to base regulation to your points that we’re not just coming out there with proposals that actually don’t reflect the state of the technology itself.
Alex Ward
Don’t worry. AI, isn’t the only thing that vibes govern these days. I think we have time for one question. If anyone’s got it right here.
Question 3
Thank you so much. Kelsey Frierson, one of the rising leaders, until about three weeks ago, I was an AI fellow in the Senate. So a lot of this stuff is kind of what I spent the last year thinking about. I’m curious, kind of, on the innovation side of things, what incentives, whether it’s through like direct R and D investment, or just ways to direct private investment, the government should be considering to kind of keep our innovative edge, especially in a very kind of, like, austere political environment when it comes to increasing spending on anything right now. So just curious, do you have any thoughts on how to maintain that innovative edge?
Alex Ward
I did not plan to call on an AI fellow, to be clear, that was purely coincidental.
Teresa Hutson
You know one incentive I do think actually goes to your talent point, which is our immigration system. We should make it easier for people to be here, to stay here, and to bring their the best brains working on these issues to the United States, adding to the innovation system in the United States. That’s my first point.
Condoleezza Rice
Yeah, I think you’re asking an extremely important question, which is, what is the proper role of government at this point? And I think the proper role of government is not choosing winners and losers and trying to decide where the technology is going. So just a very quick vignette. Man named Bill Perry. He was the undersecretary for Research and Engineering for the Clinton administration, and Secretary of Defense for, I’m sorry, for Carter, and then Secretary of Defense for Clinton. He’s one of the smartest people I know. Bill said that in 1978 he testified, and he was asked, is there any What about this thing called personal computer. He said, I don’t see why anybody would ever want a personal computer, right? So I’m really thrilled he was wrong, right? Exactly. So government isn’t very good at picking winners and losers. The second point is, we do have a model that has worked extremely well in the United States. It goes all the way back to a man named Vannevar Bush who recognized that if the government was willing to support fundamental research in places like universities, you would get the kind of output that we’ve gotten in the Silicon Valley, where Stanford and Berkeley and other places have created the innovation it’s been commercialized and the entire economy has benefited The government on the fundamental research side is falling down. NSF numbers are down. By the way, NIH numbers are not because, as a friend of mine said, because baby boomers are determined to do something about that 100% death rate. So NIH we’re going to fund. But everything else on the fundamental side is suffering. And then the final point I make is that we do have, right now an interesting dilemma. There is not a university, nor combination of universities, that can do what Microsoft does. We don’t have the compute power. So you have to ask whether or not this extremely transformative technology at the leading edge is only going to be in the commercial sector. Now I’m as dyed in the wool capitalist as you will ever find, but I don’t really think that we want to have just commercial incentives pushing where the technology is going. And so as a country, we’re going to have to think about what alternatives are there to get that kind of compute power into non commercial settings. It may be partnerships, it may be the national labs, which may have a role in this, but that proper role of government question, I think, is the hardest one that we’re facing, and we better get on it, because things are moving.
Anna Makanju
The one thing I would add in this bill is on Secretary Rice’s point is that we are increasingly seeing that infrastructure is destiny. And essentially, you will it will be essential to leverage and to stay ahead on this technology. And obviously this administration has done quite a bit in terms of what it has done to make sure that, you know, our adversaries have a harder time catching up, and that, you know, investing and ensuring that we have domestic semiconductor production. But there is a lot more to be done on this front, because the demand for this is just going to be incredible across the hardware and the energy. And how do we solve that without exacerbating other dynamics.
Alex Ward
I don’t know you, but I could have had this conversation for a lot longer. This is so fascinating. First of all, thank you. You’ve answered the second last minute to the second last panel. There’s one more to come. I think you know what it is. So stay in your seats. Get it. Stay excited. But as you sit in your seats to stay excited, please. Thanks.