Need nudge for World Data Organization on WTO model?
Ian Bremmer thinks so. Tech, Govern, Future.
Ian Bremmer promotes new book; Gerard Baker promotes right thinking:
GERARD BAKER: I think the third threat that you identify is a really particularly interesting one and one that I think hasn't been as much explored, which is technology. We're all familiar with cyber security concerns. You particularly talk about that, but you talk about artificial intelligence and also quantum computing and how much of a threat that can be. Just explain, first of all, what that is, and the threat that poses, and how you think we are positioned to resist that?
IAN BREMMER: This crisis is, you're right. It's the one that's getting the least attention right now. Everyone's been talking about COVID and climate and Russia for the last couple of years. Who's really talking about disruptive technologies and what are they talking about when they discuss it? Because everyone's problem is different, is it that monopolies, the tech companies have too much power, or is it free speech and cancel culture or is it political polarization, disinformation? All this stuff. What I'm focusing on broadly is that we are developing technologies which are incredibly dangerous to the development of our kids, to the persistence of democracy as a political system, and even to the existence of the species. These disruptive technologies, we've had an experience with one in the 20th century. It was nuclear weapons, and we knew how dangerous it was and we did everything we could to contain the proliferation, and we were largely very successful at that, but we were successful precisely because it was a very complicated technology that required both very dangerous and fairly rare natural elements in order to put it together, and that meant that governments coming together had an easier time preventing them from proliferating. I am deeply concerned, cyber weapons, and you look at AI algorithms and disinformation, when you look at lethal autonomous drones. Even when you look at quantum computing, the ability to contain those disruptive technologies, to stop them from proliferating, is orders of magnitude greater and maybe undoable compared to nuclear weapons. Yet these technologies are potentially, and perhaps even very likely, as dangerous if not more dangerous than nuclear proliferation, so how can we not, as governments and as other actors with power over these technologies, how can we not start to address them as an existential crisis?
GERARD BAKER: But how do we? This, again, requires a remarkable degree of international cooperation. How do we achieve that kind of corporation, that sense of solidarity, which doesn't seem to be there at the moment that enables us instead, people view technology not as an existential threat to the globe, they see it as not a framework of your aliens coming from out of space and invading the earth. They see it as a great opportunity, a great advantage to secure their own benefit, to secure their own dominance in the world. How do you persuade them to back off that and somehow see it as a common threat?
IAN BREMMER: Well, they see it as both. When the colonial pipeline hit occurred, Biden met with Putin a year ago in Geneva, and didn't even bring up Ukraine. He said, "Look, if you guys don't cut that out, this is going to lead to direct conflict between our two countries”. The Russians actually did tell some of these cyber gangs to knock off the attacks on critical infrastructure as a consequence of that conversation. I think people do understand in time some of the nature of these threats, but you're absolutely right, Gerry, that mostly when we talk about tech, we talk about convenience, we talk about click through, we talk about all the money that's made, and certainly the business models are not doing anything to try to prevent us, to try to display us.
GERARD BAKER: State actors and bads see it as a weaponized opportunity for them to secure advantage over somebody else. They're not incentivized to share what they know, they're incentivized actually to achieve more and more of an advantage, whether it's in cyber or AI or all of these things so they can actually inflict damage on their rivals. Isn't that right?
IAN BREMMER: Again, I think that they see it in both ways, but to the extent that there is no architecture, there are no guardrails, there is no nudging towards more responsible behavior, then you have a collective action problem. What these individual actors will do is say, "Well, if it's mostly offensive technology, I'm going to make sure I'm really good at it." It's what the Americans do, it's what Chinese do, it's what the Russians do, it's what the Israelis do. Part of the problem, so you say, "What do we do about it?" I think there are a couple of things that we do. One is you educate the public about it so that they get outraged and they start pushing for changes in behavior … Well, it's not too late on AI either. That's one thing you do. A second thing you do is you recognize that we have none of the institutions in architecture that would actually allow for us to identify which of these issues really would benefit from collaboration, would benefit from common rules of the road that otherwise we're going to destroy ourselves. We did that with nuclear weapons. We haven't done that yet with AI and disruptive technologies. We have a World Trade Organization and a lot of money has been made by the Americans on the back of that, the multinational corporations and a global middle class has emerged as a consequence of it. We don't have a world data organization.
It seems fairly clear that we need one because it actually is responsible for driving so much of the global economy, but also for creating so many of these dangerous practices that undermine national security and personal security. The only reason we don't have it is because when we were in institution building mode, data wasn't a thing for the global economy or national security, now it is. It's pretty clear that the world, and we're going to need to start with countries that trust each other, but it needs to be suitably open, that chapters can be opened for anyone that's willing to behave in accordance to those values, will need to start creating that architecture. Another thing I will say is that unlike the WTO, which was an organization of governments, of states, a world data organization, and all of the regulatory framework and all of the rules of the road that need to be created to deal with at least some of these issues cannot just be about governments, because corporations are actually sovereign in their digital space. Big tech companies, they create the walled gardens, they build the algorithms, they determine the rules of the road. They know. Even cybersecurity, Ukraine's getting attacked, the Americans and NATO are defending them in terms of javelin weapons and stinger missiles but in terms of cyber, it's Microsoft, it's Google. It's not the United States government. The multilateral framework you're going to need to create to start to respond to these problems cannot just be through governments, will need to be multi-stakeholder from day one, and that's new. That's completely new in the way we think about global governance.
The Source 17 May 2022:
Free Expression Podcast with [‘1st Class’ PPE graduate back when it meant something] Gerry Baker