Skip to content
All Podcast Episodes

Nicolas Berggruen on the Dynamics of Power, Wisdom, and Ideas in the Age of AI

Published
June 1, 2021

  • What wisdom consists of
  • The role of ideas in society and civilization
  • The increasing concentration of power and wealth
  • The technological displacement of human labor
  • Democracy, universal basic income, and universal basic capital
  • Living an examined life

 

Check out Nicolas Berggruen's thoughts archive here

Transcript

Lucas Perry: Welcome to the Future of Life Institute Podcast. I'm Lucas Perry. Today's episode is with Nicolas Berggruen and explores the importance of ideas and wisdom in the modern age of technology. We explore the race between the power of our technology and the wisdom with which we manage it , what wisdom really consists of, why ideas are so important, the increasing concentration of power and wealth in the hands of the few, how technology continues to displace human labor, and we also get into democracy and the importance of living an examined life. 

For those not familiar with Nicolas, Nicolas Berggruen is an investor and philanthropist. He is the founder and president of Berggruen Holdings, and is a co-founder and chairman of the Berggruen Institute. The Berggruen Institute is a non-profit, non-partisan, think and action tank that works to develop foundational ideas about how to reshape political and social institutions in the face of great transformations. They work across cultures, disciplines and political boundaries, engaging great thinkers to develop and promote long-term answers to the biggest challenges of the 21st Century. Nicolas is also the author, with Nathan Gardels, of Intelligent Governance for the 21st Century: A Middle Way between West and East as well as Renovating Democracy: Governing in the Age of Globalization and Digital Capitalism. And so without further ado, let's get into our conversation with Nicolas Berggruen. 

So, again, thank you very much for doing this. And to set a little bit of stage for the interview and the conversation, I just wanted to paint a little bit of a picture of wisdom and technology and this side of ideas, which is not always focused on when people are looking at worldwide issues. And I felt that this Carl Jung quote captured this perspective well. He says that "indeed it is becoming ever more obvious that it is not famine, not earthquakes, not microbes, not cancer, but man himself who is man's greatest danger to man for the simple reason that there is no adequate protection against psychic epidemics, which are infinitely more devastating than the worst of natural catastrophes." So, I think this begins to bring us to a point of reflection where we can think about, for example, the race between the power of our technology and the wisdom with which we manage it. So, to start things off here, I'm curious if you have any perspective about this race between the power of our technology and the wisdom with which we manage it, and in particular what wisdom really means to you.

Nicolas Berggruen: So, I think it's an essential question. And it's becoming more essential every day because technology, which is arguably something that we've empowered and accelerated is becoming increasingly powerful to a point where we might be at the cusp of losing control. Technology, I think, has always been powerful. Even in very early days, if you had a weapon as technology, well, it helped us humans on one side to survive by likely killing animals, but it also helped us fight. So, it can be used both ways. And I think that can be said of any technology.

What's interesting today is that technology is potentially, I think, at the risk or in the zone of opportunity where the technology itself takes on a life. Go back to the weapon example. If the weapon is not being manned somehow, well, the weapon is inert. But today, AIs are beginning to have lives of their own. Robots have lives of their own. And networks are living organisms. So, the real question is when these pieces of technology begin to have their own lives or are so powerful and so pervasive that we are living within the technology, well, that changes things considerably.

So, going back to the wisdom question, it's always a question. When technology is a weapon, what do you do with it? And technology's always a weapon, for the good or for the less good. So, you've got to have, in my mind at least, wisdom, intention, an idea of what you can do with technology, what might be the consequences. So, I don't think it's a new question; I think it's a question since the beginning of time for us as humans. And it will continue to be a question. It's just maybe more powerful today than it ever was. And it will continue to become more potent.

Lucas Perry: What would you say wisdom is?

Nicolas Berggruen: I think it's understanding and projection together. So, it's an understanding of maybe a question, an issue, and taking that issue into the real world and seeing what you do with that question or that issue. So, wisdom is maybe a combination of thinking and imagination with application, which is an interesting combination at least.

Lucas Perry: Is there an ethical or moral component to wisdom?

Nicolas Berggruen: In my mind, yes. Going back to the question, what is wisdom? Do plants or animals have wisdom? And why would we have wisdoms, and they not? We need to develop wisdom because we have thoughts. We are self-aware. And we also act. And I think the interaction of our thinking and our actions, that makes for need for wisdom. And in that sense, a code of conduct or ethical issues, moral issues become relevant. They're really societal, they're really cultural questions. So, they'll be very different depending on when and where you are. If you are sitting, as we are, it seems both in America today, in 2021; or if we were sitting 2,000 years ago somewhere else; or even today, if we're sitting in Shanghai or in Nairobi.

Lucas Perry: So, there's this part of understanding and there's this projection and this ethical component as well and the dynamics between our own thinking and action, which can all interdependently come together and express something like wisdom. What does this projection component mean for you?

Nicolas Berggruen: Well, again, to me, one can have ideas, can have feelings, a point of view, but then how do you deal with reality? How do you apply it to the real world? And what's interesting for us as humans is that we have an inner life. We have, in essence, a life with ourselves. And then we have the life that puts us in front of the world, makes us interact with the world. And are those two lives in tune? Are they not? And how far do we push them?

Some people in some cultures will say that your inner life, your thoughts, your imagination are yours. Keep them there. Other cultures and other ways of being as individuals will make us act those emotions, our imaginations, make them act out in the real world. And there's a big difference. In some thinking, action is everything. For some philosophers, you are what you do. For others, less so. And that's the same with cultures.

Lucas Perry: Do you see ideas as the bridge between the inner and the outer life?

Nicolas Berggruen: I think ideas are very powerful because they activate you. They move you as a person. But again, if you don't do anything with them, in terms of your life and your actions, they'll be limited. What do you do with those ideas? And there, the field is wide open. You can express them or you can try to implement them. But I do think that unless an idea is shared in any way that's imaginable, unless that idea is shared, it won't live. But the day it's shared, it can become very powerful. And I do believe that ideas have and will continue to shape us humans.

We live in a way that reflects a series of ideas. They may be cultural ideas. They may be religious ideas. They may be political ideas. But we all live in a world that's been created through ideas. And who created these ideas? Different thinkers, different practitioners throughout history. And you could say, "Well, these people are very creative and very smart and they've populated our world with their ideas." Or we could even say, "No, they're just vessels of whatever was the thinking of the time. And at that time, people were interested in specific ideas and specific people. And they gained traction."

So, I'm not trying to overemphasize the fact that a few people are smarter or greater than others and everything comes from them, but in reality, it does come from them. And the only question then is, were they the authors of all of this? Or did they reflect a time and a place? My feeling is it's probably a bit of both. But because we are humans, because we attribute things to individuals one way or another, ideas get attributed to people, to thinkers, to practitioners. And they're very powerful. And I think, undoubtedly, they still shape who we are and I think will continue to shape who we are at least for a while.

Lucas Perry: Yeah. So, there's a sense that we've inherited hundreds of thousands of years of ideas basically and thinking from our ancestors. And you can think of certain key persons, like philosophers or political theorists or so on, who have majorly contributed. And so, you're saying that they may have partially been a reflection of their own society and that their thought may have been an expression of their own individuality and their own unique thinking.

So, just looking at the state of humanity right now and how effective we are at going after more and more powerful technology, how do you see our investment in wisdom and ideas relative to the status of and investment that we put into the power of our technology?

Nicolas Berggruen: To me, there's a disconnect today between, as you say, the effort that we put in developing the technologies versus the effort that's being invested in understanding what these technologies might do and thinking ahead, what happens when these things come to life? How will they affect us and others? So, we are rightly so impressed and fascinated by the technologies. And we are less focused on the effects of these technologies on ourselves, on the planet, on other species.

We're not unaware, and we're getting more and more aware. I just don't know if, as you say, we invest enough of our attention, of our resources there. And also, if we have the patience and you could almost say the wisdom, to take your word, to take the time. So, I see a disconnect. And in going back to the power of ideas, and you could maybe ask the question in a different ways: our ideas or technology. Which one is more influential? Which one is more powerful? I would say they come together. But technologies alone are limited or at least historically have been limited. They needed to be manifested or let's say empowered by the owners or the creators of the technologies. They helped people. They helped the idea-makers or the ideas themselves enormously. So, technology has always been an ally to the ideas. But technology alone, without a vision, I don't think ever got that far. So, without ideas, technology is a little bit like an orphan.

And I would argue that the ideas are still more powerful than the technologies because if you think about how we think today, how we behave, we live in a world that was shaped by thinkers a few thousand years ago, no matter where we live. So, in the West, we are shaped by thinkers that lived in Greece 2, 3,000 years ago. We are shaped by beliefs that come from religions that were created a few thousand years ago. In Asia, the cultures have been shaped by people who lived also 2, 3,000 years ago. And the technology, which has changed enormously in every way, East or West, may have changed the way we live, but not that much. In the way we behave with each other, the way we use the technologies, those still reflect thinking and cultures and ideas that were developed 2, 3,000 years ago. So, I would almost argue the ideas are more powerful than anything. The technologies are an ally, but they themselves don't change the way we think, behave, feel, at least not yet.

It's possible that certain technologies will truly... and this is what's so interesting about living today, that I think some technologies will help us transform who we are as humans, potentially transform the nature of our species, maybe help us create a different species. That could be. But up to now, in my mind at least, the ideas shape how we live; the technologies help us live maybe more deeply or longer, but still in a way that reflects the same ideas that were created a few thousand years ago. So, the ideas are still the most important. So, going back to the question, do we need, in essence, a philosophy for technology? I would say yes. Technologies are becoming more and more powerful. They are powerful. And the way you use technology will reflect ideas and culture. So, you've got to get the culture and the ideas right because the technologies are getting more and more powerful.

Lucas Perry: So, to me, getting the culture and the ideas right sounds a lot like wisdom. I'm curious if you would agree with that. And I'm also curious what your view might be on why it is that the pace of the power of our technology seems to rapidly outpace the progress of the quality of our wisdom and culture and ideas. Because it seems like today we have a situation where we have ideas that are thousands of years old that are still being used in modern society. And some of those may be timeless, but perhaps some of them are also outdated.

Nicolas Berggruen: Ideas, like everything else, evolve. But my feeling is that they evolve actually quite slowly, much more slowly than possible. But I think we, as humans, we're still analog, even though we live increasingly in a digital world. Our processes and our ability to evolve is analog and is fairly slow still. So, the changes that happened over the last few millennials, which are substantial, even in the world of ideas, things like the enlightenment and other important changes, happened in a way that was very significant. Changed entirely the way we behave, but it took a long time. And technology helps us, but it's so part of our lives, that there's a question at some point, are we attached to the technology? Meaning, are we driving the car, or is the car driving us? And we're at the cusp of this, potentially. And it's not necessarily a bad thing, but, again, do we have wisdom about it? And can we lose control of the genie, in some ways?

I would argue, for example, social media networks have become very powerful. And the creators of it, even if they control the networks, and they still do in theory, they really lost control of them. The networks really have a life of their own. Could you argue the same to other times in history? I think you could. I mean, if you think of the Martin Luther and the Gutenberg Bible, you could say, "Well, that relates ideas and technologies." And in a way that was certainly less rapid than the internet, technology, in the case of the printed material, really helped spread an idea. So, again, I think that the two come together. And one helps the other. In the example I just gave, you had an idea; the technology helped.

Here, what's the idea behind, let's say, social networks? Well, giving everybody a voice, giving everybody connectivity. It's a great way to democratize access and a voice. Have we thought about the implications of that? Have we thought about a world where, in theory, everyone on earth has the same access and the same voice? Our political institutions, our cultures really are only now dealing with it. We didn't think about it ahead. So, we are catching up in some ways. The idea of giving every individual an equal voice, maybe that's a reflection of an old idea. That's not a new idea. The instrument, meaning let's say social media, is fairly new. So, you could say, "Well, it's just a reflection of an old idea."

Have we thought through what it means in terms of our political and cultural lives? Probably not enough. So, I would say half and half in this case. The idea of the individual is not new. The technology is new. The implications are something that we're still dealing with. You could also argue that the nature of anything new, an idea, in this case helped by technology. And we don't really know where the journey leads us. It was a different way of thinking. It became incredibly powerful. You didn't know at the beginning, how powerful and where it would lead. It did change the world.

But it's not the technology that changed the world; it's the ideas. And here, the question is the technology, let's say social networks, are really an enabler. The idea is still the individual. And the idea is democratizing access and voices, putting everybody on the same level playing field, but empowering a few voices, again, because of network. So, it's this dance between technology and humans and the ideas. At the end, we have to know that technology is really just a tool, even though some of these tools are becoming potential agents themselves.

Lucas Perry: Yeah. The idea of the tools becoming agents themselves is a really interesting idea. Would you agree with the characterization then that technology without the right ideas is orphaned, and ideas without the appropriate technology is ineffectual?

Nicolas Berggruen: Yes, on both.

Lucas Perry: You mentioned that some of the technology is becoming an agent in and of itself. So, it seems to me then that the real risk there is that if that technology is used or developed without the wisdom of the appropriate ideas, that that unwise agentive technology amplifies that lack of wisdom because being an agent, its nature is to self-sustain and to propagate and to actualize change in the world of its own accord. So, it seems like the fact that the technology is becoming more agentive is like a calling for more wisdom and better ideas. Would you say that that's fair?

Nicolas Berggruen: Absolutely. So, technology in the form of agents is becoming more powerful. So, you would want wisdom, you would want governance, you would want guidance, thinking, intention behind those technologies, behind those agents. And the obvious ones that are coming, everything around AI. But you could say that some of the things we are living with are already agents, even though they may not have been intended as agents. I mentioned social networks.

Social networks, frankly, are living organisms. They are agents. And no matter if they're owned by a corporation and that corporation has a management, the networks today are almost like living creatures that exist for themselves or exist as themselves. Now, can they be unplugged? Absolutely. But very unlikely that they'll be unplugged. They may be modified. And even if one dies, they'll be replaced most likely. Again, what I'm saying is that they've become incredibly powerful. And they are like living organisms. So, governance does matter. We know very well from these agents that they are amazingly powerful.

We also know that we don't know that much about what the outcomes are, where the journey may lead and how to control them. There's a reason why in some countries, in some cultures, let's say China or Russia or Turkey, there's been a real effort from a standpoint of government to control these networks because they know how powerful they are. In the West, let's say in the US, these networks have operated very freely. And I think we've lived with the real ramifications as individuals. I don't know what the average engagement is for individuals, but it's enormous.

So, we live with social networks. They're part of us; we are part of them equally. And they've empowered political discourse and political leaders. I think that if these networks hadn't existed, certain people may not have gotten elected. Certainly, they wouldn't have gotten the voice that they got. And these are part of the unintended consequences. And it's changed the nature of how we live.

So, we see it already. And this is not AI, but it is in my mind. Social networks are living creatures.

Lucas Perry: So, following up on this idea of technology as agents and organisms. I've also heard corporations likened to organisms. They have a particular incentive structure and they live and die by their capacity to satisfy that incentive, which is the accumulation of capital and wealth.

I'm curious, in terms of AI, when you were at Beneficial AI 2017. So, I'm curious what your view is of how ideas play a role in value alignment with regards to technology that is increasingly agentive, so specifically artificial intelligence. So, there's a sense that training and imbuing AI systems with the appropriate values and ideas and objectives, yet at the same time dealing with something that is fundamentally alien, given the nature of machine learning and deep learning. And so, yeah. I'm curious about your perspective about the relationship between ideas and AI.

Nicolas Berggruen: Well, you mentioned corporations. And corporations are very different than AIs, but at the same time, the way you mentioned corporations I think makes them very similar to AI. And they are a good example because they've been around for quite a while. Corporations, somebody from the outside would say, "Well, they have one objective is to accumulate capital, make money." But in reality, money is just fuel. It's just, if you want, the equivalent of energy or blood or water. That's all it is. Corporations are organisms. And their real objective, as individual agents, if you want, as sort of creatures, is to grow, expand, survive. And if you look at that, I would say you could look at AIs very similarly.

So, any artificial intelligent agent, ultimately any robot, if you put it in a embodied form, if they're well-made, if you want, or if they're well-organized, if they're going to be truly powerful, a bit like a corporation is really very powerful and it's helped progress, it's helped... if you think capitalism has helped the world, in that sense, it's helped. Well, strong AIs will also have the ability over time to want to grow and live.

So, going back to corporations. They have to live within society and within a set of rules. And those change. And those adapt to culture. So, there's a culture. When you look at some of the very old corporations, think the East India Company or so, employed slaves. That wouldn't be possible today for the East India Company. Fossil fuels were really the allies of some of the biggest corporations that existed about 100 years ago, even 50 years ago. Probably not in the future. So, things change. And culture has an enormous influence. Will it have the same kind of influence over AI agents? Absolutely.

The question is, as you can see from criticism of corporations, some corporations thought to become too powerful, not under the control or governance of anyone, any country, supernational, if you want. I think the same thing could happen to AIs. The only difference is that I think AIs could become much more powerful because they will have the ability to access data. They'll have the ability to self-transform in a way that hasn't really been experienced yet. And we don't know how far... it'll go very far. And you could imagine agents being able to access all of the world data in some ways.

And the question is, what is data? It's not just information the way we think of information, which is maybe sort of knowledge that we memorize, but it's really an understanding of the world. This is how we, as creatures and animals, as creatures, are able to function in that they understand the world. Well, AIs, if they really get there will sort of understand the world. And the question then is, can they self-transform? And could they, and this is the interesting part, begin to think and develop instincts and maybe access dimensions and senses that we as humans have a tough time accessing? And I would speculate that, yes.

If you look at AlphaGo, which is the DeepMind Google AI that beat the best Go players, the way that they beat the best Go players, and this is a complicated game that's been around for a long time, is really by coming up with moves and strategies and a way of playing that the best human players over thousands of years didn't think of. So, a different intuition, a different thinking. Is it a new dimension? Is it having access a new sense? No, but it's definitely, very creative, unexpected way of playing. To me, it's potentially a window into the future, where AIs and machines become in essence more creative and access areas of thinking, creativity and action that we humans don't see. And the question is, can it even go beyond?

I'm convinced that there are dimensions and senses that we, humans, don't access today. It's obvious. Animals don't access what we access. Plants don't access what animals do. So, there was change in evolution. And we are certainly missing dimensions and senses that exist. Will we ever access them? I don't know. Will AIs help us access them? Maybe. Will they access them on their own by somehow self-transforming? Potentially. Or are there agents that we can't even imagine, who we have no sense of, that are already there? So, I think all of this is a possibility. It's exciting, but it'll also transform who we are.

Lucas Perry: So, in order to get to a place where AI is that powerful and has senses and understanding that exist beyond what humans are capable of, how do you see the necessity of wisdom and ideas in the cultivation and practice of building beneficial AI systems? So, I mean, industry incentives and international racing towards more and more powerful AI systems could simply ruin the whole project because everyone's just amplifying power and taking shortcuts on wisdom or ideas with which to manage and develop the technology. So, how do you mitigate that dynamic, that tendency towards power?

Nicolas Berggruen: It's a very good question. And interestingly enough, I'm not sure that there are many real-world answers or that the real-world answers are being practiced, except in a way that's self-disciplined. What's interesting in the West is that government institutions are way, way behind technology. And we've seen it even in the last few years when you had hearings in Washington, D.C. around technology, how disconnected or maybe how naive and uninformed government is compared to the technologists. And the technologists have, frankly, an incentive and also an ethos of doing their work away from government. It gives them more freedom. Many of them believe in more freedom. And many of them believe that technology is freedom, almost blindly believing that any technology will help free us as humans. Therefore, technology is good, and that we'll be smart enough or wise enough or self-interested enough not to mishandle the technology.

So, I think there's a true disconnect between the technologies and the technologists that are being empowered and sort of the world around it because the technologists, and I believe it, at least the ones I've met and I've met many, I think overall are well-intended. I also think they're naive. They think whatever they're doing is going to be better for humanity without really knowing how far the technology might go or in whose hands the technology might end up in. I think that's what's happening in the West. And it's happening mostly in the US. I think other parts of the West are just less advanced technologically. When I say the US, I include some of the AI labs that exist in Europe that are owned by US actors.

On the other side of the world, you've got China that is also developing technology. And I think there is probably a deeper connection, that's my speculation, a deeper connection between government and the technologies. So, I think they're much more interested and probably more aware of what technology can do. And I think they, meaning the government, the government is going to be much more interested and focused on knowing about it and potentially using it. The questions are still the same. And that leads to the next question. If you think of beneficial AI, what is beneficial? In what way, and to who? And it becomes very tricky. Depending on cultures and religions and cultures that are derivatives of religions, you're going to have a totally different view of what is beneficial. And are we talking about beneficial just to us humans or beyond? Who is it beneficial for? And I don't think anybody has answered these questions.

And if you are one technologist in one lab or little group, you may have a certain ethos, culture, background. And you'll have your own sense of what is beneficial. And then there might be someone on the other side of the world who's developing equally powerful technology, who's going to have a totally different view of what's beneficial. Who's right? Who's wrong? I would argue they're both right. And they're both wrong. But they're both right to start with. So, should they both exist? And will they both exist? I think they'll both exist. I think it's unlikely that you're going to have one that's dominant right away. I think they will co-exist, potentially compete. And again, I think we're early days.

Lucas Perry: So, reflecting on Facebook as a kind of organism, do you think that Mark Zuckerberg has lost control of Facebook?

Nicolas Berggruen: Yes and no. No, in the sense that he's the boss of Facebook. But yes, in the sense that I doubt that he knew how far Facebook and other, I would say, engines of Facebook would reach. I don't think he or anyone knew.

And I also think that today, Facebook is a private company, but it's very much under scrutiny, not just from governments, but actually from its users. So, you could say that the users are just as powerful as Mark Zuckerberg, maybe more powerful. If tomorrow morning, Mark Zuckerberg turned Facebook or Instagram or WhatsApp off, what would happen? If they were tweaked or changed in a way that's meaningful, what would happen? It's happening all the time. I don't mean the switch-off, but the changes. But I think the changes are tested. And I think the users at the end have an enormous amount of influence.

But at the end of the day, the key is simply the engine has become so powerful or the kind of engine has become so powerful that it's not in the hands of Mark Zuckerberg. And if he didn't exist, there would be another Facebook. So, again, argument is even though one attributes a lot of these technologies to individuals, a little bit like ideas are attributable to individuals and they become the face of an idea, and I think that's powerful, that's incredibly powerful even with religions, I think that the ideas are way beyond, all the technologies are way beyond the founders. They reflect capability in terms of technology at the time when they were developed. There are a number of different social networks, not just one. And they reflect a culture or a cultural shift in the case of ideas, of religions.

Lucas Perry: So, I have two questions for you. The first is, as we begin to approach artificial general intelligence and superintelligence, do you think that AI labs and the leaders of them like Mark Zuckerberg may very well lose control of the systems and the kind of inertia that it has in the world, like the kind of inertia that Facebook has as a platform for its own continued existence? That's one question. And then the second is that about half the country is angry at Facebook because it deplatformed the president, among other people. And the other half is angry because it was able to manipulate enough people through fake news and information and allow Russian interference in advertising certain ideas.

And this makes me think of the Carl Jung quote from the beginning of the podcast about there not being adequate protection against psychic epidemics, kind of like there not being adequate protection against collectively bad ideas. So, I'm curious if you have any perspective, both on the leaders of AI labs losing control. And then maybe some antivirus malware for the human mind, if such a thing exists.

Nicolas Berggruen: So, let's start with the second question, which is the mind and mental health. Humans are self-aware. Very self-aware. And who knows what's next? Maybe another iteration, even more powerful. So, our mental health is incredibly important.

We live in our minds. We live physically, but we really live in our minds. So, how healthy is our mind? How healthy is our mental life? How happy or unhappy? How connected or not? I think these are essential questions in general. I think that in a world where technology and networks have become more and more powerful, that's even more important for the health of people, nations, countries and the planet at the end. So, addressing this seems more important than ever. I would argue that it's always been important. And it's always been an incredibly powerful factor, no matter what. Think of religious wars. Think of crusades. They are very powerful sort of mental commitments. You could say diseases, in some cases, depending on who you are.

So, I would say the same afflictions that exist today that make a whole people think something or dream something healthy or maybe in some cases not so healthy, depressed, or the opposite, euphoric or delusional, these things have existed forever. The difference is that our weapons are becoming more powerful. This is what happened half a century ago or more with the atomic power. So, our technology's becoming more powerful. Next one obviously is AI. And with it, I also think that our ability to deal with some of these is also greater. And I think that's where we have, on one side, a threat, but, on the other side, I think an opportunity. And you could say, "Well, we've always had this opportunity." And the opportunity is really, going back to your first question, around wisdom. It's really mental. We can spending time thinking these things through, spending time with ourselves. We can think through what makes sense. Let's say what's moral in a broad sense. So, you could say that's always existed.

The difference, in terms of mental health, is that we might have certain tools today that we can develop, that can help us be better. I'm not saying that it will happen and I'm not saying that there's going to be a pill for this, but I think we can be better and we are going to develop some ways to become better. And these are not just AI, but around bio-technology. And we'll be able to affect our mental states. And we will be able to do it through... and we do already through drugs, but there'll be also implants. There'll be maybe editing. And we may one day become one with the AIs, at least mentally, that we develop. So, again, I think we have the potential of changing our mental state. And you could say for the better, but what is better? That's goes back to the question of wisdom, the question of who do we want to be, and what constitutes better.

And to your other question, have the developers or the owners of some of the AI tools, do they control them? Will they continue to control them? I'm not sure. In theory, they control them, but you could argue, in some cases, "Well, they may have the technology, the IP. And in some cases, they have so much data that is needed for the AIs that there's a great synergy between the data and the technology." So, you need it almost in big places like a Facebook or Google or Tencent or an Alibaba. But you could very well say, "The technology's good enough. And the engineers are good enough. You can take it out and continue the project." And I would argue that at some point if the agents are good enough, the agents themselves become something. They become creatures that, with the right help, will have a life of their own.

Lucas Perry: So, in terms of this collective mental health aspect, how do you view the project of living an examined life or the project of self-transformation, and the importance of this approach to building a healthy civilization that is able to use and apply wisdom to the creation and use of technology? And when I say "examined life" I suppose I mean it in a bit of the sense in the way the Greeks used it.

Nicolas Berggruen: The advantage that humans have is that we can examine ourselves. We can look at ourselves. And we can change. And I think that one of the extraordinary things about our lives, and certainly I've witnessed that in my life, is that it's a journey. And I see it as a journey of becoming. And that means change. And if you are willing to self-examine and if you are willing to change, not only will life be more interesting and you will have a richer, fuller life, but you will also probably get to a place that's potentially better over time. For sure, different. And at times, better.

And you can do this as an individual. You can do that as many individuals. And as we have longer lives now, we have the opportunity to do it today more than ever. We also have not only longer lives, but longer lives where we can do things like what we are doing now, discussing these things. When, at the time of Socrates, few people could do it and now many people can do it. And I think that that trend will continue. So, the idea of self-transformation of self-examination, I think, is very powerful. And it's an extraordinary gift.

My still favorite book today is a book by Hermann Hesse called Siddhartha, which, the way I look at it, one way to read it is really a journey of self-transformation of chapters of life, where each chapter is not necessarily an improvement, but each chapter is part of living and each chapter is what constitutes maybe a full life. And if you look at Siddhartha, Siddhartha had totally different lives all within one. And I think we have this gift given to us to be able to do a lot of it.

Lucas Perry: Do you think, in the 21st century, that given the rapid pace of change, of the power of our technology, that this kind of self-examination is more important than ever?

Nicolas Berggruen: I think it's always important. It's always been important as a human because it makes our lives richer on one side, but it also helps us deal with ourselves and our excitement, but also our fears. In the 21st century, I think it's more important than ever because we have more time, not only in length, but also in quantity, within a quantum of time. And also because our effect on each other is enormous. Our effect on the planet is enormous. By engaging in social networks, by doing a podcast, by doing almost anything, you influence so many others, and not just others as humans, but you influence almost everything around you.

Lucas Perry: So, in this project of living an examined life in the 21st century, who do you take most inspiration from? Or who are some of the wisest people throughout history who you look to as examples of living a really full human life?

Nicolas Berggruen: Right. So, what is, let's call it the best life, or the best example of an examined life? And I would argue that the best example that I know of, since I mentioned it, even though it's an incredibly imperfect one, is the life, at least the fictional life, in the book of Hermann Hesse, Siddhartha, where Siddhartha goes through different chapters, in essence different lives, during his life. And each one of them is exciting. Each one of them is a becoming, a discovery. And each one of them is very imperfect. And I think that reflects the life of someone who makes it a mission to understand and to find themselves or find the right life. And it tells you how difficult it is. It also tells you how rich it can be and how exciting it can be and that there is no right answer.

On the other hand, there are people who may be lucky enough who never question themselves. And they may be the ones who live actually the best lives because, by not questioning themselves, they just live a life almost as if they were dealt a set of cards, and that's the beginning and the end. And they may be the luckiest of all, or the least lucky because they don't get to live all the potential of what a human life could be.

So, it's a long-winded answer to say I don't think there is an example. I don't think there is a model life. I think that life is discovery, in my mind, at least for me. It's living, meaning the experience of life, the experience of change, allowing change. And that means there will never be perfection. You also change. The world changes. And all of these become factors. So, you don't have a single answer. And I couldn't point to a person who is the best example.

That's why I go back to Siddhartha because the whole point of the story of Siddhartha, at least the story by Hermann Hesse, is that he struggled going through different ways of living, different philosophies, different practices. All valid. All additive. And even the very end in the story, where in essence before his death he becomes one with the world is actually not the answer. So, there is no answer.

Lucas Perry: Hopefully, we have some answers to what happens to some of these technological questions in the 21st century. So, when you look at our situation with artificial intelligence and nuclear weapons and synthetic biology and all of the really powerful emerging tech in the 21st century, what are some ideas that you feel are really, really important for this century?

Nicolas Berggruen: I think what we've discovered through millennials now, but also through what the world looks like today, which is more and more the coexistence, hopefully peaceful coexistence, of very, very different cultures. We see that we have two very powerful factors. We have the individual and the community. And what is important, and it sounds almost too simple and too obvious, but I think very difficult, is to marry the power, the responsibilities of the individual with that of community. And I'm mentioning it on purpose because these are totally different philosophies, totally different cultures. I see that there's always been a tension between those two.

And the technologies you're talking about will empower individual agents even more. And the question is, will those agents become sort of singular agents, or will they become agents that care about others or in community with others? And the ones who have access to these agents or who control these agents or who develop these agents will have enormous influence, power. How will they act? And will they care about themselves? Will they care about the agents? Will they care about the community? And which community? So, more than ever, I think we have those questions. And in the past, I think philosophers and religious thinkers had a way of dealing with it, which was very constructive in the sense that they always took the ideas to a community or the idea of a community, living the principles of an idea one way or another. Well, what is it today? What is a community today? Because the whole world is connected. So, some of these technologies are technologies that will have an application way beyond a single culture and a single nation or a single system.

And we've seen, as an example, what happened with the COVID pandemic, in my mind, accelerated every trend and also made every sort of human behavior and cultural behavior more prevalent. And we can see that with the pandemic, technology answered pretty quickly. We have vaccines today. Capital markets also reacted quickly. Funded these technologies. Distributed them to some extent. But where things fell down was around culture and governance. And you can see that everybody really acted for themselves in very different ways, with very little cooperation. So, at a moment when you have a pandemic that affects everyone, did we have global cooperation? Did we have sharing of information, of technology? Did we have global practices? No. Because we didn't, we had a much worse health crisis, incredibly unevenly distributed. So, health, but also economic and mental health outcomes, very different depending where you were.

So, going back to the question of the powerful technologies that are being developed, how are we going to deal with them? When you look at what happened recently, and the pandemic is obviously a negative event, but powerful event. You could say it's technology. It's a form of technology that spread very quickly, meaning the virus. Well, look at how we behaved globally. We didn't know how to behave. And we didn't behave.

Lucas Perry: It seems like with these really powerful technologies, that it will enable very few persons to accumulate a vast amount of wealth and power. How do you see solutions to this problem of the wealth still being inherited by like... more evenly and distributed by the rest of humanity, as technologies will increasingly empower a few individuals to have control and power over that wealth and technology?

Nicolas Berggruen: In my mind, you're right. I think that the concentration of power, the concentration of wealth will only be helped by technology. With technology, with intellectual property, you create more power and more wealth, but you need less and less people and less and less capital. So, how do you govern it? And how do you make it fair?

Some of the thinking that I believe in and that we've also been working on at the Institute is the idea of sharing the wealth, but sharing the wealth from the beginning, not after the fact. So, our idea is simply from an economic standpoint, as opposed to redistribution, which is redistributing the spoils through taxes, which is not only toxic, but sort of you're transferring from the haves to the haves-not. So, you always have a divide.

Our thinking is make sure that everybody has a piece of everything from the beginning. Meaning, let's say tomorrow Lucas starts a company, and that company is yours. Well, as opposed to it being yours, maybe it's 80% of yours, and 20% goes to a fund for everyone. And these days, you can attribute everyone as individuals through technology, through blockchain. You can give a piece of Lucas' company to everyone on paper. So, if you become very successful, everybody will benefit from your success. It won't make a difference to you because if you have 80% or 100%, you'll be successful one way or another, but your success will be the success of everyone else. So, everyone is at least in the boat with Lucas' success. And this kind of thinking I think is possible, and I think actually very healthy because it would empower others, not just Lucas. And so, the idea is very much, as technology as wealth becomes even more uneven, make sure it's shared. And as opposed to it being shared through redistribution, make sure everybody is empowered from the beginning, meaning, has a chance to access it economically or otherwise.

The issue still remains governance. If whatever you're creating is the most powerful AI engine in the world, what happens to it? Besides the economic spoils, which can be shared the way I described it, what happens to the technology itself, the power of the technology itself? How does that get governed? And I think that's very early days. And nobody has a handle of it because if it's yours, you, Lucas, will design it and design the constraints or lack of constraints of the engine. And I do think that has to be thought through. It can't just be negative; it also has to be positive. But they always come together. Nuclear power creates energy, which has beneficial and empowers weapons, which is not. So, every technology has both sides. And the question is you don't want to kill the technology out of fear. You also don't want to empower the technology where it becomes a killer. So, we have to get ahead of thinking these things through.

I think a lot of people think about it, including the technologists, the people who develop it. But not enough people spend time on it and certainly not across disciplines and across cultures. So, technologists and policymakers and philosophers and humans in general need to think about this. And they should do it in the context of let's call it Silicon Valley, but also more old-fashioned Europe, but also India and China and Africa, so that it includes some of the thinking and some of the cultural values that are outside of where the technology is developed. That doesn't mean that the other ideas are the correct ones. And it shouldn't mean that the technology should be stopped. But it does mean that the technology should be questioned.

Lucas Perry: It seems like the crucial question is how do we empower people in the century in which basically all of their power is being transferred to technology, particularly the power of their labor?

Right so, you said that taxing corporations and then using that money to provide direct payments to a country's population might be toxic. And I have some sense of the way in which that is enfeebling, though I have heard you say in other interviews that you see UBI as a potential tool, but not as an ultimate solution. And so, it seems like this, you call it universal basic capital, which is where this, say, 20% of my company is collectively owned by the citizens of the United States, that this puts wealth into the pockets of the citizenry, rather than being completely disconnected from the companies and not having any ownership in them.

I'm curious whether this really confers the kind of power that would be really enabling for people because the risk seems like people lose their ability to perform work and to have power at workplaces and then they become dependent on something like UBI. And then the question is, is whether or not democracy is representative enough of their votes to give them sufficient power and say over their lives and what happens and how technology is used?

Nicolas Berggruen: Well, I think there's a lot of pieces to this. I would say that the... let's start with the economic piece. I think UBC, meaning universal basic capital, is much more empowering and much more dignified than universal basic income, UBI. UBI is, in essence, a handout to even things out. But if you have capital, you have participation and you're part of the future. And very importantly, if you have a stake in all the economic agents that are growing, you really have a stake, not only in the future, but in the compounding, in terms of value, in terms of equity, of the future. You don't have that if you just get a handout in cash.

The reason why I think that one doesn't exclude the other, you still need cash to live. So, the idea is that you could draw against your capital accounts for different needs, education, health, housing. You could start a business. But at times you just need cash. If you don't have universal basic capital, you may need universal basic income to get you through it. But if it's well done, I think universal basic capital does the job. That's on the economic side.

On the side of power and on the side of a dignity, there will be a question because I think technology will allow us, that's the good news, to work less and less in the traditional way. So, people are going to have more and more time for themselves. That's very good news. 100 years ago, people used to have to work much more hours in a shorter life. And I think that the trend has gone the other way. So, what happens to all the free time? A real question.

And in terms of power, well, we've seen it through centuries, but increasingly today, power, and not just money, but power, is more concentrated. So, the people who develop or who control, let's say, the technological engines that we've been talking about really have much more power. In democracies, that's really balanced by the vote of the people because even if 10 people have much more power than 100 million people, and they do, the 100 million people do get to vote and do get to change the rules. So, it's not like the 10 people drive the future. They are in a very special position to create the future, but in reality, they don't. So, the 100 million voters still could change the future, including for those 10 people.

What's interesting is the dynamics in the real world. You can see about big tech companies. This is ironic. Big tech companies in the West, they're mainly in the US. And the bosses of the big tech companies, let's say Google or Facebook, Amazon, really haven't been disturbed. In China, interestingly enough, Alibaba, Jack Ma was removed. And it looks like there's a big transition now. ByteDance, which is the owner of TikTok. So, you can see, interestingly enough, in democracies, where big changes could be made because voters have the power, they don't make the changes. And in autocracies, where the voters have no power, actually the changes have been made. It's an ironic fact.

I'm not saying it's good. I am not saying that one is better than the other. But it's actually quite interesting that in the case of the US, Washington frankly has had no influence, voters have had pretty much no influence, when at the other side of the world, the opposite has happened. And people will argue, "Well, we don't want to live in China where the government can decide anything any day." But going back to your question, we live in an environment where even though all citizens have the voting power, it doesn't seem to translate to real power and to change. Voters, through the government or directly, actually seem to have very little power. They're being consulted in elections every so often. Elections are highly polarized, highly ideological. And are voters really being heard? Are they really participants? I would argue in a very, well, manipulated way.

Lucas Perry: So, as we're coming to the end here, I'm curious if you could explain a future that you fear and also a future that you're hopeful for, given the current trajectory of the race between the power of our technology and the wisdom with which we manage it.

Nicolas Berggruen: Well, I think one implies the other. And this is also a philosophical point. I think a lot of people are thinking sort of isolates one or the other. I believe in everything being connected. It's a bit like if there is light, that means there is dark. And you'll say, "Well, I'm being a little loopy." It's a bit like...

Lucas Perry: It's duality.

Nicolas Berggruen: Yeah. And duality exists by definition. And I would say in the opportunity that exist in front of us, what makes me optimistic... and my feeling is if you live, you have no choice but to be an optimist. But what makes me optimistic is that we can, if we want, deal with things that are planetary and global issues. We have technologies that are going to hopefully make us healthier, potentially happier, and make our lives more interesting. That gives us also the chance, but also the responsibilities to use it well. And that's where the dangers come.

We have, for the first time, two totally different political and cultural powers and systems that need to coexist. Can we manage it?

Lucas Perry: China and the US.

Nicolas Berggruen: Yes. China and the US. Technologically, between AI, gene editing, quantum computing, we are developing technologies that are extraordinary. Will we use them for right common good and wisely? We have a threat from climate, but we also have an opportunity. This opportunity is to address those issues, to a little bit like what happened with the pandemic, sort of create sort of the vaccines for the planet, if you want, because we are forced to do it. But then the question is, do we distribute them correctly? Do we do the fair thing? Do we do it in a way that's intelligent and empowering? So, the two always come together. And I think we have the ability, if we're thoughtful and dedicated, to construct, let's say, a healthy future.

If you look at history, it's never been in a straight line. And it won't be. So, there'll be, I hate to say, terrible accidents and periods. But over time I think our lives have become richer, more interesting, hopefully better. And in that sense, I'm an optimist. The technologies are irresistible. So, we'll use them and develop them. So, let's just make sure that we do it in a way that focuses on what we can do with them. And then what are the minimums, in terms of individuals, economically, in terms of power, voice and protection? And what the minimums in terms of addressing cooperation between countries and cultures, and with addressing planetary issues that are important and that have become more front and center today?

Lucas Perry: All right. So, as we wrap up here, is there anything else that you'd like to share with the audience? Any final words to pass along? Anything you feel like might be left unsaid?

Nicolas Berggruen: I think your questions were very good. And I hope I answered some of them. I would say that the journey for us, humans, as a species, is only getting more exciting. And let's just make sure that we are... that it's a good journey, that we feel that we are at times the conductor and the passenger both, not so bad to be both, in a way that you could say, "Well, listen, we're very happy to be on this journey." And I think it very much does depend on us.

And going back to your very first question, it depends on some of our wisdom. And we do have to invest in wisdom, which means we have to invest in our thinking about these things because they are becoming more and more powerful, not just in the machines. We need to invest in the souls of the machines. And those souls are our own souls.

Lucas Perry: I really like that. I think that's an excellent place to end on.

Nicolas Berggruen: Well, thank you, Lucas. I appreciate it. Very good questions. And I look forward to listening.

Lucas Perry: Yeah. Thank you very much, Nicolas. It was a real pleasure, and I really appreciated this.

View transcript
Podcast

Related episodes

If you enjoyed this episode, you might also like:
All episodes

Sign up for the Future of Life Institute newsletter

Join 40,000+ others receiving periodic updates on our work and cause areas.
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram