header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

Peter Thiel: 2024, The Year Musk Gave Up on Mars

2025-07-16 15:14
Read this article in 82 Minutes
Peter Thiel 认为,我们正身处一个被技术停滞、制度僵化与恐惧治理包围的时代,而AI或许是唯一的破局变量
Original Article Title: The original tech right power player on A.I., Mars and immortality.
Original Article Author: Hosted by Ross Douthat
Original Article Translation: Peggy, Johyyn, BlockBeats


Editor's Note:


In recent years, the global community has been puzzled by the seemingly close and then distant relationship between the Silicon Valley tech elite and Trump. On one hand, people try to argue through various reasons why Musk would support Trump, and when their relationship soured, they began to find reasons to claim they had long predicted the rupture between the two. This podcast gives us a fresh perspective to understand Silicon Valley's choices in the era of Trump: How did Peter Thiel deeply influence Musk? What is the deepest anxiety of the entire Silicon Valley tech elite, known as "technological stagnation"?


The host of this conversation, Ross Douthat, is a columnist for The New York Times, a well-known conservative author, and has published several books on religion, politics, and society. In this in-depth conversation about AI, politics, and faith, he describes Peter Thiel as one of the most influential right-wing intellectuals globally over the past two decades. Peter Thiel once again puts forward his consistent judgment: Since the 1970s, technological progress has been slowing down, societal structures are becoming more rigid, and humanity may be entering a flexible form of authoritarianism under the guise of "order and security." He talks about why he initially supported Trump, why he holds cautious expectations for AI, and why he is wary of the potential technological authoritarianism of environmentalism and "global governance." He believes that the antichrist may not necessarily arise from a technological breakthrough but may stem from a compromise on "order and security."


This article has greatly inspired the BlockBeats editorial team, and we also hope to dedicate it to you in front of the screen. The following is the original content (slightly reorganized for ease of reading comprehension):


Ross Douthat: (Opening) Is Silicon Valley still ambitious? What should we be more afraid of: the end of the world or stagnation? Why is one of the world's most successful investors worried about the anti-Christ?


My guest today is the co-founder of PayPal and Palantir, as well as an early investor in Donald Trump and JD Vance's political careers. Peter Thiel is the original tech right-wing power player, known for funding various conservative and even countercultural ideas. But what we're here to talk about this time is his own views, because despite the slight drawback of being a billionaire, there is a good reason to believe that he is one of the most influential right-wing intellectuals of the past 20 years.


Peter Thiel, welcome to "Interesting Times" podcast.


Peter Thiel: Thank you for having me.


Technological Stagnation: Why Do We No Longer Have a Sense of the Future?


Ross Douthat: I want to take you back to about thirteen or fourteen years ago. At that time, you wrote an article for the conservative magazine National Review titled "The End of the Future." The basic argument of the article was: on the surface, the modern world appears dynamic, fast-paced, and constantly changing, but in reality, it lacks the energy people think it has. We have long entered an era of technological stagnation. The digital age has indeed brought some breakthroughs, but it has not fundamentally transformed the world as people originally expected. Overall, we are actually stuck in place.


Peter Thiel: Yes.


Ross Douthat: It wasn't just you who raised such a point at the time, but coming from you, it carried particular weight—after all, you are an "insider" in Silicon Valley, personally involved and enriched by the Internet revolution. So I'm curious: by 2025, do you still think that assessment holds true?


Peter Thiel: Yes, I still broadly agree with the "technological stagnation" view. This argument was never absolute. We are not saying the whole world has fallen into stagnation, but rather that to some extent, the pace of development has indeed slowed down. It hasn't reset, but from 1750 to 1970, over these more than two hundred years, it was a continuously accelerating era: ships got faster, railways got faster, cars got faster, planes got faster. This trend peaked in the era of supersonic jets and the Apollo moon landings. But since then, development in various aspects has started to slow down.


I have always seen the "digital world (the world of bits)" as an exception, which is why we have witnessed the development of computers, software, the internet, and mobile internet. Then, in the past ten to fifteen years, we have seen the emergence of cryptocurrency and the artificial intelligence revolution. I do think this is indeed a significant development. But the question is: is this really enough to free us from that pervasive sense of stagnation?


In the "Back to the Future" series of articles, you can start with an epistemological question: how do we determine whether we are in a state of stagnation or acceleration? Because a key feature of late modernity is the high degree of human specialization. For example, unless you spend half of your life studying string theory, can you not judge whether physics has progressed? What about quantum computing? And how should we assess progress in cancer research, biotechnology, and all these verticals? Furthermore, how does progress in cancer treatment weigh against a breakthrough in string theory? You must "weight" these different fields to evaluate overall technological progress.


In theory, this is an extremely difficult question to define. And the fact that it is so difficult to answer is in itself questionable: today, an increasing number of knowledge domains are controlled by a small number of "expert circles," who often only answer to themselves and rely on each other for validation. This inherent exclusivity is enough to cast doubt on the so-called technological progress.


So, yes, I believe that overall, we still live in a rather stagnant era, but that does not mean that everything has come to a complete standstill.


Ross Douthat: You just mentioned "Back to the Future." Recently, we took the kids to watch the original first movie, the one starring Michael J. Fox.


Peter Thiel: The plot is set to travel from 1955 to 1985, a span of 30 years. The timeline in "Back to the Future Part II" goes from 1985 to 2015—looking back now, that's already ten years in the past as the "future." The movie features flying cars, and the vision of 2015 was vastly different from the reality of 1985.


Ross Douthat: "Back to the Future Part II" indeed shaped a character similar to Donald Trump—Biff Tannen somehow wielded power, so in a sense, it still had some foresight. But more striking is how different the physical environment of that future world looks compared to the reality of 1985.
So, the most convincing point I've heard about "technological stagnation" is: if you put someone in a time machine and have them travel from one era to another, they would find themselves in a completely different world.


For example, if you were to travel back to 1860 —


Peter Thiel: Or let's say, from 1890 to 1970, that's roughly the 80 years you have lived through. It's a feeling like that.


Ross Douthat: But for my kids, even a child living in 2025, looking back at 1985, their feeling is: Well, cars seem a bit different, people didn't have smartphones yet, but overall, the world actually seems quite similar.


This is certainly not a statistical judgment, but —


Peter Thiel: This is a very intuitive common-sense judgment.


Ross Douthat: It's a kind of common-sense understanding. But what evidence would it take to convince you that we are in a takeoff period? Is it just economic growth? Or productivity improvement? What specific indicators of "stagnation" versus "vitality" do you usually pay attention to?


Peter Thiel: Of course, you can look at an economic indicator: How does your standard of living compare to that of your parents? If you are a 30-year-old millennial, how are you doing compared to the baby boomer parents when they were 30? What was their situation like back then?


There are also some cognitive issues: How many real breakthroughs have we actually made? How should these achievements be quantified? Where is the return on investment in research reflected?


Entering the scientific community, or more generally, entering the academic world, the returns do show a diminishing trend. Perhaps this is why it often gives people a sense of a cold, even Malthusian institutional feel: You have to constantly invest more and more just to get the same output. At some point, people will choose to give up, and the whole system will collapse as a result.


The Cost of Stagnation: When Society Loses Its Upward Trajectory


Ross Douthat: Let's continue discussing this topic. Why do we seek growth and vitality? Because, as you mentioned in your related discourse, in the 1970s, the Western world went through a cultural transformation, which you believe was the period when society started to slow down and move towards stagnation. People began to feel anxious about the cost of growth, especially the concern over environmental costs.


The core of this view is: We are already wealthy enough. If we continue to strive for more wealth, the Earth may not be able to bear it, and various ecological deteriorations will follow. So, we should be satisfied with the current state. So, what are the problems with this argument?


Peter Thiel: I think there is a deeper reason behind this stagnation. When faced with history, people usually ask three questions: first, what exactly happened? Second, how should we respond? But there is also a question sandwiched in between, often overlooked: why did it happen?


People have started to exhaust new ideas. I believe that to some extent, the system itself has also degenerated, becoming more risk-averse; some of these cultural shifts can be depicted. But at the same time, I also think people do have some very legitimate concerns about the future: if we continue to advance technological progress at an accelerated pace, does that also mean that we are accelerating towards an environmental disaster, a nuclear disaster, or some other kind of endgame?


But I think if we cannot find a path to "back to the future," I do feel that society will... I can't quite articulate it, but in any case, it will start to disintegrate and fail to function properly. The middle class, which I define as those who expect their children to live better lives than themselves. And once that expectation collapses, we no longer have a truly meaningful middle-class society. There may indeed be some kind of system, such as a feudal society, where everything remains stagnant and unchangeable; or there may be some path to a completely different social structure. But this is not the operating logic of the Western world, at least not the trajectory followed in the initial 200 years of America.


Ross Douthat: Do you think ordinary people will eventually not accept this stagnation? Will they choose to rebel and in the process bring down the surrounding order?


Peter Thiel: They may rebel. Or perhaps our institutions themselves are starting to fail—because the premise of these institutions is continuous growth.


Ross Douthat: Our fiscal budget, of course, is built on growth expectations.


Peter Thiel: Yes. For example, I'm not sure, but look at Reagan and Obama. Reagan represents "consumption-driven capitalism," which is in itself a contradictory statement: as a capitalist, you don't get rich by saving but by borrowing. And Obama represents "low-tax socialism"—which is also contradictory, just like "consumption-driven capitalism."


Of course, I prefer low-tax socialism rather than the high-tax kind, but I worry that this is not sustainable. One day, either taxes will rise, or the "socialist" policies will be abandoned. It is inherently very, very unstable. This is also the reason why people lack optimism: everyone does not think we have reached a stable, "Greta-style" future. Perhaps this model could work, but we clearly haven't gotten there yet.


Ross Douthat: Because her name is likely to come up again in this conversation, here we are referring to Greta Thunberg, the Swedish environmental activist who is widely known for her protests against climate change. To you, I think she symbolizes a kind of anti-growth, essentially authoritarian, environmentally driven vision of the future.


Peter Thiel: Yes. But we are not there yet. Not yet. If society truly stagnates, it will be a completely different society—


Ross Douthat: If you really live in a "degrowth," Scandinavian-style small village.


Peter Thiel: I'm not sure if it will be like North Korea, but it will definitely be very oppressive.


Ross Douthat: One thing that has always struck me—when a society feels stagnant, stuck in some kind of "decadence," a word I often use to describe this situation, people often start to crave a crisis, longing for a turning point to arrive so they have the opportunity to fundamentally change the direction of society. Because I tend to think that in affluent societies, when people's wealth reaches a certain level, they become too comfortable, too risk-averse, and if there isn't a crisis, it's very hard to get out of "decadence" and move towards some new possibility.


So for me, the original example was: after the "9/11" events, there was a general mindset among the conservative foreign policy establishment that we had been in decadence and stagnation for a long time, and now it was time to awaken and launch a new "Crusade" to reshape the world. Obviously, that outcome was very bad. But similar emotions...


Peter Thiel: But at the time, it was when Bush told everyone to go shopping.


Ross Douthat: So, is that not truly an "anti-decadence" in itself?


Peter Thiel: Broadly speaking, yes. There are indeed some circles of new conservative foreign policy who try to escape decadence through Live Action Role-Playing (LARPing). But the mainstream is still the Bush administration faction, telling everyone, "It's time to go shopping."


Ross Douthat: So, in order to escape decadence, how much risk should people be willing to take on? Indeed, there seems to be a danger: those who want to resist decadence often need to embrace massive uncertainty. They have to stand up and say: you see, we currently have a beautiful, stable, comfortable society, but guess what? Maybe we need a war, a crisis, or even a complete government restructuring. They must face danger, even willingly dive into it.


Peter Thiel: Well, I'm not sure I can give an exact answer, but my directional judgment is: we should take on more risk, do more things. The scope of our actions should far exceed what it is now.


I can go through these verticals one by one. For example, in the biotechnology area, diseases like dementia and Alzheimer's—over the past four or five decades, we have made almost no progress. People have always been entangled in the amyloid-beta protein pathway, which clearly has no effect. Now it seems more like a ridiculous vested interest game, with industry participants constantly reinforcing themselves and endorsing each other. So yes, in this area, we really do need to take on more, take on bigger risks.


Ross Douthat: To make the discussion more specific, I'd like to continue using this example for a bit longer. My question is: when we say "we need to take on more risk in anti-aging research," what does that specifically mean? Does it mean that the FDA needs to step aside and allow anyone with a new Alzheimer's therapy to sell directly on the open market? In the medical field, what does "taking risks" actually look like?


Peter Thiel: Yes, you definitely need to take on more risk. If you have a life-threatening disease, you may already be willing to try more radical methods. Researchers should also be able to take on more risk.


Culturally, what comes to mind for me is the scene of "early modernity"—a time when people believed we would eventually cure diseases, even achieve radical longevity. Immortality was once a grand goal of early modernity. From Francis Bacon to Condorcet, this was the case. Perhaps this notion is anti-Christian, perhaps it is a continuation of Christian thought—either way, it is competitive: if Christianity promises you bodily resurrection, then science, to "win," must also promise the same.


I remember around 1999 or 2000, when we were still running PayPal, one of my co-founders, Luke Nosek, was particularly fascinated by Alcor and human cryopreservation, believing that people should freeze themselves. One time, we even took the whole company to a "cryonics party." Do you know what a "Tupperware party" is? It's the kind of event where plastic food containers are promoted at a gathering. At the "cryonics party," they weren't selling food containers...


Ross Douthat: Just freezing the head? Or is the entire body supposed to be frozen?


Peter Thiel: You can choose to freeze the entire body or just the head.


Ross Douthat: The "head-only cryopreservation" option is a bit cheaper.


Peter Thiel: What was particularly unsettling was that the printer had some issues at the time, so the cryopreservation protocols couldn't even be printed out.


Ross Douthat: Once again, a manifestation of technological stagnation, right?


Peter Thiel: But in hindsight, that was actually a sign of decline. In 1999, this idea was not yet mainstream, but there was still a small group of people among the Baby Boomer generation who firmly believed they could live forever. And that was probably the last generation of people who held such a belief. So even though I have always criticized the Baby Boomers, perhaps—even in this marginalized, narcissistic fantasy—we did lose something. At least there were some who believed that science would eventually cure all their diseases. And now, there is no millennial generation that would believe the same.


Political Bet: Why Support Trump and Populism?


Ross Douthat: However, I believe there are still some people who believe in another form of immortality. I think people's fascination with AI is somewhat related to the imagination of "transcending human limitations." I'll ask you specifically about that later. But for now, I want to talk about politics. When you initially raised the point of "stagnation," you were mainly focused on the technological and economic aspects, and what impressed me was: this point can actually be applied to a wide range of fields. When you wrote that article, you were also interested in "seasteading"—trying to establish a completely new political community outside the ossified Western system. But you later made a shift in the 2010s.


You were one of the very few early prominent Silicon Valley figures to openly support Donald Trump, possibly even the only one. You also supported some carefully chosen Republican Senate candidates, one of whom is now the Vice President of the United States. As an observer, after reading your discussions on "social decay," my understanding is that: you were actually engaged in a form of "political venture capital." You were betting on a group of potential disruptors to break the status quo, believing that taking such risks was worthwhile. Is that what you were thinking at the time?


Peter Thiel:


Of course, there were many different aspects at the time. On the one hand, there was that hope—hope that we could steer the "Titanic" heading towards the iceberg back on course, whatever metaphor you want to use, the point is to truly change the entire social trajectory.


Ross Douthat: Through political reform.


Peter Thiel: Perhaps a narrower wish is that at least we can have a conversation around these issues. So when Trump says "Make America Great Again" — is this actually a positive, optimistic, ambitious agenda? Or is it an extremely pessimistic assessment of the status quo, believing that we are no longer a great country?


I did not have high hopes that Trump could truly bring about positive change. But at the time, I felt that, for the first time in 100 years, a Republican was not repeating those syrupy, hollow Bushian platitudes. This did not mean that society had progressed, but at least we could start a genuine conversation. Looking back now, this idea was a foolish fantasy.


In 2016, I actually had two thoughts — those kinds of thoughts that just hovered at the edge of my consciousness — but I couldn't connect them at the time. The first was: if Trump lost, no one would be mad at me for supporting him; the second was: I felt he had a 50% chance of winning. And I had an implicit...


Ross Douthat: Why wouldn't anyone be mad at you if he lost?


Peter Thiel: It was just a very strange thing, and it really didn't matter. But at the time, I thought he had a half chance because the problems were indeed serious, and stagnation was frustrating. But the reality is, people weren't ready to confront these. Perhaps we have now taken that step, and by 2025, ten years after Trump took office, we can finally have this conversation. Of course, Ross, you are not that kind of left-wing zombie —


Ross Douthat: I've been labeled with all sorts of tags, Peter.


Peter Thiel: But as long as we can make some progress, I'm willing to accept it.


Ross Douthat: From your perspective, there seem to be two levels. The first level is: this society needs to be disrupted, needs to take risks; and Trump himself is a kind of disruption and risk. The second level is: Trump does dare to speak some truth about America's decline.


So, as an investor, a venture capitalist, what did you gain from Trump's first term?


Peter Thiel: Well...


Ross Douthat: Do you think, during Trump's first term, there were any anti-decadence or anti-stagnation measures taken? If any. Of course, it's also possible that you think there were none at all.


Peter Thiel: I think the whole process took longer than I expected, and it was slower. But at least we have now come to a moment when many people are beginning to realize that indeed there are some problems. And this is not the conversation I could have sparked between 2012 and 2014. At that time, I debated these issues in 2012, 2013, and 2014 with Eric Schmidt (former CEO of Google), Marc Andreessen (A16Z founder), and Bezos (founder of Amazon).


My position at the time was: "We are facing a stagnation problem," while their attitudes were all some version of "everything is developing smoothly." But I think that at least these three have made some corrections and adjustments to varying degrees later on. The whole Silicon Valley has also changed.


Ross Douthat: However, Silicon Valley has already done more than just "adjust."


Peter Thiel: Yes, regarding the stagnation problem.


Ross Douthat: Yes. But by 2024, a significant portion of Silicon Valley eventually supported Trump. The most famous of them, of course, is Elon Musk.


Peter Thiel: That's right. As I understand it, this is deeply connected to the "stagnation" issue. These things are always particularly complex, but I tend to see it this way—of course, I don't want to speak for everyone—such as Mark Zuckerberg, or Facebook, Meta. I think he doesn't really have a strong ideological stance. He hasn't thought so deep about these issues. The default stance is liberalism, and when liberalism doesn't work, what do you do? For years the answer has been: then do more. If something doesn't work, then escalate. Add another dose, invest a few billion dollars, completely "awaken," and the result is everyone starts to hate you.


At some point, people will think: Well, maybe this approach just doesn't work at all.


Ross Douthat: So they turned.


Peter Thiel: But that doesn't mean they support Trump.


Ross Douthat: Indeed, it's not supporting Trump, but whether in public or private discussions, there is indeed a feeling—in the context of 2024, regardless of whether it is like 2016, Peter, you are the only supporter, now "Trumpism" or "populism" could indeed become a driver of technological innovation, economic vitality, and so on.


Peter Thiel: Your statement is really, really too optimistic.


Ross Douthat: I know you are pessimistic. But people—


Peter Thiel: When you express it in an optimistic way, you are actually saying that these people will eventually be disappointed, they are destined to fail, something like that.


Ross Douthat: I mean, people did express a lot of optimism at that time. That's what I mean. Even though Elon Musk has expressed doomsday anxiety about the budget deficit leading to the extinction of humanity, when he entered the government, including those around him, were basically saying, "We are partnering with the Trump administration to achieve technological greatness." I think they were indeed optimistic.


Your position leans more towards pessimism, or rather realism. What I want to ask is your own judgment—rather than theirs. Do you think that Trump 2.0-style populism can become a vehicle for technological vitality?


Peter Thiel: For now, it is still our best choice. Can Harvard really cure Alzheimer's by continuing to stand still and doing things that have not worked for the past 50 years?


Ross Douthat: This sounds more like "It can't get any worse anyway, so let's try disruption." But the criticism of current populism is: Silicon Valley chooses to ally with populists, but these people don't care about science. They are not willing to invest in science. They just want to cut off funding for Harvard because they hate it. As a result, you end up not getting the kind of future-oriented investment that Silicon Valley originally hoped for. Doesn't this criticism hold?


Peter Thiel: To some extent, it does hold. But we need to go back to a more fundamental question: How well does the science system in our background work? The people of the New Deal era, despite many problems, did indeed aggressively promote science, you fund, you give money, you push for scaling up. And today, if there was another "Einstein" writing a letter to the White House, that letter would probably be lost in the mailroom. Something like the "Manhattan Project" is simply unimaginable.


Now we still refer to some things as a "moonshot," like when Biden talks about cancer research. But the "moonshot" of the 1960s actually went to the moon. And now, the "moonshot" often means something completely fictitious and doomed to never happen. When you hear "this thing needs a moonshot," it actually means: this thing is a lost cause. We don't need another Apollo project; we need the realization that this thing will never come true.


Ross Douthat: So it sounds like you're still in a position where, for you, perhaps unlike others in Silicon Valley, the value of populism lies in exposing illusions, lifting the veil. We're not yet at the stage where you expect the Trump administration to embark on a "Manhattan Project," a "moonshot" phase. It's more like—populism has helped us see that everything is fake.


Peter Thiel: You have to try to reconcile both. These two things are actually intertwined.


Take the deregularization of nuclear energy, for example: Someday, we will start building new nuclear power plants again, or design better ones, or perhaps even fusion reactors. So indeed, there is a part that is about deregulation, about deconstruction. But then you have to start the real construction—this is how things progress. In a sense, you have to clear the site first before you can start, maybe...


Ross Douthat: But are you personally no longer funding politicians?


Peter Thiel: I am conflicted on this issue. I think it is extremely important, but also highly toxic. So I am constantly wrestling with whether I should or should not...


Ross Douthat: What does "highly toxic" mean to you personally?


Peter Thiel: It is toxic for everyone involved. Because this is a zero-sum game, it's too crazy. To some extent...


Ross Douthat: Is it because everyone hates you, and they associate you with Trump? What has it done to you personally?


Peter Thiel: The toxicity lies in the fact that it happens in a zero-sum world. You feel the stakes are unusually high.


Ross Douthat: Have you gained enemies through this that you didn't have before?


Peter Thiel: It's harmful for everyone involved in various ways. It also involves a political proposition of "back to the future." You can't—this is one of the things I discussed with Elon in 2024. We talked a lot. I also shared with him an idea of a "seastead version": I said if Trump didn't win, I wanted to leave the U.S. Elon replied: There's nowhere to go, we have nowhere to go.


And then you always think of the rebuttal afterwards. Probably about two hours after we finished dinner, when I got home, I finally realized: Wow, Elon, you no longer believe in "going to Mars." In 2024, it was the year Elon no longer believed in Mars—not that he didn't believe it was a scientific technological project, but he no longer believed in its potential as a political project. Mars was originally a political proposition, meant to create an alternative societal model. But in 2024, Elon began to believe: even if you go to Mars, that socialist U.S. government, those woke AIs, will still be tagging along.


It was through our efforts that we brokered a meeting between Elon and DeepMind's CEO, Demis Hassabis.


Ross Douthat: Demis leads an artificial intelligence company.


Peter Thiel: Yes. The crux of that conversation was Demis telling Elon, "I am working on the most important project in the world; I am building a superhuman-level AI."


Elon's response was, "I too am working on the most important project in the world; I am making humanity a multiplanetary species." Then Demis said, "You know, my AI will go to Mars with you." Elon fell silent after hearing this. But in my telling of this history, this idea took several years to truly resonate with him. It wasn't until 2024 that he really grappled with this issue.


Ross Douthat: However, that doesn't mean he no longer believes in Mars itself. It just means that he thinks in order to achieve "going to Mars," he first has to win the battle on budget deficits or the "awakening culture."


Peter Thiel: Yes, but what does Mars signify?


Ross Douthat: What does Mars signify?


Peter Thiel: Is it merely a scientific project? Or is it, as depicted by Heinlein, a liberal paradise, such as using the Moon as an experimental field for an ideal society?


Ross Douthat: A vision of a new society, inhabited by many... descendants of Elon Musk.


Peter Thiel: Hmm, I'm not sure if this vision has already been concretized to that extent, but if you really start to concretize it, you'll realize that Mars shouldn't just be a scientific project; it should be a political project. And once you concretize it, you must start to seriously consider: Will awakened AI go with you? Will socialist governments go with you? Then you may not just be "going to Mars"; you will need to think of other ways.


The Light and Shadow of AI: Is it a Growth Engine or a Mediocrity Amplifier?


Ross Douthat: So, awakened artificial intelligence (AI), at least in this current period of stagnation, seems to be an exception—it is one of the few areas that has truly made significant progress, and this progress has been unexpected for many. At the same time, it is also an exception in the political realm we just mentioned. In my view, the Trump administration did indeed give AI investors what they wanted here to some extent: taking a step back and deregulating, while also promoting public-private partnerships. So it is both at the forefront of technological advancement and a point of government reengagement.


You are also an investor in the AI field. What do you think you are investing in?


Peter Thiel: This is a long story with many levels. We can start by asking a question: How important do I think AI really is? My "clumsy" answer is: It is definitely not an empty hype "air burger," but it is also not a complete transformation of society. My current estimate is that it is probably at a similar level to the late 1990s Internet. I am not sure if it is enough to truly end long-term stagnation, but perhaps enough to nurture some great companies.


For example, the Internet once drove GDP growth by about 1% annually for ten to fifteen years, also leading to a certain increase in productivity. So, my initial positioning on AI is probably at this level.


This is currently our only growth engine. In a way, its "all-in" strategy is somewhat unhealthy. I hope we can make progress on multiple dimensions simultaneously, such as advancing the Mars mission, such as overcoming dementia. But if we only have AI for now, then I will accept it. Of course, it has risks, undoubtedly this technology is dangerous. But it also brings...


Ross Douthat: So are you skeptical of the "superintelligence cascade theory"? The rough idea of this theory is: once AI is successful enough, it will become extremely smart, thereby driving breakthroughs in the physical world for us. In other words, humans may not be able to cure dementia, may not figure out how to build the perfect rocket factory, but AI can.


Once it exceeds a certain threshold, what it brings is not only digital progress but also sixty-four other paths of progress. It sounds like you don't quite believe it, or you think this possibility is unlikely?


Peter Thiel: Yes, I am not sure if this is the key issue.


Ross Douthat: What does "key" mean? What does the "gating factor" you mentioned refer to?


Peter Thiel: This may be an ideology in Silicon Valley. Counterintuitively, it may be more liberal than conservative. In Silicon Valley, people are exceptionally obsessed with intelligence (I.Q.), everything revolves around the "smart people": if we have more smart people, we can create more great things.


But from an economic perspective, the argument against it is: in reality, people become more at a loss the smarter they are. They may not be more productive because they do not know how to apply their intelligence, and our society does not know how to accept them, making it difficult for them to integrate into the mainstream. This means that the real problem may not be the "degree of intelligence" at all, but rather that our social structure itself is problematic.


Ross Douthat: So is this ultimately a limitation of intelligence itself, or is it a problem inherent to the type of personality that superintelligence has brought about?


I'm actually not very supportive of the view that "as long as we elevate the level of intelligence, all problems can be easily solved." Previously, when I was doing a podcast with an artificial intelligence accelerationist, we discussed this issue. For example, we raise intelligence to a certain level, and Alzheimer's could be conquered; we enhance intelligence, and artificial intelligence can design a process to manufacture a billion robots overnight. My skepticism about intelligence lies in the fact that I believe it ultimately has limits.


Peter Thiel: Yes, this is indeed hard to prove. Such matters have always been difficult to falsify.


Ross Douthat: Until we truly possess superintelligence.


Peter Thiel: But I do agree with your intuition. Because in reality, we already have many highly intelligent people, yet many things are still stuck for other reasons. So maybe the problem is fundamentally unsolvable, which is the most pessimistic view. Perhaps dementia cannot be cured at its core, and perhaps death itself is an unsolvable problem.


Or, it could be a problem with the cultural structure. The issue may not lie with a particular intelligent individual, but in how they are accepted by this society. Can we tolerate "heretical intelligent people"? Perhaps you need these "nonconformist" intelligent individuals to drive the occurrence of mad experiments. But if AI is just conventionally "smart," and if we simply understand "awakening" as "excessive compliance" or "political correctness," that kind of intelligence may not bring about real breakthroughs.


Ross Douthat: So are you concerned about a possible future: Artificial intelligence itself becoming the representative of a "new kind of stagnation"? It's highly intelligent and creative, but everything is within a framework, like the algorithm of Netflix: constantly producing "okay" movies, content that people are willing to watch but not necessarily love; generating a lot of mediocre ideas; marginalizing human labor without new breakthroughs. It alters existing structures but, in a sense, deepens stagnation. Is this the scenario you are worried about?


Peter Thiel: This is entirely possible. It is indeed a risk. But ultimately, I will still come to this judgment: I still think we should try AI. In comparison, the alternative is complete stagnation.


Yes, it may bring about many unforeseen scenarios. For example, the combination of AI and military drones could be very dangerous, very dystopian, and unsettling, but it will ultimately bring about some kind of change. But if you don't have AI at all, then really nothing will happen.


In fact, there has been a similar debate on the internet: whether the internet has exacerbated conformism, whether it has made the whole society more "awake." The reality is, it did not bring about the explosion of ideas and diversity as the liberals fantasized in 1999. But if you ask me, I still believe that the emergence of the internet is better than a world without the internet. And in my view, AI is the same: it's better than "having nothing," and "having nothing" is its only alternative.


You see, we are only discussing AI itself, which is actually silently admitting that, apart from it, we are almost at a standstill.


Ross Douthat: But clearly, there is also a group of people in the AI field: their expectations of AI are much grander, more transformative, and even more utopian than you have expressed. You just mentioned that modern society once promised radical life extension for humans—and now such promises are disappearing. But obviously, many deeply involved in AI actually see it as a path to "transhumanism," a tool to transcend the constraints of the flesh—either to create "posthuman species," or to achieve the fusion of the human brain with a machine.


Do you think these are all far-fetched fantasies? Or is this a "high-concept" for fundraising? In your view, are they hype? Delusions? Or are you genuinely concerned about this?


Peter Thiel: Well, yes.


Ross Douthat: I think you still hope for the continuation of humanity, right?


Peter Thiel: Um—


Ross Douthat: You're hesitating.


Peter Thiel: I don't know. I, I would...


Ross Douthat: This is a long hesitation !!!


Peter Thiel: There are too many implicit questions in this.


Ross Douthat: So let me ask you directly: Should humanity continue to exist?


Peter Thiel: Yes.


Ross Douthat: Okay.


Peter Thiel: But I also hope that we can fundamentally address these issues. So... I'm not very sure, yes—this is "transhumanism." Its ideal state is to transform our human natural body into an immortal body.


These views are often compared to gender transition. For example, in the context of transgender issues, some people are crossdressers, crossing gender expression through clothing; while others are transsexuals, possibly undergoing surgery to change their reproductive organs from male to female or vice versa. Of course, we can discuss what these surgeries actually change and how much they change.


But the criticism of these transitions is not that they are "strange" or "unnatural," but rather: Isn't this too insignificant? What we want is not just cross-dressing or organ replacement, we want a more profound transformation—a change in a person's inner self, mind, and even the entire body.


By the way, orthodox Christianity's criticism of this kind of transhumanism is not that it is too radical, but that it is far from sufficient. You have changed the body, but you have not yet changed the soul, changed the entire being.


Ross Douthat: Hold on. I basically agree with the view you hold—that religion should be a friend of scientific and technological progress. I also believe that any faith in divine intention must acknowledge one fact: we have indeed made progress and achieved many things that were almost unimaginable to our ancestors.


But the ultimate promise of Christianity still seems to be: through God's grace, one can attain a perfected body and a perfected soul. And the person who attempts to achieve this goal through a bunch of machines may ultimately become a character in a dystopian story.


Peter Thiel: Well, let's make this question a bit clearer.


Ross Douthat: Of course, you could also have a kind of heretical form of Christianity that would offer a different interpretation.


Peter Thiel: Yes, I don't know. But I noticed that the word "nature" is not mentioned once in the entire Old Testament. In this sense, the revelation tradition of Jewish Christianity, as I understand it, is actually a spiritual tradition that transcends nature. It talks about transcendence, talks about overcoming. And the closest expression to "nature" is probably: man is fallen. From a Christian perspective, this "fall" can almost be seen as a natural state: man is flawed, incomplete. This statement is true. But in a sense, faith means that you have to rely on God's power to transcend it, to overcome it.


Ross Douthat: Exactly. But those who are currently trying to build the "God of the Machine," of course, not including you, do not see themselves as collaborating with Yahweh, the Lord of Hosts.


Peter Thiel: Of course, of course. But...


Ross Douthat: They believe they are using their own power to construct immortality, right?


Peter Thiel: In fact, we are discussing many levels of this. Returning to my earlier point, my criticism is actually that they are not ambitious enough. From a Christian perspective, they are far from radical enough.


Ross Douthat: What they lack is moral and spiritual ambition.


Peter Thiel: So, are they ambitious enough on a physical level? Are they truly transhumanists? Honestly, the whole idea of body freezing now seems more like a retro relic from 1999, and not many people are actually doing it anymore. So they are not transhumanists in the bodily dimension. Maybe they have shifted towards the "consciousness upload" path? But, to be honest, I would rather have my own body than just a simulation of my computer program.


Ross Douthat: I agree with that as well.


Peter Thiel: So, in my view, even the concept of uploading is a step back from body freezing. But even so, it is part of this topic—discussing this point makes it difficult to assess. I'm not saying they are all making things up, or that it's all fake, but I also...


Ross Douthat: Do you feel that some of it is fake?


Peter Thiel: I wouldn't say it's fake because "fake" implies they are lying. But what I mean is, these are not their real priorities.


Ross Douthat: I see.


Peter Thiel: So, we see a lot of abundance language, an optimistic narrative. A few weeks ago, I spoke with Elon about this, he said that in ten years, the United States will have 1 billion humanoid robots. I said: If that's true, then you no longer need to worry about the budget deficit. Because by then, the growth will be extremely rapid, and economic growth itself can solve this problem. But he is still worried about the deficit. That certainly doesn't mean he doesn't believe in the prospect of "one billion robots," but it may mean: he hasn't thought through the implications of this expectation, or he doesn't think it will bring about a fundamental economic restructuring; or maybe there is just a great deal of uncertainty in this expectation. So, in a way, these future blueprints have not really been thoroughly thought through.


If there's one criticism I have of Silicon Valley, it's that: it always avoids the question of the "meaning of technology." Discussions often get stuck at the micro level, such as "What is AI's IQ-ELO score?" or "How should we define AGI?" We get bogged down in these never-ending technical details, but we overlook those more meaningful middle-level questions, which are actually the ones that are truly important: such as what does it mean for the fiscal deficit? What does it mean for the economic structure? What does it mean for geopolitics?


One recent issue we discussed is: Will artificial intelligence change the course of some human wars? If we are entering an accelerating AI revolution, then at the military level, will other countries fall behind? From an optimistic perspective, this gap might create a deterrent effect: other countries will know they have lost. But from a pessimistic perspective, it might instead prompt them to act more quickly—because they realize that they either act now or miss the opportunity forever. If they don't fight now, they may never have a chance again.


Whichever the case may be—this will be an extremely significant event—but the issue is: we haven't thought these through yet. We haven't seriously discussed what AI means for geopolitics, nor have we seriously considered its impact on the macroeconomy. These are the issues that I hope we collectively delve deeper into.


Dystopian Imagination: Who is the Real "Antichrist"?


Ross Douthat: You have indeed been focusing on another more macroscopic issue—let's continue along the "religious" line of thought. You have recently frequently mentioned the Antichrist concept—this is a term in a Christian context, also related to eschatology. For you, what does the "Antichrist" mean? How do you understand this concept?


Peter Thiel: How much time do we have left?


Ross Douthat: We have enough time to talk about the Antichrist.


Peter Thiel: Well, actually, I can talk about this topic for a long time. I think when we talk about existential risks or challenges facing humanity, we always face a framing problem. These risks are often wrapped in a sci-fi syntax of "technological runaway, heading towards a dystopia": such as nuclear war, environmental disaster, or more specifically, climate change, although we can list many other risks, such as bioweapons, various sci-fi doomsday scenarios. Of course, artificial intelligence does bring certain types of risks.


But I have always felt that if we really want to establish a framework for discussing existential risks, we should also discuss the possibility of another "bad singularity," I refer to as a "global totalitarian state." Because when facing all of the above risks, the default political solution path often leads to a form of "global governance." For example, how do we control nuclear weapons? Imagine a truly empowered United Nations that can control all nuclear weapons, coordinating governance through a global political order. Similarly, when we talk about how to deal with artificial intelligence, we also come up with similar answers: we need "global compute governance," we need a world government to regulate all computers, record every keystroke, ensure that no one can write a dangerous AI program. I have been thinking, could this path actually be "jumping out of the frying pan and into the fire"?


The atheistic version of the statement is: "One world, or nothing." This is the title of a short film produced by the American Scientific Association in the 1940s. The film starts with a nuclear apocalypse destroying the world, and then concludes that in order to avoid destruction, we must establish a world government. "One world, otherwise nothing." The Christian version, in a way, poses a similar question: Antichrist or Armageddon? You either accept a single world order ruled by the Antichrist, or we sleepwalk into Armageddon (the final battleground of the biblical end-of-the-world global war). Ultimately, "one world or nothing," "Antichrist or doomsday," are different expressions of the same question.


I have many thoughts on this issue, but there is a key flaw that always concerns me—many narratives about the Antichrist always skip over a central question: how exactly does the Antichrist take over the world? The book says that he convinces everyone through his demonic speeches and hypnotic language. This sounds like some kind of "Demonium Ex-Machina."


Ross Douthat: Yes, that's completely implausible.


Peter Thiel: Indeed, it is a glaringly implausible plot hole. But I believe we have actually found an explanation for this loophole nowadays: the way the Antichrist takes over the world is not through charming people but by continuously creating "apocalyptic anxiety." He will repeatedly emphasize that "Armageddon is coming," "the risk to survival is imminent," and use this as a reason to propose regulating everything. He is not the "evil tech genius" image from the 17th or 18th century, not a scientific madman sitting in a lab inventing a doomsday device. The reality is, people are much more cautious and fearful than that.


In our era, what truly sparks political resonance is not "technological liberation" but "technological fear." People will say, "We need to stop, we can't let technology run amok anymore." In the 17th century, we might still imagine a technological authoritarian figure like Dr. Strangelove or Edward Teller taking over the world; but today, it is more likely that this role is assumed by someone like Greta Thunberg.


Ross Douthat: I would like to propose a middle way possibility. The Antichrist we used to fear was a technosorcerer with superpowers; now, we are more likely to fear the person who promises to "control technology and ensure security." In your view, this is leading toward a kind of universal stagnation, right?


Peter Thiel: Yes, that description is closer to the path of events as I see it unfolding.


Ross Douthat: But do you think people are still afraid of a "17th-century-style" Antichrist? Are we still afraid of a Dr. Strangelove-like figure?


Peter Thiel: Yes, deep down, people still fear that old-fashioned Antichrist image.


Ross Douthat: But are you saying that a true Antichrist would leverage that fear and say, "You must follow me to escape the surveillance state, the Terminator, and nuclear doomsday"?


Peter Thiel: Exactly.


Ross Douthat: My view is that, given the current state of the world, for people to believe in this fear as real, there needs to be some form of technological breakthrough that makes that doomsday threat tangible. In other words, I can understand if the world truly believes AI is about to destroy humanity, it may indeed turn to a leader promising "peace and regulation." But to reach that state, there must be some kind of "technological leap" first, meaning the apocalyptic scenario of accelerationism must be partially realized.


In order to usher in the "peace and security" Antichrist you're talking about, the prerequisite is actually a significant technological breakthrough. For example, a fundamental flaw of 20th-century totalitarianism was its "lack of knowledge capacity": it couldn't grasp what was happening worldwide. So you need AI or other new technologies to address this information bottleneck and provide data support for totalitarian rule. In other words, the "worst-case scenario" you envision also depends on a genuine technological leap — which is then harnessed to maintain a stagnant totalitarian rule. We can't just jump directly from our current technological state to that.


Peter Thiel: Well, that path does exist —


Ross Douthat: And the reality now is that Greta Thunberg is still protesting against Israel on a ship in the Mediterranean. I really don't see the "security brought by AI," the "tranquility brought by technology," or the "security under climate control" becoming a powerful, global political rallying cry today. Without real technological acceleration or a visceral fear of disaster, this discourse itself is also difficult to take effect.


Peter Thiel: These are indeed very difficult issues to assess, but I do think environmentalism is a very powerful force. I'm not sure if it's powerful enough to bring about a "unified global" totalitarian state, but it certainly has power.


Ross Douthat: In the current situation, I don't think it's quite there yet.


Peter Thiel: I would like to say that in Europe, environmentalism may be the only thing people still believe in. Their belief in the "green future" even surpasses concerns about Islamic law or the authoritarian rule of some countries. After all, the so-called "future" is a concept that looks different from the present. In Europe, only three imaginations of the future remain: green transformation, Islamic law, and authoritarianism. And "green" is clearly the most dominant narrative.


Ross Douthat: That is in a Europe that is in decline, is no longer dominating the world.


Peter Thiel: Of course, it is always nested in a specific context. We can see this when we look back at the history of nuclear technology development, as we ultimately did not move towards a global authoritarian rule model. But by the 1970s, there was an explanation about technological stagnation that felt that the rapid advancement of technology had made people fearful, and the "Baconian scientific spirit" also came to an end at Los Alamos.


Since then, society seems to have made up its mind: to stop here and not progress. And when Charles Manson took LSD in the late 60s and eventually turned to a path of murder, what he saw in the hallucinogen was an extreme worldview of freedom: you could act like the anti-hero in a Dostoevsky novel, everything was permitted.


Of course, not everyone became Charles Manson. But in that period of history that I am telling, everyone became as paranoid as he was, and ultimately, the hippies took over the culture...


Ross Douthat: But Charles Manson did not become an antichrist, nor did he take over the world. We were just talking about doomsday, and you...


Peter Thiel: But in my view, the story of the 70s is this: the hippies won. We landed on the moon in July 1969, and three weeks later, Woodstock opened. Looking back from today, that was the watershed moment of technological progress. The hippies won that cultural battle. Of course, I am not saying Charles Manson literally won.


Ross Douthat: Okay, let's circle back to the topic of "antichrist" and wrap it up. Your earlier statement sounded like you were in a "rearguard" position: environmentalism is already anti-progress enough, let's talk about this for now. Okay, we will accept that assessment for now.


Peter Thiel: I am not retreating; I am just pointing out that this force is indeed very powerful.


Ross Douthat: But the reality is, we are not currently living under the rule of the "enemy Christ." We are just in a state of stagnation. What you are suggesting is a possible future scenario: an order driven by fear that would make stagnation a permanent state. My view is: if that situation were to truly occur, it would inevitably be accompanied by some dramatic technological leap, akin to the level of transformation seen at Los Alamos, which would truly instill fear in humanity.


I want to ask you a specific question: you are an AI investor deeply involved in the development of Palantir, military tech, surveillance tech, and war-related technologies. In the scenario you just described: an "enemy Christ" who establishes a global order driven by human fear of technological change. It sounds like he would likely use the very tools you are building. For example, he might say, "We no longer need technological progress," but he would add, "I am very satisfied with Palantir's current achievements."


Isn't this a question to be worried about? Could there be such a historical irony: the person who was initially most concerned about the "enemy Christ" inadvertently accelerating his arrival?


Peter Thiel: Well, there are many different possibilities here. But I obviously do not think I am doing what you are suggesting.


Ross Douthat: I don't really believe you are doing that either. I just want to understand how a society can reach the point of willingly accepting "permanent authoritarian rule."


Peter Thiel: There can be many different levels of interpretation on this. But the words I just said, as a macro description of global technological stagnation, are they really that absurd? The entire world has been under "peace and securityism" for fifty years. This is the content of 1 Thessalonians 5:3, the slogan of the enemy Christ is "peace and security."


We have already handed over decision-making power to the FDA: it not only regulates drugs in the US, but actually sets global standards because other countries default to obeying it. And the US Nuclear Regulatory Commission, in a practical sense, also controls global nuclear projects. You cannot design a modular nuclear reactor in Argentina and start construction directly because they will not trust their own regulatory body and will ultimately defer to the US's opinion.


So, in the end, this fifty-year technological stagnation is indeed an issue that needs explaining. A common saying is: we have exhausted all innovative ideas. But another answer is: at the cultural level, some force no longer allows us to move forward. This cultural explanation can be "bottom-up." Perhaps humanity itself has undergone a transformation, becoming more docile, more easily accepting restrictions; or it can be "top-down." The governmental system itself has undergone some evolution and become a mechanism driving stagnation.


You see, nuclear power was supposed to become the key energy source of the 21st century. But it has already been globally "taken offline."


Ross Douthat: So, from your logic, we are actually already under a "mild version" of Enemy Christ's rule. So let me ask you a more ontological question: Do you believe God governs history?


Peter Thiel: (Pause) I would say that human free will and choice always exist. These things are not completely predetermined by some determinism.


Ross Douthat: But God will not let us live forever under this lukewarm, stagnant Enemy Christ system, right? The end of the story shouldn't be like this, should it?


Peter Thiel: Blaming everything on God's will is very problematic. I can quote many scriptures to explain this point, such as John 15:25, "They hated me without a cause." That is to say, those who persecuted Christ had no real reason. If we understand this verse as an expression of "ultimate causality," those people would say: I did it because God made me do it, God arranged everything.


But the traditional Christian understanding is actually against Calvinistic determinism. God is not the ultimate perpetrator behind everything. If you say everything is caused by God...


Ross Douthat: Wait a minute, but God did...


Peter Thiel: Then you are making God a scapegoat.


Ross Douthat: But indeed God allowed Jesus to enter history because He did not want us to be forever trapped in a decaying, stagnant Roman Empire. So God will eventually intervene to save, right?


Peter Thiel: I am not a Calvinist. And...


Ross Douthat: But this is not equivalent to Calvinism, this is just a basic belief of Christianity. God will not let us stare at our phone screens forever, be repeatedly admonished by Greta Thunberg. He will not leave us forever in that fate.


Peter Thiel: I believe, for better or worse, the scope of human action, human freedom always exists. If I really believe everything is predetermined, we can only accept fate—the lion is coming, then meditate and wait to be eaten. But I don't think we should give up resistance.


Ross Douthat: I agree with that. The reason I ask these questions is because I hope that as we resist the "Enemy Christ," we can exercise human free will with hope—you also agree with this, right?


Peter Thiel: We can reach consensus on this point.


Ross Douthat: Great. Peter Thiel, thank you very much for your insights today.


Peter Thiel: Thank you.


Welcome to join the official BlockBeats community:

Telegram Subscription Group: https://t.me/theblockbeats

Telegram Discussion Group: https://t.me/BlockBeats_App

Official Twitter Account: https://twitter.com/BlockBeatsAsia

举报 Correction/Report
This platform has fully integrated the Farcaster protocol. If you have a Farcaster account, you canLogin to comment
Choose Library
Add Library
Cancel
Finish
Add Library
Visible to myself only
Public
Save
Correction/Report
Submit