The world’s richest man, Elon Musk, never follows the ordinary path. He always comes up with some technological waves and business plans that seem almost crazy but are highly forward – looking. However, this time, the Chinese have taken the lead. Of course, the competition for sending computing power into space has just begun.
Tesla Roadster in Space
In 2018, Musk sent his red Tesla Roadster into space. It was the maiden flight of SpaceX’s Falcon Heavy rocket. To prove the powerful carrying capacity of his rocket, he decided to send something really cool into outer space and finally chose his own Tesla car.

This was the first time a (human – made) car appeared in outer space. Inside the car sat a dummy wearing a SpaceX spacesuit, and David Bowie’s song “Space Oddity” was playing on a loop. The words “Don’t Panic” were displayed on the center console, which was a tribute to the science – fiction classic “The Hitchhiker’s Guide to the Galaxy”.
Seven years have passed, and the car is still floating in a heliocentric orbit, becoming an artificial asteroid that orbits the sun about once every one and a half years. Of course, due to cosmic radiation and meteorite impacts, the car body is pitted, and the interior is beyond recognition.
Sending a car into space, which sounds incredible at first, has become the most expensive and most imaginative advertisement in history. It has done extremely successful brand marketing and image promotion for SpaceX, Tesla, and Musk himself, bringing many potential customers to Tesla in the following years and laying the foundation for him to become the world’s richest man a few years later.
For many years, the dream of colonizing Mars and exploring space has been Musk’s personal aura. After sending the car into space, the world’s richest man’s new plan now is to build a super AI data center in space. At the US – Saudi Investment Forum last week, Musk publicly elaborated on his crazy idea again. He clearly proposed to launch an AI satellite station within five years, with the future AI data center located in outer space.
The Energy Hunger of Data Centers
At first hearing, this sounds even more unbelievable than sending a Tesla into space.
Indeed, we are in an era of AI computing power competition. Tech giants are frantically building AI infrastructure around the world, investing huge amounts of money to build super data centers. In the next three to five years, the relevant capital expenditures of several major US tech giants such as Amazon, Google, Microsoft, and Meta will reach up to $1.2 trillion. McKinsey analysts predict that by 2030, the total global investment in data centers may reach up to $6.7 trillion, of which about $5.2 trillion is directly driven by AI workloads.
However, building data centers requires an astonishing amount of power supply, a large amount of land for building computer rooms, the purchase of sky – high – priced chips, and an efficient cooling system. Therefore, most tech giants choose to invest and build in the central inland areas such as Arizona and Tennessee, where land and power costs are relatively low and the government can provide tax incentives.
At the US – Saudi Investment Forum last week, Musk and Jensen Huang jointly announced that xAI plans to cooperate with Saudi state – owned AI enterprise Humain to invest $25 billion in building a new 500 – megawatt data center on the outskirts of Riyadh, the capital of Saudi Arabia, using NVIDIA’s AI chips. This computing power is even greater than xAI’s largest global computing center, Colossus (300 megawatts), in Tennessee.
Building a data center in a hot desert area? This is normal. Arizona is also a hot desert, and the United Arab Emirates in the Middle East is also building super data centers. Solar energy is the most direct beneficiary. Saudi Arabia has the world’s largest solar energy resources, with an average annual sunshine of more than 3,000 hours.
With the full support of the Saudi government, Musk’s data center can obtain sufficient financial support and energy supply. Land and approval are not problems. However, innovative technologies such as immersion liquid cooling are needed for heat dissipation. It is estimated that cooling costs will account for up to 20% of the operating costs.
In fact, the primary consideration for data center location is power supply. Because stable and inexpensive power supply has become the biggest factor restricting the development of AI. Taking the United States as an example, in 2014, data centers accounted for only 1.8% of the total US electricity consumption, but Bain Capital predicts that this proportion may reach 9% by 2030.
According to a report released by the International Energy Agency in 2025, in 2024, the global data center power consumption reached 415 terawatt – hours, accounting for about 1.5% of the total global electricity consumption. What’s more worrying is that this figure is expected to double to 945 terawatt – hours by 2030, equivalent to the total annual electricity consumption of Japan.
AI giants are extremely hungry for power supply and have even started building their own power generation facilities. Musk’s xAI is using gas turbines for temporary power generation; Microsoft has signed an agreement to restart a nuclear power plant project; and OpenAI is lobbying the government to cooperate with enterprises to add 100 gigawatts (100 GW) of power annually.
Endless Solar Energy Supply
This is precisely the greatest advantage of building a data center in space. Musk specifically explained at the forum that even to achieve a small part of a Kardashev Type II civilization (fully utilizing stellar energy), the energy required for AI computing will always be several orders of magnitude greater than what the Earth can provide. For example, it is impossible to achieve terawatt – year – level AI computing on Earth, no matter how many power plants are built, and the cooling capacity also cannot meet the requirements.
Therefore, solar – powered AI satellites/computing clusters based in space are an inevitable future. The cost – effectiveness of power and computing in space will soon far exceed that on Earth. In Earth’s orbit, the intensity of solar radiation is 1.36 times that on the Earth’s surface. In polar or geosynchronous orbits, the energy utilization rate can reach 99%, far higher than the 30% – 40% on the ground. More importantly, in an appropriate orbit (such as a sun – synchronous orbit), satellites can be exposed to sunlight almost all day long, without having to consider day – night cycles and weather effects.
Musk also mentioned another key factor: heat dissipation. Currently, most of the mass and volume of supercomputer racks (such as GB300) are actually cooling equipment. In space, most of this cooling equipment can be eliminated because heat can be directly radiated into the cold vacuum, without the need for water cooling or fans.
Although heat cannot be dissipated through air convection, the extremely cold environment of minus 270 degrees Celsius on the dark side of space can be used for efficient radiative heat dissipation. Research shows that the efficiency of this heat dissipation method is three times that on the ground, and it does not consume precious water resources.
Heat can be conducted to the radiation cooling panels on the satellite surface through heat pipes or fluid circuits and then directly radiated into deep space in the form of infrared radiation. Although this heat dissipation method requires a large area of cooling panels, it fundamentally solves the heat dissipation challenge that ground – based data centers have almost reached the physical limit.
In the past few weeks, Musk has been talking about how to send a new generation of solar Starlink satellites into space using Starship rockets at various events. These satellites are equipped with high – speed laser communication and will directly form a data center in orbit.
The day after the US – Saudi Investment Forum, Musk emphasized again that these AI satellites can generate 100 gigawatts of solar energy annually – equivalent to a quarter of the average annual electricity consumption in the United States. “We have fully planned it out, and it will be very crazy.”
So, how to send chips, solar panels, and many other devices into space? Musk then specifically elaborated on his concept of a space AI center:
The Starship rocket can send about 300 gigawatts (or even 500 gigawatts) of solar – powered AI satellites into orbit annually. The average power consumption in the United States is about 500 gigawatts. So, if 300 gigawatts of AI computing power is sent into space every year, it means that within two years, the AI computing power in space will exceed the current total electricity consumption of the entire US economy, and it is only used for intelligent processing.
The problem of orbital transportation capacity will be completely solved by the Starship rocket, and solar panels are not a problem either. The annual production capacity of ground – based solar cells has long exceeded 1,500 gigawatts (and far exceeds this figure). The next most crucial piece of the puzzle is chip production. For this reason, Tesla must have a terawatt – scale wafer factory; otherwise, there will be no solution on a sufficient scale.
These figures are already extremely terrifying by Earth’s standards, but they are nothing in the face of the Kardashev civilization index. To truly move towards a higher – level civilization, solar – powered AI satellites must be mass – produced on the moon to increase the annual computing power to the level of over 100 terawatts.
Tech Giants Marching Forward Together
In fact, this seemingly crazy idea is not something that only Musk has thought of. Many AI giants have envisioned this vision and have even started exploring it.
Their core logic is the same: the demand for power from AI will eventually be so large that the Earth cannot supply it, so they must go to space.
Jeff Bezos, the founder of Amazon, said last month, “Within the next twenty years, the cost of space data centers will beat that of ground – based ones. Space will ultimately become one of the places that continuously make the Earth a better place.” His company, Blue Origin, has just achieved rocket vertical landing, becoming the second space exploration company after SpaceX to achieve this technology.
Although Amazon has not officially disclosed detailed plans for a space data center, Blue Origin is accelerating the development of the New Glenn rocket to further reduce launch costs. Considering Amazon Web Services (AWS)’s more than 30% share in the global cloud – computing market, Amazon can deeply integrate its space data center with AI cloud services in the future to build an orbital version of AWS.
Amazon’s Project Kuiper plans to launch a low – Earth – orbit satellite internet service in Australia in mid – 2026, competing with Starlink. In 2025, the company successfully launched its first batch of 27 satellites and plans to deploy in – orbit AI data processing nodes in the future, combining AWS’s edge – computing capabilities.
Alphabet, Google’s parent company, recently announced “Project Suncatcher”, aiming to launch a constellation of compact satellites powered by solar energy and equipped with Google’s advanced Tensor Processing Units (TPUs). It plans to launch two test satellites in early 2027 in cooperation with Planet Labs, with each satellite carrying four TPUs.
It should be noted that Google’s space data center will not be a giant single – building like those on Earth but a constellation concept consisting of about 81 satellites with a radius of 1 kilometer. Each satellite is equipped with a TPU used by Google to drive AI. The scale of this constellation will “far exceed any satellite constellation in history and at present”.
Google CEO Sundar Pichai said, “Like all moon – landing projects, this requires us to solve many complex engineering challenges.” He also admitted that this project faces significant technical and logistical obstacles, and the final scale may change.
Energy is the most direct factor. The description of Project Kuiper states that the problem of AI’s energy demand is already very difficult to solve, and the best solution is to launch “swarms of satellites” into space to directly capture energy from the sun. In a “suitable orbit”, the efficiency of solar panels in space can be eight times higher than that on Earth.
Popular Track in Space Exploration
In fact, there have been many explorers in the field of space computing. The projects of many US space startups have verified the trend of space computing power that Musk mentioned. According to the latest report of the European Space Policy Institute (ESPI), a European think – tank, about $81 million in private capital has flowed into space data center projects or key technology fields directly related to them in the past five years.
At the beginning of this year, Florida – based lunar exploration company Lonestar conducted an operational test of a data center on the way to the moon and in lunar orbit, but the lander mission was not completely successful. Axiom Space launched a prototype of an orbital data center to the International Space Station in August. The AI infrastructure startup Crusoe, which has attracted much attention due to its cooperation with OpenAI, also plans to deploy its cloud platform on the Starcloud satellite to be launched by the end of 2026.
Earlier this month, startup Starcloud sent a satellite equipped with an NVIDIA H100 GPU into space. The Starcloud – 1 satellite weighs only about 60 kilograms but is equipped with an NVIDIA processor and can run artificial intelligence models including Google’s Gemini variants in orbit.
Jonathan Cirtain, the CEO of Axiom Space, mentioned at a Deutsche Bank conference last week that currently, 90% of the data generated in space is lost, and increasing infrastructure can solve this problem. “If you can directly generate information products in space and transmit them back to the ground, you can immediately create new value for the assets people deploy in space.”
Of course, moving a data center to space is by no means that simple, and this future trend also faces many technical difficulties and doubts. “As an engineer, just saying ‘I’m going to throw a
该文观点仅代表作者本人,36氪平台仅提供信息存储空间服务。
Leave a Reply