Elon Musk Pushes Forward with Orbital Data Centers, Merging SpaceX and xAI for Space-Based AI Infrastructure
Elon Musk is advancing his vision for orbital data centers with increasing seriousness, following SpaceX’s recent filing with the Federal Communications Commission for a network of a million satellites designed to function as space-based data centers. While the idea may have initially seemed like a bold exaggeration, the move is now backed by concrete steps, including the formal merger between SpaceX and Musk’s AI venture xAI, which was completed on Monday. This integration brings together Musk’s space and artificial intelligence ambitions into a single, unified entity, suggesting a long-term strategy that could involve deploying AI computing infrastructure in orbit. The FCC accepted the filing on Wednesday and opened a public comment period—a routine step—but FCC Chairman Brendan Carr made an unusual move by sharing the document on X, signaling potential support. Carr has previously shown favoritism toward Musk, particularly as the latter remains aligned with former President Donald Trump, making regulatory approval likely to proceed smoothly. Musk has also begun publicly articulating the rationale behind space-based computing. In a recent episode of Patrick Collison’s podcast “Cheeky Pint,” featuring Dwarkesh Patel, Musk argued that space offers a significant advantage for powering AI systems. He noted that solar panels in orbit generate about five times more energy than those on Earth due to the absence of atmospheric interference and constant sunlight. This, he claims, could drastically reduce one of the biggest costs in running data centers: energy. However, critics like Patel point out logical gaps in the argument. While solar energy is more abundant in space, data centers require more than just power—cooling, maintenance, latency, and hardware reliability are major challenges. GPUs used in AI training are prone to failure, and repairing or replacing them in orbit would be far more complex and costly than on Earth. These technical hurdles remain significant, even if the energy argument is compelling. Despite these concerns, Musk remains confident. He predicted that by 2028—within 30 to 36 months—the most economically viable location for AI computing will be space. He went further, stating that by 2030, the amount of AI processing done in orbit will exceed the total amount processed on Earth since the beginning of the digital age. The scale of this ambition is staggering. Global data center capacity is expected to reach around 200 gigawatts by 2030, representing roughly a trillion dollars in infrastructure investment. If Musk’s vision materializes, a significant portion of that investment could shift to space-based systems. Given that SpaceX earns revenue from launching payloads into orbit, this initiative aligns perfectly with its business model. The upcoming IPO of the combined SpaceX-xAI entity, expected in the coming months, will likely bring even more attention to this ambitious plan. As tech giants continue to spend hundreds of billions on data centers each year, the idea of moving some of that infrastructure into space is no longer just science fiction—it’s becoming a serious contender in the race for the future of computing.
