Google stock photo
x
Project Suncatcher was conceived from Google’s belief that space could be the ideal environment to expand AI computing capacity. Photo: iStock

Explained: What is Google’s new Project Suncatcher to build AI data centres in space?

The project envisions compact constellations of solar-powered satellites, carrying Google TPUs and connected by free-space optical links


Click the Play button to hear this message in audio format

Google has announced a new research initiative aimed at building AI data centres in space, a bold venture the tech giant describes as a new moonshot project that could transform how machine learning computation is scaled.

Project Suncatcher was conceived from Google’s belief that space could be the ideal environment to expand AI computing capacity. The company unveiled the initiative alongside a preprint research paper titled Towards a Future Space-Based, Highly Scalable AI Infrastructure System Design.

Also Read: OpenAI launches Atlas browser, challenging Google Chrome

CEO Sundar Pichai announced the project on X, writing, “Inspired by our history of moonshots — from quantum computing to autonomous driving — Project Suncatcher explores how we might one day build scalable ML compute systems in space, harnessing more of the sun’s power (which emits more energy than 100 trillion times humanity’s total electricity production). Like any moonshot, it will require us to overcome many complex engineering challenges.”

Pichai added that extensive testing would be required to address issues such as “thermal management and on-orbit system reliability.” Google plans to launch a pair of prototype satellites by 2027 in partnership with Planet.

Why data centres in space?

In a blog post, Google outlined its vision for scalable, space-based data centres featuring tensor processing units (TPUs) orbiting on solar-powered satellites. The company said that solar panels in orbit could generate up to eight times more electricity than those on Earth.

Google’s idea centres on energy abundance and efficiency. It noted that the Sun produces more energy than 100 trillion times humanity’s total electricity output, making it the ultimate energy source in our solar system.

In the right orbit, a solar panel can be up to eight times more efficient than on Earth, producing power almost continuously and minimising the need for large batteries.

Also Read: Google to invest $15 billion in Vizag AI hub; Sundar Pichai calls it 'landmark move'

“Artificial intelligence is a foundational technology that could reshape our world, driving new scientific discoveries and helping us tackle humanity’s greatest challenges,” Google stated. “Now, we are asking where we can go to unlock its fullest potential.”

The company argues that space could ultimately be the best environment to scale AI compute while minimising impact on terrestrial resources, an increasingly important consideration as AI’s energy demands rise.

How will the project function in space?

Project Suncatcher envisions small, compact constellations of solar-powered satellites carrying Google TPUs and connected through free-space optical links, the company said.

Google’s early research, shared in its preprint paper, highlights progress in addressing some of the fundamental challenges of this ambitious venture, including high-bandwidth inter-satellite communication between satellites, orbital dynamics, and the effects of radiation on computing hardware.

By focusing on a modular design of smaller, interconnected satellites, Google aims to lay the foundation for a highly scalable, space-based AI infrastructure.

The proposed system would consist of a constellation of networked satellites, likely operating in a dawn-to-dusk sun-synchronous low-Earth orbit, where they would be exposed to near-constant sunlight. This orbit maximises solar energy collection while reducing the need for heavy onboard batteries.

Google said that while initial Earth-based tests have demonstrated bidirectional speeds of up to 1.6 Tbps, the company believes this can be scaled significantly.

Technical challenges

To make such a system viable, Google must overcome several technical hurdles, including achieving data centre-scale inter-satellite links, controlling large clusters of closely positioned satellites, ensuring TPU radiation tolerance, and keeping launch costs economically feasible.

These challenges could take years to resolve. Large-scale machine-learning workloads require the distribution of tasks across numerous accelerators with high-bandwidth, low-latency connections.

Also Read: Google’s Gemini Nano Banana AI goes viral, generating 3D Figurines

Delivering performance comparable to terrestrial data centres requires links between satellites that support tens of terabits per second. However, achieving this kind of bandwidth requires power levels thousands of times higher than conventional long-range communications.

To achieve this high-bandwidth inter-satellite links, satellites would need to fly in a much more compact formation than existing systems, with satellites positioned just hundreds of meters apart, requiring only modest station-keeping manoeuvres to maintain stable constellations within desired sun-synchronous orbit.

For machine learning accelerators to operate effectively in space, they must also withstand the harsh conditions of low-Earth orbit. Google also tested its Trillium TPUs for radiation resilience, claiming they can endure “a total ionising dose equivalent to a five-year mission life without permanent failures.”

Next Story