As space scientists collect more and more data, observatories around the world are finding new ways to apply supercomputing, cloud computing, and deep learning to make sense of it all. Here are some examples of how these technologies are changing the way astronomers study space.
What happens when black holes collide?
As a postdoctoral student in the US, astrophysicist Eliu Huerta started to think about how technology might help more breakthroughs happen in his field. Then researchers detected gravitational waves for the first time in 2015 with LIGO (the Laser Interferometer Gravitational-Wave Observatory).
Scientists have since charted these observations and scrambled to learn all they can about these elusive forces. They’ve detected dozens more gravitational-wave signals, and advances in computing are helping them to keep up.
As a postdoc, Huerta searched for gravitational waves by tediously trying to match data collected by detectors to a catalogue of potential waveforms. He wanted to find a better way.
Earlier this year Huerta, who is now a computational scientist at Argonne National Laboratory near Chicago, created an AI ensemble that’s capable of processing a month’s worth of LIGO data in just seven minutes.
His algorithms—which run on special processors called GPUs—combine advances in artificial intelligence and distributed computing. Using either separate computers or networks that act as a single system, Huerta can identify gravitationally dense places like black holes, which produce waves when they merge.
Huerta’s collection of AI models is open source, which means anyone can use them. “Not everybody has access to a supercomputer,” he says. “This is going to lower the barriers for researchers to adopt and to use AI.”
How has the night sky changed?
As much as astronomy has expanded, the field has been slow to integrate cloud computing. The Vera C. Rubin Observatory, currently under construction in Chile, will become the first astronomical institution of its size to adopt a cloud-based data facility.
When the observatory starts up in 2024, the data its telescope captures will become available as part of the Legacy Survey of Space and Time (LSST) project, which will create a catalogue thousands of times larger than any previous survey of the night sky. Past surveys were almost always downloaded and stored locally, which made it hard for astronomers to access each other’s work.
“We are making a map of the full sky,” says Hsin-Fang Chiang, a member of the Rubin’s data management team. And in the process, they are building “a huge data set that’s going to be useful for many different kinds of science in astronomy.”
Although Chiang’s PhD is in astronomy, her initial research had nothing to do with the survey. Years later, she got a chance to be involved thanks to the sheer size of the project. She’s proud that her work could improve the way scientists collaborate.
The 10-year project will deliver a 500-petabyte set of data and images to the cloud, to help astronomers answer questions about the structure and evolution of the universe.
“For each position in the sky, we’ll have more than 800 images there,” says Chiang. “You could even see what happened in the past. So especially for supernovas or things that change a lot, then that’s very interesting.”
The Rubin Observatory will process and store 20 terabytes of data every night as it maps the Milky Way and places beyond. Astronomers affiliated with the project will be able to access and analyze that data from anywhere via a web browser. Eventually, the images the telescope takes every night will be converted into an online database of stars, galaxies, and other celestial bodies.
What did the early universe look like?
Advances in computing could help astronomers turn back the cosmic clock. Earlier this year, Japanese astronomers used ATERUI II, a supercomputer that specializes in astronomy simulations, to reconstruct what the universe may have looked like as early as the Big Bang.
ATERUI II is helping the researchers investigate cosmic inflation—the theory that the early universe expanded exponentially from one moment to the next. Astronomers agree that this expansion would have left extreme variations in the density of matter that would have affected both the distribution of galaxies and the way they developed.
By comparing 4,000 simulations of the early universe—all with different density fluctuations—against the real thing, scientists could rewind time and ask why some places in the universe are rife with cosmic activity while others are barren.
Masato Shirasaki, an assistant professor at the National Astronomical Observatory of Japan, says that question would be almost impossible to answer without these simulations. The project requires a huge amount of data storage (about 10 terabytes, equivalent to 22,000 episodes of Game of Thrones).
Shirasaki’s team developed a model of how the universe is thought to have evolved and applied it to each of the simulations to see which result may be closest to how it looks today. This method made it easier to explore the physics of cosmic inflation.
In the next few years, Shirasaki’s methods could help shorten the observation time needed for future efforts like SPHEREx, a two-year mission slated for 2024 involving a spacecraft that will orbit Earth and gaze at nearly 300 million galaxies across the sky. With these leaps in computing, our understanding of the universe is expanding, bit by bit.