July 12, 2022
For the first time, NASA's James Webb Space Telescope brought back images of far-off galaxies, many of which have never been seen before.
The $10 billion telescope – the largest, most complex and powerful space telescope ever built – sent back images of the SMACS 0723 cluster in the deepest and sharpest infrared image to date, whose light traveled 4.6 billion years ago to get to Earth.
With thousands of galaxies and billions of stars to be seen by the telescope, it stands to reason humans alone cannot document its findings.
That’s where Morpheus comes in – an AI system tasked with analyzing the images.
Morpheus was trained on UC Santa Cruz’s Lux supercomputer – which includes 28 GPU nodes with two Nvidia V100 Tensor Core GPUs each. As data and images are sent from the telescope to scientists on Earth, that information will also be fed into the AI.
The system will help scientists get a better understanding of what the images show, but also a better idea of what the telescope is looking for.
The UC Santa Cruz’s Computer Science and Astronomy departments created the deep learning framework that classifies astronomical objects, such as galaxies, based on the raw data streaming out of telescopes on a pixel-by-pixel basis.
“The JWST will really enable us to see the universe in a new way that we’ve never seen before,” said Prof. Brant Robertson. “So it’s really exciting.”
Morpheus was previously used to classify images from the Hubble Space Telescope.
The system, along with Robertson and around 50 other researchers, will attempt to use the data to map some of the earliest features of the universe.
Around half a million galaxies will be surveyed using multiband near-infrared imaging and 32,000 galaxies in mid-infrared imaging.
About the Author(s)
You May Also Like