The space simulator – modeling the universe on a budget
For the past several years, a team of University of California astrophysicists working at Los Alamos National Laboratory have been using a cluster of roughly 300 computer processors to model some of the most intriguing aspects of the Universe. Called the Space Simulator, this de facto supercomputer has not only proven itself to be one of the fastest supercomputers in the world, but has also demonstrated that modeling and simulation of complex phenomena, from supernovae to cosmology, can be done on a fairly economical basis.
According to Michael Warren, one of the Space Simulator’s three principal developers, “Our goal was to acquire a computer which would deliver the highest performance possible on the astrophysics simulations we wanted to run, while remaining within the modest budget that we were allotted. Building the Space Simulator turned out to be a excellent choice.”
The Space Simulator is a 294-node Beowulf cluster with theoretical peak performance just below 1.5 teraflops, or trillions of floating point operations per second. Each Space Simulator processing node looks much like a computer you would find at home than at a supercomputer center, consisting of a Pentium 4 processor, 1 gigabyte of 333 MHz SDRAM, an 80 gigabyte hard drive and a gigabit Ethernet card. Each individual node cost less than $1,000 and the entire system cost under $500,000. The cluster achieved Linpack performance of 665.1 gigaflops per second on 288 processors in October 2002, making it the 85th fastest computer in the world, according to the 20th TOP500 list (see www.top500.org). A gigaflop is a billion floating-point operations per second. Since 2002, the Space Simulator has moved down to #344 on the most recent TOP500 list as faster computers are built, but Warren and his colleagues are not worried. They built the Space Simulator to do specific astrophysics research, not to compete with other computers. It was never designed to compete with Laboratory’s massive supercomputers and, in fact, is not scalable enough to do so.
The Space Simulator has been used almost continuously for theoretical astrophysics simulations since it was built, and has spent much of the past year calculating the evolution of the Universe. The first results of that work were recently presented at a research conference in Italy by Los Alamos postdoctoral research associate Luis Teodoro. Further analysis of the simulations, in collaboration with Princeton University professor Uros Seljak, will soon be published in the prestigious journal Monthly Notices of the Royal Astronomical Society. In addition to simulating the structure and evolution of the Universe, the Space Simulator has been used to study the explosions of massive stars and to help understand the X-ray emission from the center of our galaxy.
The Space Simulator is actually the Laboratory’s third generation Beowulf cluster. The first was Loki, which was constructed in 1996 from 16 200 MHz Pentium Pro processors. Loki was followed by the Avalon cluster, which consisted of 144 alpha processors. The Space Simulator follows the same basic architecture as these previous Beowulf machines, but is the first to use Gigabit Ethernet as the network fabric, and requires significantly less space than a cluster using typical computers. The Space Simulator runs parallel N-body algorithms, which were originally designed for astrophysical applications involving gravitational interactions, but have since been used to model more complex particle systems.
In addition to Warren, the developers of the Space Simulator include Los Alamos staff members Chris Fryer and Patrick Goda. Los Alamos’ Laboratory-Directed Research and Development (LDRD) program provided funding for the Space Simulator research. LDRD funds basic and applied research and development focusing on employee-initiated creative proposals selected at the discretion of the Laboratory director.
Media Contact
More Information:
http://www.lanl.gov/worldviewAll latest news from the category: Physics and Astronomy
This area deals with the fundamental laws and building blocks of nature and how they interact, the properties and the behavior of matter, and research into space and time and their structures.
innovations-report provides in-depth reports and articles on subjects such as astrophysics, laser technologies, nuclear, quantum, particle and solid-state physics, nanotechnologies, planetary research and findings (Mars, Venus) and developments related to the Hubble Telescope.
Newest articles
Pinpointing hydrogen isotopes in titanium hydride nanofilms
Although it is the smallest and lightest atom, hydrogen can have a big impact by infiltrating other materials and affecting their properties, such as superconductivity and metal-insulator-transitions. Now, researchers from…
A new way of entangling light and sound
For a wide variety of emerging quantum technologies, such as secure quantum communications and quantum computing, quantum entanglement is a prerequisite. Scientists at the Max-Planck-Institute for the Science of Light…
Telescope for NASA’s Roman Mission complete, delivered to Goddard
NASA’s Nancy Grace Roman Space Telescope is one giant step closer to unlocking the mysteries of the universe. The mission has now received its final major delivery: the Optical Telescope…