Published: 02.09.10
Science

Interdisciplinary look inside the Earth’s interior

The Swiss platform for High-Performance and High-Productivity Computing (HP2C) is the world’s first and only project with the aim of developing optimized scientific simulations for high-performance computers. Seismologists from ETH Zurich are also involved, namely in the «Petaquake» project.

Simone Ulmer
A snapshot of seismic wave propagation through the three-dimensional Earth 20 minutes after an earthquake. The hypothetical earthquake has been placed at the northpole. Such simulations are numerically possible, but for realistic 3D models and high resolution (more than 100 million grid points) only on supercomputers. To devise new earth models, thousands of such simulations need to be undertaken.(Image: Inst. of Geophysics/ETH Zürich)
A snapshot of seismic wave propagation through the three-dimensional Earth 20 minutes after an earthquake. The hypothetical earthquake has been placed at the northpole. Such simulations are numerically possible, but for realistic 3D models and high resolution (more than 100 million grid points) only on supercomputers. To devise new earth models, thousands of such simulations need to be undertaken.(Image: Inst. of Geophysics/ETH Zürich) (large view)

What is the exact structure of the Earth’s interior? What are the processes that take place there? Where and how do earthquakes originate? These are some of the central questions concerning our planet that we have not yet been able to answer with certainty. A view into the Earth’s interior similar to computed tomography for a human being could provide these answers, thus helping to improve seismic risk maps. This would be an important basis for assessing the risk of the locations of nuclear power plants or hospitals in Switzerland for example.

Tomographic recording

Although today’s supercomputers can already perform a quadrillion calculation steps per second, the computers and programs for modelling are not sufficiently coordinated to compute high-resolution complex images of the Earth’s interior in a reasonable period of time. Compared to the human body, screening the Earth is a difficult endeavour because earthquakes occur irregularly and seismographs and are spread unevenly across the Earth’s surface. Also, unlike the X-rays used in computed tomography, there are large portions of the Earth’s interior that seismic waves do not penetrate. Someday, however, the «Petaquake» project could provide a similar insight into the Earth as into the human body. Scientists from the Institute of Geophysics at ETH Zurich under the supervision of Domenico Giardini, a professor at ETH, have joined forces with mathematicians and computer scientists from the University of Basel under the supervision of Professors Helmar Burkhart and Marcus Grote as well as PD Olaf Schenk to carry out this project.

«Petaquake» is one of the promising projects launched by the Swiss platform for High-Performance and High-Productivity Computing (HP2C) in 2009. The objective of the HP2C platform is to use interdisciplinary cooperation between hardware manufacturers, computer scientists, mathematicians and end users to develop special methods and algorithms for high-performance computers by 2013 already that will allow complex simulations to be completed in just a few hours in future instead of months or even years.

Visualizing the Earth’s interior

The seismic waves triggered by an earthquake travel through the Earth at different speeds depending on the material they penetrate. Our knowledge of the structure of the Earth’s interior is based primarily on the modelling of these seismic waves, which is carried out based on certain assumptions about the Earth’s interior. «We made many findings from assuming that seismic waves behave in the same way as optical waves», says Lapo Boschi, a senior assistant lecturer to Giardini. However, he said, this is only an approximation of how things work in practice. The aim of «Petaquake» is to compute high-resolution tomographic images of processes at scales of tens of kilometres. For Dr. Boschi, the most exciting aspect of the project is obtaining such images through improved algorithms that take adequate account of factors such as wave physics.

Together with the mathematicians and computer engineers, Professor Giardini’s group – which also includes the two senior assistant lecturers Tarje Nissen-Meyer and Luis Dalguer – developed numerical methods that describe both the propagation of the seismic waves (solution to the forward problem) and new tomographic models using real seismic data (solution to the inversion problem). «Taking into account the full wave character, resolving relevant geophysical processes with the help of supercomputers as well as the flood of high-quality new data all contribute to an exciting phase in modern seismology», says Nissen-Meyer. Among other things, he would like to use the new models to examine the dynamics of the Earth's mantle and magnetic field.

Safety of nuclear power plant locations

The new algorithms are also used for research into earthquakes, in particular for assessing risk. Luis Dalguer is using the new models to research the dynamics and the behaviour of the ruptures zones where earthquakes occur. For example, «Petaquake» shall facilitate a local, small-scale, three-dimensional resolution to show in detail how ruptures zones develop, where they are initiated and why and when they end. The results shall contribute to develop a model for ground motion resulting from the rupture process. According to the researchers, this is relevant for assessing the safety of locations for nuclear power plants.

Since the evaluations of the PEGASOS project (probabilistic seismic hazard analysis project for locations for nuclear power plants in Switzerland, «Probabilistische Erdbebengefährdungsanalyse für KKW-Standorte in der Schweiz») were presented to the public in 2007, it has become clear that the seismic risk to nuclear power plants in Switzerland may have been underestimated. The assessment is based on extensive series of measurements carried out at locations around the world since the 1980s. These show that that earthquake ground shaking could be stronger than previously assumed even in areas in which moderately large earthquakes are expected. This is why new methods for risk assessment are now focusing on the composition of the subsurface through which the seismic waves propagate: strong earthquakes near the rupture zone are simulated in order to reduce uncertainties in the assessments.

Supercomputers already opened up a whole new generation of earthquake research to seismologists around twenty years ago. «With the 3D-models, we are even able to visualize the mechanics and physics behind the propagation of the rupture zones», says Dalguer.

Key components for success

Marcus Grote and Olaf Schenk deliver the hardware-related algorithms for the project which tell the computer the program workflow for generating a simulation that is as close to reality as possible. The mathematic modelling is the «language» used for formal communication between all those involved. This is a language that covers the entire physics of an elastic wave equation. In order to obtain a realistic image, the scientists divide up the Earth using a flexible grid. Tetrahedra of different sizes make it possible to portray the space- and time-dependent wave propagation at defined intervals. They also had to develop a procedure that makes it possible to choose different time intervals at which the waves travel through a given area in neighbouring tetrahedra of different sizes. For Marcus Grote, both of these procedures are key to the success of the project. «The aim is now to optimize these components in such a way that they simulate the wave within a short period of time and satisfy the seismic wave equation exactly», says Schenk. If the simulated waves and the data recorded by the seismometer match, the penetrated medium can be described and the Earth’s interior can be portrayed in a new three-dimensional model.

Marcus Grote sees the project as a catalyst: «Normally it takes up to ten years from the time a new algorithm is presented at a conference until it is published and ultimately used by seismologists for example.» Grote and Schenk are specialized in numerical procedures for the simulation of wave phenomena on high-performance computers. In addition to their use in seismology, these can also be used in imaging procedures in medicine for example.

HP2C is already being imitated

The driving force behind HP2C is Thomas Schulthess, who is a professor at ETH and the Director of the Swiss National Supercomputing Centre (CSCS). Schulthess initiated HP2C together with Piero Martinoli, the President of Università della Svizzera italiana (University of Lugano, USI). Schulthess maintains that the success of the HP2C platform will be evident in three years’ time at the latest. He feels that the project offers many opportunities and poses many challenges. The main challenge, he says, is getting the programs to run on the computers. For the scientists, it is essential to work together with industry, which in turn has a great deal of interest in the HP2C project. According to Schulthess, if the scientific and mathematical questions are well formulated, the computers can be optimized accordingly. For Professor Schulthess, having the fastest computer in Switzerland is not the main priority. Instead he realizes: «Through the network of the HP2C platform, the CSCS and Switzerland will develop know-how that is not available anywhere else in the world.» The project has already attracted interest from the world’s largest industrial nations, the G8 states, and has inspired them to initiate a similar project.

About the project

In 2008, Thomas Schulthess outlined a project for the Rectors’ Conference of the Swiss Universities (CRUS) that aimed to step up cooperation between the CSCS and USI and to develop a structured approach for targeting fundamental problems in supercomputing. The HP2C project was originally planned on a smaller scale with a budget of around 4 million Swiss francs. The CRUS suggested expanding the platform across Switzerland, and today it contributes roughly 14.5 million Swiss francs, which corresponds to somewhere in the region of 80% of the project’s financing. HP2C is a central component of the Swiss National HPCN Strategy financed by the Swiss Federal Government since 2009, and the world’s only network for interdisciplinary collaboration between different interest groups.

Despite the tough requirements placed on the project by the Call for Proposals, 17 draft projects were submitted. Eight projects from the fields of fluid and molecular dynamics, astronomy, mathematics, computer engineering, medicine, biology as well as seismology stood up to the review process carried out at USI by international experts and made it straight through the review process. Three other projects followed after the applications had been revised.
The CSCS provides the computer science expertise needed for the project, while USI has the know-how in the field of applied mathematics. In addition, the CSCS is working together with industry to develop new computer architectures and the software environment on massive parallel computer architectures.

 
Reader comments:
We are interested in your feedback. Please send us your comment: