
When the last massive earthquake shook the San Andreas Fault in 1906âcausing fires that burned down most of San Francisco and leaving half the cityâs population homelessâno one would hear about âplate tectonicsâ for another 50 years, and the Richter scale was still a generation away. Needless to say, by todayâs standards, only primitive data survive to help engineers prepare southern California for an earthquake of similar magnitude.
âWe havenât had a really big rupture since the city of Los Angeles existed,â said Thomas Jordan, (SCEC) director.
Scientists predict this is just the quiet before the storm for cities like San Francisco and Los Angeles, among other regions lining the San Andreas.
âWe think the San Andreas Fault is locked and loaded, and we could face an earthquake of 7.5-magnitude or bigger in the future,â Jordan said. âBut the data accumulated from smaller earthquakes in southern California over the course of the last century is insufficient to predict the shaking associated with such large events.â
To prepare California for the next âbig one,â SCEC joint researchersâincluding computational scientist Yifeng Cui of the University of California, San Diego and geophysicist Kim Olsen of San Diego State Universityâare simulating on Titan, the worldâs most powerful supercomputer for open science research, earthquakes at high frequencies for more detailed predictions that are needed by structural engineers.
Titan, which is managed by the Oak Ridge Leadership Computing Facility (OLCF) located at 91°”Íű (ORNL), is a 27-petaflop Cray XK7 machine with a hybrid CPU/GPU architecture. GPUs, or graphics processing units, are accelerators that can rapidly perform calculation-intensive work while CPUs carry out more complex commands. The computational power of enables users to produce simulationsâcomprising millions of interacting molecules, atoms, galaxies, or other systems difficult to manipulate in the labâthat are often the largest and most complex of their kind.
The SCECâs high-frequency earthquakes are no exception.
âItâs a pioneering study,â Olsen said, âbecause nobody has really managed to get to these higher frequencies using fully physics-based models.â
Many earthquake studies hinge largely on historical and observational data, which assumes that future earthquakes will behave as they did in the past (even if the rupture site, the geological features, or the built environment is different).
âFor example, there have been lots of earthquakes in Japan, so we have all this data from Japan, but analyzing this data is a difficult task because scientists and engineers preparing for earthquakes in California have to ask âIs Japan the same as California?â The answer is in some ways yes, and in some ways no,â Jordan said.
The physics-based model calculates wave propagations and ground motions radiating from the San Andreas Fault through a 3-D model approximating the Earthâs crust. Essentially, the simulations unleash the laws of physics on the regionâs specific geological features to improve predictive accuracy.
Seismic wave frequency, which is measured in Hertz (cycles per second), is important to engineers who are designing buildings, bridges, and other infrastructure to withstand earthquake damage. Low-frequency waves, which cycle less than once per second (1 Hertz), are easier to model, and engineers have largely been able to build in preparation for the damage caused by this kind of shaking.
âBuilding structures are sensitive to different frequencies,â Olsen said. âItâs mostly the big structures like highway overpasses and high-rises that are sensitive to low-frequency shaking, but smaller structures like single-family homes are sensitive to higher frequencies, even up to 10 Hertz.â
But high-frequency waves (in the 2â10 Hertz range) are more difficult to simulate than low-frequency waves, and there has been little information to give engineers on shaking up to 10 Hertz.
âThe engineers have hit a wall as they try to reduce their uncertainty about how to prevent structural damage,â Jordan said. âThere are more concerns than just building damage there, too. If you have a lot of high-frequency shaking it can rip apart the pipes, electrical systems, and other infrastructure in hospitals, for example. Also, very rigid structures like nuclear power plants can be sensitive to higher frequencies.â
A better understanding of the effects of high-frequency waves on critical facilities could inform disaster response in addition to structural engineering.
High-frequency waves are computationally more daunting because they move much faster through the ground. And in the case of the SCECâs simulations on Titan, the ground is extremely detailed: representing a chunk of terrain one-fifth the size of California (including a depth of 41 kilometers) at a spatial resolution of 20 meters. The ground models include detailed 3-D structural variationsâboth larger features such as sedimentary basins as well as small-scale variations on the order of tens of metersâthrough which seismic waves must travel.
Along the San Andreas, the Earthâs surface is a mix of hard bedrock and pockets of clay and silt sands.
âThe Los Angeles region, for example, sits on a big sedimentary basin that was formed over millions of years as rock eroded out of mountains and rivers, giving rise to a complex layered structure,â Jordan said.
Soft ground like Los Angelesâs sedimentary basin amplifies incoming waves, causing these areas to shake more over a longer period of time than rocky ground, which means some areas further away from the rupture site could actually experience more infrastructure damage.
The entire simulation totaled 443 billion grid points. At every point, 28 variablesâincluding different wave velocities, stress, and anelastic wave attenuation (how waves lose energy to heat as they move through the crust)âwere calculated.
âHigh-frequency ground motion modeling is a complex problem that requires a much larger scale of computation,â Jordan said. âWith the capabilities that we have on Titan, we can approach those higher frequencies.â
Back in 2010, the SCEC team used the OLCFâs 1.75-petaflop Cray XT5 Jaguar supercomputer to simulate an 8-magnitude earthquake along the San Andreas Fault. Those simulations peaked at 2 Hertz. At the time the Jaguar simulations were conducted, doubling wave frequency would have required a 16-fold increase in computational power.
But on Titan in 2013, the team was able to run simulations of a 7.2-magnitude earthquake up to their goal of 10 Hertz, which can better inform performance-based building design. By modifying their code originally designed for CPUs for GPUsâthe Anelastic Wave Propagation by Olsen, Steven Day, and Cui, known as the AWP-ODCâthey significantly improved speed up. The simulations ran 5.2 times faster than they would have on a comparable CPU machine without GPU accelerators.
âWe redesigned the code to exploit high performance and throughput,â Cui said. âWe made some changes in the communications schema and reduced the communication required between the GPUs and CPUs, and that helped speed up the code.â
The SCEC team anticipates simulations on Titan will help improve its CyberShake platform, which is an ongoing sweep of millions of earthquake simulations that model many rupture sites across California.
âOur plan is to develop the GPU codes so the codes can be migrated to the CyberShake platform,â Jordan said. âOvercoming the computational barriers associated with high frequencies is one way Titan is preparing for this progression.â
Utilizing hybrid CPU/GPU machines in the future promises to substantially reduce the computational time required for each simulation, which would enable faster analyses and hazard assessments. And it is not only processor-hours that matter but real time as well. The 2010 San Andreas Fault simulations took 24 hours to run on Jaguar, but the higher frequency, higher resolution simulations took only five and a half hours on Titan.
And considering the âbig oneâ could shake California anytime in the next few decades to the next few years, accelerating our understanding of the potential damage is crucial to SCEC researchers.
âWe donât really know what happens in California during these massive events, since we havenât had one for more than 100 years,â Jordan said. âAnd simulation is the best technique we have for learning and preparing.â