Chichinina, T. (Instituto Mexicano del Petroleo) | Obolentseva, I. (Trofimuk Institute of Petroleum Geology and Geophysics SB RAS) | Gorshkalev, S. (Trofimuk Institute of Petroleum Geology and Geophysics SB RAS) | Karsten, W. (Trofimuk Institute of Petroleum Geology and Geophysics SB RAS) | Korsunov, I. (Geostylus Limited)
Gangopadhyay, Abhijit (BP America) | Johnston, Rodney (BP America) | Lucas, Jennifer (BP America) | Ramirez, Jaime (BP America) | Gillham, Travis (BP America) | Peña, Victor (BP America) | Wirnkar, Fabian (BP America)
Trandafir, Gabriela (University of Bucharest) | Marasescu, Alina (University of Bucharest) | Furnica, Mihai (University of Bucharest) | Orza, Razvan (University of Bucharest) | Iacob, Cezar (University of Bucharest)
How much can one trust the predictive capabilities of a computational simulation? If a politician or executive was going to use the results of a computer model for some critical decision, how well should we sleep at night?
To properly answer these questions, one must characterize and quantify the uncertainties inherent in the model and its predictions. The last decade has seen the emergence and growth of methods and procedures for uncertainty quantification for computer simulations. Techniques for calibrating models to available measurement data, propagating uncertainties from model inputs to model outputs, and accurately interpolating model predictions to untried input parameters are meant to instill confidence that a given simulation has some predictive capability.
Such characterizations of uncertainties typically require an ensemble of runs from the computer model where each run is a variation of an incompletely characterized model input - e.g., material properties, boundary conditions, etc. However, a single simulation can be remarkably computationally expensive, often requiring extensive time on massive supercomputers. The cost of the simulations makes it difficult to study the effects of the parameters on model outputs; thorough sensitivity analysis, uncertainty quantification, and design optimization studies are often infeasible.
This presentation will examine methods for constructing cheap( reduced order models (ROMs) with input/output relationships that mimic the full physical simulation. These ROMs can be used in place of the expensive simulation to study the effects of the input parameters on the model outputs.
In particular, I will discuss our recent work on a non-intrusive, interpolating ROM. The essential idea behind the construction of the ROM is to run a few expensive simulations, and to use their outputs to tune the parameters of the ROM. This tuning procedure involves a singular value decomposition (SVD) on the matrix of outputs from the expensive simulations, which is very large for highly resolved space and time discretizations. We discuss MapReduce-based methods for scaling the computation of the SVD to high fidelity simulation codes.
We will demonstrate the capabilities of the ROM using a two-dimensional dynamic rupture simulation of the magnitude 9.0 March 2011 Tohoku earthquake.