Roberts, Alan W. (University of Durham) | Hobbs, Richard W. (University of Durham) | Goldstein, Michael (University of Durham) | Moorkamp, Max (IFM-Geomar) | Heincke, Björn (IFM-Geomar) | Jegen, Marion (IFM-Geomar)
Where several different kinds of geophysical datasets have been acquired from a particular region, each of these can contain valuable information about the Earth, which may not be present in the other datasets. Jointly determining a common model, therefore, often gives a more thorough and more constrained description of the Earth structure than considering each dataset individually. For example, a seismic velocity inversion is only weakly constrained by first arrival seismic refraction data, but considering it alongside Magneto-Telluric (MT) and gravity data can greatly assist in the constraint (Jegen-Kulcsar et al., 2009). Strategies for joint inversion are therefore an active area of research. To date, most schemes for accomplishing this have been deterministic in nature. Using a deterministic technique often means that it is conceptually difficult to include prior beliefs about the system under determination, uncertainties both in measurement and the relationship between the different physical quantities (velocity, resistivity, density), and the discrepancy between the model and the real Earth. Statistical strategies such as MCMC (Markov Chain Monte Carlo) model searches exist for assessing this kind of problem, but the number of potentially computationally expensive forward model runs required to effectively sample the whole model space and thus achieve a meaningful result is normally prohibitively high (> 105), even for simple 1D models, so such schemes are not generally implemented. However, a technique known as emulation is used in various scientific fields eg. cosmology (Vernon and Goldstein, 2009), whereby computationally expensive forward modelling code (a simulator) is approximated by an uncertainty-calibrated computationally cheap function. Here we apply emulation to the problem of stochastic joint model determination.
The main obstacle to carrying out a full Bayesian search over all plausible model space is computational expense, because the time taken to run one forward modelling step for a 3D model can be of order an hour or more. The purpose of an emulator is to use a computationally cheap function to approximate the output of the model simulator for a given set of input parameters. The strategy is to train the emulator using a number of the full forward simulator runs. This is similar to training a neural network. However, in training the emulator we also calibrate how uncertain the emulator is. This is important, particularly because we aim to use it in order to reject implausible model space, and a lack of knowledge about the uncertainty of the emulator would make this task impossible. Our approach is to build an emulator for each of the datasets; travel-time vs offset (t vs x) in the case of refraction seismic, Re(Z) and Im(Z) as a function of frequency (w) in the case of MT, and the gravity field (in the 1D case there is only one point), with the aim of predicting these functions for a given set of input parameters, to within a calibrated uncertainty over a range of model space that encompasses our target model. We use these emulator to identify and reject implausible models.