Layer | Fill | Outline |
---|
Map layers
Theme | Visible | Selectable | Appearance | Zoom Range (now: 0) |
---|
Fill | Stroke |
---|---|
Collaborating Authors
Herrmann, Felix J.
The industry is experiencing significant changes due to artificial intelligence (AI) and the challenges of the energy transition. While some view these changes as threats, recent advances in AI offer unique opportunities, especially in the context of “digital twins” for subsurface monitoring and control.
- Geophysics > Seismic Surveying (1.00)
- Geophysics > Time-Lapse Surveying > Time-Lapse Seismic Surveying (0.49)
3D seismic survey design by maximizing the spectral gap
Zhang, Yijun (Georgia Institute of Technology) | Yin, Ziyi (Georgia Institute of Technology) | López, Oscar (Florida Atlantic University) | Siahkoohi, Ali (Rice University) | Louboutin, Mathias (Georgia Institute of Technology) | Herrmann, Felix J. (Georgia Institute of Technology)
The massive cost of 3D acquisition calls for methods to reduce the number of receivers by designing optimal receiver sampling masks. Recent studies on 2D seismic showed that maximizing the spectral gap of the subsampling mask leads to better wavefield reconstruction results. We enrich the current study by proposing a simulation-free method to generate optimal 3D acquisition by maximizing the spectral gap of the subsampling mask via a simulated annealing algorithm. Numerical experiments confirm improvement of the proposed method over receiver sampling locations obtained by jittered sampling.
Optimized time-lapse acquisition design via spectral gap ratio minimization
Zhang, Yijun (Georgia Institute of Technology) | Yin, Ziyi (Georgia Institute of Technology) | López, Oscar (Harbor Branch Oceanographic Institute) | Siahkoohi, Ali (Rice University) | Louboutin, Mathias (Georgia Institute of Technology) | Kumar, Rajiv (Schlumberger) | Herrmann, Felix J. (Georgia Institute of Technology)
ABSTRACT Modern-day reservoir management and monitoring of geologic carbon storage increasingly call for costly time-lapse seismic data collection. We demonstrate how techniques from graph theory can be used to optimize acquisition geometries for low-cost sparse 4D seismic data. Based on midpoint-offset-domain connectivity arguments, our algorithm automatically produces sparse nonreplicated time-lapse acquisition geometries that favor wavefield recovery.
- Geophysics > Time-Lapse Surveying > Time-Lapse Seismic Surveying (1.00)
- Geophysics > Seismic Surveying (1.00)
Abstract We present the Seismic Laboratory for Imaging and Modeling/Monitoring open-source software framework for computational geophysics and, more generally, inverse problems involving the wave equation (e.g., seismic and medical ultrasound), regularization with learned priors, and learned neural surrogates for multiphase flow simulations. By integrating multiple layers of abstraction, the software is designed to be both readable and scalable, allowing researchers to easily formulate problems in an abstract fashion while exploiting the latest developments in high-performance computing. The design principles and their benefits are illustrated and demonstrated by means of building a scalable prototype for permeability inversion from time-lapse crosswell seismic data, which, aside from coupling of wave physics and multiphase flow, involves machine learning.
- Europe (0.68)
- North America > United States (0.46)
- Reservoir Description and Dynamics > Reservoir Fluid Dynamics > Flow in porous media (1.00)
- Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (1.00)
- Data Science & Engineering Analytics > Information Management and Systems > Artificial intelligence (1.00)
ABSTRACT Bayesian inference for high-dimensional inverse problems is computationally costly and requires selecting a suitable prior distribution. Amortized variational inference addresses these challenges by pretraining a neural network that acts as a surrogate conditional distribution that approximates the posterior distribution not only for one instance of the observed data but also for the distribution of the data pertaining to a specific inverse problem. When fed previously unseen data, the neural network — in our case, a conditional normalizing flow — provides the posterior samples at virtually no cost. However, the accuracy of amortized variational inference relies on the availability of high-fidelity training data, which seldom exist in geophysical inverse problems because of the earth’s heterogeneous subsurface. In addition, the network is prone to errors if evaluated over data that are not drawn from the training data distribution. As such, we have aimed to increase the resilience of amortized variational inference in the presence of moderate data distribution shifts. We achieve this via a correction to the conditional normalizing flow’s latent distribution that improves the approximation to the posterior distribution for the data at hand. The correction involves relaxing the standard Gaussian assumption on the latent distribution and parameterizing it via a Gaussian distribution with an unknown mean and (diagonal) covariance. These unknowns are then estimated by minimizing the Kullback-Leibler divergence between the corrected and the (physics-based) true posterior distributions. Although generic and applicable to other inverse problems by means of a linearized seismic imaging example, we find that our correction step improves the robustness of amortized variational inference with respect to changes in the number of seismic sources, noise variance, and shifts in the prior distribution. This approach, given noisy seismic data simulated via the linearized Born modeling, provides a seismic image with limited artifacts and an assessment of its uncertainty at approximately the same cost as five reverse time migrations.
- Geophysics > Seismic Surveying > Seismic Processing > Seismic Migration (0.87)
- Geophysics > Seismic Surveying > Seismic Modeling > Velocity Modeling > Seismic Inversion (0.46)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- (2 more...)
ABSTRACT Surface consistency forms the basis for short-wavelength statics estimation. When raypaths in the near surface diverge from a normal incidence or when the normal moveout (NMO) velocity is inaccurate, surface-consistent methods may fail to estimate accurate statics. Existing nonsurface-consistent techniques can be prone to errors due to the need to construct pilot traces or pick horizons while imposing additional computational costs. To overcome these limitations and correct for the surface- and nonsurface-consistent statics, we have developed a low-rank-based residual statics (LR-ReS) estimation and correction framework. The method makes use of the redundant nature of seismic data by using its low-rank structure in the midpoint-offset-frequency domain. Due to the near-surface effect, the low-rank structure is destroyed. Therefore, we estimate the statics by means of low-rank approximation and crosscorrelation. To alleviate the need for accurate rank selection for low-rank approximation and improved statics estimation, we implement the method in an iterative and multiscale fashion. Because the low-rank approximation deteriorates at high frequencies, we use its better performance at low frequencies and exploit the common statics among the different frequency bands. The LR-ReS estimation and correction can be applied to data without an NMO correction, which makes statics estimation independent of the NMO velocity errors. Consequently, it can reduce the multiple iterations of the NMO velocity estimation and short-wavelength statics correction commonly needed for conventional methods to improve their performance. Moreover, the LR-ReS estimation does not require windowing of a noise-free area containing aligned primaries or mute to avoid the NMO stretch effect, which enables statics correction of the wavefield of all offsets. To evaluate the performance of our method, we apply it to simulated data and a challenging field data set affected by complex weathering layers and noise, which indicate a substantial improvement compared with conventional short-wavelength statics correction.
ABSTRACT We use techniques from Bayesian inference and deep neural networks to translate uncertainty in seismic imaging to uncertainty in tasks performed on the image, such as horizon tracking. Seismic imaging is an ill-posed inverse problem because of bandwidth and aperture limitations, which are hampered by the presence of noise and linearization errors. Many regularization methods, such as transform-domain sparsity promotion, have been designed to deal with the adverse effects of these errors; however, these methods run the risk of biasing the solution and do not provide information on uncertainty in the image space and how this uncertainty impacts certain tasks on the image. A systematic approach is developed to translate uncertainty due to noise in the data to the confidence intervals of automatically tracked horizons in the image. The uncertainty in the seismic image is characterized by a convolutional neural network (CNN) that is used to reparameterize the image. To assess these uncertainties, samples are drawn from the posterior distribution of the CNN weights. Compared with traditional priors, it is argued in the literature that these CNNs introduce a flexible inductive bias that is a surprisingly good fit for a diverse set of problems, including medical imaging, compressive sensing, and diffraction tomography. The method of stochastic gradient Langevin dynamics is used to sample from the posterior distribution. This method is designed to handle large-scale Bayesian inference problems with computationally expensive forward operators as in seismic imaging. Aside from offering a robust alternative to the maximum a posteriori estimate that is prone to overfitting, access to these samples allows us to translate uncertainty in the image, due to noise in the data, to uncertainty on the tracked horizons. For instance, it admits estimates for the pointwise standard deviation on the image and for confidence intervals on its automatically tracked horizons.
- Geophysics > Seismic Surveying > Seismic Modeling > Velocity Modeling > Seismic Inversion (0.46)
- Geophysics > Seismic Surveying > Seismic Processing > Seismic Migration (0.46)
- Health & Medicine (1.00)
- Energy > Oil & Gas > Upstream (1.00)
- Europe > Norway > North Sea > Central North Sea > South Viking Graben > PL 046 > Block 15/9 > Sleipner Field > Draupne Formation (0.99)
- Europe > Norway > North Sea > Central North Sea > South Viking Graben > PL 046 > Block 15/8 > Sleipner Field > Draupne Formation (0.99)
- Europe > Norway > North Sea > Central North Sea > South Viking Graben > PL 046 > Block 15/6 > Sleipner Field > Draupne Formation (0.99)
- Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (1.00)
- Data Science & Engineering Analytics > Information Management and Systems > Neural networks (1.00)
- Data Science & Engineering Analytics > Information Management and Systems > Artificial intelligence (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (1.00)
ABSTRACT Many currently proposed seismic inversion techniques that focus on robustness with respect to the background model choice are not appropriate for large-scale 3D applications, and the methods that are computationally feasible for industrial problems, such as full-waveform inversion, are notoriously limited by convergence stagnation and require adequate starting models. We have developed a novel solution that is scalable and less sensitive to starting models or inaccurate parameters (such as anisotropy) that are typically kept fixed during inversion. It is based on a dual reformulation of the classic wavefield reconstruction inversion, whose empirical robustness with respect to these issues is well documented in the literature. Although the classic version is not suited to three dimensions because it leverages expensive frequency-domain solvers for the wave equation, our proposal allows the deployment of state-of-the-art time-domain finite-difference methods, and it is potentially sufficiently mature for industrial-scale problems.
- Geophysics > Seismic Surveying > Seismic Processing (1.00)
- Geophysics > Seismic Surveying > Seismic Modeling > Velocity Modeling > Seismic Inversion (1.00)
Uncertainty quantification provides quantitative measures on the reliability of candidate solutions of ill-posed inverse problems. Due to their sequential nature, Monte Carlo sampling methods require large numbers of sampling steps for accurate Bayesian inference and are often computationally infeasible for large-scale inverse problems, such as seismic imaging. Our main contribution is a data-driven variational inference approach where we train a normalizing flow (NF), a type of invertible neural net, capable of cheaply sampling the posterior distribution given previously unseen seismic data from neighboring surveys. To arrive at this result, we train the NF on pairs of low- and high-fidelity migrated images. In our numerical example, we obtain high-fidelity images from the Parihaka dataset and low-fidelity images are derived from these images through the process of demigration, followed by adding noise and migration. During inference, given shot records from a new neighboring seismic survey, we first compute the reverse-time migration image. Next, by feeding this low-fidelity migrated image to the NF we gain access to samples from the posterior distribution virtually for free. We use these samples to compute a high-fidelity image including a first assessment of the image’s reliability. To our knowledge, this is the first attempt to train a conditional network on what we know from neighboring images to improve the current image and assess its reliability.
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.96)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.89)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.46)
Time-lapse seismic monitoring of carbon storage and sequestration is often challenging because the time-lapse signature of the growth of CO2 plumes is weak in amplitude and therefore difficult to detect seismically. This situation is compounded by the fact that the surveys are often coarsely sampled and not replicated to reduce costs. As a result, images obtained for different vintages (baseline and monitor surveys) often contain artifacts that may be attributed wrongly to time-lapse changes. To address these issues, we propose to invert the baseline and monitor surveys jointly. By using the joint recovery model, we exploit information shared between multiple time-lapse surveys. Contrary to other time-lapse methods, our approach does not rely on replicating the surveys to detect time-lapse changes. To illustrate this advantage, we present a numerical sensitivity study where CO2 is injected in a realistic synthetic model. This model is representative of the geology in the southeast of the North Sea, an area currently considered for carbon sequestration. Our example demonstrates that the joint recovery model improves the quality of time-lapse images allowing us to monitor the CO2 plume seismically.