Time-lapse seismic monitoring is a powerful technique for reservoir management and the optimization of hydrocarbon recovery. In time-lapse seismic datasets, the difference in seismic properties across different vintages enables the detection of spatio-temporal changes in saturated properties and structure induced by production. The main objectives are (1) to identify bypass pay zones in time-lapse seismic data for the deepwater Amberjack field, located in the Gulf of Mexico, (2) confirm the identified bypass pay zones in the results of reservoir simulation, and (3) recommend well planning strategies to exploit these bypassed resources.
A high-fidelity seismic-to-simulation 4D workflow that incorporates seismic, petrophysics, petrophysical property modeling, and reservoir simulation was employed, which leveraged cross-discipline interaction, interpretation, and integration to extend asset management capabilities. The workflow addresses geology (well log interpretation and framework development), geophysics (seismic interpretation, velocity modeling, and seismic inversion), and petrophysical property modeling (earth models and co-located co-simulation of petrophysical properties with P-impedance from seismic inversion). An embedded petro-elastic model (PEM) in the reservoir simulator is then used to affiliate spatial dry rock properties with saturation properties to compute dynamic elastic properties, which can be related to multi-vintage P-impedance from time-lapse seismic inversion. In the absence of the requisite dry rock properties for the PEM, a small data engine is used to determine these absent properties using metaheuristic optimization techniques. Specifically, two particle swarm optimization (PSO) applications, including an exterior penalty function (EPF), are modified resulting in the development of nested and average methods, respectively. These methods simultaneously calculate the missing rock parameters (dry rock bulk modulus, shear modulus, and density) necessary for dynamic, embedded P-impedance calculation in the history-constrained reservoir simulation results. Afterward, a graphic-enabled method was devised to appropriately classify the threshold to discriminate non-reservoir (including bypassed pay) and reservoir from the P-impedance difference. Its results are compared to unsupervised learning (k-means clustering and hierarchical clustering). From seismic data, one can identify bypassed pay locations, which are confirmed from reservoir simulation after conducting a seismic-driven history match. Finally, infill wells are planned, and then modeled in the reservoir simulator.
An accurate estimation of viscosity values is imperative for an optimal production and transport design of hydrocarbon fluids. Based on this requirement, precise and robust empirical correlation models are highly requested. While there are numerous correlation models from literature, most models are inadequate to predict an accurate oil viscosity using unbiased data. This study aims to develop new and improved empirical viscosity correlations through available field measurements on the NCS. The performance of the proposed models is then studied through a comparative analysis with published correlations from literature.
New correlation models are developed for dead, gas saturated and undersaturated oils using Particle Swarm Optimization (PSO) and Radial Basis Function Network (RBFN). The first technique is a computational optimization algorithm that aims to improve a function with respect to a specified objective function, while the latter is an artificial neural network model that utilizes different radial basis functions as activation functions.
The optimization algorithm was used to re-calculate the coefficients of established viscosity correlation expressions while maintaining their functional form. The results show that the modified correlation models are more in agreement with the test data for all three oil types using the defined parameters from literature, compared to the established empirical correlations and the RBFN. The new correlations provide a mean absolute percentage error of 15.08% and 17.41% and 3.35%, for dead, saturated and undersaturated oil viscosity, respectively. The highly accurate result in the latter correlation is linked to the input variables, as the undersaturated viscosity is a function of saturated viscosity, which is presumed known.
The results of this study make it reasonable to conclude that the proposed correlation methods are more in-line with the measured viscosity on the NCS, compared to the discussed correlation models from literature.
We use the mixed-integer nonlinear optimization algorithm called Particle Swarm Optimization and Mesh Adaptive Direct Search to optimize the design of seismic surveys. Due to the conflicting goals of obtaining a good subsurface illumination at the lowest possible cost, we apply a bi-objective optimization strategy that searches the best options in the illumination and cost senses while builds a Pareto front that shows the trade-off between illumination and cost and allows the survey designer to choose the specific amount of each one of them. The Particle Swarm Optimization part is used to escape local minima and the mixed-integer part is used to deal with integer aspects of a seismic survey design like the number of receivers and sources, to name but a few.
Presentation Date: Monday, September 25, 2017
Start Time: 3:55 PM
Location: Exhibit Hall C/D
Presentation Type: POSTER
Olalekan, Fayemi (Key Laboratory of Shale Gas and Geoengineering, Institute of Geology and Geophysics, Beijing 100029, China) | Di, Qingyun (Key Laboratory of Shale Gas and Geoengineering, Institute of Geology and Geophysics, Beijing 100029, China)
This study introduces the application of an improved implementation work flow for centered-centered progressive PSO (IRCCPSO) inversion technique for Multi-transient electromagnetic method (MTEM) full waveform inversion. The stabilizing functional was used to introduce the constraint in the inversion algorithm; thus, the global best position was updated using multi-objective functional. Firstly, 1D study using conventional IRCCPSO technique was presented. Furthermore, 2D inversion study over a buried resistive body model was carried out using a limited search space. The obtained inversion results were good representation of the earth model. Consequently, this confirms the effectiveness of the IRCCPSO technique as a good geophysical tool for MTEM full waveform inversion.
Presentation Date: Wednesday, September 27, 2017
Start Time: 2:40 PM
Location: Exhibit Hall C, E-P Station 2
Presentation Type: EPOSTER
Seismic traveltime tomography is an optimization problem that requires large computational efforts. Therefore, linearized techniques are commonly used for their low computational cost. These local optimization methods are likely to get trapped in a local minimum as they critically depend on the initial model. On the other hand, common global optimization techniques such as Genetic Algorithm (GA) or Simulated Annealing (SA) are insensitive to the initial model but are computationally expensive and require many controlling parameters. Particle Swarm Optimization (PSO) is a rather new global optimization approach with few parameters that has shown excellent convergence rates and is straightforwardly parallelizable, allowing a good distribution of the workload. However, while it can traverse several local minima of the evaluated misfit function, classical implementation of PSO can get trapped in local minima at later iterations as particles inertia dim.
We propose a Competitive PSO (CPSO) to allow "worst" particles to explore the model parameter space and eventually find a better minimum. A tomography algorithm based on CPSO is successfully applied on a 3D synthetic case corresponding to a typical calibration shot geometry in a hydraulic fracturing context.
Presentation Date: Tuesday, October 18, 2016
Start Time: 1:25:00 PM
Location: Lobby D/C
Presentation Type: POSTER
A novel particle swarm optimization (PSO) method for discrete parameters and its hybridized algorithm with multi-point geostatistics are presented. This stochastic algorithm is designed for complex geological models, which often require discrete facies modeling before simulating continuous reservoir properties. In this paper, we first develop a new PSO method for discrete parameters (Pro-DPSO) where particles move in the probability mass function (pmf) space instead of the parameter space. Then Pro-DPSO is hybridized with the single normal equation simulation algorithm (SNESIM), one of the popular multipoint geostatistics algorithms, to ensure the prior geological features. This hybridized algorithm (Pro-DPSO-SNESIM) is evaluated on a synthetic example of seismic inversion, and compared with a Markov chain Monte Carlo (McMC) method. The results show that the new algorithm generates multiple optimized models with the convergence rate much faster than the McMC method.
In this paper, we combine a fast wave equation solver using boundary integral methods with a global optimization method, namely Particle Swarm Optimization (PSO), to estimate an initial velocity model. Unlike finite difference methods that discretize the model space into pixels or voxels, our forward solver achieves significant computational savings by constraining the model space to a layered model with perturbations. The speed and reduced model space of the forward solve allows us to use global optimization methods that typically require numerous evaluations and few unknown variables. Our technique does not require an initial guess of a velocity model and is robust to local minima, unlike gradient descent frequently used in methods for both initial velocity model estimation and full waveform inversion. We apply our inversion algorithm to several synthetic data sets and demonstrate how prior information can be used to greatly improve the inversion.
Most seismic processing techniques rely on an accurate velocity model to obtain meaningful results. Incorrect velocity models can hamper processing and lead to erroneous interpretation of seismic data. In particular, inversion techniques such as Full Waveform Inversion (FWI) are highly sensitive to the initial velocity. Without a good initial velocity estimate, FWI will converge to local minima with artifacts (Virieux and Operto, 2009).
Unlike global seismology, exploration and regional scale velocity models can be poorly constrained. Initial velocity models can be built using a variety of techniques such as travel time tomography, NMO semblance analysis, and even full waveform inversion at very low frequencies (Woodward et al., 2008). Most methods of constructing an initial velocity model rely on expensive travel-time or wave equation solvers and use gradient based approaches that are susceptible to local minima. Furthermore, travel time tomography does not use the full wavefield and requires the picking of arrivals as well as long offsets. Frequently noise levels are too high for full waveform inversion to be performed at frequencies where the problem is sufficiently convex and thus it can be plagued by local minima Pratt (1999); Sirgue (2006).
To combat the problem of local minima in geophysical inverse problems, researchers have applied global optimization techniques that are less susceptible to local minima (Sambridge and Mosegaard, 2002; Sen and Stoffa, 2013). Researchers also make use of different objective functions and regularization techniques (Burstedde and Ghattas, 2009; van Leeuwen and Herrmann, 2013) to make the inversion more convex, mitigating the need for global solvers.
In this paper we integrate a fast Helmholtz solver with the global optimization method Particle Swarm Optimization (PSO) to invert a velocity model without an initial model. We first give an overview of the field expansion method we use for quickly solving the Helmholtz equation and modify it to reduce artifacts and accurately simulate exploration scale data. We then describe two variations of the PSO algorithm for inverting a velocity model from an observed data set. Finally, we present the results of our inversion algorithm on synthetic data and demonstrate how prior information about the velocity model can be used to improve the inversion.
History matching of the numerical model was conducted in a heterogeneous sandstone reservoir where the hydrocarbon gas injection inter-well test was implemented to confirm the injectivity and the incremental oil effect prior to the expansion to full field operation.
To decrease the degree of the misfit between simulated and measured values, some of the model parameters were calibrated using one of the stochastic sampling algorithms, Particle Swarm Optimization (PSO). However, it was difficult to lead all the misfit components of a pair of production wells to a good matching quality simultaneously. These two wells observed gas-breakthrough, but they seemed to show the trade-off of matching accuracy between one component and the others.
In order to overcome the difficulty in the conventional single-objective optimization method above, PSO with the principle of multi-objective optimization, MOPSO, was applied. Multi-objective optimization algorithms are useful to find many “pareto solutions”, which represent the models that are not inferior in every set of objective functions to any other models. Therefore, engineers can compare multiple history-matched models in “pareto solutions”. The aim of this study is to demonstrate the MOPSO method for history matching of a reservoir model of the real field.
MOPSO helped us find a lot of good models as the “pareto solutions”, including ones whose GOR matching accuracy of a well were improved dramatically, which had been difficult with single-objective PSO. The key was which objective functions should be grouped into a set or which should be handled individually. Also, the analysis of each model in the “pareto solutions” provided the understanding about the relationship between matching accuracy of individual wells and the model parameters.
Throughout this study, MOPSO was found an effective methodology especially for difficult history matching problems. Multi-objective optimization algorithms have the high potential for the application to various types of optimization studies including the selection of reservoir development scenarios.
A variety of optimization algorithms have been applied to automated history matching of numerical models of reservoirs for parameter sampling in order to make that process more easily and more efficiently. Input values to parameters should be sampled to find global optimum position in the parameter space quickly by avoiding the convergence to local optimum areas. This paper introduces an application case of one of the algorithms for stochastic sampling, Particular Swarm Optimization (PSO) algorithm (e.g., Mohammed et al, 2009; Okano, 2013).
About the PSO algorithm, some studies have been reported in petroleum industry. For example, Mohamed et al. (2009) investigated the efficiency of three stochastic sampling algorithms, Hamiltonian Monte Carlo (HMC) algorithm, PSO algorithm and the Neighbourhood Algorithm (NA), and found that PSO tended to concentrate sampling more in the low misfit regions and it was able to obtain a good history match in fewer iterations than NA for the example case although the behavior of these algorithms were likely to depend on the algorithm parameter setting.
Effective reservoir management relies heavily on the accurate prediction of reservoir flow performance over the entire life of the reservoir. Accurate prediction of flow performance is possible only if reservoir flow parameters are known with reasonable accuracy. Consequently, a major challenge in reservoir management is the estimation of reservoir parameters that affect the flow and distribution of reservoir fluids the most.
Estimation of distributed reservoir parameters has been done using mainly gradient-based optimization algorithms because these algorithms are relatively faster than global (stochastic) optimization algorithms. However, the gradient-based algorithms are local in nature and thus limited in their search ability. In this paper, we present a local-global optimization method that generates multiple realizations of reservoir parameters at coarse scale. The method involves the use of a local search optimization algorithm to parameterize the model space at a coarse scale followed by a stochastic search for better estimates in the vicinity of the local estimate. At the end of the search, the method produces a distribution of estimates that can be used for uncertainty quantification. To test the effectiveness of the method, the local-global optimization algorithm was applied to a sample reservoir with a known distributed permeability field. Results obtained indicate that the method is able to produce multiple history-matched realizations of the permeability field, some of which are closer to the true reservoir permeability distribution than the estimate obtained from an exhaustive local search.