In the complete paper, the authors present a novel methodology to model interwell connectivity in mature waterfloods and achieve an improved reservoir-energy distribution and sweep pattern to maximize production performance by adjusting injection and production strategy on the well-control level. A Drilling Advisory System (DAS) is a rig-based drilling-surveillance and -optimization platform that encourages regular drilloff tests, carefully monitors drilling performance, and provides recommendations for controllable drilling parameters to help improve the overall drilling process. This paper proposes a framework based on proxies and rejection sampling (filtering) to perform multiple history-matching runs with a manageable number of reservoir simulations.
A challenging problem of automated history-matching work flows is ensuring that, after applying updates to previous models, the resulting history-matched models remain consistent geologically. To enhance the applicability of localization to various history-matching problems, the authors adopt an adaptive localization scheme that exploits the correlations between model variables and observations. This paper presents a novel approach to generate approximate conditional realizations using the distributed Gauss-Newton (DGN) method together with a multiple local Gaussian approximation technique. This work presents a systematic and rigorous approach of reservoir decomposition combined with the ensemble Kalman smoother to overcome the complexity and computational burden associated with history matching field-scale reservoirs in the Middle East. This paper presents a comparison of existing work flows and introduces a practically driven approach, referred to as “drill and learn,” using elements and concepts from existing work flows to quantify the value of learning (VOL).
Proper characterization of heterogeneous rock properties and hydraulic fracture parameters is essential for optimizing well spacing and reliable estimation of EUR in unconventional reservoirs. High resolution characterization of matrix properties and complex fracture parameters requires efficient history matching of well production and pressure response. We propose a novel reservoir model parameterization method to reduce the number of unknowns, regularize the ill-posed problem and enhance the efficiency of history matching of unconventional reservoirs.
Our proposed method makes a low rank approximation of the spatial distribution of reservoir properties taking into account the varying model resolution of the matrix and hydraulic fractures. Typically, hydraulic fractures are represented with much higher resolution through local grid refinements compared to the matrix properties. In our approach, the spatial property distribution of both for matrix and fractures is represented using a few parameters via a linear transformation with multiresolution basis functions. The parameters in transform domain are then updated during model calibrations, substantially reducing the number of unknowns. The multiresolution basis functions are constructed by eigen-decomposition of an adaptively coarsened grid Laplacian corresponding to the data resolution. High property resolution at the area of interest through the adaptive resolution control while keeping the original grid structure improves quality of history matching, reduces simulation runtime and improves the efficiency of history matching.
We demonstrate the power and efficacy of our method using synthetic and field examples. First, we illustrate the effectiveness of the proposed multiresolution parameterization by comparing it with traditional method. For the field application, an unconventional tight oil reservoir model with a multi-stage hydraulic fractured well is calibrated using bottom-hole pressure and water cut history data. The hydraulic fractures as well as the stimulated reservoir volume (SRV) near the well are represented with higher grid resolution. In addition to matrix and fracture properties, the extent of the SRV and hydraulic fractures are also adjusted through history matching using a Multiobjective Genetic Algorithm. The calibrated ensemble of models are used to obtain bounds of production forecast.
Our proposed method is designed to calibrate reservoir and fracture properties with higher resolution in regions that have improved data resolution and higher sensitivity to the well performance data, for example the SRV region and the hydraulic fractures. This leads to a fast and efficient history matching workflow and enables us to make optimal development/completion plans in a reasonable time frame.
Tripoppoom, Sutthaporn (The University of Texas at Austin / PTT Exploration and Production PLC) | Yu, Wei (The University of Texas at Austin) | Sepehrnoori, Kamy (The University of Texas at Austin) | Miao, Jijun (The University of Texas at Austin / Sim Tech LLC)
Production data which is always available at no additional cost can give an invaluable information of fracture geometry and reservoir properties. However, in unconventional reservoirs, it is insufficient to characterize hydraulic fractures geometry and reservoir properties by only one realization because it cannot capture the non-uniqueness of history matching problem and subsurface uncertainties. Therefore, the objective of this study is to obtain multiple realizations in shale reservoirs by adopting Assisted History Matching (AHM).
We used Neural Network-Markov Chain Monte Carlo (NN-MCMC) algorithm in the proposed AHM workflow for shale reservoirs. The reason is that MCMC, one of AHM in the Bayesian statistics, has benefits of quantifying uncertainty without bias or being trapped in any local minima. Also, using MCMC with neural network (NN) as a proxy model unlocks the limitation of an infeasible number of simulation runs required by a traditional MCMC algorithm. The proposed AHM workflow also utilized the benefits of Embedded Discrete Fracture Model (EDFM) to model fractures with a higher computational efficiency than a traditional local grid refinement (LGR) method and more accuracy than the continuum approach.
We applied the proposed AHM workflow to an actual shale-gas well. We performed history matching on two cases including hydraulic fractures only and hydraulic fractures with natural fractures. The uncertain parameters for history matching consist of fracture geometry, fracture conductivity, matrix permeability, matrix and fracture water saturation, and relative permeability curves. For the case with natural fractures, we included number, length and conductivity of natural fractures as the additional uncertain parameters.
We found that, in this case, the NN-MCMC algorithm find the history match solutions around 30% from a total number of simulation runs. Also, we obtained the posterior distribution of each fracture parameter and reservoir property for both cases. Moreover, we found that the presence of natural fractures affects the posterior distribution. We observed significantly lower fracture height, lower fracture conductivity, higher fracture water saturation than the case without natural fractures because more fluid flow is enhanced by natural fractures. Lastly, the proposed AHM workflow using NN-MCMC algorithm can characterize fracture geometry, reservoir properties, and natural fractures in a probabilistic manner. These multiple realizations can be further used for a probabilistic production forecast, future fracturing design improvement, and infill well placement decision.
It is observed across the global oil and gas industry that all projects are based on specific production forecasts to provide volumes and ensure continuity in the exploration and production business. The majority of these forecasts fall short or under perform based on promises made to the investment community and various stakeholders. There is evidence that industry expectations are usually higher than historical delivery. The onus lies with the exploration, development and production groups tasked with providing this information to find and use all tools and methods available at their disposal and to make realistic corrections during the forecasting process. It is also important to note that a single model will not perfectly represent reality, but we can rely on multiple paths to represent a realistic range of outcomes.
Integration of time-lapse seismic data into dynamic reservoir model is an efficient process in calibrating reservoir parameters update. The choice of the metric which will measure the misfit between observed data and simulated model has a considerable effect on the history matching process, and then on the optimal ensemble model acquired. History matching using 4D seismic and production data simultaneously is still a challenge due to the nature of the two different type of data (time-series and maps or volumes based).
Conventionally, the formulation used for the misfit is least square, which is widely used for production data matching. Distance measurement based objective functions designed for 4D image comparison have been explored in recent years and has been proven to be reliable. This study explores history matching process by introducing a merged objective function, between the production and the 4D seismic data. The proposed approach in this paper is to make comparable this two type of data (well and seismic) in a unique objective function, which will be optimised, avoiding by then the question of weights. An adaptive evolutionary optimisation algorithm has been used for the history matching loop. Local and global reservoir parameters are perturbed in this process, which include porosity, permeability, net-to-gross, and fault transmissibility.
This production and seismic history matching has been applied on a UKCS field, it shows that a acceptalbe production data matching is achieved while honouring saturation information obtained from 4D seismic surveys.
Ahmadinia, Masoud (Centre for Fluid and Complex Systems, Coventry University) | Shariatipour, Seyed (Centre for Fluid and Complex Systems, Coventry University) | Andersen, Odd (SINTEF Digital, Mathematics and Cybernetics) | Sadri, Mahdi (Centre for Fluid and Complex Systems, Coventry University)
To improve the reservoir simulation model, uncertain parameters such as porosity and permeability in the reservoir rock strata need to be adjusted to match the simulated production data with the actual production data. This process is known as History Matching (HM). In geological CO2 storage that is being promoted for use in depleted hydrocarbon reservoirs and saline aquifers, CO2 tends to migrate upwards and accumulate as a separate plume in the zone immediately beneath the reservoir caprock. Thus caprock morphology is of considerable importance with respect to storage safety and migration prediction for the purpose of long-term CO2 storage. Moreover, small scale caprock irregularities, which are not captured by seismic surveys, could be one of the sources of errors while matching the observed CO2 plume migration and the numerical modelling results (e.g. Sleipner). Thus here we study the impact of uncertainties in slope and rugosity (small scale caprock irregularities not captured by seismic surveys) on plume migration, using a history-matching process. We defined 10 cases with different initial guesses to reproduce the caprock properties representing an observed plume shape. The results showed a reasonable match between the horizontal plume shape in calibrated and observed models with an average error of 2.95 percentages
Liu, Guoxiang (Baker Hughes a GE Company) | Stephenson, Hayley (Baker Hughes a GE Company) | Shahkarami, Alireza (Baker Hughes a GE Company) | Murrell, Glen (Baker Hughes a GE Company) | Klenner, Robert (Energy & Environmental Research Center, University of North Dakota) | Iyer, Naresh (GE Global Research) | Barr, Brian (GE Global Research) | Virani, Nurali (GE Global Research)
Optimization problems, such as optimal well-spacing or completion design, can be resolved rapidly via surrogate proxy models, and these models can be built using either data-based or physics-based methods. Each approach has its strengths and weaknesses with respect to management of uncertainty, data quality or validation. This paper explores how data- and physics-based proxy models can be used together to create a workflow that combines the strengths of each approach and delivers an improved representation of the overall system. This paper presents use cases that display reduced simulation computational costs and/or reduced uncertainty in the outcomes of the models. A Bayesian calibration technique is used to improve predictability by combining numerical simulations with data regressions. Discrepancies between observations and surrogate outcomes are then observed to calibrate the model and improve the prediction quality and further reduce uncertainty. Furthermore, Gaussian Process Regression is used to locate global minima/maxima, with a minimal number of samples. To demonstrate the methodology, a reservoir model involving two wells in a drill space unit (DSU) in the Bakken Formation was constructed using publicly available data. This reservoir model was tuned by history matching the production data for the two wells. A data-based regression model was constructed based on machine learning technologies using the same dataset. Both models were coupled in a system to build a hybrid model to test the proposed process of data and physics coupling for completion optimization and uncertainty reduction. Subsequently, Gaussian Process Model was used to explore optimization scenarios outside of the data region of confidence and to exploit the hybrid model to further reduce uncertainty and prediction. Overall, both the computation time to identify optimal completion scenarios and uncertainty were reduced. This technique creates a robust framework to improve operational efficiency and drive completion optimization in an optimal timeframe. The hybrid modeling workflow has also been piloted in other applications such as completion design, well placement and optimization, parent-child well interference analysis, and well performance analysis.
Gao, Guohua (Shell Global Solutions (US)) | Vink, Jeroen C. (Shell Global Solutions International) | Chen, Chaohui (Shell International Exploration and Production) | Araujo, Mariela (Shell Global Solutions (US)) | Ramirez, Benjamin A. (Shell International Exploration and Production) | Jennings, James W. (Shell International Exploration and Production) | El Khamra, Yaakoub (Shell Global Solutions (US)) | Ita, Joel (Shell Global Solutions (US))
Uncertainty quantification of production forecasts is crucially important for business planning of hydrocarbon-field developments. This is still a very challenging task, especially when subsurface uncertainties must be conditioned to production data. Many different approaches have been proposed, each with their strengths and weaknesses. In this work, we develop a robust uncertainty-quantification work flow by seamless integration of a distributed-Gauss-Newton (GN) (DGN) optimization method with a Gaussian mixture model (GMM) and parallelized sampling algorithms. Results are compared with those obtained from other approaches.
Multiple local maximum-a-posteriori (MAP) estimates are determined with the local-search DGN optimization method. A GMM is constructed to approximate the posterior probability-density function (PDF) by reusing simulation results generated during the DGN minimization process. The traditional acceptance/rejection (AR) algorithm is parallelized and applied to improve the quality of GMM samples by rejecting unqualified samples. AR-GMM samples are independent, identically distributed samples that can be directly used for uncertainty quantification of model parameters and production forecasts.
The proposed method is first validated with 1D nonlinear synthetic problems with multiple MAP points. The AR-GMM samples are better than the original GMM samples. The method is then tested with a synthetic history-matching problem using the SPE01 reservoir model (Odeh 1981; Islam and Sepehrnoori 2013) with eight uncertain parameters. The proposed method generates conditional samples that are better than or equivalent to those generated by other methods, such as Markov-chain Monte Carlo (MCMC) and global-search DGN combined with the randomized-maximum-likelihood (RML) approach, but have a much lower computational cost (by a factor of five to 100). Finally, it is applied to a real-field reservoir model with synthetic data, with 235 uncertain parameters. AGMM with 27 Gaussian components is constructed to approximate the actual posterior PDF. There are 105 AR-GMM samples accepted from the 1,000 original GMM samples, and they are used to quantify the uncertainty of production forecasts. The proposed method is further validated by the fact that production forecasts for all AR-GMM samples are quite consistent with the production data observed after the history-matching period.
The newly proposed approach for history matching and uncertainty quantification is quite efficient and robust. The DGN optimization method can efficiently identify multiple local MAP points in parallel. The GMM yields proposal candidates with sufficiently high acceptance ratios for the AR algorithm. Parallelization makes the AR algorithm much more efficient, which further enhances the efficiency of the integrated work flow.