Reliability of subsurface assessment for different field development scenarios depends on how effective the uncertainty in production forecast is quantified. Currently there is a body of work in the literature on different methods to quantify the uncertainty in production forecast. The objective of this paper is to revisit and compare these probabilistic uncertainty quantification techniques through their applications to assisted history matching of a deep-water offshore waterflood field. The paper will address the benefits, limitations, and the best criteria for applicability of each technique.
Three probabilistic history matching techniques commonly practiced in the industry are discussed. These are Design-of-Experiment (DoE) with rejection sampling from proxy, Ensemble Smoother (ES) and Genetic Algorithm (GA). The model used for this study is an offshore waterflood field in Gulf-of-Mexico. Posterior distributions of global subsurface uncertainties (e.g. regional pore volume and oil-water contact) were estimated using each technique conditioned to the injection and production data.
The three probabilistic history matching techniques were applied to a deep-water field with 13 years of production history. The first 8 years of production data was used for the history matching and estimate of the posterior distribution of uncertainty in geologic parameters. While the convergence behavior and shape of the posterior distributions were different, consistent posterior means were obtained from Bayesian workflows such as DoE or ES. In contrast, the application of GA showed differences in posterior distribution of geological uncertainty parameters, especially those that had small sensitivity to the production data. We then conducted production forecast by including infill wells and evaluated the production performance using sample means of posterior geologic uncertainty parameters. The robustness of the solution was examined by performing history matching multiple times using different initial sample points (e.g. random seed). This confirmed that heuristic optimization techniques such as GA were unstable since parameter setup for the optimizer had a large impact on uncertainty characterization and production performance.
This study shows the guideline to obtain the stable solution from the history matching techniques used for different conditions such as number of simulation model realizations and uncertainty parameters, and number of datapoints (e.g. maturity of the reservoir development). These guidelines will greatly help the decision-making process in selection of best development options.
Recent studies have indicated that Huff-n-Puff (HNP) gas injection has the potential to recover an additional 30-70% oil from multi-fractured horizontal wells in shale reservoirs. Nonetheless, this technique is very sensitive to production constraints and is impacted by uncertainty related to measurement quality (particularly frequency and resolution), and lack of constraining data. In this paper, a Bayesian workflow is provided to optimize the HNP process under uncertainty using a Duvernay shale well as an example.
Compositional simulations are conducted which incorporate a tuned PVT model and a set of measured cyclic injection/compaction pressure-sensitive permeability data. Markov chain Monte Carlo (McMC) is used to estimate the posterior distributions of the model uncertain variables by matching the primary production data. The McMC process is accelerated by employing an accurate proxy model (kriging) which is updated using a highly adaptive sampling algorithm. Gaussian Processes are then used to optimize the HNP control variables by maximizing the lower confidence interval (μ-σ) of cumulative oil production (after 10 years) across a fixed ensemble of uncertain variables sampled from posterior distributions.
The uncertain variable space includes several parameters representing reservoir and fracture properties. The posterior distributions for some parameters, such as primary fracture permeability and effective half-length, are narrower, while wider distributions are obtained for other parameters. The results indicate that the impact of uncertain variables on HNP performance is nonlinear. Some uncertain variables (such as molecular diffusion) that do not show strong sensitivity during the primary production strongly impact gas injection HNP performance. The results of optimization under uncertainty confirm that the lower confidence interval of cumulative oil production can be maximized by an injection time of around 1.5 months, a production time of around 2.5 months, and very short soaking times. In addition, a maximum injection rate and a flowing bottomhole pressure around the bubble point are required to ensure maximum incremental recovery. Analysis of the objective function surface highlights some other sets of production constraints with competitive results. Finally, the optimal set of production constraints, in combination with an ensemble of uncertain variables, results in a median HNP cumulative oil production that is 30% greater than that for primary production.
The application of a Bayesian framework for optimizing the HNP performance in a real shale reservoir is introduced for the first time. This work provides practical guidelines for the efficient application of advanced machine learning techniques for optimization under uncertainty, resulting in better decision making.
Da Silva Moreira, Paulo Henrique (LRAP / Universidade Federal do Rio de Janeiro) | Gomes da Silveira, Thaís Márcia (LRAP / Universidade Federal do Rio de Janeiro) | Drexler, Santiago (LRAP / Universidade Federal do Rio de Janeiro) | Couto, Paulo (LRAP / Universidade Federal do Rio de Janeiro)
The reliable prediction of reservoir performance requires the cost effective implementation of oil recovery systems, and it is necessary to simulate the fluid flow processes in the reservoir and to measure the rock and fluid properties that determine reservoir behaviour. However, a good prediction relies on accurate values of reservoir physical properties. Carbonates rocks in Brazilian Pre-salt are known for their heterogeneity. Characterizing their physical represents a great challenge and the combination of experimental and computational techniques lead to a more comprehensive understanding of the reservoir behavior.
In the present work, the relative permeability curves of a carbonate core sample with respect to oil and water are calculated by matching the data obtained in a labscale unsteady-state core flood experiment carried out at high pressure high temperature characteristics of Brazilian Pre-salt reservoirs. Corey-type equations were used to model the relative permeability due to its simplicity and having fewer parameters involved. The Monte Carlo Markov chain (MCMC) method was used as optimization tool, taking the fluid production and pressure drop measurements collected during the core flood experiment as input data. An alalysis of the sensitivity cofficients was carried out in order to deal with eventual linear dependences among the terms to be estimated. The Markov chain was generated and its convergence observed. The posterior distributions of the constant terms in the Corey equations were calculated and their mean values applied in order to calculate the relative permeability curves for the oil and water phases. The range of water saturation in which the relative permeability curves describe the core conditions after the breakthrough time, due to the occurrence of capillary end-effects, was calculated. The history match of fluid production and pressure drop was carried out, showing a good fit between the pressure curves. A gap was observed between the production curves due to the fact that the experimental measurements accounted the cumulative volume of oil and water, while the theoretical curve accounted the oil volume only.
Araujo, Mariela (Shell International Exploration and Production Inc.) | Chen, Chaohui (Shell International Exploration and Production Inc.) | Gao, Guohua (Shell International Exploration and Production Inc.) | Jennings, Jim (Shell International Exploration and Production Inc.) | Ramirez, Benjamin (Shell International Exploration and Production Inc.) | Xu, Zhihua (ExxonMobil) | Yeh, Tzu-hao (Shell International Exploration and Production Inc.) | Alpak, Faruk Omer (Shell International Exploration and Production Inc.) | Gelderblom, Paul (Shell International Exploration and Production Inc.)
Increased access to computational resources has allowed reservoir engineers to include assisted history matching (AHM) and uncertainty quantification (UQ) techniques as standard steps of reservoir management workflows. Several advanced methods have become available and are being used in routine activities without a proper understanding of their performance and quality. This paper provides recommendations on the efficiency and quality of different methods for applications to production forecasting, supporting the reservoir-management decision-making process.
Results from five advanced methods and two traditional methods were benchmarked in the study. The advanced methods include a nested sampling method MultiNest, the integrated global search Distributed Gauss-Newton (DGN) optimizer with Randomized Maximum Likelihood (RML), the integrated local search DGN optimizer with a Gaussian Mixture Model (GMM), and two advanced Bayesian inference-based methods from commercial simulation packages. Two traditional methods were also included for some test problems: the Markov-Chain Monte Carlo method (MCMC) is known to produce accurate results although it is too expensive for most practical problems, and a DoE-proxy based method widely used and available in some form in most commercial simulation packages.
The methods were tested on three different cases of increasing complexity: a 1D simple model based on an analytical function with one uncertain parameter, a simple injector-producer well pair in the SPE01 model with eight uncertain parameters, and an unconventional reservoir model with one well and 24 uncertain parameters. A collection of benchmark metrics was considered to compare the results, but the most useful included the total number of simulation runs, sample size, objective function distributions, cumulative oil production forecast distributions, and marginal posterior parameter distributions.
MultiNest and MCMC were found to produce the most accurate results, but MCMC is too costly for practical problems. MultiNest is also costly, but it is much more efficient than MCMC and it may be affordable for some practical applications. The proxy-based method is the lowest-cost solution. However, its accuracy is unacceptably poor.
DGN-RML and DGN-GMM seem to have the best compromise between accuracy and efficiency, and the best of these two is DGN-GMM. These two methods may produce some poor-quality samples that should be rejected for the final uncertainty quantification.
The results from the benchmark study are somewhat surprising and provide awareness to the reservoir engineering community on the quality and efficiency of the advanced and most traditional methods used for AHM and UQ. Our recommendation is to use DGN-GMM instead of the traditional proxy-based methods for most practical problems, and to consider using the more expensive MultiNest when the cost of running the reservoir models is moderate and high-quality solutions are desired.
Bayesian inference provides a convenient framework for history matching and prediction. In this framework, prior knowledge, system nonlinearity, and measurement errors can be directly incorporated into the posterior distribution of the parameters. The Markov-chain Monte Carlo (MCMC) method is a powerful tool to generate samples from the posterior distribution. However, the MCMC method usually requires a large number of forward simulations. Hence, it can be a computationally intensive task, particularly when dealing with large-scale flow and transport models. To address this issue, we construct a surrogate system for the model outputs in the form of polynomials using the stochastic collocation method (SCM). In addition, we use interpolation with the nested sparse grids and adaptively take into account the different importance of parameters for high-dimensional problems. Furthermore, we introduce an additional transform process to improve the accuracy of the surrogate model in case of strong nonlinearities, such as a discontinuous or unsmooth relation between the input parameters and the output responses. Once the surrogate system is built, we can evaluate the likelihood with little computational cost. Numerical results demonstrate that the proposed method can efficiently estimate the posterior statistics of input parameters and provide accurate results for history matching and prediction of the observed data with a moderate number of parameters.
Gao, Guohua (Shell Global Solutions, US Inc.) | Vink, Jeroen C. (Shell Global Solutions International B.V.) | Chen, Chaohui (Shell International Exploration & Production Inc.) | Araujo, Mariela (Shell International Exploration & Production Inc.) | Ramirez, Benjamin (Shell International Exploration & Production Inc.) | Jennings, Jim W. (Shell International Exploration & Production Inc.) | Khamra, Yaakoub El (Shell Global Solutions, US Inc.) | Ita, Joel (Shell Global Solutions, US Inc.)
Uncertainty quantification of production forecasts is crucially important for business planning of hydrocarbon field developments. This is still a very challenging task, especially when subsurface uncertainties must be conditioned to production data. Many different approaches have been proposed, each with their strengths and weaknesses. In this work, we develop a robust uncertainty quantification workflow by seamless integration of a distributed Gauss-Newton (DGN) optimization method with Gaussian Mixture Model (GMM) and parallelized sampling algorithms. Results are compared with those obtained from other approaches.
Multiple local maximum-a-posteriori (MAP) estimates are located with the local-search DGN optimization method. A GMM is constructed to approximate the posterior probability density function, by fitting simulation results generated during the DGN minimization process. The traditional acceptance-rejection (AR) algorithm is parallelized and applied to improve the quality of GMM samples by rejecting unqualified samples. AR-GMM samples are independent, identically-distributed (i.i.d.) samples that can be directly used for uncertainty quantification of model parameters and production forecasts.
The proposed method is first validated with 1-D nonlinear synthetic problems having multiple MAP points. The AR-GMM samples are better than the original GMM samples. Then, it is tested with a synthetic history-matching problem using the SPE-1 reservoir model with 8 uncertain parameters. The proposed method generates conditional samples that are better than or equivalent to those generated by other methods, e.g., Markov chain Monte Carlo (MCMC) and global search DGN combined with the Randomized Maximum Likelihood (RML) approach, but have a much lower computational cost (by a factor of 5 to 100). Finally, it is applied to a real field reservoir model with synthetic data, having 235 uncertain parameters. A GMM with 27 Gaussian components is constructed to approximate the actual posterior PDF. 105 AR-GMM samples are accepted from the 1000 original GMM samples, and are used to quantify uncertainty of production forecasts. The proposed method is further validated by the fact that production forecasts for all AR-GMM samples are quite consistent with the production data observed after the history matching period.
The newly proposed approach for history matching and uncertainty quantification is quite efficient and robust. The DGN optimization method can efficiently identify multiple local MAP points in parallel. The GMM yields proposal candidates with sufficiently high acceptance ratios for the AR algorithm. Parallelization makes the AR algorithm much more efficient, which further enhances the efficiency of the integrated workflow.
Monthly production data can be easily obtained from public and commercial databases, and are essential information to assess the performance of operators across different fields. Additionally, analyzing the production of the wells via maps is helpful in the visualization of general aspects of the field geology and potential identification of sweet spots. In this context, decline curve models are a feasible choice to process the available production data, because this is the only data required by these models, they have a reduced number of parameters which can be promptly history matched and analyzed, and they provide production forecasts and estimated ultimate recoveries (EUR's). As summarized in the work of Arps (1945), decline curve analysis has been applied to predict oil production since the beginning of the last century. Arps (1945) presented differential equations for the rate-time relationship, which resulted in the exponential, harmonic and hyperbolic models. Although initially these models were derived from empiricism, subsequent works attempted to explain those equations from a fluid flow perspective (Fetkovich, 1980; Camacho-Velázquez, 1987; Camacho-Velázquez and Raghavan, 1989). However, in unconventional reservoirs, the Arps equations are not a reasonable extrapolation for the extended transient flow period, because it does not account for a transition to boundary dominated flow and results in an infinite EUR (Lee and Sidle, 2010).
Hong, Aojie (University of Stavanger and The National IOR Centre of Norway) | Bratvold, Reidar B. (University of Stavanger and The National IOR Centre of Norway) | Lake, Larry W. (The University of Texas at Austin)
This paper was prepared for presentation at the Unconventional Resources Technology Conference held in Houston, Texas, USA, 23-25 July 2018. The URTeC Technical Program Committee accepted this presentation on the basis of information contained in an abstract submitted by the author(s). The contents of this paper have not been reviewed by URTeC and URTeC does not warrant the accuracy, reliability, or timeliness of any information herein. All information is the responsibility of, and, is subject to corrections by the author(s). Any person or entity that relies on any information obtained from this paper does so at their own risk. The information herein does not necessarily reflect any position of URTeC. Any reproduction, distribution, or storage of any part of this paper by anyone other than the author without the written consent of URTeC is prohibited. Abstract Decline curve analysis (DCA) for unconventional plays requires a model that can capture the characteristics of different flow regimes. Thus, various models have been proposed. Traditionally, in probabilistic DCA, an analyst chooses a single model that he/she believes best fits the data. However, several models might fit the data almost equally well and the one that best fits the data might not best represent the flow characteristics. Therefore, uncertainty in which is the "best" model remains. This work aims to integrate model uncertainty in probabilistic DCA for unconventional plays. Instead of identifying a single "best" model, we propose to regard any model as potentially good, where goodness is characterized by a probability.
ABSTRACT: Predicting rockburst intensity is an important task in mining since rockburst occurs as a violent expulsion of rock in high geo-stress condition which causes considerable damages to underground structures, equipment and most importantly presents serious menaces to workers’ safety. It has been responsible for numerous deaths and injuries in underground mines across the world. Due to this importance, the current study aims at predicting the intensity of rockburst on the basis of 174 rockburst events that were compiled. Several existing criteria were considered to model the rockburst intensity. The inputs parameters included the maximum tangential stress, the uniaxial compressive strength, the uniaxial tensile strength of the surrounding rock and the elastic strain energy index. A Bayesian inference approach was implemented to identify the most appropriate models for estimating the rockburst intensity category among three rockburst criteria. The WinBUGS software was used to compute the posterior predictive distributions of the model parameters and the deviance information criterion (DIC) corresponding to the models. The DIC and the percentage of correctly predicted rockburst category were employed to assess the model performance. Overall, the results indicate that the Bayesian inference allows achieving satisfactory predictive performance in modelling the rockburst intensity. Also, the associated predictive uncertainty can be improved when new data are available. The results suggest that the implemented Bayesian models can be helpful in managing rockburst events in mines using site specific data and therefore, reducing the casualties induced by rockburst.
Rockburst is referred to as one of the deadliest hazards in underground mining initiated in high geostress and brittle rock conditions with extremely complex mechanical features. This results into a violent expulsion of rock from the surrounding rock mass. Usually, it occurs with a sudden release of elastic strain energy and some other phenomena, such as slabbing, spalling, ejecting or throwing which cause considerable damages to underground structures, equipment and most importantly presents serious menaces to workers’ safety and production. Rockbursts have been associated with thousands of accidents and casualties in South Africa, Central Europe, North and South America and China, every year. For instance, in China from 2001 to 2007, numerous hazards in metal mines related to rockburst were reported (Zhang and Fu, 2008).
Although extensive research works have been accomplished over the past few years, rockburst remains not properly understood and still threatens deep mining operations all around the world. The explanation of this phenomenon is based on hypotheses and empirical results and still lacks a sound theory, fundamental physics or mechanics even though well described. Basically, two mechanisms including instantaneous slip on an existing geological feature and instantaneous fracturing of a volume of over-stressed rock are involved which leads to the existence of two approaches (the dynamics and the continuum mechanics) to describe the complex rock spalling and ejection that occur in such event (Wagner, 1982, Tang et al., 2010, Adoko et al., 2013).
Tian, Yao (Texas A&M University (now with University of Houston)) | Ayers, Walter B. (Texas A&M University (retired)) | Sang, Huiyan (Texas A&M University) | McCain, William D. (Texas A&M University) | Ehlig-Economides, Christine (University of Houston)
Recent progress has increased our understanding of key controls on the productivity of shale reservoirs. The quantitative relations between regional Eagle Ford Shale production trends and geologic parameters were investigated to clarify which geologic parameters exercise dominant control on well-production rates.
Previously, qualitative correlations for the Eagle Ford Shale were demonstrated among depth, thickness, total organic carbon (TOC), distribution of limestone beds, and average bed thickness with regional production. Eagle Ford production wells are horizontal, but it was necessary to use vertical wells that penetrated the Eagle Ford to map reservoir properties. No wells in the database had both production and geological parameters, and thus geological parameters could not be directly related to individual-well production. Therefore, spatial-interpolation methods derived from the Kriging and Bayesian methods with Markov-chain Monte Carlo (MCMC) sampling algorithms were used to integrate data sets and predict geological properties at production-well locations. The spatial Gaussian-process-regression modeling was conducted to investigate the primary controls on production.
Results suggest that the 6-month cumulative production from the Eagle Ford Shale, in barrels of oil equivalent (BOE), increases consistently with depth, with Eagle Ford thickness (up to 180-ft thickness), and with TOC (up to 7%). Also, when the number of limestone beds exceeds 12, production increases with the number of limestone beds. The corresponding significance code indicates that the parameters most significant to production are TOC and depth (which relates to pressure and thermal maturation).
Concepts and models developed in this study may assist operators in making critical Eagle Ford Shale development decisions and should be transferable to other shale plays.