Recent studies have indicated that Huff-n-Puff (HNP) gas injection has the potential to recover an additional 30-70% oil from multi-fractured horizontal wells in shale reservoirs. Nonetheless, this technique is very sensitive to production constraints and is impacted by uncertainty related to measurement quality (particularly frequency and resolution), and lack of constraining data. In this paper, a Bayesian workflow is provided to optimize the HNP process under uncertainty using a Duvernay shale well as an example.
Compositional simulations are conducted which incorporate a tuned PVT model and a set of measured cyclic injection/compaction pressure-sensitive permeability data. Markov chain Monte Carlo (McMC) is used to estimate the posterior distributions of the model uncertain variables by matching the primary production data. The McMC process is accelerated by employing an accurate proxy model (kriging) which is updated using a highly adaptive sampling algorithm. Gaussian Processes are then used to optimize the HNP control variables by maximizing the lower confidence interval (μ-σ) of cumulative oil production (after 10 years) across a fixed ensemble of uncertain variables sampled from posterior distributions.
The uncertain variable space includes several parameters representing reservoir and fracture properties. The posterior distributions for some parameters, such as primary fracture permeability and effective half-length, are narrower, while wider distributions are obtained for other parameters. The results indicate that the impact of uncertain variables on HNP performance is nonlinear. Some uncertain variables (such as molecular diffusion) that do not show strong sensitivity during the primary production strongly impact gas injection HNP performance. The results of optimization under uncertainty confirm that the lower confidence interval of cumulative oil production can be maximized by an injection time of around 1.5 months, a production time of around 2.5 months, and very short soaking times. In addition, a maximum injection rate and a flowing bottomhole pressure around the bubble point are required to ensure maximum incremental recovery. Analysis of the objective function surface highlights some other sets of production constraints with competitive results. Finally, the optimal set of production constraints, in combination with an ensemble of uncertain variables, results in a median HNP cumulative oil production that is 30% greater than that for primary production.
The application of a Bayesian framework for optimizing the HNP performance in a real shale reservoir is introduced for the first time. This work provides practical guidelines for the efficient application of advanced machine learning techniques for optimization under uncertainty, resulting in better decision making.
Da Silva Moreira, Paulo Henrique (LRAP / Universidade Federal do Rio de Janeiro) | Gomes da Silveira, Thaís Márcia (LRAP / Universidade Federal do Rio de Janeiro) | Drexler, Santiago (LRAP / Universidade Federal do Rio de Janeiro) | Couto, Paulo (LRAP / Universidade Federal do Rio de Janeiro)
The reliable prediction of reservoir performance requires the cost effective implementation of oil recovery systems, and it is necessary to simulate the fluid flow processes in the reservoir and to measure the rock and fluid properties that determine reservoir behaviour. However, a good prediction relies on accurate values of reservoir physical properties. Carbonates rocks in Brazilian Pre-salt are known for their heterogeneity. Characterizing their physical represents a great challenge and the combination of experimental and computational techniques lead to a more comprehensive understanding of the reservoir behavior.
In the present work, the relative permeability curves of a carbonate core sample with respect to oil and water are calculated by matching the data obtained in a labscale unsteady-state core flood experiment carried out at high pressure high temperature characteristics of Brazilian Pre-salt reservoirs. Corey-type equations were used to model the relative permeability due to its simplicity and having fewer parameters involved. The Monte Carlo Markov chain (MCMC) method was used as optimization tool, taking the fluid production and pressure drop measurements collected during the core flood experiment as input data. An alalysis of the sensitivity cofficients was carried out in order to deal with eventual linear dependences among the terms to be estimated. The Markov chain was generated and its convergence observed. The posterior distributions of the constant terms in the Corey equations were calculated and their mean values applied in order to calculate the relative permeability curves for the oil and water phases. The range of water saturation in which the relative permeability curves describe the core conditions after the breakthrough time, due to the occurrence of capillary end-effects, was calculated. The history match of fluid production and pressure drop was carried out, showing a good fit between the pressure curves. A gap was observed between the production curves due to the fact that the experimental measurements accounted the cumulative volume of oil and water, while the theoretical curve accounted the oil volume only.
Hong, Aojie (National IOR Centre of Norway and University of Stavanger) | Bratvold, Reidar B. (National IOR Centre of Norway and University of Stavanger) | Lake, Larry W. (University of Texas at Austin) | Ruiz Maraggi, Leopoldo M. (University of Texas at Austin)
Aojie Hong and Reidar B. Bratvold, National IOR Centre of Norway and University of Stavanger, and Larry W. Lake and Leopoldo M. Ruiz Maraggi, University of Texas at Austin Summary Decline-curve analysis (DCA) for unconventional plays requires a model that can capture the characteristics of different flow regimes. Thus, various models have been proposed. Traditionally, in probabilistic DCA, an analyst chooses a single model that is believed to best fit the data. However, several models might fit the data almost equally well, and the one that best fits the data might not best represent the flow characteristics. Therefore, uncertainty remains regarding which is the "best" model. This work aims to integrate model uncertainty in probabilistic DCA for unconventional plays. Instead of identifying a single "best" model, we propose to regard any model as potentially good, with goodness characterized by a probability. The probability of a model being good is interpreted as a measure of the relative truthfulness of this model compared with the other models. This probability is subsequently used to weight the model forecast. Bayes' law is used to assess the model probabilities for given data. Multiple samples of the model-parameter values are obtained using maximum likelihood estimation (MLE) with Monte Carlo simulation. Thus, the unique probabilistic forecasts of each individual model are aggregated into a single probabilistic forecast, which incorporates model uncertainty along with the intrinsic uncertainty (i.e., the measurement errors) in the given data. We demonstrate and conclude that using the proposed approach can mitigate over/underestimates resulting from using a single decline-curve model for forecasting. The proposed approach performs well in propagating model uncertainty to uncertainty in production forecasting; that is, we determine a forecast that represents uncertainty given multiple possible models conditioned to the data. The field data show that no one model is the most probable to be good for all wells. The novelties of this work are that probability is used to describe the goodness of a model; a Bayesian approach is used to integrate the model uncertainty in probabilistic DCA; the approach is applied to actual field data to identify the most-probable model given the data; and we demonstrate the value of using this approach to consider multiple models in probabilistic DCA for unconventional plays. Introduction Although numerical techniques for forecasting hydrocarbon production have developed rapidly over the past decades, DCA remains an industry-accepted method and is used extensively in the oil and gas industry. Decline-curve models are very computationally attractive because only production data, which can be easily acquired, are required for determining a few parameter values through history matching.
A challenge in oil-reservoir studies is evaluating the ability of geomechanical, statistical, and geophysical methods to predict discrete geological features. This problem arises frequently with fracture corridors, which are discrete, tabular subvertical fracture clusters. Fracture corridors can be inferred from well data such as horizontal-borehole-image logs. Unfortunately, well data, and especially borehole image logs, are sparse, and predictive methods are needed to fill in the gap between wells. One way to evaluate such methods is to compare predicted and inferred fracture corridors statistically, using chi-squared and contingency tables.
In this article, we propose a modified contingency table to validate fracture-corridor-prediction techniques. We introduce two important modifications to capture special aspects of fracture corridors. The first modification is the incorporation of exclusion zones where no fracture corridors can exist, and the second modification is taking into consideration the fuzzy nature of fracture-corridor indicators from wells such as circulation losses. An indicator is fuzzy when it has more than one possible interpretation. The reliability of an indicator is the probability that it correctly suggests a fracture corridor. The indicators with reliability of unity are hard indicators, and “soft” and “fuzzy” indicators are those with reliability that is less than unity.
A structural grid is overlaid on the reservoir top in an oil field. Each cell of the grid is examined for the presence and reliability of inferred fracture corridors and exclusion zones and the confidence level of predicted fracture corridors. The results are summarized in a contingency table and are used to calculate chi-squared and conditional probability of having an actual fracture corridor given a predicted fracture corridor.
Three actual case studies are included to demonstrate how single or joint predictive methods can be statistically evaluated and how conditional probabilities are calculated using the modified contingency tables. The first example tests seismic faults as indicators of fracture corridors. The other examples test fracture corridors predicted by a simple geomechanical method.
Wheeler, Mary F. (The University of Texas at Austin, USA) | Srinivasan, Sanjay (Pennsylvania State University, USA) | Lee, Sanghyun (Florida State University, USA) | Singh, Manik (Pennsylvania State University, USA)
Optimal design of hydraulic fractures is controlled by the distribution of natural fractures in the reservoir. Due to sparse information, there is uncertainty associated with the prediction of the natural fracture system. Our objective here is to: i) Quantify uncertainty associated with prediction of natural fractures using micro-seismic data and a Bayesian model selection approach, and ii) Use fracture probability maps to implement a finite element phase-field approach for modeling interactions of propagating fractures with natural fractures.
The proposed approach employs state-of-the-art numerical modeling of natural and hydraulic fractures using a diffusive adaptive finite element phase-field approach. The diffusive phase field is defined using the probability map describing the uncertainty in the spatial distribution of natural fractures. That probability map is computed using a model selection procedure that utilizes a suite of prior models for the natural fracture network and a fast proxy to quickly evaluate the forward seismic response corresponding to slip events along fractures. Employing indicator functions, diffusive fracture networks are generated utilizing an accurate computational adaptive mesh scheme based on a posteriori error estimators.
The coupled algorithm was validated with existing benchmark problems which include prototype computations with fracture propagation and reservoir flows in a highly heterogeneous reservoir with natural fractures. Implementation of a algorithm for computing fracture probability map based on synthetic micro-seismic data mimicking a Fort Worth basin data set reveals consistency between the interpreted fracture sets and those observed in the reference. Convergence of iterative solvers and numerical efficiencies of the methods were tested against different examples including field-scale problems. Results reveal that the interpretation of uncertainty pertaining to the presence of fractures and utilizing that uncertainty within the phase field approach to simulate the interactions between induced and natural fracture yields complex structures that include fracture branching, fracture hooking etc.
The novelty of this work lies in the efficient integration of the phase-field fracture propagation models to diffusive natural fracture networks with stochastic representation of uncertainty associated with the prediction of natural fractures in a reservoir. The presented method enables practicing engineers to design hydraulic fracturing treatment accounting for the uncertainty associated with the location and spatial variations in natural fractures. Together with efficient parallel implementation, our approach allows for cost-efficient approach to optimizing production processes in the field.
In this work, we investigate different approaches for history matching of imperfect reservoir models while accounting for model error. The first approach (base case scenario) relies on direct Bayesian inversion using iterative ensemble smoothing with annealing schedules without accounting for model error. In the second approach the residual, obtained after calibration, is used to iteratively update the covariance matrix of the total error, that is a combination of model error and data error. In the third approach, PCA-based error model is used to represent the model discrepancy during history matching. However, the prior for the PCA weights is quite subjective and is generally hard to define. Here the prior statistics of model error parameters are estimated using pairs of accurate and inaccurate models. The fourth approach, inspired from
We present a novel sampling algorithm for characterization and uncertainty quantification of heterogeneous multiple facies reservoirs. The method implements a Bayesian inversion framework to estimate physically plausible porosity distributions. This inversion process incorporates data matching at the well locations and constrains the model space by adding
The proposed workflow uses an ensemble-based Markov Chain Monte Carlo approach combined with sampling probability distributions that are physically meaningful. Moreover, the method targets geostatistical modeling to specific zones in the reservoir. Accordingly, it improves fulfilling the inherent stationarity assumption in geostatistical simulation techniques. Parameter sampling and geostatistical simulations are calculated through an inversion process. In other words, the models fit the known porosity field at the well locations and are structurally consistent within main reservoir compartments, zones, and layers obtained from the seismic impedance volume. The new sampling algorithm ensures that the automated history matching algorithm maintains diversity among ensemble members avoiding underestimation of the uncertainty in the posterior probability distribution.
We evaluate the efficiency of the sampling methodology on a synthetic model of a waterflooding field. The predictive capability of the assimilated ensemble is assessed by using production data and dynamic measurements. Also, the qualities of the results are examined by comparing the geological realism of the assimilated ensemble with the reference probability distribution of the model parameters and computing the predicted dynamic data mismatch. Our numerical examples show that incorporating the seismically constrained models as prior information results in an efficient model update scheme and favorable history matching.
Araujo, Mariela (Shell International Exploration and Production Inc.) | Chen, Chaohui (Shell International Exploration and Production Inc.) | Gao, Guohua (Shell International Exploration and Production Inc.) | Jennings, Jim (Shell International Exploration and Production Inc.) | Ramirez, Benjamin (Shell International Exploration and Production Inc.) | Xu, Zhihua (ExxonMobil) | Yeh, Tzu-hao (Shell International Exploration and Production Inc.) | Alpak, Faruk Omer (Shell International Exploration and Production Inc.) | Gelderblom, Paul (Shell International Exploration and Production Inc.)
Increased access to computational resources has allowed reservoir engineers to include assisted history matching (AHM) and uncertainty quantification (UQ) techniques as standard steps of reservoir management workflows. Several advanced methods have become available and are being used in routine activities without a proper understanding of their performance and quality. This paper provides recommendations on the efficiency and quality of different methods for applications to production forecasting, supporting the reservoir-management decision-making process.
Results from five advanced methods and two traditional methods were benchmarked in the study. The advanced methods include a nested sampling method MultiNest, the integrated global search Distributed Gauss-Newton (DGN) optimizer with Randomized Maximum Likelihood (RML), the integrated local search DGN optimizer with a Gaussian Mixture Model (GMM), and two advanced Bayesian inference-based methods from commercial simulation packages. Two traditional methods were also included for some test problems: the Markov-Chain Monte Carlo method (MCMC) is known to produce accurate results although it is too expensive for most practical problems, and a DoE-proxy based method widely used and available in some form in most commercial simulation packages.
The methods were tested on three different cases of increasing complexity: a 1D simple model based on an analytical function with one uncertain parameter, a simple injector-producer well pair in the SPE01 model with eight uncertain parameters, and an unconventional reservoir model with one well and 24 uncertain parameters. A collection of benchmark metrics was considered to compare the results, but the most useful included the total number of simulation runs, sample size, objective function distributions, cumulative oil production forecast distributions, and marginal posterior parameter distributions.
MultiNest and MCMC were found to produce the most accurate results, but MCMC is too costly for practical problems. MultiNest is also costly, but it is much more efficient than MCMC and it may be affordable for some practical applications. The proxy-based method is the lowest-cost solution. However, its accuracy is unacceptably poor.
DGN-RML and DGN-GMM seem to have the best compromise between accuracy and efficiency, and the best of these two is DGN-GMM. These two methods may produce some poor-quality samples that should be rejected for the final uncertainty quantification.
The results from the benchmark study are somewhat surprising and provide awareness to the reservoir engineering community on the quality and efficiency of the advanced and most traditional methods used for AHM and UQ. Our recommendation is to use DGN-GMM instead of the traditional proxy-based methods for most practical problems, and to consider using the more expensive MultiNest when the cost of running the reservoir models is moderate and high-quality solutions are desired.
We present a new digital solution based on a novel technique to predict acid gas membranes remaining performance based on the field data. Gas membranes are widely used onshore and offshore for acid gas removal from natural gas due to their efficiency and compactness. These systems are proven and well accepted, however their performance is highly dependent on field operations practices and conditions of the natural gas stream that feeds the system. If operating conditions are not controlled, the system performance can deteriorate. The weakened performance can lead to undesirable product gas specifications, contractual penalties, unexpected downtime, and ultimately the risk of environmental impact. On the other hand, maintenance anxiety and uncertainty can lead to overspend on membrane elements replacements; increasing overall operating expenditures. We developed the new technique during the past two years to allow the system operator to anticipate performance upsets by predictive monitoring and active machine learning using field operations data of gas membrane systems. This technique has adopted one of recursive Bayesian estimation techniques, linear Kalman filtering, and allows operators to predict and manage remaining membrane performance in the field proactively thereby optimize the membrane replacement expenditure.