Content of PetroWiki is intended for personal use only and to supplement, not replace, engineering judgment. SPE disclaims any and all liability for your use of such content. A two-part theorem relating conditional probability to unconditional (prior) probability, used in value of information problems but also important to acknowledge when estimating probabilities for geologically dependent prospects.
Several frac design models have been developed for unconventional and tight gas reservoirs to predict hydraulic fracture geometries. However, despite the considerable uncertainty associated with estimating the stimulated reservoir volume, these models do not quantify or demonstrate the probabilistic effects on these estimations. Recent field test shows a gross overestimation of the modeled fracture heights (sometimes up to a factor of 2x-3x) due in many cases to the presence of laminations. In this work we introduce a Bayesian methodology for probabilistic hydraulic fracture design that quantifies the uncertainty of predicting the fracture geometry in the presence of geologic factors such as laminations.
To apply Bayesian inference to the deterministic frac design models, the design parameters are linked to the Bayes theorem by assuming the prior distribution is the distribution of frac design parameters before any treating pressure data has been observed. This can be a uniform distribution based on practical ranges of the parameters. The likelihood function is the conditional probability of the observed treating pressures given the frac design parameters. The posterior distribution is the distribution of the frac design parameters after all the available treating pressure data has been taken into account. In a Bayesian model, the goal is to compute the posterior distribution when some amount of data has been observed which in the developed methodology is the observed treating pressure data. The effects of laminations are accounted for in a geomechanical model able to model the weak interfaces and their shearing during hydraulic fracturing. The resulting estimated fracture height is used as a prior in the Bayesian hydraulic fracture model with proppant transport.
The estimated treating pressures are a function of several important variables such as the pressure dependent leak-off, perforation friction coefficient, pipe friction coefficient, reservoir pressure, stresses and other parameters related to the geologic variability. In the Bayesian methodology, the above-mentioned parameters are either assumed to be random variables or their distribution is derived from geologic models instead of assuming deterministic values for the frac design model. This is followed by constructing a Markov chain of the history matching parameters using Markov Chain Monte Carlo with the Metropolis algorithm. Based on the comparison of the simulated treating pressures with the observed treating pressures, the Markov Chain intuitively converges towards the most probable parameters which are then used to quantify the uncertainty in predicting the fracture geometry. The application to the impact of the laminations on the fracture height shows that the Bayesian approach is able to provide a more realistic range of fracture heights much lower than those derived using current models and in line with measured field data.
The proposed Bayesian methodology provides a means to generate probabilistic estimates of complex geologic factors, such as laminations, that have a major impact on the fracture geometry. The resulting fracture geometry is able to account for the all the geologic uncertainties. Interdisciplinary components of Bayesian inference, reservoir engineering and hydraulic fracturing are integrated together with geomechanics to address the important issues of fracture height growth.
Recent studies have indicated that Huff-n-Puff (HNP) gas injection has the potential to recover an additional 30-70% oil from multi-fractured horizontal wells in shale reservoirs. Nonetheless, this technique is very sensitive to production constraints and is impacted by uncertainty related to measurement quality (particularly frequency and resolution), and lack of constraining data. In this paper, a Bayesian workflow is provided to optimize the HNP process under uncertainty using a Duvernay shale well as an example.
Compositional simulations are conducted which incorporate a tuned PVT model and a set of measured cyclic injection/compaction pressure-sensitive permeability data. Markov chain Monte Carlo (McMC) is used to estimate the posterior distributions of the model uncertain variables by matching the primary production data. The McMC process is accelerated by employing an accurate proxy model (kriging) which is updated using a highly adaptive sampling algorithm. Gaussian Processes are then used to optimize the HNP control variables by maximizing the lower confidence interval (μ-σ) of cumulative oil production (after 10 years) across a fixed ensemble of uncertain variables sampled from posterior distributions.
The uncertain variable space includes several parameters representing reservoir and fracture properties. The posterior distributions for some parameters, such as primary fracture permeability and effective half-length, are narrower, while wider distributions are obtained for other parameters. The results indicate that the impact of uncertain variables on HNP performance is nonlinear. Some uncertain variables (such as molecular diffusion) that do not show strong sensitivity during the primary production strongly impact gas injection HNP performance. The results of optimization under uncertainty confirm that the lower confidence interval of cumulative oil production can be maximized by an injection time of around 1.5 months, a production time of around 2.5 months, and very short soaking times. In addition, a maximum injection rate and a flowing bottomhole pressure around the bubble point are required to ensure maximum incremental recovery. Analysis of the objective function surface highlights some other sets of production constraints with competitive results. Finally, the optimal set of production constraints, in combination with an ensemble of uncertain variables, results in a median HNP cumulative oil production that is 30% greater than that for primary production.
The application of a Bayesian framework for optimizing the HNP performance in a real shale reservoir is introduced for the first time. This work provides practical guidelines for the efficient application of advanced machine learning techniques for optimization under uncertainty, resulting in better decision making.
Da Silva Moreira, Paulo Henrique (LRAP / Universidade Federal do Rio de Janeiro) | Gomes da Silveira, Thaís Márcia (LRAP / Universidade Federal do Rio de Janeiro) | Drexler, Santiago (LRAP / Universidade Federal do Rio de Janeiro) | Couto, Paulo (LRAP / Universidade Federal do Rio de Janeiro)
The reliable prediction of reservoir performance requires the cost effective implementation of oil recovery systems, and it is necessary to simulate the fluid flow processes in the reservoir and to measure the rock and fluid properties that determine reservoir behaviour. However, a good prediction relies on accurate values of reservoir physical properties. Carbonates rocks in Brazilian Pre-salt are known for their heterogeneity. Characterizing their physical represents a great challenge and the combination of experimental and computational techniques lead to a more comprehensive understanding of the reservoir behavior.
In the present work, the relative permeability curves of a carbonate core sample with respect to oil and water are calculated by matching the data obtained in a labscale unsteady-state core flood experiment carried out at high pressure high temperature characteristics of Brazilian Pre-salt reservoirs. Corey-type equations were used to model the relative permeability due to its simplicity and having fewer parameters involved. The Monte Carlo Markov chain (MCMC) method was used as optimization tool, taking the fluid production and pressure drop measurements collected during the core flood experiment as input data. An alalysis of the sensitivity cofficients was carried out in order to deal with eventual linear dependences among the terms to be estimated. The Markov chain was generated and its convergence observed. The posterior distributions of the constant terms in the Corey equations were calculated and their mean values applied in order to calculate the relative permeability curves for the oil and water phases. The range of water saturation in which the relative permeability curves describe the core conditions after the breakthrough time, due to the occurrence of capillary end-effects, was calculated. The history match of fluid production and pressure drop was carried out, showing a good fit between the pressure curves. A gap was observed between the production curves due to the fact that the experimental measurements accounted the cumulative volume of oil and water, while the theoretical curve accounted the oil volume only.
Gao, Guohua (Shell Global Solutions (US)) | Vink, Jeroen C. (Shell Global Solutions International) | Chen, Chaohui (Shell International Exploration and Production) | Araujo, Mariela (Shell Global Solutions (US)) | Ramirez, Benjamin A. (Shell International Exploration and Production) | Jennings, James W. (Shell International Exploration and Production) | El Khamra, Yaakoub (Shell Global Solutions (US)) | Ita, Joel (Shell Global Solutions (US))
Uncertainty quantification of production forecasts is crucially important for business planning of hydrocarbon-field developments. This is still a very challenging task, especially when subsurface uncertainties must be conditioned to production data. Many different approaches have been proposed, each with their strengths and weaknesses. In this work, we develop a robust uncertainty-quantification work flow by seamless integration of a distributed-Gauss-Newton (GN) (DGN) optimization method with a Gaussian mixture model (GMM) and parallelized sampling algorithms. Results are compared with those obtained from other approaches.
Multiple local maximum-a-posteriori (MAP) estimates are determined with the local-search DGN optimization method. A GMM is constructed to approximate the posterior probability-density function (PDF) by reusing simulation results generated during the DGN minimization process. The traditional acceptance/rejection (AR) algorithm is parallelized and applied to improve the quality of GMM samples by rejecting unqualified samples. AR-GMM samples are independent, identically distributed samples that can be directly used for uncertainty quantification of model parameters and production forecasts.
The proposed method is first validated with 1D nonlinear synthetic problems with multiple MAP points. The AR-GMM samples are better than the original GMM samples. The method is then tested with a synthetic history-matching problem using the SPE01 reservoir model (Odeh 1981; Islam and Sepehrnoori 2013) with eight uncertain parameters. The proposed method generates conditional samples that are better than or equivalent to those generated by other methods, such as Markov-chain Monte Carlo (MCMC) and global-search DGN combined with the randomized-maximum-likelihood (RML) approach, but have a much lower computational cost (by a factor of five to 100). Finally, it is applied to a real-field reservoir model with synthetic data, with 235 uncertain parameters. AGMM with 27 Gaussian components is constructed to approximate the actual posterior PDF. There are 105 AR-GMM samples accepted from the 1,000 original GMM samples, and they are used to quantify the uncertainty of production forecasts. The proposed method is further validated by the fact that production forecasts for all AR-GMM samples are quite consistent with the production data observed after the history-matching period.
The newly proposed approach for history matching and uncertainty quantification is quite efficient and robust. The DGN optimization method can efficiently identify multiple local MAP points in parallel. The GMM yields proposal candidates with sufficiently high acceptance ratios for the AR algorithm. Parallelization makes the AR algorithm much more efficient, which further enhances the efficiency of the integrated work flow.
Hong, Aojie (National IOR Centre of Norway and University of Stavanger) | Bratvold, Reidar B. (National IOR Centre of Norway and University of Stavanger) | Lake, Larry W. (University of Texas at Austin) | Ruiz Maraggi, Leopoldo M. (University of Texas at Austin)
Aojie Hong and Reidar B. Bratvold, National IOR Centre of Norway and University of Stavanger, and Larry W. Lake and Leopoldo M. Ruiz Maraggi, University of Texas at Austin Summary Decline-curve analysis (DCA) for unconventional plays requires a model that can capture the characteristics of different flow regimes. Thus, various models have been proposed. Traditionally, in probabilistic DCA, an analyst chooses a single model that is believed to best fit the data. However, several models might fit the data almost equally well, and the one that best fits the data might not best represent the flow characteristics. Therefore, uncertainty remains regarding which is the "best" model. This work aims to integrate model uncertainty in probabilistic DCA for unconventional plays. Instead of identifying a single "best" model, we propose to regard any model as potentially good, with goodness characterized by a probability. The probability of a model being good is interpreted as a measure of the relative truthfulness of this model compared with the other models. This probability is subsequently used to weight the model forecast. Bayes' law is used to assess the model probabilities for given data. Multiple samples of the model-parameter values are obtained using maximum likelihood estimation (MLE) with Monte Carlo simulation. Thus, the unique probabilistic forecasts of each individual model are aggregated into a single probabilistic forecast, which incorporates model uncertainty along with the intrinsic uncertainty (i.e., the measurement errors) in the given data. We demonstrate and conclude that using the proposed approach can mitigate over/underestimates resulting from using a single decline-curve model for forecasting. The proposed approach performs well in propagating model uncertainty to uncertainty in production forecasting; that is, we determine a forecast that represents uncertainty given multiple possible models conditioned to the data. The field data show that no one model is the most probable to be good for all wells. The novelties of this work are that probability is used to describe the goodness of a model; a Bayesian approach is used to integrate the model uncertainty in probabilistic DCA; the approach is applied to actual field data to identify the most-probable model given the data; and we demonstrate the value of using this approach to consider multiple models in probabilistic DCA for unconventional plays. Introduction Although numerical techniques for forecasting hydrocarbon production have developed rapidly over the past decades, DCA remains an industry-accepted method and is used extensively in the oil and gas industry. Decline-curve models are very computationally attractive because only production data, which can be easily acquired, are required for determining a few parameter values through history matching.
A challenge in oil-reservoir studies is evaluating the ability of geomechanical, statistical, and geophysical methods to predict discrete geological features. This problem arises frequently with fracture corridors, which are discrete, tabular subvertical fracture clusters. Fracture corridors can be inferred from well data such as horizontal-borehole-image logs. Unfortunately, well data, and especially borehole image logs, are sparse, and predictive methods are needed to fill in the gap between wells. One way to evaluate such methods is to compare predicted and inferred fracture corridors statistically, using chi-squared and contingency tables.
In this article, we propose a modified contingency table to validate fracture-corridor-prediction techniques. We introduce two important modifications to capture special aspects of fracture corridors. The first modification is the incorporation of exclusion zones where no fracture corridors can exist, and the second modification is taking into consideration the fuzzy nature of fracture-corridor indicators from wells such as circulation losses. An indicator is fuzzy when it has more than one possible interpretation. The reliability of an indicator is the probability that it correctly suggests a fracture corridor. The indicators with reliability of unity are hard indicators, and “soft” and “fuzzy” indicators are those with reliability that is less than unity.
A structural grid is overlaid on the reservoir top in an oil field. Each cell of the grid is examined for the presence and reliability of inferred fracture corridors and exclusion zones and the confidence level of predicted fracture corridors. The results are summarized in a contingency table and are used to calculate chi-squared and conditional probability of having an actual fracture corridor given a predicted fracture corridor.
Three actual case studies are included to demonstrate how single or joint predictive methods can be statistically evaluated and how conditional probabilities are calculated using the modified contingency tables. The first example tests seismic faults as indicators of fracture corridors. The other examples test fracture corridors predicted by a simple geomechanical method.
Wheeler, Mary F. (The University of Texas at Austin, USA) | Srinivasan, Sanjay (Pennsylvania State University, USA) | Lee, Sanghyun (Florida State University, USA) | Singh, Manik (Pennsylvania State University, USA)
Optimal design of hydraulic fractures is controlled by the distribution of natural fractures in the reservoir. Due to sparse information, there is uncertainty associated with the prediction of the natural fracture system. Our objective here is to: i) Quantify uncertainty associated with prediction of natural fractures using micro-seismic data and a Bayesian model selection approach, and ii) Use fracture probability maps to implement a finite element phase-field approach for modeling interactions of propagating fractures with natural fractures.
The proposed approach employs state-of-the-art numerical modeling of natural and hydraulic fractures using a diffusive adaptive finite element phase-field approach. The diffusive phase field is defined using the probability map describing the uncertainty in the spatial distribution of natural fractures. That probability map is computed using a model selection procedure that utilizes a suite of prior models for the natural fracture network and a fast proxy to quickly evaluate the forward seismic response corresponding to slip events along fractures. Employing indicator functions, diffusive fracture networks are generated utilizing an accurate computational adaptive mesh scheme based on a posteriori error estimators.
The coupled algorithm was validated with existing benchmark problems which include prototype computations with fracture propagation and reservoir flows in a highly heterogeneous reservoir with natural fractures. Implementation of a algorithm for computing fracture probability map based on synthetic micro-seismic data mimicking a Fort Worth basin data set reveals consistency between the interpreted fracture sets and those observed in the reference. Convergence of iterative solvers and numerical efficiencies of the methods were tested against different examples including field-scale problems. Results reveal that the interpretation of uncertainty pertaining to the presence of fractures and utilizing that uncertainty within the phase field approach to simulate the interactions between induced and natural fracture yields complex structures that include fracture branching, fracture hooking etc.
The novelty of this work lies in the efficient integration of the phase-field fracture propagation models to diffusive natural fracture networks with stochastic representation of uncertainty associated with the prediction of natural fractures in a reservoir. The presented method enables practicing engineers to design hydraulic fracturing treatment accounting for the uncertainty associated with the location and spatial variations in natural fractures. Together with efficient parallel implementation, our approach allows for cost-efficient approach to optimizing production processes in the field.
In this work, we investigate different approaches for history matching of imperfect reservoir models while accounting for model error. The first approach (base case scenario) relies on direct Bayesian inversion using iterative ensemble smoothing with annealing schedules without accounting for model error. In the second approach the residual, obtained after calibration, is used to iteratively update the covariance matrix of the total error, that is a combination of model error and data error. In the third approach, PCA-based error model is used to represent the model discrepancy during history matching. However, the prior for the PCA weights is quite subjective and is generally hard to define. Here the prior statistics of model error parameters are estimated using pairs of accurate and inaccurate models. The fourth approach, inspired from
We present a novel sampling algorithm for characterization and uncertainty quantification of heterogeneous multiple facies reservoirs. The method implements a Bayesian inversion framework to estimate physically plausible porosity distributions. This inversion process incorporates data matching at the well locations and constrains the model space by adding
The proposed workflow uses an ensemble-based Markov Chain Monte Carlo approach combined with sampling probability distributions that are physically meaningful. Moreover, the method targets geostatistical modeling to specific zones in the reservoir. Accordingly, it improves fulfilling the inherent stationarity assumption in geostatistical simulation techniques. Parameter sampling and geostatistical simulations are calculated through an inversion process. In other words, the models fit the known porosity field at the well locations and are structurally consistent within main reservoir compartments, zones, and layers obtained from the seismic impedance volume. The new sampling algorithm ensures that the automated history matching algorithm maintains diversity among ensemble members avoiding underestimation of the uncertainty in the posterior probability distribution.
We evaluate the efficiency of the sampling methodology on a synthetic model of a waterflooding field. The predictive capability of the assimilated ensemble is assessed by using production data and dynamic measurements. Also, the qualities of the results are examined by comparing the geological realism of the assimilated ensemble with the reference probability distribution of the model parameters and computing the predicted dynamic data mismatch. Our numerical examples show that incorporating the seismically constrained models as prior information results in an efficient model update scheme and favorable history matching.