Feature
SPE Disciplines
Geologic Time
Journal
Conference
Publisher
Author
Concept Tag
Country
Industry
Oilfield Places
File Type
Layer | Fill | Outline |
---|
Theme | Visible | Selectable | Appearance | Zoom Range (now: 0) |
---|
Fill | Stroke |
---|---|
Summary In this work, a probabilistic methodology for decline curve analysis (DCA) in unconventional reservoirs is presented using several Bayesian model-fitting algorithms and deterministic models. The deterministic models considered are the power law exponential (PLE) model, the stretched exponential production decline (SEPD) model, Duongโs model, and the logistic growth analysis (LGA) model. Accurate production forecasting and uncertainty quantification were the primary objectives of this study. The Bayesian inferencing techniques described in this work utilize three sampling vehicles, namely the Gibbs sampling (implemented in OpenBUGS, an open-source software), the Metropolis-Hastings (MH) algorithm, and approximate Bayesian computation (ABC) to sample parameter values from their posterior distributions. These different sampling algorithms are applied in conjunction with DCA models to estimate DCA parameter prediction intervals. Using these prediction levels, production is forecasted, and uncertainty bounds are established. To examine its reliability, the methodology was tested on over 74 oil and gas wells located in the three main subplays of the Permian Basin, namely, the Delaware play, the Central Basin Platform, and the Midland play. Results show that the examined DCA-Bayesian models are well calibrated, result in low production errors, and narrow uncertainty bounds for the production history data sets. Also, the LGA model is best in terms of prediction errors for all algorithms except MH. The Gibbs algorithm is nearly the best algorithm in terms of prediction error for all DCA models except the Arps model. Prediction errors are the highest in the Central Basin Platform. The methodology was also successfully applied to unconventional reservoirs with as low as sixโmonths of available production history. Depending on the amount of production history data available, the probabilistic model that provides the best fit can vary. It is therefore recommended that all possible combinations of the deterministic and Bayesian model-fitting algorithms be applied to the available production history. This is to obtain more confidence in the conclusions related to the production forecasts, reserves estimate, and uncertainty bounds. The novelty of this methodology relies on using multiple combinations of DCA-Bayesian models to achieve accurate reserves estimates and narrow uncertainty bounds. This paper can help assess shale plays because some of the shale plays are in the early stages of development when productivity estimations are carried out.
It is critically important for decision making and an extremely challenging task to quantify the uncertainty of model parameters and production forecasts properly after conditioning to production data. This paper presents a novel approach to generate approximate conditional realizations using the distributed Gauss-Newton (DGN) method together with a multiple local Gaussian approximation technique. Results are compared with those obtained from other approaches, such as randomized maximum likelihood (RML), ensemble Kalman filter (EnKF), and Markov-chain Monte Carlo (MCMC). It is well-known that oil- and gasfield development is a high-risk venture. Uncertainties originating from geological models (e.g., structure, stratigraphy, channels, and geobodies) are coupled with uncertainties of reservoir models (e.g., distribution of permeability and porosity in the reservoir) and uncertainties of economic parameters (e.g., oil and gas prices and costs associated with drilling and other operations).
The dynamic nature of unconventional-reservoir developments calls for the availability of fast and reliable history-matching methods for simulation models. In this paper, the authors apply an assisted-history-matching (AHM) approach to a pair of wells in the Wolfcamp B and C formations of the Midland Basin, for which production history is recorded for two periods: primary production and gas injection (huff'n' puff, or HNP). The recorded history of gas injection revealed severe interwell interactions, underscoring the importance of fracture-interference modeling. Fracture segments are modeled with an embedded discrete fracture model (EDFM). Interwell communication is modeled using long fractures that only become active during gas injection.
Summary In this work, we evaluate different algorithms to account for model errors while estimating the model parameters, especially when the model discrepancy (used interchangeably with โmodel errorโ) is large. In addition, we introduce two new algorithms that are closely related to some of the published approaches under consideration. Considering all these algorithms, the first calibration approach (base case scenario) relies on Bayesian inversion using iterative ensemble smoothing with annealing schedules without any special treatment for the model error. In the second approach, the residual obtained after calibration is used to iteratively update the total error covariance combining the effects of both model errors and measurement errors. In the third approach, the principal component analysis (PCA)โbased error model is used to represent the model discrepancy during history matching. This leads to a joint inverse problem in which both the model parameters and the parameters of a PCAโbased error model are estimated. For the joint inversion within the Bayesian framework, prior distributions have to be defined for all the estimated parameters, and the prior distribution for the PCAโbased error model parameters are generally hard to define. In this study, the prior statistics of the model discrepancy parameters are estimated using the outputs from pairs of highโfidelity and lowโfidelity models generated from the prior realizations. The fourth approach is similar to the third approach; however, an additional covariance matrix of difference between a PCAโbased error model and the corresponding actual realizations of prior error is added to the covariance matrix of the measurement error. The first newly introduced algorithm (fifth approach) relies on building an orthonormal basis for the misfit component of the error model, which is obtained from a difference between the PCAโbased error model and the corresponding actual realizations of the prior error. The misfit component of the error model is subtracted from the data residual (difference between observations and model outputs) to eliminate the incorrect relative contribution to the prediction from the physical model and the error model. In the second newly introduced algorithm (sixth approach), we use the PCAโbased error model as a physically motivated bias correction term and an iterative update of the covariance matrix of the total error during history matching. All the algorithms are evaluated using three forecasting measures, and the results show that a good parameterization of the error model is needed to obtain a good estimate of physical model parameters and to provide better predictions. In this study, the last three approaches (i.e., fourth, fifth, sixth) outperform the other methods in terms of the quality of estimated model parameters and the prediction capability of the calibrated imperfect models.
Abstract Objectives/Scope A stable, single-well deconvolution algorithm has been introduced for well test analysis in the early 2000โs, that allows to obtain information about the reservoir system not always available from individual flow periods, for example the presence of heterogeneities and boundaries. One issue, recognised but largely ignored, is that of uncertainty in well test analysis results and non-uniqueness of the interpretation model. In a previous paper (SPE 164870), we assessed these with a Monte Carlo approach, where multiple deconvolutions were performed over the ranges of expected uncertainties affecting the data (Monte Carlo deconvolution). Methods, Procedures, Process In this paper, we use a non-linear Bayesian regression model based on models of reservoir behaviour in order to make inferences about the interpretation model. This allows us to include uncertainty for the measurements which are usually contaminated with large observational errors. We combine the likelihood with flexible probability distributions for the inputs (priors), and we use Markov Chain Monte Carlo algorithms in order to approximate the probability distribution of the result (posterior). Results, Observations, Conclusions We validate and illustrate the use of the algorithm by applying it to the same synthetic and field data sets as in SPE 164870, using a variety of tools to summarise and visualise the posterior distribution, and to carry out model selection. Novel/Additive Information The approach used in this paper has several advantages over Monte Carlo deconvolution: (1) it gives access to meaningful system parameters associated with the flow behaviour in the reservoir; (2) it makes it possible to incorporate prior knowledge in order to exclude non-physical results; and (3) it allows to quantify parameter uncertainty in a principled way by exploiting the advantages of the Bayesian approach.
Abstract Various physico-chemical processes are affecting Alkali Polymer (AP) Flooding. Core floods can be performed to determine ranges for the parameters used in numerical models describing these processes. Because the parameters are uncertain, prior parameter ranges are introduced and the data is conditioned to observed data. It is challenging to determine posterior distributions of the various parameters as they need to be consistent with the different sets of data that are observed (e.g. pressures, oil and water production, chemical concentration at the outlet). Here, we are applying Machine Learning in a Bayesian Framework to condition parameter ranges to a multitude of observed data. To generate the response of the parameters, we used a numerical model and applied Latin Hypercube Sampling (2000 simulation runs) from the prior parameter ranges. To ensure that sufficient parameter combinations of the model comply with various observed data, Machine Learning can be applied. After defining multiple Objective Functions (OF) covering the different observed data (here six different Objective Functions), we used the Random Forest algorithm to generate statistical models for each of the Objective Functions. Next, parameter combinations which lead to results that are outside of the acceptance limit of the first Objective Function are rejected. Then, resampling is performed and the next Objective Function is applied until the last Objective Function is reached. To account for parameter interactions, the resulting parameter distributions are tested for the limits of all the Objective Functions. The results show that posterior parameter distributions can be efficiently conditioned to the various sets of observed data. Insensitive parameter ranges are not modified as they are not influenced by the information from the observed data. This is crucial as insensitive parameters in history could become sensitive in the forecast if the production mechanism is changed. The workflow introduced here can be applied for conditioning parameter ranges of field (re-)development projects to various observed data as well.
Maraggi, Leopoldo M. Ruiz (The University of Texas at Austin) | Lake, Larry W. (The University of Texas at Austin) | Walsh, Mark P. (The University of Texas at Austin)
Extrapolation of history matched single-phase flow solutions is a common practice to forecast production from tight oil reservoirs. Nonetheless, this approach: a) omits multiphase flow effects that exist below bubble point conditions, and b) has not included the quantification of the uncertainty of the estimated ultimate recovery (EUR). This work combines a new two-phase (oil and gas) flow solution within a Bayesian framework to address the uncertainty in the EUR. We illustrate the application of the procedure to wells of the Permian Basin of West Texas.
First, we combine the oil and the gas flow equations into a single dimensionless two-phase flow equation. The solution is a dimensionless flow rate model that can be easily scaled using two parameters: hydrocarbon pore volume and characteristic time. Second, this study generates the probabilistic production forecasts using a Bayesian approach in which the parameters of the model are treated as random variables. We construct parallel Markov chains of the parameters using an adaptative Metropolis-Hastings (M-H) Markov chain Monte Carlo (MCMC) for this purpose. Third, we evaluate the robustness of our inferences using posterior predictive checks (PPCs). Finally, we quantify the uncertainty in the EUR percentiles using the bootstrap method.
The results of this research are as follows. First, this work shows that EUR estimates based on single-phase flow solutions will consistently underestimate the ultimate oil recovery factors in solution-gas drives where the reservoir pressure is less than the bubble point. The degree of underestimation will depend on the reservoir and flowing conditions as well as the fluid properties. Second, the application of parallel Markov chains with an adaptative M-H MCMC scheme addresses the issues of convergence, mixing, and autocorrelation while speeding up the convergence of the chain. Third, we generate replicated data from our posterior distributions to assess the robustness of our inferences (PPCs). Finally, we use hindcasting to calibrate and strengthen our inferences. To our knowledge, all these approaches are novel in EUR forecasting.
Using a Bayesian framework with a low-dimensional (two-parameter) physics-based model provides a fast and reliable technique to quantify the uncertainty in production forecasts. In addition, the use of parallel chains with an adaptative M-H MCMC accelerates the rate of convergence and increases the robustness of the method.
Forecasting production of tight oil wells poses two main challenges. First, the increasing number of tight oil wells leads us to forecast their future production and estimate their reserves with limited amount of data in a fast and reliable way while honoring the underlying physics, such as fluid flow and fracture configuration. Second, most of these wells behave as semi-infinite acting systems (transient flow regime) in which drainage volume grows continuously over time until the boundaries of the reservoir have not been observed. The uncertainty of the magnitude of the drainage volume of these reservoirs directly translates into the uncertainty in the EUR estimates (since the EUR is proportional to the drainage volume of the system).
Maraggi, Leopoldo M. Ruiz (The University of Texas at Austin) | Lake, Larry W. (The University of Texas at Austin) | Walsh, Mark P. (The University of Texas at Austin)
Extrapolation of history matched single-phase flow solutions is a common practice to forecast production from tight oil reservoirs. Nonetheless, this approach: a) omits multiphase flow effects that exist below bubble point conditions, and b) has not included the quantification of the uncertainty of the estimated ultimate recovery (EUR). This work combines a new two-phase (oil and gas) flow solution within a Bayesian framework to address the uncertainty in the EUR. We illustrate the application of the procedure to wells of the Permian Basin of West Texas. First, we combine the oil and the gas flow equations into a single dimensionless two-phase flow equation.