Decline curve analysis is widely used in industry to perform production forecasting and to estimate reserves volumes. A useful technique in verifying the validity of a decline model is to estimate the Arps decline parameters, the loss ratio and the b-factor, with respect to time. This is used to check the model fit and to determine the flow regimes under which the reservoir produces. Existing methods to estimate the b-factor are heavily impacted by noise in production data. In this work, we introduce a new method to estimate the Arps decline parameters.
We treat the loss ratio and the b-factor over time as parameters to be estimated in a Bayesian framework. We include prior information on the parameters in the model. This serves to regularize the solution and prevent noise in the data from being amplified. We then fit the parameters to the model using Markov chain Monte Carlo methods to obtain probability distributions of the parameters. These distributions characterize the uncertainty in the parameters being estimated. We then compare our method with existing methods using simulated and field data.
We show that our method produces smooth loss ratio and b-factor estimates over time. Estimates using the three-point derivative method are not matched with data, and results in biased estimates of the Arps parameters. This can lead to misleading fits in decline curve analysis and unreliable estimates of reserves. We show that our technique helps in identification of end of linear flow and start of boundary dominated flow. We use our method on simulated data, with and without noise. Finally, we demonstrate the validity of our method on field cases.
Fitting a decline curve using the loss ratio and b-factor plots is a powerful technique that can highlight important features in the data and the possible points of failure of a model. Calculating these plots using the Bourdet three-point derivative induces bias and magnifies noise. Our analysis ensures that this estimation is robust and repeatable by adding prior information on the parameters to the model and by calibrating the estimates to the data.
Reliability of subsurface assessment for different field development scenarios depends on how effective the uncertainty in production forecast is quantified. Currently there is a body of work in the literature on different methods to quantify the uncertainty in production forecast. The objective of this paper is to revisit and compare these probabilistic uncertainty quantification techniques through their applications to assisted history matching of a deep-water offshore waterflood field. The paper will address the benefits, limitations, and the best criteria for applicability of each technique.
Three probabilistic history matching techniques commonly practiced in the industry are discussed. These are Design-of-Experiment (DoE) with rejection sampling from proxy, Ensemble Smoother (ES) and Genetic Algorithm (GA). The model used for this study is an offshore waterflood field in Gulf-of-Mexico. Posterior distributions of global subsurface uncertainties (e.g. regional pore volume and oil-water contact) were estimated using each technique conditioned to the injection and production data.
The three probabilistic history matching techniques were applied to a deep-water field with 13 years of production history. The first 8 years of production data was used for the history matching and estimate of the posterior distribution of uncertainty in geologic parameters. While the convergence behavior and shape of the posterior distributions were different, consistent posterior means were obtained from Bayesian workflows such as DoE or ES. In contrast, the application of GA showed differences in posterior distribution of geological uncertainty parameters, especially those that had small sensitivity to the production data. We then conducted production forecast by including infill wells and evaluated the production performance using sample means of posterior geologic uncertainty parameters. The robustness of the solution was examined by performing history matching multiple times using different initial sample points (e.g. random seed). This confirmed that heuristic optimization techniques such as GA were unstable since parameter setup for the optimizer had a large impact on uncertainty characterization and production performance.
This study shows the guideline to obtain the stable solution from the history matching techniques used for different conditions such as number of simulation model realizations and uncertainty parameters, and number of datapoints (e.g. maturity of the reservoir development). These guidelines will greatly help the decision-making process in selection of best development options.
We develop a novel ensemble model-maturation method that is based on the Randomized Maximum Likelihood (RML) technique and adjoint-based computation of objective function gradients. The new approach is especially relevant for rich data sets with time-lapse information content. The inversion method that solves the model-maturation problem takes advantage of the adjoint-based computation of objective function gradients for a very large number of model parameters at the cost of a forward and a backward (adjoint) simulation. The inversion algorithm calibrates model parameters to arbitrary types of production data including time-lapse reservoir-pressure traces by use of a weighted and regularized objective function. We have also developed a new and effective multigrid preconditioning protocol for accelerated iterative linear solutions of the adjoint-simulation step for models with multiple levels of local grid refinement. The protocol is based on a geometric multigrid (GMG) preconditioning technique. Within the model-maturation workflow, a machine-learning technique is applied to establish links between the mesh-based inversion results (e.g., permeability-multiplier fields) and geologic modeling parameters inside a static model (e.g., object dimensions, etc.). Our workflow integrates the learnings from inversion back into the static model, and thereby, ensures the geologic consistency of the static model while improving the quality of ensuing dynamic model in terms of honoring production and time-lapse data, and reducing forecast uncertainty. This use of machine learning to post-process the model-maturation outcome effectively converts the conventional continuous-parameter history-matching result into a discrete tomographic inversion result constrained to geological rules encoded in training images.
We demonstrate the practical utilization of the adjoint-based model-maturation method on a large time-lapse reservoir-pressure data set using an ensemble of full-field models from a reservoir case study. The model-maturation technique effectively identifies the permeability modification zones that are consistent with alternative geological interpretations and proposes updates to the static model. Upon these updates, the model not only agrees better with the time-lapse reservoir-pressure data but also better honors the tubing-head pressure as well as production logging data. We also provide computational performance indicators that demonstrate the accelerated convergence characteristics of the new iterative linear solver for adjoint equations.
Over 20 percent of major oil and gas (O&G) incidents reported within the European Union (EU) since 1984 have been associated with corrosion under insulation (CUI) [
Using bayesian networks (BNs) Oceaneering has developed a decision support system for effective CUI risk management. The Bayesian model can be incorporated into existing risk-based assessment (RBA) systems. A key feature of the model is the ability to predict corrosion hotspots while quantifying uncertainties. The model uses probabilities based on objective data as well as subject matter expertise, which makes analytical techniques in business accessible to a wide range of users.
With a case study we illustrate how BNs can be used to assess the risk of a fuel gas line on a live asset in the North sea. The most likely estimated remaining life (ERL) is forecasted in the range of 13 to 24 years, with a worst case of 6.7 years and best case of 40 years. By comparison, the customer CUI tracker reported an ERL of 9.7 years. BNs increase flexibility for scheduling inspection intervals, enabling more targeted inspection planning. This is a significant advancement from current RBA methodologies.
Content of PetroWiki is intended for personal use only and to supplement, not replace, engineering judgment. SPE disclaims any and all liability for your use of such content. A two-part theorem relating conditional probability to unconditional (prior) probability, used in value of information problems but also important to acknowledge when estimating probabilities for geologically dependent prospects.
Several frac design models have been developed for unconventional and tight gas reservoirs to predict hydraulic fracture geometries. However, despite the considerable uncertainty associated with estimating the stimulated reservoir volume, these models do not quantify or demonstrate the probabilistic effects on these estimations. Recent field test shows a gross overestimation of the modeled fracture heights (sometimes up to a factor of 2x-3x) due in many cases to the presence of laminations. In this work we introduce a Bayesian methodology for probabilistic hydraulic fracture design that quantifies the uncertainty of predicting the fracture geometry in the presence of geologic factors such as laminations.
To apply Bayesian inference to the deterministic frac design models, the design parameters are linked to the Bayes theorem by assuming the prior distribution is the distribution of frac design parameters before any treating pressure data has been observed. This can be a uniform distribution based on practical ranges of the parameters. The likelihood function is the conditional probability of the observed treating pressures given the frac design parameters. The posterior distribution is the distribution of the frac design parameters after all the available treating pressure data has been taken into account. In a Bayesian model, the goal is to compute the posterior distribution when some amount of data has been observed which in the developed methodology is the observed treating pressure data. The effects of laminations are accounted for in a geomechanical model able to model the weak interfaces and their shearing during hydraulic fracturing. The resulting estimated fracture height is used as a prior in the Bayesian hydraulic fracture model with proppant transport.
The estimated treating pressures are a function of several important variables such as the pressure dependent leak-off, perforation friction coefficient, pipe friction coefficient, reservoir pressure, stresses and other parameters related to the geologic variability. In the Bayesian methodology, the above-mentioned parameters are either assumed to be random variables or their distribution is derived from geologic models instead of assuming deterministic values for the frac design model. This is followed by constructing a Markov chain of the history matching parameters using Markov Chain Monte Carlo with the Metropolis algorithm. Based on the comparison of the simulated treating pressures with the observed treating pressures, the Markov Chain intuitively converges towards the most probable parameters which are then used to quantify the uncertainty in predicting the fracture geometry. The application to the impact of the laminations on the fracture height shows that the Bayesian approach is able to provide a more realistic range of fracture heights much lower than those derived using current models and in line with measured field data.
The proposed Bayesian methodology provides a means to generate probabilistic estimates of complex geologic factors, such as laminations, that have a major impact on the fracture geometry. The resulting fracture geometry is able to account for the all the geologic uncertainties. Interdisciplinary components of Bayesian inference, reservoir engineering and hydraulic fracturing are integrated together with geomechanics to address the important issues of fracture height growth.
Recent studies have indicated that Huff-n-Puff (HNP) gas injection has the potential to recover an additional 30-70% oil from multi-fractured horizontal wells in shale reservoirs. Nonetheless, this technique is very sensitive to production constraints and is impacted by uncertainty related to measurement quality (particularly frequency and resolution), and lack of constraining data. In this paper, a Bayesian workflow is provided to optimize the HNP process under uncertainty using a Duvernay shale well as an example.
Compositional simulations are conducted which incorporate a tuned PVT model and a set of measured cyclic injection/compaction pressure-sensitive permeability data. Markov chain Monte Carlo (McMC) is used to estimate the posterior distributions of the model uncertain variables by matching the primary production data. The McMC process is accelerated by employing an accurate proxy model (kriging) which is updated using a highly adaptive sampling algorithm. Gaussian Processes are then used to optimize the HNP control variables by maximizing the lower confidence interval (μ-σ) of cumulative oil production (after 10 years) across a fixed ensemble of uncertain variables sampled from posterior distributions.
The uncertain variable space includes several parameters representing reservoir and fracture properties. The posterior distributions for some parameters, such as primary fracture permeability and effective half-length, are narrower, while wider distributions are obtained for other parameters. The results indicate that the impact of uncertain variables on HNP performance is nonlinear. Some uncertain variables (such as molecular diffusion) that do not show strong sensitivity during the primary production strongly impact gas injection HNP performance. The results of optimization under uncertainty confirm that the lower confidence interval of cumulative oil production can be maximized by an injection time of around 1.5 months, a production time of around 2.5 months, and very short soaking times. In addition, a maximum injection rate and a flowing bottomhole pressure around the bubble point are required to ensure maximum incremental recovery. Analysis of the objective function surface highlights some other sets of production constraints with competitive results. Finally, the optimal set of production constraints, in combination with an ensemble of uncertain variables, results in a median HNP cumulative oil production that is 30% greater than that for primary production.
The application of a Bayesian framework for optimizing the HNP performance in a real shale reservoir is introduced for the first time. This work provides practical guidelines for the efficient application of advanced machine learning techniques for optimization under uncertainty, resulting in better decision making.
Da Silva Moreira, Paulo Henrique (LRAP / Universidade Federal do Rio de Janeiro) | Gomes da Silveira, Thaís Márcia (LRAP / Universidade Federal do Rio de Janeiro) | Drexler, Santiago (LRAP / Universidade Federal do Rio de Janeiro) | Couto, Paulo (LRAP / Universidade Federal do Rio de Janeiro)
The reliable prediction of reservoir performance requires the cost effective implementation of oil recovery systems, and it is necessary to simulate the fluid flow processes in the reservoir and to measure the rock and fluid properties that determine reservoir behaviour. However, a good prediction relies on accurate values of reservoir physical properties. Carbonates rocks in Brazilian Pre-salt are known for their heterogeneity. Characterizing their physical represents a great challenge and the combination of experimental and computational techniques lead to a more comprehensive understanding of the reservoir behavior.
In the present work, the relative permeability curves of a carbonate core sample with respect to oil and water are calculated by matching the data obtained in a labscale unsteady-state core flood experiment carried out at high pressure high temperature characteristics of Brazilian Pre-salt reservoirs. Corey-type equations were used to model the relative permeability due to its simplicity and having fewer parameters involved. The Monte Carlo Markov chain (MCMC) method was used as optimization tool, taking the fluid production and pressure drop measurements collected during the core flood experiment as input data. An alalysis of the sensitivity cofficients was carried out in order to deal with eventual linear dependences among the terms to be estimated. The Markov chain was generated and its convergence observed. The posterior distributions of the constant terms in the Corey equations were calculated and their mean values applied in order to calculate the relative permeability curves for the oil and water phases. The range of water saturation in which the relative permeability curves describe the core conditions after the breakthrough time, due to the occurrence of capillary end-effects, was calculated. The history match of fluid production and pressure drop was carried out, showing a good fit between the pressure curves. A gap was observed between the production curves due to the fact that the experimental measurements accounted the cumulative volume of oil and water, while the theoretical curve accounted the oil volume only.
Hong, Aojie (National IOR Centre of Norway and University of Stavanger) | Bratvold, Reidar B. (National IOR Centre of Norway and University of Stavanger) | Lake, Larry W. (University of Texas at Austin) | Ruiz Maraggi, Leopoldo M. (University of Texas at Austin)
Aojie Hong and Reidar B. Bratvold, National IOR Centre of Norway and University of Stavanger, and Larry W. Lake and Leopoldo M. Ruiz Maraggi, University of Texas at Austin Summary Decline-curve analysis (DCA) for unconventional plays requires a model that can capture the characteristics of different flow regimes. Thus, various models have been proposed. Traditionally, in probabilistic DCA, an analyst chooses a single model that is believed to best fit the data. However, several models might fit the data almost equally well, and the one that best fits the data might not best represent the flow characteristics. Therefore, uncertainty remains regarding which is the "best" model. This work aims to integrate model uncertainty in probabilistic DCA for unconventional plays. Instead of identifying a single "best" model, we propose to regard any model as potentially good, with goodness characterized by a probability. The probability of a model being good is interpreted as a measure of the relative truthfulness of this model compared with the other models. This probability is subsequently used to weight the model forecast. Bayes' law is used to assess the model probabilities for given data. Multiple samples of the model-parameter values are obtained using maximum likelihood estimation (MLE) with Monte Carlo simulation. Thus, the unique probabilistic forecasts of each individual model are aggregated into a single probabilistic forecast, which incorporates model uncertainty along with the intrinsic uncertainty (i.e., the measurement errors) in the given data. We demonstrate and conclude that using the proposed approach can mitigate over/underestimates resulting from using a single decline-curve model for forecasting. The proposed approach performs well in propagating model uncertainty to uncertainty in production forecasting; that is, we determine a forecast that represents uncertainty given multiple possible models conditioned to the data. The field data show that no one model is the most probable to be good for all wells. The novelties of this work are that probability is used to describe the goodness of a model; a Bayesian approach is used to integrate the model uncertainty in probabilistic DCA; the approach is applied to actual field data to identify the most-probable model given the data; and we demonstrate the value of using this approach to consider multiple models in probabilistic DCA for unconventional plays. Introduction Although numerical techniques for forecasting hydrocarbon production have developed rapidly over the past decades, DCA remains an industry-accepted method and is used extensively in the oil and gas industry. Decline-curve models are very computationally attractive because only production data, which can be easily acquired, are required for determining a few parameter values through history matching.
A challenge in oil-reservoir studies is evaluating the ability of geomechanical, statistical, and geophysical methods to predict discrete geological features. This problem arises frequently with fracture corridors, which are discrete, tabular subvertical fracture clusters. Fracture corridors can be inferred from well data such as horizontal-borehole-image logs. Unfortunately, well data, and especially borehole image logs, are sparse, and predictive methods are needed to fill in the gap between wells. One way to evaluate such methods is to compare predicted and inferred fracture corridors statistically, using chi-squared and contingency tables.
In this article, we propose a modified contingency table to validate fracture-corridor-prediction techniques. We introduce two important modifications to capture special aspects of fracture corridors. The first modification is the incorporation of exclusion zones where no fracture corridors can exist, and the second modification is taking into consideration the fuzzy nature of fracture-corridor indicators from wells such as circulation losses. An indicator is fuzzy when it has more than one possible interpretation. The reliability of an indicator is the probability that it correctly suggests a fracture corridor. The indicators with reliability of unity are hard indicators, and “soft” and “fuzzy” indicators are those with reliability that is less than unity.
A structural grid is overlaid on the reservoir top in an oil field. Each cell of the grid is examined for the presence and reliability of inferred fracture corridors and exclusion zones and the confidence level of predicted fracture corridors. The results are summarized in a contingency table and are used to calculate chi-squared and conditional probability of having an actual fracture corridor given a predicted fracture corridor.
Three actual case studies are included to demonstrate how single or joint predictive methods can be statistically evaluated and how conditional probabilities are calculated using the modified contingency tables. The first example tests seismic faults as indicators of fracture corridors. The other examples test fracture corridors predicted by a simple geomechanical method.