The 9th International Conference on Gas Hydrates featured discussions on key advancements in flow assurance, including the concept of risk management and anti-agglomerates being applicable strategies in transient operations. By using a Bayesian network, a probabilistic graphical model, in their shale operations, companies might eliminate some of the hurdles they face in these areas, an expert said at the 2016 Unconventional Resources Technology Conference.
A growing sector of water midstream companies is in the Permian Basin looking to take advantage of a business opportunity borne out of rising produced water volumes. Billion-dollar valuations for these companies in the near future could become a reality. The shale revolution is leading the US into a unique position: a bigger exporter of crude and petroleum liquids than the Kingdom of Saudi Arabia. How Many Export Terminals Does the US Need? The new bottleneck for Texas’ crude production is not onshore—it is offshore.
We present a framework that enables estimation of low-dimensional reservoir properties directly from seismic data, without requiring the solution of a high dimensional seismic inverse problem. Our workflow is based on the Bayesian evidential analysis approach and exploits learning the direct relation between seismic data and reservoir properties to efficiently estimate reservoir properties, as well as generate samples from the posterior distribution. We discuss methods of learning highly informative summary statistics from seismic data, which help minimizing computational costs of the approach. We demonstrate the efficacy of our approach by estimating the posterior distribution of reservoir net-to-gross for sub-resolution thin-sand synthetic reservoir.
Presentation Date: Tuesday, October 16, 2018
Start Time: 8:30:00 AM
Location: 209A (Anaheim Convention Center)
Presentation Type: Oral
Modern prestack inversions provide a one-step inversion to elastic properties and simultaneously to a probabilistic classification into litho-fluid classes. As with any inversion, prior information is required to constrain the low frequency content of the elastic property estimates. In addition, it supplies spatial information on the expected distribution of the litho-fluid classes. In this paper, it is shown that the amount of prior information used in a particular case has an impact on the resolution attainable, both in the sense of vertical resolution in time and in the sense of discrimination between different litho-fluid classes. In the exploration setting, where little or no prior information is available, the inversion delivers a result which should be considered as a relative inversion for elastic properties. In the other extreme, such as the production setting where many wells are available, a highly detailed inversion may supply results where the constraints derived from prior information produce sub-seismic resolution for formations of interest. These conclusions are supported by the use of wedge models and blind well tests using data from the Odin field in the Norwegian North Sea.
Presentation Date: Thursday, October 18, 2018
Start Time: 8:30:00 AM
Location: 206A (Anaheim Convention Center)
Presentation Type: Oral
The development of an economically efficient drilling program in shale-gas plays is a challenging task, requiring a large number of wells; even with many wells, the average well production and the variation of well performance (economics) remain highly uncertain. The ability to delineate a shale play with the fewest number of wells and to focus drilling in the most productive areas is an important driver of commercial success. Here, we present a new methodology for improving the economic returns of shale-gas plays. The methodology proposed in this study informs the development of drilling policies for shale-gas opportunities by use of a probabilistic model that accounts for the uncertainty in the chance of success (COS) and its spatial dependency. The model developed in this study shows how a prior view of the COS across a shale-gas play is updated as more wells are drilled.
Balaji, Karthik (University of North Dakota) | Rabiei, Minou (University of North Dakota) | Canbaz, Hakan (Schlumberger) | Agharzeyva, Zinyat (Texas A & M University) | Tek, Suleyman (University of the Incarnate Word) | Bulut, Ummugul (Texas A&M University-San Antonio) | Temizel, Cenk (Aera Energy LLC)
Data-driven methods serve as a robust tool to turn data into knowledge. Historical data generally has not been used in an effective way in analyzing processes due to lack of a well-organized data, where there is a huge potential of turning terabytes of data into knowledge. With the advances and implementation of data-driven methods data-driven models have become more widely-used in analysis, predictive modeling, control and optimization of several processes. Yet, the industry overall is still skeptical on the use of datadriven methods, since they are data-based solution rather than traditional physics-based solutions; even though physics and geology are sometimes part of this methodology. This study comprehensively evaluates the status of data-driven methods in oil and gas industry along with the recent advances and applications.
This study outlines the development of these methods from the fundamentals, theory and applications of these methods along with their historical acceptance and use in the industry. Major challenges in the process of implementation of these methods are reviewed for different areas of applications. The major advantages and drawbacks of data-driven methods over the traditional methods are discussed. Limitations and areas of opportunities are outlined. Latest advances along with latest applications and the associated results and value of the methods are provided.
It is observed that the successful use of data-driven methods requires strong understanding of petroleum engineering processes and the physics-based conventional methods together with a good grasp of traditional statistics, data mining, artificial intelligence and machine learning. Data-driven methods start with a data-based approach to identify the issues and their solutions. Even though data-driven methods provide great solutions on some challenging and complex processes, that are new and/or hard to define with existing conventional methods, there is still skepticism in the industry on the use of these methods. This is strongly tied to the delicacy and sensitive nature of the processes and on the usage of the data. Organization and refinement of the data turn out to be important components of an efficient data-driven process.
Data-driven methods offer great advantages in the industry over that of conventional methods under certain conditions. However, the image of these methods for most of the industry professionals is still fuzzy. This study serves to bridge the gap between successful implementation and more widely use and acceptance of data-driven methods, and the fuzziness and reservations on the understanding of these methods in the industry. Significant components of these methods along with clarification of definitions, theory, application and concerns are also outlined in this study.
In this paper we describe a data classification method that is designed to estimate pore fluid type. In practice, this method will use Vp, Vs, density, porosity and pore fluid type information from well logs to estimate pore fluid type in an earth model where Vp, Vs and density have already been estimated by seismic inversion. Our method is based on the probabilistic nearest neighbor (k-nn) method, but also incorporates feature selection and nonlinear regression. To improve upon the standard k-nn method, we incorporate a regression approach based on the Bayesian MARS model to estimate porosity in the earth model. Additionally, we utilize a feature selection method that automatically generates various data combinations and evaluates their usefulness for classification. We test our method on synthetic data generated using multiple realizations of an inclusion-based rock physics model where pores are either gas, oil or brine saturated. In this synthetic test, our method reduces the misclassification rate by ~25% relative to the standard probabilistic nearest neighbor approach.
Presentation Date: Monday, September 25, 2017
Start Time: 3:30 PM
Presentation Type: ORAL
The goal of seismic reservoir characterization is to estimate rock and fluid properties from seismic data. The solution of the seismic inverse problem is based on the forward models that describe physical relationships between lithology and fluid parameters and their seismic response. If the forward models indicate nonlinear physics relations between the variables, advanced inversion methods, such as stochastic optimization algorithms, should be adopted to predict the reservoir properties. However, these methods are generally time consuming. In this paper, we propose an inversion approach that combined linearized AVO modeling and linearized rock physics relations. The rock physics model is based on Nur's critical porosity model and Gassmann's equations and its linearization is based on first-order Taylor series approximation. The linearization is computed with respect to solid elastic moduli, solid density, fluid bulk modulus, fluid density and porosity. Indeed, the rock physics model is almost linear in these properties and the linearization of the model provides a good approximation. The combined forward model is then used in a Bayesian inversion workflow for the estimation of the above-mentioned rock and fluid properties from pre-stack seismic data and well logs. In the Bayesian inversion, we assume the prior model and the error term to be distributed as Gaussian distribution to derive the analytical solution of the Bayesian inverse problem. The reservoir properties of interest, i.e. porosity, mineral volumes, and fluid saturations, are then computed from the inversion results. The proposed method was first validated on a synthetic dataset and then applied to a field dataset with satisfactory inversion results.
Presentation Date: Tuesday, September 26, 2017
Start Time: 8:55 AM
Presentation Type: ORAL
ABSTRACT: This paper investigates the effect that small sample sizes have on uncertainties associated with intact rock strength parameter estimations, using various combinations of strength data (namely, uniaxial and triaxial compressive, uniaxial compressive and indirect tensile, and uniaxial and triaxial compressive together with indirect tensile). The analysis applies recently developed Bayesian nonlinear regression models to small sample sizes (n = 9, 12 and 15) drawn from an extensive published data set to obtain Hoek-Brown strength parameters. The results show how the uncertainty in parameter estimations, particularly the correlation between estimated m and σc are affected by the data combination. The paper further discusses how using a combination of three data types could result in significant reduction of uncertainties associated with strength parameter estimations, and concludes with suggestions for allocation of resources to different types of laboratory testing.
The introduction of geotechnical limit states design (LSD) standards such as the Canadian Highway Bridge Design Code (CHBDC) (CSA, 2006) and EN-1997 (CEN, 2004) requires the basic philosophy of rock engineering design to change from that of traditional safety factors to the more rational reliability-based design (RBD) (e.g. Lemaire, 2006). A key element of RBD is robust statistical characterization of design parameters (e.g. MacGregor, 1976; CEN, 2002; Gulvanessian et al, 2002, Lemaire, 2006).
In rock engineering, strength of intact rock is a fundamental design parameter, and is often presented in terms of the well-known empirical HB criterion. However, for the most part, the uncertainties associated with both the fitted criterion and strength predictions based on it have historically been ignored. One aspect of fitting the HB criterion to strength data that has received much attention is that of combining compressive and tensile strength data (e.g. Zuo et al., 2008; Sari, 2010; Cai, 2010; Read and Richards, 2011; Sari, 2012; Langford and Diederichs 2015). One recent report exists that addresses both of these issues (Langford & Diederichs 2015).
Moridis, Nefeli (Texas A&M University) | Soltanpour, Yasser (Texas A&M University) | Medina-Cetina, Zenon (Texas A&M University) | Lee, W. John (Texas A&M University) | Blasingame, Thomas A. (Texas A&M University)
We began this research by asking "Can we use Bayes' theorem to supplement available decline models and improve the accuracy of our estimates of ultimate recovery?" This study focuses on the Eagle Ford Shale, and in particular, on oil wells in the Greater Core Eagle Ford Area. Our goal was to develop a method based on a probabilistic approach to identify, characterize, and better model well production based on standard decline models
To attempt to answer this question, we first obtained data for 68 wells in the Greater Core of the Eagle Ford Shale, Texas. As process, we eliminated the wells that did not have enough production data, wells that did not show a production decline and wells that had too much data noise, leaving eight wells. We then performed decline curve analysis (DCA) using the Modified Hyperbolic (MH) and Power-Law Exponential (PLE) models (the two most common DCA models), consisting in user-guided analysis software. Then, the Bayesian paradigm was implemented to calibrate the same two models on the same set of wells.
The primary focus of the research was the implementation of the Bayesian paradigm on the eight-well data set. We first performed a "best fit" parameter estimation using least squares optimization, which provided an optimized set of parameters for the two decline models. This was followed by using the Markov Chain Monte Carlo (MCMC) integration of the Bayesian posterior function for each model, which provided a full probabilistic description of its parameters. This allowed for the simulation of a number of likely realizations of the decline curves, from which first order statistics were computed to provide a confidence metric on the calibration of each model as applied to the production data of each well.
Results showed variation on the calibration of the MH and PLE models. The forward models (MH and PLE) overestimated the ultimate recovery in the majority of the wells compared with the Bayesian calibrations, proving that the Bayesian paradigm was able to capture a more accurate trend of the data and thus able to determine more accurate estimates of reserves.
In industry, the same decline models are used for unconventional wells as for conventional wells, even though we know that the same models may not apply. Based on the proposed results, we believe that Bayesian inference yields more accurate estimates of ultimate recovery for unconventional reservoirs than deterministic DCA methods. Moreover, it provides a measure of confidence on the prediction of production as a function of varying data and varying decline models.