The 9th International Conference on Gas Hydrates featured discussions on key advancements in flow assurance, including the concept of risk management and anti-agglomerates being applicable strategies in transient operations. By using a Bayesian network, a probabilistic graphical model, in their shale operations, companies might eliminate some of the hurdles they face in these areas, an expert said at the 2016 Unconventional Resources Technology Conference.
A growing sector of water midstream companies is in the Permian Basin looking to take advantage of a business opportunity borne out of rising produced water volumes. Billion-dollar valuations for these companies in the near future could become a reality. The shale revolution is leading the US into a unique position: a bigger exporter of crude and petroleum liquids than the Kingdom of Saudi Arabia. How Many Export Terminals Does the US Need? The new bottleneck for Texas’ crude production is not onshore—it is offshore.
Nandi Formentin, Helena (Durham University and University of Campinas) | Vernon, Ian (Durham University) | Avansi, Guilherme Daniel (University of Campinas) | Caiado, Camila (Durham University) | Maschio, Célio (University of Campinas) | Goldstein, Michael (Durham University) | Schiozer, Denis José (University of Campinas)
Reservoir simulation models incorporate physical laws and reservoir characteristics. They represent our understanding of sub-surface structures based on the available information. Emulators are statistical representations of simulation models, offering fast evaluations of a sufficiently large number of reservoir scenarios, to enable a full uncertainty analysis. Bayesian History Matching (BHM) aims to find the range of reservoir scenarios that are consistent with the historical data, in order to provide comprehensive evaluation of reservoir performance and consistent, unbiased predictions incorporating realistic levels of uncertainty, required for full asset management. We describe a systematic approach for uncertainty quantification that combines reservoir simulation and emulation techniques within a coherent Bayesian framework for uncertainty quantification.
Our systematic procedure is an alternative and more rigorous tool for reservoir studies dealing with probabilistic uncertainty reduction. It comprises the design of sets of simulation scenarios to facilitate the construction of emulators, capable of accurately mimicking the simulator with known levels of uncertainty. Emulators can be used to accelerate the steps requiring large numbers of evaluations of the input space in order to be valid from a statistical perspective. Via implausibility measures, we compare emulated outputs with historical data incorporating major process uncertainties. Then, we iteratively identify regions of input parameter space unlikely to provide acceptable matches, performing more runs and reconstructing more accurate emulators at each wave, an approach that benefits from several efficiency improvements. We provide a workflow covering each stage of this procedure.
The procedure was applied to reduce uncertainty in a complex reservoir case study with 25 injection and production wells. The case study contains 26 uncertain attributes representing petrophysical, rock-fluid and fluid properties. We selected phases of evaluation considering specific events during the reservoir management, improving the efficiency of simulation resources use. We identified and addressed data patterns untracked in previous studies: simulator targets,
We advance the applicability of Bayesian History Matching for reservoir studies with four deliveries: (a) a general workflow for systematic BHM, (b) the use of phases to progressively evaluate the historical data; and (c) the integration of two-class emulators in the BHM formulation. Finally, we demonstrate the internal discrepancy as a source of error in the reservoir model.
We present a framework that enables estimation of low-dimensional reservoir properties directly from seismic data, without requiring the solution of a high dimensional seismic inverse problem. Our workflow is based on the Bayesian evidential analysis approach and exploits learning the direct relation between seismic data and reservoir properties to efficiently estimate reservoir properties, as well as generate samples from the posterior distribution. We discuss methods of learning highly informative summary statistics from seismic data, which help minimizing computational costs of the approach. We demonstrate the efficacy of our approach by estimating the posterior distribution of reservoir net-to-gross for sub-resolution thin-sand synthetic reservoir.
Presentation Date: Tuesday, October 16, 2018
Start Time: 8:30:00 AM
Location: 209A (Anaheim Convention Center)
Presentation Type: Oral
Modern prestack inversions provide a one-step inversion to elastic properties and simultaneously to a probabilistic classification into litho-fluid classes. As with any inversion, prior information is required to constrain the low frequency content of the elastic property estimates. In addition, it supplies spatial information on the expected distribution of the litho-fluid classes. In this paper, it is shown that the amount of prior information used in a particular case has an impact on the resolution attainable, both in the sense of vertical resolution in time and in the sense of discrimination between different litho-fluid classes. In the exploration setting, where little or no prior information is available, the inversion delivers a result which should be considered as a relative inversion for elastic properties. In the other extreme, such as the production setting where many wells are available, a highly detailed inversion may supply results where the constraints derived from prior information produce sub-seismic resolution for formations of interest. These conclusions are supported by the use of wedge models and blind well tests using data from the Odin field in the Norwegian North Sea.
Presentation Date: Thursday, October 18, 2018
Start Time: 8:30:00 AM
Location: 206A (Anaheim Convention Center)
Presentation Type: Oral
The development of an economically efficient drilling program in shale-gas plays is a challenging task, requiring a large number of wells; even with many wells, the average well production and the variation of well performance (economics) remain highly uncertain. The ability to delineate a shale play with the fewest number of wells and to focus drilling in the most productive areas is an important driver of commercial success. Here, we present a new methodology for improving the economic returns of shale-gas plays. The methodology proposed in this study informs the development of drilling policies for shale-gas opportunities by use of a probabilistic model that accounts for the uncertainty in the chance of success (COS) and its spatial dependency. The model developed in this study shows how a prior view of the COS across a shale-gas play is updated as more wells are drilled.
Balaji, Karthik (University of North Dakota) | Rabiei, Minou (University of North Dakota) | Canbaz, Hakan (Schlumberger) | Agharzeyva, Zinyat (Texas A & M University) | Tek, Suleyman (University of the Incarnate Word) | Bulut, Ummugul (Texas A&M University-San Antonio) | Temizel, Cenk (Aera Energy LLC)
Data-driven methods serve as a robust tool to turn data into knowledge. Historical data generally has not been used in an effective way in analyzing processes due to lack of a well-organized data, where there is a huge potential of turning terabytes of data into knowledge. With the advances and implementation of data-driven methods data-driven models have become more widely-used in analysis, predictive modeling, control and optimization of several processes. Yet, the industry overall is still skeptical on the use of datadriven methods, since they are data-based solution rather than traditional physics-based solutions; even though physics and geology are sometimes part of this methodology. This study comprehensively evaluates the status of data-driven methods in oil and gas industry along with the recent advances and applications.
This study outlines the development of these methods from the fundamentals, theory and applications of these methods along with their historical acceptance and use in the industry. Major challenges in the process of implementation of these methods are reviewed for different areas of applications. The major advantages and drawbacks of data-driven methods over the traditional methods are discussed. Limitations and areas of opportunities are outlined. Latest advances along with latest applications and the associated results and value of the methods are provided.
It is observed that the successful use of data-driven methods requires strong understanding of petroleum engineering processes and the physics-based conventional methods together with a good grasp of traditional statistics, data mining, artificial intelligence and machine learning. Data-driven methods start with a data-based approach to identify the issues and their solutions. Even though data-driven methods provide great solutions on some challenging and complex processes, that are new and/or hard to define with existing conventional methods, there is still skepticism in the industry on the use of these methods. This is strongly tied to the delicacy and sensitive nature of the processes and on the usage of the data. Organization and refinement of the data turn out to be important components of an efficient data-driven process.
Data-driven methods offer great advantages in the industry over that of conventional methods under certain conditions. However, the image of these methods for most of the industry professionals is still fuzzy. This study serves to bridge the gap between successful implementation and more widely use and acceptance of data-driven methods, and the fuzziness and reservations on the understanding of these methods in the industry. Significant components of these methods along with clarification of definitions, theory, application and concerns are also outlined in this study.
In this paper we describe a data classification method that is designed to estimate pore fluid type. In practice, this method will use Vp, Vs, density, porosity and pore fluid type information from well logs to estimate pore fluid type in an earth model where Vp, Vs and density have already been estimated by seismic inversion. Our method is based on the probabilistic nearest neighbor (k-nn) method, but also incorporates feature selection and nonlinear regression. To improve upon the standard k-nn method, we incorporate a regression approach based on the Bayesian MARS model to estimate porosity in the earth model. Additionally, we utilize a feature selection method that automatically generates various data combinations and evaluates their usefulness for classification. We test our method on synthetic data generated using multiple realizations of an inclusion-based rock physics model where pores are either gas, oil or brine saturated. In this synthetic test, our method reduces the misclassification rate by ~25% relative to the standard probabilistic nearest neighbor approach.
Presentation Date: Monday, September 25, 2017
Start Time: 3:30 PM
Presentation Type: ORAL
The goal of seismic reservoir characterization is to estimate rock and fluid properties from seismic data. The solution of the seismic inverse problem is based on the forward models that describe physical relationships between lithology and fluid parameters and their seismic response. If the forward models indicate nonlinear physics relations between the variables, advanced inversion methods, such as stochastic optimization algorithms, should be adopted to predict the reservoir properties. However, these methods are generally time consuming. In this paper, we propose an inversion approach that combined linearized AVO modeling and linearized rock physics relations. The rock physics model is based on Nur's critical porosity model and Gassmann's equations and its linearization is based on first-order Taylor series approximation. The linearization is computed with respect to solid elastic moduli, solid density, fluid bulk modulus, fluid density and porosity. Indeed, the rock physics model is almost linear in these properties and the linearization of the model provides a good approximation. The combined forward model is then used in a Bayesian inversion workflow for the estimation of the above-mentioned rock and fluid properties from pre-stack seismic data and well logs. In the Bayesian inversion, we assume the prior model and the error term to be distributed as Gaussian distribution to derive the analytical solution of the Bayesian inverse problem. The reservoir properties of interest, i.e. porosity, mineral volumes, and fluid saturations, are then computed from the inversion results. The proposed method was first validated on a synthetic dataset and then applied to a field dataset with satisfactory inversion results.
Presentation Date: Tuesday, September 26, 2017
Start Time: 8:55 AM
Presentation Type: ORAL
ABSTRACT: This paper investigates the effect that small sample sizes have on uncertainties associated with intact rock strength parameter estimations, using various combinations of strength data (namely, uniaxial and triaxial compressive, uniaxial compressive and indirect tensile, and uniaxial and triaxial compressive together with indirect tensile). The analysis applies recently developed Bayesian nonlinear regression models to small sample sizes (n = 9, 12 and 15) drawn from an extensive published data set to obtain Hoek-Brown strength parameters. The results show how the uncertainty in parameter estimations, particularly the correlation between estimated m and σc are affected by the data combination. The paper further discusses how using a combination of three data types could result in significant reduction of uncertainties associated with strength parameter estimations, and concludes with suggestions for allocation of resources to different types of laboratory testing.
The introduction of geotechnical limit states design (LSD) standards such as the Canadian Highway Bridge Design Code (CHBDC) (CSA, 2006) and EN-1997 (CEN, 2004) requires the basic philosophy of rock engineering design to change from that of traditional safety factors to the more rational reliability-based design (RBD) (e.g. Lemaire, 2006). A key element of RBD is robust statistical characterization of design parameters (e.g. MacGregor, 1976; CEN, 2002; Gulvanessian et al, 2002, Lemaire, 2006).
In rock engineering, strength of intact rock is a fundamental design parameter, and is often presented in terms of the well-known empirical HB criterion. However, for the most part, the uncertainties associated with both the fitted criterion and strength predictions based on it have historically been ignored. One aspect of fitting the HB criterion to strength data that has received much attention is that of combining compressive and tensile strength data (e.g. Zuo et al., 2008; Sari, 2010; Cai, 2010; Read and Richards, 2011; Sari, 2012; Langford and Diederichs 2015). One recent report exists that addresses both of these issues (Langford & Diederichs 2015).