There is a growing trend toward the use of statistical modeling and data analytics for oil and gas (and related subsurface domain) applications. This session will focus on the state of the art applications of converting data into information—particularly the actionable kind that lead to better decisions.
To mark the opening of the SPE Annual Caspian Technical Conference in Azerbaijan, industry luminaries will address delegates and journalists, providing insight an vision on the oil and gas industry in the Caspian region. In recent years the Upstream O&G industry has sought to find ways to improve its overall performance, through operational execution excellence and by reviewing and improving the work flow process in order to eliminate waste, add value and improve financial performance. Many companies are already learning from the Past, gathering and analyzing data, adopting the use digital technologies such as the cloud and artificial intelligence in the Present, in order to enable step changes in the Future. When considering Past, Present and the Future, nothing is omitted. Mature field development is already part of the Present and as data analytics evolve, part of the Future too.
More recently, there is an increasing trend to use purely data-driven models based on big data and machine learning techniques. The goal here is to exploit the multitude of data sources to extract intelligence, improve operational efficiency and optimize reservoir performance. In this workshop, we explore the opportunities presented by combining the data-driven models (data scientists) with physics-based models (domain experts), to provide a balanced and informed view of reservoir insights and create predictive and generalizable models while enforcing known physical constraints and addressing gaps in the data. What is an SPE Workshop? What is an SPE Workshop?
More recently, there is an increasing trend to use purely data-driven models based on big data and machine learning techniques. The goal here is to exploit the multitude of data sources to extract intelligence, improve operational efficiency and optimize reservoir performance. In this workshop, we explore the opportunities presented by combining the data driven models (data scientists) with physics-based models (domain experts), to provide a balanced and informed view of reservoir insights and create predictive and generalizable models while enforcing known physical constraints and addressing gaps in the data.
Martini, Brigette (Corescan Inc.) | Bellian, Jerome (Whiting Petroleum Corporation) | Katz, David (Encana Corporation) | Fonteneau, Lionel (Corescan Pty Ltd) | Carey, Ronell (Corescan Pty Ltd) | Guisinger, Mary (Whiting Petroleum Corporation) | Nordeng, Stephan H. (University of North Dakota)
Hyperspectral core imaging studies of the Bakken-Three Forks formations over the past four years has revealed non-destructive, high resolution, spatially relevant insight into mineralogy, both primary and diagenetically altered that can be applied to reservoir characterization. While ‘big’ data like co-acquired hyperspectral imagery, digital photography and laser profiles can be challenging to analyze, synthesize, scale, visualize and store, their value in providing mineralogical information, structural variables and visual context at scales that lie between (and ultimately link) nano and reservoir-scale measurements of the Bakken-Three Forks system, is unique.
Simultaneous, co-acquired hyperspectral core imaging data (at 500 μm spatial resolution), digital color photography (at 50 μm spatial resolution) and laser profiles (at 20 μm spatial and 7 μm vertical resolution), were acquired over 24 wells for a total of 2,870 ft. of core, seven wells of which targeted the Bakken-Three Forks formations. These Bakken-Three Forks data (~5.5 TB) represent roughly 175,000,000 pixels of spatially referenced mineralogical data. Measurements were performed at a mobile Corescan HCI-3 laboratory based in Denver, CO, while spectral and spatial analysis of the data was completed using proprietary in-house spectral software, offsite in Perth, WA, Australia. Synthesis of the spectral-based mineral maps and laser-based structural data, with ancillary data (including Qemscan, XRD and various downhole geophysical surveys) were completed in several software and modelling platforms.
The resulting spatial context of this hyperspectral imaging-based mineralogy and assemblages are particularly compelling, both in small scale micro-distribution as well as borehole scale mineralogical distributions related to both primary lithology and secondary alteration. These studies also present some of the first successful measurement and derivation of lithology from hyperspectral data. Relationships between hyperspectral-derived mineralogy and oil concentrations are presented as are separately derived structural variables. The relationship between hyperspectral-based mineralogy to micro-scale reservoir characteristics (including those derived from Qemscan) were studied, as were relationships to larger-scale downhole geophysical data (resulting in compelling correlations between variables of resistivity and hyperspectral-mineralogy). Finally, basic Net-to-Gross calculations were completed using the hyperspectral imaging data, thereby extending the use of such data from geological characterizations through to resource estimations.
The high-fidelity mineralogical maps afforded by hyperspectral core imaging have not only provided new geological insight into the Bakken-Three Forks formations, but ultimately provide improved well completion designs in those formations, as well as a framework for applying the technology to other important unconventional reservoir formations in exploration and development. The semi-automated nature of the technology also ushers in the ability to consistently and accurately log mineralogy from multiple wells and fields globally, allowing for advanced comparative analysis.
Proper characterization of heterogeneous rock properties and hydraulic fracture parameters is essential for optimizing well spacing and reliable estimation of EUR in unconventional reservoirs. High resolution characterization of matrix properties and complex fracture parameters requires efficient history matching of well production and pressure response. We propose a novel reservoir model parameterization method to reduce the number of unknowns, regularize the ill-posed problem and enhance the efficiency of history matching of unconventional reservoirs.
Our proposed method makes a low rank approximation of the spatial distribution of reservoir properties taking into account the varying model resolution of the matrix and hydraulic fractures. Typically, hydraulic fractures are represented with much higher resolution through local grid refinements compared to the matrix properties. In our approach, the spatial property distribution of both for matrix and fractures is represented using a few parameters via a linear transformation with multiresolution basis functions. The parameters in transform domain are then updated during model calibrations, substantially reducing the number of unknowns. The multiresolution basis functions are constructed by eigen-decomposition of an adaptively coarsened grid Laplacian corresponding to the data resolution. High property resolution at the area of interest through the adaptive resolution control while keeping the original grid structure improves quality of history matching, reduces simulation runtime and improves the efficiency of history matching.
We demonstrate the power and efficacy of our method using synthetic and field examples. First, we illustrate the effectiveness of the proposed multiresolution parameterization by comparing it with traditional method. For the field application, an unconventional tight oil reservoir model with a multi-stage hydraulic fractured well is calibrated using bottom-hole pressure and water cut history data. The hydraulic fractures as well as the stimulated reservoir volume (SRV) near the well are represented with higher grid resolution. In addition to matrix and fracture properties, the extent of the SRV and hydraulic fractures are also adjusted through history matching using a Multiobjective Genetic Algorithm. The calibrated ensemble of models are used to obtain bounds of production forecast.
Our proposed method is designed to calibrate reservoir and fracture properties with higher resolution in regions that have improved data resolution and higher sensitivity to the well performance data, for example the SRV region and the hydraulic fractures. This leads to a fast and efficient history matching workflow and enables us to make optimal development/completion plans in a reasonable time frame.
Several frac design models have been developed for unconventional and tight gas reservoirs to predict hydraulic fracture geometries. However, despite the considerable uncertainty associated with estimating the stimulated reservoir volume, these models do not quantify or demonstrate the probabilistic effects on these estimations. Recent field test shows a gross overestimation of the modeled fracture heights (sometimes up to a factor of 2x-3x) due in many cases to the presence of laminations. In this work we introduce a Bayesian methodology for probabilistic hydraulic fracture design that quantifies the uncertainty of predicting the fracture geometry in the presence of geologic factors such as laminations.
To apply Bayesian inference to the deterministic frac design models, the design parameters are linked to the Bayes theorem by assuming the prior distribution is the distribution of frac design parameters before any treating pressure data has been observed. This can be a uniform distribution based on practical ranges of the parameters. The likelihood function is the conditional probability of the observed treating pressures given the frac design parameters. The posterior distribution is the distribution of the frac design parameters after all the available treating pressure data has been taken into account. In a Bayesian model, the goal is to compute the posterior distribution when some amount of data has been observed which in the developed methodology is the observed treating pressure data. The effects of laminations are accounted for in a geomechanical model able to model the weak interfaces and their shearing during hydraulic fracturing. The resulting estimated fracture height is used as a prior in the Bayesian hydraulic fracture model with proppant transport.
The estimated treating pressures are a function of several important variables such as the pressure dependent leak-off, perforation friction coefficient, pipe friction coefficient, reservoir pressure, stresses and other parameters related to the geologic variability. In the Bayesian methodology, the above-mentioned parameters are either assumed to be random variables or their distribution is derived from geologic models instead of assuming deterministic values for the frac design model. This is followed by constructing a Markov chain of the history matching parameters using Markov Chain Monte Carlo with the Metropolis algorithm. Based on the comparison of the simulated treating pressures with the observed treating pressures, the Markov Chain intuitively converges towards the most probable parameters which are then used to quantify the uncertainty in predicting the fracture geometry. The application to the impact of the laminations on the fracture height shows that the Bayesian approach is able to provide a more realistic range of fracture heights much lower than those derived using current models and in line with measured field data.
The proposed Bayesian methodology provides a means to generate probabilistic estimates of complex geologic factors, such as laminations, that have a major impact on the fracture geometry. The resulting fracture geometry is able to account for the all the geologic uncertainties. Interdisciplinary components of Bayesian inference, reservoir engineering and hydraulic fracturing are integrated together with geomechanics to address the important issues of fracture height growth.
The generic term "intelligent well" is used to signify that some degree of direct monitoring and/or remote control equipment is installed within the well completion. The first computer-assisted operations optimized gas lifted production by remote control near the tree and assisted with pumping well monitoring and control. Permanent downhole pressure and temperature gauges are commonly run as part of the completion system and combined with data transmission infrastructure. With the development, successful implementation, and improving reliability of a variety of permanently installed sensors, it was perceived that the potential to exercise direct control of inflow to the wellbore would provide significant and increased economic benefit.
The petro-elastic model (PEM) represents an integral component in the closed-loop calibration of integrated four-dimensional (4D) solutions incorporating time-lapse seismic, elastic and petrophysical rock property modeling, and reservoir simulation. Calibration of the reservoir simulation model is needed so that it is not only consistent with production history but also with the contemporaneous subsurface description as characterized by time-lapse seismic. The PEM requires dry rock properties in its description, which are typically derived from mechanical rock tests. In the absence of those mechanical tests, a small data challenge is posed, whereby all necessary data is not available but the value of reconciling seismic attributes to simulated production remains. A seismic inversion-constrained n-dimensional metaheuristic optimization technique is employed directly on three-dimensional (3D) geocellular arrays to determine elastic and density properties for the PEM embedded in the commercial reservoir simulator.
Ill-posed dry elastic and density property models are considered in a field case where the seismic inversion and petrophysical property model constrained by seismic inversion exist. An n-dimensional design optimization technique is implemented to determine the optimal solution of a multidimensional pseudo-objective function comprised of multidimensional design variables. This study investigates the execution of a modified particle swarm optimization (PSO) method combined with an exterior penalty function (EPF) with varied constraints. The proposed technique involves using n-dimensional design optimization to solve the pseudo-objective function comprised of the PSO and EPF given limited availability of constraints. In this work, an examination of heavily and reduced-order penalized metaheuristic optimization processes, where the design variables and optimal solution are derived from 3D arrays, is conducted so that constraint applicability is quantified. While the process is examined specifically for PEM, it can be applied to other data-limited modeling techniques.
Integration of time-lapse seismic data into dynamic reservoir model is an efficient process in calibrating reservoir parameters update. The choice of the metric which will measure the misfit between observed data and simulated model has a considerable effect on the history matching process, and then on the optimal ensemble model acquired. History matching using 4D seismic and production data simultaneously is still a challenge due to the nature of the two different type of data (time-series and maps or volumes based).
Conventionally, the formulation used for the misfit is least square, which is widely used for production data matching. Distance measurement based objective functions designed for 4D image comparison have been explored in recent years and has been proven to be reliable. This study explores history matching process by introducing a merged objective function, between the production and the 4D seismic data. The proposed approach in this paper is to make comparable this two type of data (well and seismic) in a unique objective function, which will be optimised, avoiding by then the question of weights. An adaptive evolutionary optimisation algorithm has been used for the history matching loop. Local and global reservoir parameters are perturbed in this process, which include porosity, permeability, net-to-gross, and fault transmissibility.
This production and seismic history matching has been applied on a UKCS field, it shows that a acceptalbe production data matching is achieved while honouring saturation information obtained from 4D seismic surveys.