Almost simultaneously, advances were made in understanding both the processes within the source rock organic matter that accompany the generation and expulsion of hydrocarbons and in the acquisition, processing, and quantitative interpretation of 3D seismic data. In particular, as organic matter in shales in unconventional plays generates and expels hydrocarbons, porosity is formed in the organic matter and the organic matter becomes more dense and more brittle. As these changes are occurring at a micro-scale, extraction of hundreds of different attributes from a well-imaged 3D seismic volume has made it possible to observe changes at a macro-scale in seismic lines and horizons within that volume. Seismic attributes derived from pre-stack inversions yielding rock mechanical properties from shear (Vs) and compressional (Vp) velocities and density, when calibrated with well log and/or core measurements, can be combined to calculate TOC, pore pressure, rigidity, and compressibility because these properties cause fundamental changes in how seismic waves travel through the rock.
Equally important, the escalation in computing power via methods such as machine learning, neural networks, and multivariate statistics has made it possible to interpret large amounts of data. All of these innovations have contributed to better identification of sweet spots within unconventional plays. Such sweet spots include areas with elevated TOC values, enhanced porosity, and zones that can be targeted for fracking.
One of the primary advantages of seismic data is that it provides information in those areas in between control points/wells. This information in turn helps operators to better select targets for wells and for landing zones. Carefully tied 3D seismic inversion and integration with petrophysical and rock data further allow for detailed characterization of unconventional reservoirs. The enhanced ability to identify the best potential drilling targets has significant economic implications in terms of risk reduction and improved chances to find economic prospects.
While 3D seismic data is being used routinely by numerous companies to predict the mechanical properties, density, and associated TOC of many formations, there is yet to be a direct link made between TOC loss, kerogen conversion, and the associated changes in rock properties. This work documents the importance of TOC loss during maturation and its effects on rock properties like porosity, density, brittleness, and how those advances coupled with the advances in quantitative interpretation of 3D seismic data are enabling the unconventional operators to predict location, thickness, landing zone, and sweet spots with appropriately acquired, processed, and interpreted 3D seismic. Meticulously calibrated 3D seismic inversion and integration with petrophysical and rock data permit detailed reservoir characterization of unconventional reservoirs.
Updated methods for the back calculation of original TOC have been developed using well logs, rock measurements, and 3D basin modeling to assist in locating and developing unconventional reservoirs. In addition, petrophysical measurements that reflect TOC and porosity and are related to fundamental properties controlling the seismic response can be extracted from the seismic reflection data. In turn, seismic attributes derived from pre-stack inversions yielding rock mechanical properties from shear (Vs) and compressional (Vp) velocities and density, when calibrated with well log and/or core measurements, can be combined to estimate TOC, pore pressure, rigidity, and compressibility because these properties cause basic modifications in how seismic waves travel through the rock.
This study shows advancements in studies of: 1) TOC loss with increased thermal maturation, 2) how this loss affects the development of organic porosity, 3) how kerogen becomes denser, harder, and more brittle with increasing maturity, and 4) how recent developments in quantitative interpretation workflows for 3D seismic data facilitate estimation of TOC and determination of rock mechanical properties from shear (Vs) and compressional (Vp) velocities and density. Further integration of geochemical, geomechanical, and geophysical technologies and measurements will provide improved estimates of present-day TOC that can in turn be extended to relative maturity and percent conversion.
Examples provided in this work illustrate prediction of present-day TOC, porosity, density, and mechanical properties extracted from high fidelity pre-stack inversion. Pre-stack inversion along with machine learning can be used to predict rock properties such as porosity, TOC, organic matter quality, rigidity, and pressure and to correlate those properties back to well productivity for improved execution. Relating present TOC estimated from seismic to TOC loss and kerogen property changes with increasing maturity is possible by combining the results of these technologies.
Though analysis and inversion of painstakingly acquired modern 3D seismic data is capable of estimating porosity, TOC, matrix strength, and pore pressure, the latest work on rock property changes as hydrocarbons mature and are expelled isn't typically addressed in most studies. Increasing communication between disciplines might improve estimation of these properties and extend the capability to assess the extent of TOC loss during maturation and the porosity increases that accompany it. This ability is especially important in the intra-well regions where the potential of 3D seismic to extend data between control points enables better reserve estimates and high grading of acreage. After carefully calibrating a quantitative 3D seismic interpretation with a 3D basin modeling analysis of the source rock potential and maturity, an operator is better prepared to high grade acreage and attain the most economic development of unconventional resources.
The escalation in computing power means there are hundreds of different attributes that can be extracted or calculated from a well-imaged 3D seismic volume. Using quantitative calibration of fundamental geochemical measurements such as TOC, pyrolysis, and petrographic measurements of vitrinite reflectance that yield the quantity, quality, and maturity of organic matter in combination with well log and seismic data creates a model for identifying sweet spots and the areas in the target formation that exhibit high TOC, high porosity, and elevated brittleness. Further integration and calibration of changes occurring at the micro-level in organic matter in unconventional plays with their impact on the signatures of data at the macro-level can provide information on the types of hydrocarbons most likely to be found in these sweet spots as well as identifying which zone(s) in the target formation are most likely to be amenable to fracking. Used together, the advances outlined here result in a technological evolution that could have a substantial impact on: 1) the approach to and 2) the economics of the exploration and production of unconventional plays.
The size of the individual seismic surveys has increased over the last decade, along with the generation of megamerge and even larger, what some operators call “gigamerge” surveys. The number of useful attribute volumes has also increased, such that interpreters may need to integrate terabytes of data. During the past several years, various machine learning methods including unsupervised, supervised and deep learning have been developed to better cope with such large amounts of information. In this study we apply several unsupervised machine learning methods to a seismic data volume from the Barents Sea, on which we had previously interpreted shallow high-amplitude anomalies using traditional interactive interpretation workflows. Specifically, we apply k-means, principal component analysis, self-organizing mapping and generative topographic mapping to a suite of attributes and compare them to previously generated P-impedance, porosity and Vclay displays, and find that self-organized mapping and the generative topographic mapping provide additional information of interpretation interest.
In the late 1980s, seismic facies analysis was carried out on 2D seismic data by visually examining the seismic waveforms that can be characterized by their amplitude, frequency and phase expression. Such information would be posted on maps and contoured to generate facies maps. As seismic data volumes increased in size with the adoption of 3D seismic data in the early 1990s, interpreters found that 3D seismic attributes highlighted patterns that facilitated the human recognition of geologic features on time and horizon slices, thereby both accelerating and further quantifying the interpretation. More recently, computer-assisted seismic facies classification techniques have evolved. Such methods or workflows examine seismic data or their derived geometric, spectral, or geomechanical attributes and assign each voxel to one of a finite number of classes, each of which is assumed to represent seismic facies. Such seismic facies may or may not represent geologic facies or petrophysical rock types. In this workflow, well log data, completion data, or production data are then used to determine if a given seismic facies is unique and should be lumped (or “clustered”) with other similar facies determined from attributes with similar attribute expression.
Seismic attributes can be both powerful and challenging to incorporate into interpretation and analysis. Recent developments with machine learning have added new capabilities to multi-attribute seismic analysis. In 2018, Geophysical Insights conducted a proof of concept on 100 square miles of multi-client 3D data jointly owned by Geophysical Pursuit, Inc. (GPI) and Fairfield Geotechnologies (FFG) in the Denver-Julesburg Basin (DJ). The purpose of the study was to evaluate the effectiveness of a machine learning workflow to improve resolution within the reservoir intervals of the Niobrara and Codell formations, the primary targets for development in this portion of the basin.
The seismic data are from Phase 5 of the GPI/Fairfield Niobrara program in northern Colorado. A preliminary workflow which included synthetics, horizon picking and correlation of 28 wells was completed. The seismic volume was re-sampled from 2 ms to 1 ms. Detailed well time-depth charts were created for the Top Niobrara, Niobrara A, B and C benches, Fort Hays and Codell intervals. The interpretations, along with the seismic volume, were loaded into the Paradise® machine learning application, and two suites of attributes were generated, instantaneous and geometric. The first step in the machine learning workflow is Principal Component Analysis (PCA). PCA is a method of identifying attributes that have the greatest contribution to the data and that quantifies the relative contribution of each. PCA aids in the selection of which attributes are appropriate to use in a Self-Organizing Map (SOM). In this case, 15 instantaneous attribute volumes, plus the parent amplitude volume, were used in the PCA and eight were selected to use in SOMs. The SOM is a neural network-based machine learning process that is applied to multiple attribute volumes simultaneously. The SOM produces a non-linear classification of the data in a designated time or depth window.
For this study, a 60-ms interval that encompasses the Niobrara and Codell formations was evaluated using several SOM topologies. One of the main drilling targets, the B chalk, is approximately 30 feet thick; making horizontal well planning and execution a challenge for operators. An 8 X 8 SOM applied to 1 ms seismic data improves the stratigraphic resolution of the B bench. The neuron classification also images small but significant structural variations within the chalk bench. These variations correlate visually with the geometric curvature attributes. This improved resolution allows for precise well planning for horizontals within the bench. The 25 foot thick C bench and the 17 to 25 foot thick Codell are also seismically resolved via SOM analysis. Petrophysical analyses from wireline logs run in seven wells within the survey by Digital Formation; together with additional results from SOMs show the capability to differentiate a high TOC upper unit within the A marl which presents an additional exploration target. Utilizing 2d color maps and geobodies extracted from the SOMs combined with petrophysical results allows calculation of reserves for the individual reservoir units as well as the recently identified high TOC target within the A marl.
The results show that a multi-attribute machine learning workflow improves the seismic resolution within the Niobrara reservoirs of the DJ Basin and results can be utilized in both exploration and development.
Identification and quantification of parasequences remains a key aspect in unconventional reservoir development  demonstrated the importance of gamma ray parasequences (GRP) in unconventional play development. Currently, most of the drilling plans in unconventional plays are executed using a ‘factory made” drilling and completion program. Due to thousands of wells in an unconventional play, it is a very difficult task for operators to incorporate the fine scale reservoir characterization in time for drilling plan.
Currently the upward dirtying and upward cleaning parasequences in shale plays are interpreted qualitatively and manually by a human interpreter on individual well logs. We believe these parasequences hold key information about the underlying geology and their quantification can provide key insights into the depositional environment and hence reservoir quality. Incorporating this information in due time for drilling and completion can aid the decision making process on well placement and hydraulic fracturing design.
In this work, we handle the reservoir characterization challenge on two fronts: we first provide a statistical filtering approach to interpret the parasequences in a well log and then use machine assisted application on other wells in the area of interest. We then use Least-squares fit to obtain slopes of these parasequences. Furthermore, we map these slopes and compare them to the conventional parasequence thickness map to provide quantitative well log attributes to help aid the geologic interpretation.
Unconventional play development has key differences to that of a conventional play development. In a conventional porosity, permeability etc. are the key drivers for production. Well spacing and landing the best zone and hydraulic fracturing guide the production performance in horizontal wells. As the well is completed with hydraulic fracturing operation, the geomechanical properties of the layer become of utmost importance (-6]).  proposed that the layered properties of the shale reservoir are highly complex and is composed of alternating brittle and ductile geological sequences also known as brittle-ductile couplets . The optimal landing zone depends on a tradeoff between the brittleness and rock properties such as total organic carbon(TOC). The good rock from the reservoir perspective which is high in TOC is generally more ductile and not a suitable candidate for hydraulic fracturing operation and vice versa.
Integration of time-lapse seismic data into dynamic reservoir model is an efficient process in calibrating reservoir parameters update. The choice of the metric which will measure the misfit between observed data and simulated model has a considerable effect on the history matching process, and then on the optimal ensemble model acquired. History matching using 4D seismic and production data simultaneously is still a challenge due to the nature of the two different type of data (time-series and maps or volumes based).
Conventionally, the formulation used for the misfit is least square, which is widely used for production data matching. Distance measurement based objective functions designed for 4D image comparison have been explored in recent years and has been proven to be reliable. This study explores history matching process by introducing a merged objective function, between the production and the 4D seismic data. The proposed approach in this paper is to make comparable this two type of data (well and seismic) in a unique objective function, which will be optimised, avoiding by then the question of weights. An adaptive evolutionary optimisation algorithm has been used for the history matching loop. Local and global reservoir parameters are perturbed in this process, which include porosity, permeability, net-to-gross, and fault transmissibility.
This production and seismic history matching has been applied on a UKCS field, it shows that a acceptalbe production data matching is achieved while honouring saturation information obtained from 4D seismic surveys.
Time-lapse seismic monitoring is a powerful technique for reservoir management and the optimization of hydrocarbon recovery. In time-lapse seismic datasets, the difference in seismic properties across different vintages enables the detection of spatio-temporal changes in saturated properties and structure induced by production. The main objectives are (1) to identify bypass pay zones in time-lapse seismic data for the deepwater Amberjack field, located in the Gulf of Mexico, (2) confirm the identified bypass pay zones in the results of reservoir simulation, and (3) recommend well planning strategies to exploit these bypassed resources.
A high-fidelity seismic-to-simulation 4D workflow that incorporates seismic, petrophysics, petrophysical property modeling, and reservoir simulation was employed, which leveraged cross-discipline interaction, interpretation, and integration to extend asset management capabilities. The workflow addresses geology (well log interpretation and framework development), geophysics (seismic interpretation, velocity modeling, and seismic inversion), and petrophysical property modeling (earth models and co-located co-simulation of petrophysical properties with P-impedance from seismic inversion). An embedded petro-elastic model (PEM) in the reservoir simulator is then used to affiliate spatial dry rock properties with saturation properties to compute dynamic elastic properties, which can be related to multi-vintage P-impedance from time-lapse seismic inversion. In the absence of the requisite dry rock properties for the PEM, a small data engine is used to determine these absent properties using metaheuristic optimization techniques. Specifically, two particle swarm optimization (PSO) applications, including an exterior penalty function (EPF), are modified resulting in the development of nested and average methods, respectively. These methods simultaneously calculate the missing rock parameters (dry rock bulk modulus, shear modulus, and density) necessary for dynamic, embedded P-impedance calculation in the history-constrained reservoir simulation results. Afterward, a graphic-enabled method was devised to appropriately classify the threshold to discriminate non-reservoir (including bypassed pay) and reservoir from the P-impedance difference. Its results are compared to unsupervised learning (k-means clustering and hierarchical clustering). From seismic data, one can identify bypassed pay locations, which are confirmed from reservoir simulation after conducting a seismic-driven history match. Finally, infill wells are planned, and then modeled in the reservoir simulator.
Saluja, Vikas (Oil & Natural Gas Corporation LTD.) | Singh, Uday (Oil & Natural Gas Corporation LTD.) | Ghosh, Aninda (Oil & Natural Gas Corporation LTD.) | Prakash, Puja (Oil & Natural Gas Corporation LTD.) | Kumar, Ravendra (Oil & Natural Gas Corporation LTD.) | Verma, Rajeev (Oil & Natural Gas Corporation LTD.)
The case study demonstrated here is the innovative workflow for fault delineation technique on a 3D seismic volume in B-173A Field of Heera Panna Bassein (HPB) Sector, Western Offshore Basin, India. B-173A is located 50 kms west of Mumbai at an average water depth of about 50 m. The field was discovered in the year 1992 and it was put on production in Aug 1998. In B-173A field there are two hydrocarbon bearing zones one is gas bearing Mukta (Lower Oligocene carbonates) Formation and oil bearing Bassein (Middle to Upper Eocene Carbonates) formation.
The present study is an extended workflow on Advanced Seismic Interpretation using Spectral Decomposition and RGB Blending for Fault delineation. Iso-frequency volumes are extracted from Relative Acoustic Impedance data instead of seismic data itself.
The workflow is for effective fault delineation and it consists of Spectral Decomposition of relative acoustic impedance data and RGB Blending of discontinuity attributes of different Iso-frequency volumes.
It is observed that RGB blend volume of discontinuity attributes provided more convincing results for fault delineation as compared to the results of traditional discontinuity attributes.
Wheeler, Mary F. (The University of Texas at Austin, USA) | Srinivasan, Sanjay (Pennsylvania State University, USA) | Lee, Sanghyun (Florida State University, USA) | Singh, Manik (Pennsylvania State University, USA)
Optimal design of hydraulic fractures is controlled by the distribution of natural fractures in the reservoir. Due to sparse information, there is uncertainty associated with the prediction of the natural fracture system. Our objective here is to: i) Quantify uncertainty associated with prediction of natural fractures using micro-seismic data and a Bayesian model selection approach, and ii) Use fracture probability maps to implement a finite element phase-field approach for modeling interactions of propagating fractures with natural fractures.
The proposed approach employs state-of-the-art numerical modeling of natural and hydraulic fractures using a diffusive adaptive finite element phase-field approach. The diffusive phase field is defined using the probability map describing the uncertainty in the spatial distribution of natural fractures. That probability map is computed using a model selection procedure that utilizes a suite of prior models for the natural fracture network and a fast proxy to quickly evaluate the forward seismic response corresponding to slip events along fractures. Employing indicator functions, diffusive fracture networks are generated utilizing an accurate computational adaptive mesh scheme based on a posteriori error estimators.
The coupled algorithm was validated with existing benchmark problems which include prototype computations with fracture propagation and reservoir flows in a highly heterogeneous reservoir with natural fractures. Implementation of a algorithm for computing fracture probability map based on synthetic micro-seismic data mimicking a Fort Worth basin data set reveals consistency between the interpreted fracture sets and those observed in the reference. Convergence of iterative solvers and numerical efficiencies of the methods were tested against different examples including field-scale problems. Results reveal that the interpretation of uncertainty pertaining to the presence of fractures and utilizing that uncertainty within the phase field approach to simulate the interactions between induced and natural fracture yields complex structures that include fracture branching, fracture hooking etc.
The novelty of this work lies in the efficient integration of the phase-field fracture propagation models to diffusive natural fracture networks with stochastic representation of uncertainty associated with the prediction of natural fractures in a reservoir. The presented method enables practicing engineers to design hydraulic fracturing treatment accounting for the uncertainty associated with the location and spatial variations in natural fractures. Together with efficient parallel implementation, our approach allows for cost-efficient approach to optimizing production processes in the field.
A software system based on deep neural network (DNN) technology was designed and trained to recognize fault lines in 2D seismic vertical sections, and fault surfaces in 3D seismic cubes. The system was successfully tested on public domain data from several basins in New Zealand. The paper describes the key components of the system and explains how they were designed. A relatively small size window is used to scan 2D seismic sections. Two DNNs identify if the window contains a fault and output the vector corresponding to the fault segment in the window. After scanning the entire section, a clustering algorithm is applied to group these vectors in separate clusters corresponding to the faults in the section. Finally, a linear regression algorithm calculates the fault lines – not always straight – in the section. Fault lines in successive 2D vertical sections of a 3D cube are associated to form fault surfaces. The training data set that was created to train the DNNs contains 120,000 examples. The validation test set has 30,000 examples. A special workflow and software were developed to generate these 150,000 labelled examples with a mix of synthetic and real data. The results obtained are quite satisfactory: the success rate exceeds 95% on the validation test set. The vector clustering algorithm properly handles crossing faults. The system is designed to quickly learn to correct mistakes highlighted by geophysicists, like patterns wrongly identified as faults, or missed faults. This system was trained to identify fault patterns in seismic data just based on examples. The concept can easily be expanded to recognize many other types of patterns such as structural or stratigraphic traps, horizons, and types of seismic facies. This fault detection software is the first step toward a machine learning-based system for automated structural interpretation of seismic data.