Zhou, Xu (Louisiana State Unviersity) | Tyagi, Mayank (Louisiana State Unviersity) | Zhang, Guoyin (China University of Petroleum - Beijing) | Yu, Hao (Southwest Petroleum University) | Chen, Yangkang (Zhejiang University)
With recent developments in data acquisition and storage techniques, there exists huge amount of available data for data-driven decision making in the Oil & Gas industry. This study explores an application of using Big Data Analytics to establish the statistical relationships between seismic attribute values from a 3D seismic survey and petrophysical properties from well logs. Such relationships and models can be further used for the optimization of exploration and production operations.
3D seismic data can be used to extract various seismic attribute values at all locations within the seismic survey. Well logs provide accurate constraints on the petrophysical values along the wellbore. Big Data Analytics methods are utilized to establish the statistical relationships between seismic attributes and petrophysical data. Since seismic data are at the reservoir scale and are available at every sample cell of the seismic survey, these relationships can be used to estimate the petrophysical properties at all locations inside the seismic survey.
In this study, the Teapot dome 3D seismic survey is selected to extract seismic attribute values. A set of instantaneous seismic attributes, including curvature, instantaneous phase, and trace envelope, are extracted from the 3D seismic volume. Deep Learning Neural Network models are created to establish the relationships between the input seismic attribute values from the seismic survey and petrophysical properties from well logs. Results show that a Deep Learning Neural Network model with multi-hidden layers is capable of predicting porosity values using extracted seismic attribute values from 3D seismic volumes. Ultilization of a subset of seismic attributes improves the model performance in predicting porosity values from seismic data.
It is well known that geophysics, particularly the
Maleki, Masoud (Uuniversity of Campinas UNICAMP) | Danaei, Shahram (Uuniversity of Campinas UNICAMP) | Davolio, Alessandra (Uuniversity of Campinas UNICAMP) | José Schiozer, Denis (Uuniversity of Campinas UNICAMP)
Permanent Reservoir Monitoring (PRM) in systems deep-water settings provide on-demand snapshots for hydrocarbon reservoirs at different times during their production history. Delays in the interpretation turnaround of 4D seismic data reduce some benefits of the PRM. These delays could adversely impact the decision making processes despite obtaining information on demand. Using fast-track approaches in 4D seismic interpretation can provide timely information for reservoir management. This work focuses on a fast-track 4D seismic qualitative interpretation in PRM environment, with the aim of choosing the best seismic amplitude attribute (4D) to use. Different seismic attributes are extracted and the one with high signal-to-noise ratio is selected to carry out the 4D qualitative interpretation. All 4D signals are juxtaposed with well production history data to increase confidence in our interpretation. The selected attribute can be interpreted and used for the foreseeable life of field. This workflow has been developed and applied on post-salt Brazilian offshore field to choose the best seismic attribute to conduct the 4D seismic qualitative interpretation.
The seismic inversion method using the seismic onset times has shown great promise for integrating real- continuous seismic surveys for updating geologic models. However, due to the high cost of seismic surveys, such frequent seismic surveys are not commonly available. In this study, we focus on analyzing the impact of seismic survey frequency on the onset time approach, aiming to extend the advantages of onset time approach when infrequent seismic surveys are available.
To analyze the impact of seismic survey frequency on the onset time approach, first, we conduct a sensitivity analysis based on the frequent seismic survey data (over 175 surveys) of steam injection in a heavy oil reservoir (Peace River Unit) in Canada. The calculated onset time maps based on seismic survey data sampled at various time intervals from the frequent data sets are compared to examine the need and effectiveness of interpolation between surveys. Additionally, we compare the onset time inversion with the traditional seismic amplitude inversion and quantitatively investigate the nonlinearity and robustness for these two inversion methods.
The sensitivity analysis shows that using interpolation between seismic surveys to calculate the onset time an adequate onset time map can be extracted from the infrequent seismic surveys. This holds good as long as there are no changes in the underlying physical mechanisms during the interpolation period. A 2D waterflooding case demonstrates the necessity of interpolation for large time span between the seismic surveys and obtaining more accurate model update and efficient data misfit reduction during the inversion. The SPE Brugge benchmark case shows that the onset time inversion method obtains comparable permeability update as the traditional seismic amplitude inversion method while being much more efficient. This results from the significant data reduction achieved by integrating a single onset time map rather than multiple sets of amplitude maps. The onset time approach also achieves superior convergence performance resulting from its quasi-linear properties. It is found that the nonlinearity of the onset time method can be smaller than that of the amplitude inversion method by several orders of magnitude.
Within a single field geophysical survey results always have a significant amount of data with a considerable variability and heterogeneity. This allows to classify geophysical data as a Big Data. Data scientists and software developers are increasingly recommending the use of machine learning techniques for data processing and interpretation. ML algorithms allow one to extract the most complete amount of useful information, reduce time costs, minimize the subjective factor in the decision-making process, etc. Early testing of these approaches began in the 60s, active practical implementation consisted in the 90s due to the large-scale implementation of seismic studies in 3D CDP modification 1. The emergence of new algorithms, modifications of the original data, the development of computational resources support the relevance of this topic at the present time. In seismic data interpretation machine learning approaches provide high performance in the process of automatic horizons picking, fault tracing, seismic facies analysis, sesimic inversion, reservoir prediction, etc. At the stage of seismic facies analysis application of the ML algorythms is especially effective since in the process of multiattribute classification the initial dataset increases severalfold in accordance with the number of calculated attributes 5-7, 9, 10.
Gupta, Harshit (Indian Institute of Technology Delhi) | Pradhan, Siddhant (Indian Institute of Technology Delhi) | Gogia, Rahul (Indian Institute of Technology Delhi) | Srirangarajan, Seshan (Indian Institute of Technology Delhi) | Phirani, Jyoti (Indian Institute of Technology Delhi) | Ranu, Sayan (Indian Institute of Technology Delhi)
Horizons in a seismic image are geologically signficant surfaces that can be used for understanding geological structures and stratigraphy models. However, horizon tracking in seismic data is a time consuming and challenging task. Saving geologist's time from this seismic interpretation task is essential given the time constraints for the decision making in the oil & gas industry. We take advantage of the deep convolutional neural networks (CNN) to track the horizons directly from the seismic images. We propose a novel automatic seismic horizon tracking method that can reduce the time needed for interpretation, as well as increase the accuracy for the geologists. We show the performance comparison of the proposed CNN model for different training data set sizes and different methods of balancing the classes.
Taha, Taha (Emerson Automation Solutions) | Ward, Paul (Emerson Automation Solutions) | Peacock, Gavin (Emerson Automation Solutions) | Heritage, John (Emerson Automation Solutions) | Bordas, Rafel (Emerson Automation Solutions) | Aslam, Usman (Emerson Automation Solutions) | Walsh, Steve (Emerson Automation Solutions) | Hammersley, Richard (Emerson Automation Solutions) | Gringarten, Emmanuel (Emerson Automation Solutions)
This paper presents a case study in 4D seismic history matching using an automated, ensemble-based workflow that tightly integrates the static and dynamic domains. Subsurface uncertainties, captured at every stage of the interpretative and modelling process, are used as inputs within a repeatable workflow. By adjusting these inputs, an ensemble of models is created, and their likelihoods constrained by observations within an iterative loop. The result is multiple realizations of calibrated models that are consistent with the underlying geology, the observed production data, the seismic signature of the reservoir and its fluids. It is effectively a digital twin of the reservoir with an improved predictive ability that provides a realistic assessment of uncertainty associated with production forecasts.
The example used in this study is a synthetic 3D model mimicking a real North Sea field. Data assimilation is conducted using an Ensemble Smoother with multiple data assimilations (ES-MDA). This paper has a significant focus on seismic data, with the corresponding result vector generated via a petro-elastic model. 4D seismic data proves to be a key additional source of measurement data with a unique volumetric distribution creating a coherent predictive model. This allows recovery of the underlying geological features and more accurately models the uncertainty in predicted production than was possible by matching production data alone.
A significant advantage of this approach is the ability to utilize simultaneously multiple types of measurement data including production, RFT, PLT and 4D seismic. Newly acquired observations can be rapidly accommodated which is often critical as the value of most interventions is reduced by delay.
Is the Cloud Mature Enough for High-Performance Computing? Data volumes are growing at an exponential rate. How can high-performance computing solutions help operators manage these volumes? Will faster, stronger processors and cloud computing solutions be the answer? Last year saw continued contraction in the seismic data-acquisition industry.
This paper introduces a new core-analysis work flow for determining resistivity index (RI), formation factor (FF), and other petrophysical properties directly from an as-received (AR) set of core samples. This paper discusses a study undertaken to gain better understanding of nuclear magnetic resonance (NMR) characteristics of volcanic reservoirs with different lithologies. Is the Cloud Mature Enough for High-Performance Computing? Data volumes are growing at an exponential rate. How can high-performance computing solutions help operators manage these volumes?
In some reservoir applications, seismic data are acquired with downhole sources and receivers. If the receiver is stationed at various depth levels in a well and the source remains on the surface, the measurement is called vertical seismic profiling (VSP). This technique produces a high-resolution, 2D image that begins at the receiver well and extends a short distance (a few tens of meters or a few hundred meters, depending on the source offset distance) toward the source station. This image, a 2D profile restricted to the vertical plane passing through the source and receiver coordinates, is useful in tying seismic responses to subsurface geologic and engineering control. If the source is deployed at various depth levels in one well and the receiver is placed at several depth stations in a second well, the measurement is called crosswell seismic profiling (CSP). Images made from CSP data have the best spatial resolution of any seismic measurement used in reservoir characterization because a wide range of frequencies is recorded.