Copyright 2019 held jointly by the Society of Petrophysicists and Well Log Analysts (SPWLA) and the submitting authors. ABSTRACT Today, many machine learning techniques are regularly employed in petrophysical modelling such as cluster analysis, neural networks, fuzzy logic, self-organising maps, genetic algorithm, principal component analysis etc. While each of these methods has its strengths and weaknesses, one of the challenges to most of the existing techniques is how to best handle the variety of dynamic ranges present in petrophysical input data. Mixing input data with logarithmic variation (such as resistivity) and linear variation (such as gamma ray) while effectively balancing the weight of each variable can be particularly difficult to manage. DTA is conceived based on extensive research conducted in the field of CFD (Computational Fluid Dynamics). This paper is focused on the application of DTA to petrophysics and its fundamental distinction from various other statistical methods adopted in the industry. Case studies are shown, predicting porosity and permeability for a variety of scenarios using the DTA method and other techniques. The results from the various methods are compared, and the robustness of DTA is illustrated. The example datasets are drawn from public databases within the Norwegian and Dutch sectors of the North Sea, and Western Australia, some of which have a rich set of input data including logs, core, and reservoir characterisation from which to build a model, while others have relatively sparse data available allowing for an analysis of the effectiveness of the method when both rich and poor training data are available. The paper concludes with recommendations on the best way to use DTA in real-time to predict porosity and permeability. INTRODUCTION The seismic shift in the data analytics landscape after the Macondo disaster has produced intensive focus on the accuracy and precision of prediction of pore pressure and petrophysical parameters.
Oil price forecasting has been shown to be challenging if not impossible for the long-term. However, the oil price has a major impact on Exploration and Production projects.
Historical Project Realized Oil Price (PROP) can be calculated for example projects by summing up the total project revenue using the actual oil prices and dividing through the total amount of oil produced. For different starting dates of example projects, the PROP changes. Determining the PROP for different starting times, a Cumulative Distribution Function (CDF) can be derived. Adjusting this CDF for expected "half cycle breakeven costs" for the low limit and demand considerations for the high case leads to a PROP range that can be used for future project evaluation.
Including PROP ranges into project evaluation allows for the selection of the most attractive development option, Value of Information analysis and project Probability of Economic Success (PES) calculation including oil price uncertainty.
Furthermore, using PROP ranges rather than oil price scenarios enables a distinction between short-term budget planning and long-term project development. For budget planning, a scenario approach is suggested while for long-term planning PROP ranges should be used. Applying long-term planning on PROP ranges leads to less fluctuation in staff planning and small annual adjustments in PROP range forecasting. Also, using PROP ranges results in increasing PES project hurdles at low oil prices and lower PES hurdles at high oil prices. Hence, at low oil prices the risk averseness of the company is increased. Another effect of using PROP ranges is that at high oil prices robustness of projects to low oil prices is included in the assessment.
To investigate the effect of PROP ranges on portfolio PES hurdles and project PES hurdles, a simplified linear-fit-model was developed. The results of the model showed that the project PES hurdles in a Value at Risk assessment can be determined applying the linear-fit-model to quantify the oil price dependency. The required individual project PES hurdles can be adjusted using the linear-fit-model to account for oil price uncertainty.
A high rate of penetration (ROP) is considered one of the most sought-after targets when drilling a well. While physics-based models determine the importance of drilling parameters, they fail to capture the extent or degree of influence of the interplay of the different dynamic drilling features. Parameters such as WOB, RPM, and flowrate, MSE, bit run distance, gamma ray for each rock formation in the Volve field in the North Sea were examined ensuring an adequate ROP while controlling the tool face orientation is quite challenging. Nevertheless, its helps follow the planned well trajectory and eliminates excessive doglegs that lead to wellbore deviations. Five different Machine Learning algorithms were preliminary implemented to optimize ROP and create a less tortuous borehole. The collected data was cleaned and preprocessed and used to structure and train Random Forest, Support Vector Regression, Ridge Regression, LASSO, and Gradient Boosting, XG boost among others and the appropriate hyperparameters were selected. A successful model was chosen based on maximized ROP, minimized deviation from planned trajectory, and lower CPF. An MAEP of 15% was achieved using GBM boost followed AdaBoost. The algorithms have demonstrated competence on the historical dataset, accordingly it will be further tested on blind data to serve as a real-time system for drilling optimization to enable a fully automated system.
Maintaining a stable borehole and optimizing drilling are still considered to be vital practice for the success of any hydrocarbon field development and planning. The present study deliberates a case study on the estimation of pore pressure and fracture gradient for the recently decommissioned Volve oil field at the North Sea. High resolution geophysical logs drilled through the reservoir formation of the studied field have been used to estimate the overburden, pore pressure, and fracture pressure. The well-known Eaton’s method and Matthews-Kelly’s tools were used for the estimation of pore pressure and fracture gradient, respectively. Estimated outputs were calibrated and validated with the available direct downhole measurements (formation pressure measurements, LOT/FIT). Further, shear failure gradient has been calculated using Mohr-Coulomb rock failure criterion to understand the wellbore stability issues in the studied field. Largely, the pore pressure in the reservoir formation is hydrostatic in nature, except the lower Cretaceous to upper Jurassic shales, which were found to be associated with mild overpressure regimes. This study is an attempt to assess the in-situ stress system of the Volve field if CO2 is injected for geological storage in near future.
Petrophysics is a pivotal discipline that bridges engineering and geosciences for reservoir characterization and development. New sensor technologies have enabled real-time streaming of large-volume, multi-scale, and high-dimensional petrophysical data into our databases. Petrophysical data types are extremely diverse, and include numeric curves, arrays, waveforms, images, maps, 3-D volumes, and texts. All data can be indexed with depth (continuous or discrete) or time. Petrophysical data exhibits all the "7V" characteristics of big data, i.e., volume, velocity, variety, variability, veracity, visualization, and value. This paper will give an overview of both theories and applications of machine learning methods as applicable to petrophysical big data analysis.
Recent publications indicate that petrophysical data-driven analytics (PDDA) has been emerging as an active sub-discipline of petrophysics. Field examples from the petrophysics literature will be used to illustrate the advantages of machine learning in the following technical areas: (1) Geological facies classification or petrophysical rock typing; (2) Seismic rock properties or rock physics modeling; (3) Petrophysical/geochemical/geomechanical properties prediction; (3) Fast physical modeling of logging tools; (4) Well and reservoir surveillance; (6) Automated data quality control; (7) Pseudo data generation; and (8) Logging or coring operation guidance.
The paper will also review the major challenges that need to be overcome before the potentially game-changing value of machine learning for petrophysics discipline can be realized. First, a robust theoretical foundation to support the application of machine leaning to petrophysical interpretation should be established; second, the utility of existing machine learning algorithms must be evaluated and tested in different petrophysical tasks with different data scenarios; third, procedures to control the quality of data used in machine leaning algorithms need to be implemented and the associated uncertainties need to be appropriately addressed. The paper will outlook the future opportunities of enabling advanced data analytics to solve challenging oilfield problems in the era of the 4th industrial revolution (IR4.0).
Pastusek, Paul (ExxonMobil Development Co.) | Payette, Greg (ExxonMobil Upstream Research Co.) | Shor, Roman (University of Calgary) | Cayeux, Eric (Norce) | Aarsnes, Ulf Jakob (Norce) | Hedengren, John (Brigham Young University) | Menand, Stéphane (DrillScan) | Macpherson, John (Baker Hughes GE) | Gandikota, Raju (MindMesh Inc.) | Behounek, Michael (Apache Corp.) | Harmer, Richard (Schlumberger) | Detournay, Emmanuel (University of Minnesota) | Illerhaus, Roland (Integrity Directional) | Liu, Yu (Shell Development Co.)
The drilling industry has substantially improved performance based on knowledge from physics-based, statistical, and empirical models of components and systems. However, most models and source code have been recreated multiple times, which requires significant effort and energy with little additional benefit or stepwise improvements. The authors propose that it is time to form a coalition of industry and academic leaders to support an open source effort for drilling, to encourage the reuse of continuously improving models and coding efforts. The vision for this guiding coalition is to 1) set up a repository for source code, data, benchmarks, and documentation, 2) encourage good coding practices, 3) review and comment on the models and data submitted, 4) test, use and improve the code, 5) propose and collect anonymized real data, 6) attract talent and support to the effort, and 7) mentor those getting started. Those interested to add their time and talent to the cause may publish their results through peer-reviewed literature.
SUMMARY We apply an acoustic multiple attenuation method to a marine OBC dataset from the Volve field in the North Sea. The method consists of estimating surface-to-subsurface virtual responses using the Marchenko method, and using interferometric integrals to compute prestack multiple estimates from these virtual responses. The prestack multiples are adaptively subtracted from the original reflection data, and then images are created. We provide detailed comparisons between the image obtained from the multiples-attenuated image, the migrated original dataset, and an image obtained by migrating only the multiples. We show that several structures are attenuated in our new images, evidencing that our multiple attenuation method is effective in field data.
Elastic full waveform inversion (EFWI) embodies the original intention of waveform inversion at its inception as it is a better representation of the mostly solid Earth. However, compared with the acoustic P-wave assumption, EFWI for P- and S-wave velocities using multi-component data admitted mixed results. Full waveform inversion (FWI) is a highly nonlinear problem and this nonlinearity only increases under the elastic assumption. Reflection waveform inversion (RWI) can mitigate the nonlinearity by relying on transmissions from reflections focused on inverting low wavenumber components of the model. In our elastic endeavor, we split the P- and S-wave velocities into low wavenumber and perturbation components and propose a nonlinear approach to invert for both of them. The new optimization problem is built on an objective function that depends on both background and perturbation models. We utilize an equivalent stress source based on the model perturbation to generate reflection instead of demigrating from an image, which is applied in conventional RWI. Application on a slice of an ocean-bottom data shows that our method can efficiently update the low wavenumber parts of the model, but more so, obtain perturbations that can be added to the low wavenumbers for a high resolution output.
Presentation Date: Monday, October 17, 2016
Start Time: 4:35:00 PM
Presentation Type: ORAL
For the purpose of extracting higher resolution information from a 3D field data set, we apply a 3D elastic orthorhombic (ORT) anisotropic full waveform inversion (FWI) to hopefully better represent the physics of the Earth. We utilize what we consider as the optimal parameterization for surface acquired seismic data over a potentially orthorhombic media. This parameterization admits the possibility of incorporating a hierarchical implementation moving from higher anisotropy symmetry to lower ones. From the analysis of the radiation pattern of this new parameterization, we focus the inversion of the 3D data on the parameters that may have imprint on the data with minimal tradeoff, and as a result we invert for the horizontal P-wave velocity model, an
Presentation Date: Tuesday, October 18, 2016
Start Time: 3:45:00 PM
Presentation Type: ORAL