Layer | Fill | Outline |
---|
Map layers
Theme | Visible | Selectable | Appearance | Zoom Range (now: 0) |
---|
Fill | Stroke |
---|---|
Collaborating Authors
domain knowledge
Abstract A serious issue facing many oil and gas companies is the uneasiness among the traditional engineering talents to learn and adapt to the changes brought about by digital transformation. The transformation has been expected as the human being is limited in analyzing problems that are multidimensional and there are difficulties in doing analysis on a large scale. But many companies face human factor issues in preparing the traditional staff to realize the potential of adaptation of AI (Artificial Intelligence) based decision making. As decision-making in oil and gas industry is growing in complexity, acceptance of digital based solutions remains low. One reason can be the lack of adequate interpretability. The data scientist and the end-users should be able to assure that the prediction is based on correct set of assumptions and conform to accepted domain expertise knowledge. A proper set of questions to the experts can include inquiries such as where the information comes from, why certain information is pertinent, what is the relationship of components and also would several experts agree on such an assignment. Among many, one of the main concerns is the trustworthiness of applying AI technologies There are limitations of current continuing education approaches, and we suggest improvements that can help in such transformation. It takes an intersection of human judgment and the power of computer technology to make a step-change in accepting predictions by (ML) machine learning. A deep understanding of the problem, coupled with an awareness of the key data, is always the starting point. The best solution strategy in petroleum engineering adaptation of digital technologies requires effective participation of the domain experts in algorithmic-based preprocessing of data. Application of various digital solutions and technologies can then be tested to select the best solution strategies. For illustration purposes, we examine a few examples where digital technologies have significant potentials. Yet in all, domain expertise and data preprocessing are essential for quality control purposes
- North America > United States > Texas > West Gulf Coast Tertiary Basin > Eagle Ford Shale Formation (0.99)
- North America > United States > Texas > Sabinas - Rio Grande Basin > Eagle Ford Shale Formation (0.99)
- North America > United States > Texas > Maverick Basin > Eagle Ford Shale Formation (0.99)
- (2 more...)
Rock Type Classification Models Interpretability Using Shapley Values
Voskresenskiy, Anton Georgievich (Gazpromneft, ITMO University) | Bukhanov, Nikita Vladimirovich (Aramco Innovations) | Kuntsevich, Maria Alexandrovna (Gazpromneft) | Popova, Oksana Anatolievna (Gazpromneft) | Goncharov, Alexey Sergeevich (Gazpromneft)
Abstract We propose a methodology to improve rock type classification using machine learning (ML) techniques and to reveal causal inferences between reservoir quality and well log measurements. Rock type classification is an essential step in accurate reservoir modeling and forecasting. Machine learning approaches allow to automate rock type classification based on different well logs and core data. In order to choose the best model which does not progradate uncertainty further into the workflow it is important to interpret machine learning results. Feature importance and feature selection methods are usually employed for that. We propose an extension to existing approaches - model agnostic sensitivity algorithm based on Shapley values. The paper describes a full workflow to rock type prediction using well log data: from data preparation, model building, feature selection to causal inference analysis. We made ML models that classify rock types using well logs (sonic, gamma, density, photoelectric and resistivity) from 21 wells as predictors and conduct a causal inference analysis between reservoir quality and well logs responses using Shapley values (a concept from a game theory). As a result of feature selection, we obtained predictors which are statistically significant and at the same time relevant in causal relation context. Macro F1-score of the best obtained models for both cases is 0.79 and 0.85 respectively. It was found that the ML models can infer domain knowledge, which allows us to confirm the adequacy of the built ML model for rock types prediction. Our insight was to recognize the need to properly account for the underlying causal structure between the features and rock types in order to derive meaningful and relevant predictors that carry a significant amount of information contributing to the final outcome. Also, we demonstrate the robustness of revealed patterns by applying the Shapley values methodology to a number of ML models and show consistency in order of the most important predictors. Our analysis shows that machine learning classifiers gaining high accuracy tend to mimic physical principles behind different logging tools, in particular: the longer the travel time of an acoustic wave the higher probability that media is represented by reservoir rock and vice versa. On the contrary lower values of natural radioactivity and density of rock highlight the presence of a reservoir. The article presents causal inference analysis of ML classification models using Shapley values on 2 real-world reservoirs. The rock class labels from core data are used to train a supervised machine learning algorithm to predict classes from well log response. The aim of supervised learning is to label a small portion of a dataset and allow the algorithm to automate the rest. Such data-driven analysis may optimize well logging, coring, and core analysis programs. This algorithm can be extended to any other reservoir to improve rock type prediction. The novelty of the paper is that such analysis reveals the nature of decisions made by the ML model and allows to apply truly robust and reliable petrophysics-consistent ML models for rock type classification.
- Geology > Mineral > Silicate > Phyllosilicate (0.31)
- Geology > Rock Type > Sedimentary Rock > Clastic Rock (0.30)
- Information Technology > Artificial Intelligence > Natural Language > Text Classification (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (0.95)
The effective collaboration between various data scientists and domain experts is perhaps the most important, which is discussed here. Based on practical experience, the principal theses put forward here are that (1) data science projects require domain expertise, (2) domain expertise and data science expertise generally cannot be provided by the same individual, (3) effective communication between the various experts is essential for which everyone requires some limited understanding of the others' expertise and real-world experience, and (4) management must acknowledge these aspects by reserving sufficient project time and budget for communication and change management.
We are witnessing a structural change in the energy landscape. The days of "if-you-build-it, they-will-come" are gone. Exploration and production (E&P) companies have stopped earning economic rent from merely "finding" new oil and gas resources. The market is now rewarding them for efficient monetization of their hydrocarbon reserves. Therefore, E&P companies will need to be hyper-focused on producing the highest-margin barrels.
We are witnessing a structural change in the energy landscape. The days of "if-you-build-it, they-will-come" are gone. Exploration and production (E&P) companies have stopped earning economic rent from merely "finding" new oil and gas resources. The market is now rewarding them for efficient monetization of their hydrocarbon reserves. Therefore, E&P companies will need to be hyper-focused on producing the highest-margin barrels.
Exploration and production companies have long seen integrated operations (IO) as a key innovative strategy for maintaining high production levels. Automation, optimization of processes, and overall data management are the keywords for operators, information technology (IT) companies, and service companies looking to guarantee the maximum exploitation of their available resources. Some of these companies have taken steps recently to make IO system architecture and facilities a more central aspect of their operations. In 2018, Statoil will establish a new onshore integrated operations center (IOC) that it says will help increase safety, add value, and reduce emissions from its installations on the Norwegian continental shelf (NCS). The company called the IOC an "important step" in its digital roadmap that will enable increased production efficiency and production potential on the NCS.
- Europe > Norway (0.72)
- Oceania > Australia > Queensland (0.29)
- Oceania > Australia > Queensland > Surat Basin (0.99)
- Oceania > Australia > New South Wales > Surat Basin (0.99)
- Europe > Norway > Norwegian Sea > Halten Terrace > Block 6507/11 > Åsgard Field > Åre Formation (0.99)
- (41 more...)
- Information Technology > Cloud Computing (0.70)
- Information Technology > Communications > Collaboration (0.70)
Abstract Industry reports confirm that while the unconventional wells drilling and completion operations have achieved significant efficiency gains, which translated into steep cost reductions, the production operations costs remained largely flat. The authors provide a quick review of the above and discuss technologies aimed at addressing this situation including machine learning, AI and physics-based modeling. The latter has been the cornerstone for production optimization since the early days of oil and gas exploitation, however, it has been uneconomical for systematic use in unconventional wells and is applied to only a small fraction of the large number of active wells. This is mainly due to their complex completions, which makes modeling a tedious and resource intensive task. Data science promises faster and nimbler solutions, however, while advanced techniques such as deep learning have achieved impressive advances in many areas like face and speech recognition, they remain mostly rooted in pattern recognition, with no grasp of cause and effect, which limits the type of problems they can successfully resolve. This paper will discuss the above solutions and limitations and presents a novel physics-based approach that would be economical for widespread deployment.
Unlocking Completion Design Optimization Using an Augmented AI Approach
Ma, Zheren (Quantum Reservoir Impact LLC) | Davani, Ehsan (Quantum Reservoir Impact LLC) | Ma, Xiaodan (Quantum Reservoir Impact LLC) | Lee, Hanna (Quantum Reservoir Impact LLC) | Arslan, Izzet (Quantum Reservoir Impact LLC) | Zhai, Xiang (Quantum Reservoir Impact LLC) | Darabi, Hamed (Quantum Reservoir Impact LLC) | Castineira, David (Quantum Reservoir Impact LLC)
Abstract An Augmented AI approach has been developed to optimize completion design parameters and access the full potential of unconventional assets by leveraging big data sculpting, domain-induced feature engineering, and robust and explainable machine learning models with quantified uncertainty. This method unlocks the full potential of a well using completion design parameters optimization that considers all the factors that impact well performance, geological characteristics, well trajectory, spacing, etc. By leveraging basin-level knowledge captured by big data sculpting with the use of uncertainty quantification, Augmented AI can provide quick and science-based answers for completion optimization, and also assess the full potential of an asset in unconventional reservoirs. By leveraging computer vision and natural language processing techniques, unstructured data from various sources were deciphered, combined and organized into a structured database. Imputation techniques were used to fill the gaps of missing data. With the Augmented AI approach, the median accuracies of IP and EUR predictions for new drills is around 90%, which often outperforms industry-standard type curving methods. With the explainable machine learning (ML) model, the direct impact of completion design parameters on well performance is deconvoluted among other parameters, such as engineering and geological attributes. The prediction also comes with an 80% confidence interval to quantify the prediction uncertainties, which allows for better risk management and confident business decision making. With the ML model and given economic inputs and metrics, many sensitivity analyses are performed to evaluate optimized completion design parameters. The proposed Augmented AI approach has been deployed to Eagle Ford wells.
- Geology > Geological Subdiscipline (1.00)
- Geology > Petroleum Play Type > Unconventional Play > Shale Play (0.70)
- North America > United States > West Virginia > Appalachian Basin > Marcellus Shale Formation (0.99)
- North America > United States > Virginia > Appalachian Basin > Marcellus Shale Formation (0.99)
- North America > United States > Texas > Permian Basin > Yeso Formation (0.99)
- (36 more...)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
- Information Technology > Data Science > Data Mining > Big Data (0.55)
Abstract Rod pump is by far the most widely used artificial lift method within the oil industry to improve the oil production for reservoirs with energy levels too low to lift fluids to the surface. It is roughly responsible for about two-thirds of the artificial lift producing oil wells. Well failures, including surface, reservoir and down-hole failures, in rod pump artificial lift systems commonly occur, leading to substantial downtime, production loss and operation expense. Correct identification of well failures and scheduling appropriate maintenance will reduce inappropriate repairs, minimize downtime and subsequently improve operation efficiency. This paper presents a systematic Pattern Recognition approach to detect the well failures. In this method, we start with the pattern recognition workflow, explain how we choose the right data and define good features based on the domain knowledge and how we extract features utilizing signal processing techniques. Finally, we will describe how we establish the classifiers via heuristics instead of training. Through this practical application of pattern recognition, we describe some fundamental concepts and steps to build a successful application using pattern recognition technologies. This method was firstly tested on one year history data of 100 rod pump production wells from an active oilfield and the results were compared with the state-of-the-art method from Liu et al (2010); and then on six months of history data of 100 rod pump production wells. The pilot test results showed that our method could correctly identify the well failure with success rate of 82–86% and only 11–15% of false alarm. Comparing to Liu's the state-of-the-art method (2010), our method has much lower false alarm rate while with similar success rate of detecting failure, thus confirming that our systematic method is more capable of detecting well failures successfully.
Abstract The burgeoning world of unconventional resource play exploration and exploitation has forced the petroleum E&P industry to confront issues of uncertainty and risk on an unprecedented scale. Additionally, the digital oilfield revolution continues to drive ever-vaster streams of data from every aspect of our operations—but the information contained within this data is not so simple to extract. As industry practice outpaces theoretical understanding, operators are turning to quantitative techniques honed on "big data" in far-flung industries in order to make evidence-based decisions. These techniques fly the various flags of "statistics," "data mining," and "machine learning" but can be usefully united under the banner of analytics: the discovery and communication of meaningful patterns in data. Our first steps into this arena revealed that analytics could be a boon to more than E&P; indeed, the approach could benefit the entire business. Deriving maximum value from analytics would require a vision and a strategy for analytics at the enterprise scale—as well as appropriate action to bring these to fruition. In this paper, we present our experience over the first year of the initiative to build an internal "center of excellence" for analytics at Devon. We describe our initial work defining "analytics" at Devon and crafting a strategy to realize this vision. This includes our design of a process for analytics, definition of the R criteria for analytics projects, and our experiences with our initial "pilot" projects and their results. We share key lessons learned from these projects: the importance of answering the right question, accessing the right data, applying the right tools, and—critically—assembling the right people. Finally, we present our vision for the future of analytics at Devon.
- Information Technology > Software (1.00)
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Data Science > Data Mining > Big Data (0.34)