In this paper, the authors introduce a novel semianalytic approach to compute the sensitivity of the bottomhole pressure (BHP) data with respect to gridblock properties. Although principal-component analysis (PCA) has been applied widely to reduce the number of parameters characterizing a reservoir, its disadvantages are well-recognized.
The technical challenges imposed by tight well spacing and fracture interactions have become a focal point of recent earnings calls between investors and the leaders of several shale producers. The picture of the future is becoming clearer, and there are fewer oil wells in it. The 5-year-old software startup is getting noticed by the oil and gas industry for its ability to accelerate analytics projects by taking care of all the tedious work involved with data wrangling.
Temizel, Cenk (Aera Energy) | Balaji, Karthik (University of North Dakota) | Canbaz, Celal Hakan (Ege University) | Palabiyik, Yildiray (Istanbul Technical University) | Moreno, Raul (Smart Recovery) | Rabiei, Minou (University of North Dakota) | Zhou, Zifu (University of North Dakota) | Ranjith, Rahul (Far Technologies)
Due to complex characteristics of shale reservoirs, data-driven techniques offer fast and practical solutions in optimization and better management of shale assets. Developments in data-driven techniques enable robust analysis of not only the primary depletion mechanisms, but also the enhanced oil recovery in unconventionals such as natural gas injection. This study provides a comprehensive background on application of data-driven methods in oil and gas industry, the process, methodology and learnings along with examples of data-driven analysis of natural gas injection in shale oil reservoirs through the use of publicly-available data.
Data is obtained and organized. Patterns in production data are analyzed using data-driven methods to understand key parameters in the recovery process as well as the optimum operational strategies to improve recovery. The complete process is illustrated step-by-step for clarity and to serve as a practical guide for readers. This study also provides information on what other alternative physics-based evaluation methods will be able to offer in the current conditions of data availability and the understanding of physics of recovery in shale oil assets together with the comparison of outcomes of those methods with respect to the data-driven methods. Thereby, a thorough comparison of physics-based and data-driven methods, their advantages, drawbacks and challenges are provided.
It has been observed that data organization and filtering takes significant time before application of the actual data-driven method, yet data-driven methods serve as a practical solution in fields that are mature enough to bear data for analysis as long as the methodology is carefully applied. The advantages, challenges and associated risks of using data-driven methods are also included. The results of comparison between physics-based methods and data-driven methods illustrate the advantages and disadvantages of each method while providing the differences in evaluation and outcome along with a guideline for when to use what kind of strategy and evaluation in an asset.
A comprehensive understanding of the interactions between key components of the formation and the way various elements of an EOR process impact these interactions, is of paramount importance. Among the few existing studies on natural gas injection in shale oil with the use of data-driven methods in oil and gas industry include a comparative approach including the physics-based methods but lack the interrelationship between physics-based and data-driven methods as a complementary and a competitor within the era of rise of unconventionals. This study closes the gap and serves as an up-to-date reference for industry professionals.
Saluja, Vikas (Oil & Natural Gas Corporation LTD.) | Singh, Uday (Oil & Natural Gas Corporation LTD.) | Ghosh, Aninda (Oil & Natural Gas Corporation LTD.) | Prakash, Puja (Oil & Natural Gas Corporation LTD.) | Kumar, Ravendra (Oil & Natural Gas Corporation LTD.) | Verma, Rajeev (Oil & Natural Gas Corporation LTD.)
The case study demonstrated here is the innovative workflow for fault delineation technique on a 3D seismic volume in B-173A Field of Heera Panna Bassein (HPB) Sector, Western Offshore Basin, India. B-173A is located 50 kms west of Mumbai at an average water depth of about 50 m. The field was discovered in the year 1992 and it was put on production in Aug 1998. In B-173A field there are two hydrocarbon bearing zones one is gas bearing Mukta (Lower Oligocene carbonates) Formation and oil bearing Bassein (Middle to Upper Eocene Carbonates) formation.
The present study is an extended workflow on Advanced Seismic Interpretation using Spectral Decomposition and RGB Blending for Fault delineation. Iso-frequency volumes are extracted from Relative Acoustic Impedance data instead of seismic data itself.
The workflow is for effective fault delineation and it consists of Spectral Decomposition of relative acoustic impedance data and RGB Blending of discontinuity attributes of different Iso-frequency volumes.
It is observed that RGB blend volume of discontinuity attributes provided more convincing results for fault delineation as compared to the results of traditional discontinuity attributes.
This paper presents a data driven approach to answer the question of whether premium, high strength white sand proppant, while more expensive than regional (brown) sand, is justified due to its alleged ability to make better producing wells. For this study, 739 horizontal wells with production, and stimulation data were used in a robust statistical approach to conclude that, for the most common set of well characteristics, white sand will produce a superior NPV weighted economic outcome than lower cost regional (brown) sand alternatives. While there are wells in this analysis that did not produce this robust conclusion of "white sand is better", none of them produced an outcome that "brown sand was better". Rather, several of the wells simply had results that were statistically inconclusive. This paper serves as a good example of what data are needed to perform such an analysis and the challenges of normalizing'first order effects' that dominate the influence on well productivity (TVD, lateral length, and proppant intensity) while attempting to ascertain the influence of'second order' factors such as Sand Type. Becoming familiar and adept at these analysis methods should facilitate the statistical verification of other second order effects on finding the optimal stimulation treatment.
This paper discusses about the various approaches that has been taken to reduce downtime in Coal bed methane (CBM) wells by predicting the string integrity failure in advance. Various approaches such as the Principal component analysis (PCA) based T-statistics approach and the Bag of Features approach have been taken to find a solution. These approaches fall under the classical classification approach of supervised learning. The power and usefulness of these approaches are fuelled and limited by the number and richness of annotations and sensors at the Well site. These approaches used on the CBM wells gave very encouraging results, thereby proving the helpfulness of this approach in enhancing the well efficiency and decreasing the well downtime by planning based on the failure prediction of the model.
This paper evaluates the impact of decision making and uncertainty associated with production forecast for 2000+ wells completed in Permian basin. Existing studies show that unconventional reservoirs have complex reservoir characteristics making traditional methods for ultimate recovery estimation insufficient. Based on these limitations, uncertainty is increased during the estimation of reservoir properties, reserve quantification and, evaluation of economic viability. Thus, it is necessary to determine and recommend favorable conditions in which these reservoirs are developed.
In this study, cumulative production is predicted using four different decline curve analysis (DCA) − power law exponential, stretched exponential, extended exponential and Duong models. A comparison between the predicted cumulative production from the models using a subset of historical data (0-3months) and actual production data observed over the same time period determines the accuracy of DCA's; repeating the evaluation for subsequent time intervals (0-6 months, 0-9 months,) provides a basis to monitor the performance of each DCA with time. Moreover, the best predictive models as a combination of DCA's predictions is determined via multivariate regression. Afterwards, uncertainty due to prediction errors excluding any bias is estimated and expected disappointment (ED) is calculated using probability density function on the results obtained.
In this paper, uncertainty is estimated from the plot of ED versus time for all wells considered. ED drops for wells having longer production history as more data are used for estimation. Also, the surprise/disappointment an operator experiences when using various DCA methods is estimated for each scenario. However, it appears that whilst Duong (DNG) method always overpredicts, power law exponential (PLE) decline mostly under predicts, the stretched exponential lies between DNG & PLE estimates and the extended exponential DCA demonstrates an erratic behavior crossing over the actual trend multiple times with time. In conclusion, profitability zones for producing oil in the Permian basin are defined implicitly based on drilling and completion practices which paves the path to determine the "sweet spot" via optimization of fracture spacing and horizontal length in the wells.
The outcome of the paper helps improve the industry's take on uncertainty analysis in production forecast, especially the concept of expected disappointment/surprise. This study suggests that effects of
The effect of frac-hit among the stimulated horizontal wells located in the northwest of the State of New Mexico are identified by addressing how to predict whether or not a planned well caused frac-hit for older wells nearby, and in case of the frac-hit occurrence, how to predict the degree of impact. The machine learning method is used to find the relationship between well parameters such as distance and age difference, and frac-hit occurrence and the degree of impact. Determining the probability of frac-hit occurrence is considered as a classification problem, and random forest method is used to predict the occurrence of the frac-hit. Predicting the impact of the frac-hit is considered as a regression problem, and two machine learning methods, gradient boosting and adaptive boosting (AdaBoost), are used to solve this problem. In the pool of data, the data are randomly assigned to train and test set for unbiased machine learning.
The data of the training set are put into the random forest classifier to find whether the distance, age, age difference, and bearing have any impact on the occurrence of the frac-hit. Among these four factors, the bearing has the most significant impact, which means that the weight of bearing in classification process is higher than the other parameters, followed by the distance as the second important factor. Applying the trained random forest classifier on the test set data gives 78% correct outcomes compared to the actual frac-hit data in the test set.
Considering the change of oil production due to frac-hit as the indicator to measure the degree of impact in gradient boosting and AdaBoost algorithm shows that the bearing between wells is not an influential parameter in the regression problem compared to the classification problem. In other words, if the well has already experienced the frac-hit, the importance of bearing decreases, and the distance, age difference, and age of the wells become more prominent factors. The analysis shows that the average error between the actual data and the predicted results by gradient boosting and AdaBoost is about 40%.
The results of this paper can be used by the hydraulic fracturing operators to pre-determine the frac-hit probability and its impact on existing offset wells. It can also help to refine well design strategies to minimize the risk of potential well interferences.
Liu, Guoxiang (Baker Hughes a GE Company) | Stephenson, Hayley (Baker Hughes a GE Company) | Shahkarami, Alireza (Baker Hughes a GE Company) | Murrell, Glen (Baker Hughes a GE Company) | Klenner, Robert (Energy & Environmental Research Center, University of North Dakota) | Iyer, Naresh (GE Global Research) | Barr, Brian (GE Global Research) | Virani, Nurali (GE Global Research)
Optimization problems, such as optimal well-spacing or completion design, can be resolved rapidly via surrogate proxy models, and these models can be built using either data-based or physics-based methods. Each approach has its strengths and weaknesses with respect to management of uncertainty, data quality or validation. This paper explores how data- and physics-based proxy models can be used together to create a workflow that combines the strengths of each approach and delivers an improved representation of the overall system. This paper presents use cases that display reduced simulation computational costs and/or reduced uncertainty in the outcomes of the models. A Bayesian calibration technique is used to improve predictability by combining numerical simulations with data regressions. Discrepancies between observations and surrogate outcomes are then observed to calibrate the model and improve the prediction quality and further reduce uncertainty. Furthermore, Gaussian Process Regression is used to locate global minima/maxima, with a minimal number of samples. To demonstrate the methodology, a reservoir model involving two wells in a drill space unit (DSU) in the Bakken Formation was constructed using publicly available data. This reservoir model was tuned by history matching the production data for the two wells. A data-based regression model was constructed based on machine learning technologies using the same dataset. Both models were coupled in a system to build a hybrid model to test the proposed process of data and physics coupling for completion optimization and uncertainty reduction. Subsequently, Gaussian Process Model was used to explore optimization scenarios outside of the data region of confidence and to exploit the hybrid model to further reduce uncertainty and prediction. Overall, both the computation time to identify optimal completion scenarios and uncertainty were reduced. This technique creates a robust framework to improve operational efficiency and drive completion optimization in an optimal timeframe. The hybrid modeling workflow has also been piloted in other applications such as completion design, well placement and optimization, parent-child well interference analysis, and well performance analysis.