Monte Carlo simulation is a process of running a model numerous times with a random selection from the input distributions for each variable. The results of these numerous scenarios can give you a "most likely" case, along with a statistical distribution to understand the risk or uncertainty involved. Computer programs make it easy to run thousands of random samplings quickly. Monte Carlo simulation begins with a model, often built in a spreadsheet, having input distributions and output functions of the inputs. The following description is drawn largely from Murtha.
An understanding of statistical concepts is important to many aspects of petroleum engineering, but especially reservoir modeling and simulation. The discussion below focuses on a range of statistical concepts that engineers may find valuable to understand. The focus here is classical statistics, but differences in the application for geostatistics are included. A quantitative approach requires more than a headlong rush into the data, armed with a computer. Because conclusions from a quantitative study are based at least in part on inferences drawn from measurements, the geoscientist and reservoir engineer must be aware of the nature of the measurement systems with which the data are collected. Each of these scales is more rigorously defined than the one before it. The nominal and ordinal scales classify observations into exclusive categories.
The accurate calculation of porosity at the wellbore is essential for an accurate calculation of original oil in place (OOIP) or original gas in place (OGIP) throughout the reservoir. The porosity and its distribution also need to be calculated as accurately as possible because they are almost always directly used in the water saturation (Sw) and permeability calculations and, possibly, in the net pay calculations. In most OOIP and OGIP studies, only the gross-rock-volume uncertainties have a greater influence on the result than porosity does. Occasionally, where porosity estimates are difficult, porosity is the leading uncertainty. Fractured and clay-mineral-rich reservoirs remain a challenge. For this discussion, it is assumed that the core data have been properly adjusted to reservoir conditions, that the data from various logs have been reviewed and validated as needed, and that all of the required depth-alignment work has been completed.
This page discusses single phase permeability models that are specifically based on the pore dimensions of the reservoir. Pore dimensions are a critical factor in determining crucial characteristics of the reservoir; including porosity, permeability, and capillary pressure. The dimension of interconnected pores plays a major role in determining permeability. Most methods of estimating permeability are indirect methods. A viable direct method requires both adequate theoretical underpinnings relating pore throat dimension to permeability and experimental determination of the critical pore dimension parameters.
A Monte Carlo model is, in principle, just a worksheet in which some cells contain probability distributions rather than values. Thus, one can build a Monte Carlo model by converting a deterministic worksheet with the help of commercial add-in software. Practitioners, however, soon find that some of their deterministic models were constructed in a way that makes this transition difficult. Redundancy, hidden formulas, and contorted logic are common features of deterministic models that encumber the resulting Monte Carlo model. Likewise, presentation of results from probabilistic analysis might seem no different from any other engineering presentation (problem statement, summary and conclusions, key results, method, and details).
Figure 1.2 – Sketch of the impact of primary depositional features (such as quartz content and sorting, in italics) and diagenetic processes (such as compaction and cementation) on permeability/porosity trends in sandstones. Sedimentological scale (phi scale) is logarithmic in powers of two. Grain diameter (d) (crystal size in case of dolostones) is given in micrometers. Limestone and dolostone classes are from Lucia; chalk data are from Mortensen et al.  See Discussion for explanation of vertical bars showing grain and pore throat sizes in a consolidated sandstone. Symbols linked with solid lines denote size ranges; dotted lines distinguish sorting classes.
A high rate of penetration (ROP) is considered one of the most sought-after targets when drilling a well. While physics-based models determine the importance of drilling parameters, they fail to capture the extent or degree of influence of the interplay of the different dynamic drilling features. Parameters such as WOB, RPM, and flowrate, MSE, bit run distance, gamma ray for each rock formation in the Volve field in the North Sea were examined ensuring an adequate ROP while controlling the tool face orientation is quite challenging. Nevertheless, its helps follow the planned well trajectory and eliminates excessive doglegs that lead to wellbore deviations. Five different Machine Learning algorithms were preliminary implemented to optimize ROP and create a less tortuous borehole. The collected data was cleaned and preprocessed and used to structure and train Random Forest, Support Vector Regression, Ridge Regression, LASSO, and Gradient Boosting, XG boost among others and the appropriate hyperparameters were selected. A successful model was chosen based on maximized ROP, minimized deviation from planned trajectory, and lower CPF. An MAEP of 15% was achieved using GBM boost followed AdaBoost. The algorithms have demonstrated competence on the historical dataset, accordingly it will be further tested on blind data to serve as a real-time system for drilling optimization to enable a fully automated system.
Hadi, Farqad (Petroleum Engineering Department, Baghdad University) | Albehadili, Ali (Iraqi Drilling Company) | Jassim, Abduihussein (Najaf Oil Fields) | Almahdawi, Faleh (Petroleum Engineering Department, Baghdad University)
Formulating a prediction tool that can estimate the formation permeability in uncored wells is of particular importance for many applications related to reservoir simulation and production management. Although formation permeability can be obtained from a laboratory or from a reservoir, core analysis and well-test data are limited due to cost and time-saving purposes. A major challenge of previous methods is that they are required other parameters to be previously computed such as porosity and water saturation. In addition, they are affected by the uncertainty that introduced by the cementation factor and saturation exponent. This study presents two prediction methods, multiple regression analysis (MRA) and artificial neural networks (ANNs), to estimate formation permeability using conventional well log data.
The prediction methods were demonstrated by means of a field case in SE Iraq. The study uses core/well log data from Mishrif reservoir which is mainly composed of carbonate (limestone) formations. Two traditional methods were reviewed and presented for permeability determination. These methods are the classical method and the flow zone indicator (FZI) method.
At the same porosity, the results showed a wide range of formation permeability prediction. This result gives a special attention to the assumption that the relationship between permeability and porosity is generally unique in carbonate environments. The deep lateral log resistivity appears to be more conservative in the permeability function rather than other parameters, followed in decreasing order by bulk density, sonic travel time, micro and shallow resistivities, and shale volume. Although the presented models based on RA and ANNs resemble to be closely in determining the formation permeability, the correlation coefficient of ANNs was found to be higher than that obtained from RA, which indicated that the ANNs is more precise than RA. The comparison among previous methods shows the superiority of the FZI method rather than the classical method. However, core porosity and permeability should be previously determined to apply FZI method. This study presents efficient and cost-effective models for a prediction of permeability in uncored wells by incorporating conventional well logs.
Temizel, Cenk (Aera Energy) | Balaji, Karthik (University of North Dakota) | Canbaz, Celal Hakan (Ege University) | Palabiyik, Yildiray (Istanbul Technical University) | Moreno, Raul (Smart Recovery) | Rabiei, Minou (University of North Dakota) | Zhou, Zifu (University of North Dakota) | Ranjith, Rahul (Far Technologies)
Due to complex characteristics of shale reservoirs, data-driven techniques offer fast and practical solutions in optimization and better management of shale assets. Developments in data-driven techniques enable robust analysis of not only the primary depletion mechanisms, but also the enhanced oil recovery in unconventionals such as natural gas injection. This study provides a comprehensive background on application of data-driven methods in oil and gas industry, the process, methodology and learnings along with examples of data-driven analysis of natural gas injection in shale oil reservoirs through the use of publicly-available data.
Data is obtained and organized. Patterns in production data are analyzed using data-driven methods to understand key parameters in the recovery process as well as the optimum operational strategies to improve recovery. The complete process is illustrated step-by-step for clarity and to serve as a practical guide for readers. This study also provides information on what other alternative physics-based evaluation methods will be able to offer in the current conditions of data availability and the understanding of physics of recovery in shale oil assets together with the comparison of outcomes of those methods with respect to the data-driven methods. Thereby, a thorough comparison of physics-based and data-driven methods, their advantages, drawbacks and challenges are provided.
It has been observed that data organization and filtering takes significant time before application of the actual data-driven method, yet data-driven methods serve as a practical solution in fields that are mature enough to bear data for analysis as long as the methodology is carefully applied. The advantages, challenges and associated risks of using data-driven methods are also included. The results of comparison between physics-based methods and data-driven methods illustrate the advantages and disadvantages of each method while providing the differences in evaluation and outcome along with a guideline for when to use what kind of strategy and evaluation in an asset.
A comprehensive understanding of the interactions between key components of the formation and the way various elements of an EOR process impact these interactions, is of paramount importance. Among the few existing studies on natural gas injection in shale oil with the use of data-driven methods in oil and gas industry include a comparative approach including the physics-based methods but lack the interrelationship between physics-based and data-driven methods as a complementary and a competitor within the era of rise of unconventionals. This study closes the gap and serves as an up-to-date reference for industry professionals.