An understanding of statistical concepts is important to many aspects of petroleum engineering, but especially reservoir modeling and simulation. The discussion below focuses on a range of statistical concepts that engineers may find valuable to understand. The focus here is classical statistics, but differences in the application for geostatistics are included. A quantitative approach requires more than a headlong rush into the data, armed with a computer. Because conclusions from a quantitative study are based at least in part on inferences drawn from measurements, the geoscientist and reservoir engineer must be aware of the nature of the measurement systems with which the data are collected. Each of these scales is more rigorously defined than the one before it. The nominal and ordinal scales classify observations into exclusive categories.
Each of these is discussed briefly in the next two sections. Thereafter--except for another section on probabilistic procedures near the end--the chapter will focus on deterministic procedures because they still are more widely used. Both procedures need the same basic data and equations. Reserves calculated using such procedures are classified subjectively on the basis of professional judgments of the uncertainty in each reserve estimate and/or of pertinent regulatory and/or corporate guidelines. Probabilistic procedures recognize that uncertainties in input data and equations to calculate reserves may be significant.
A reservoir characterization study is a part of the development of a reservoir model. This article describes each of the basic elements involved in a reservoir characterization study. The result of reservoir characterization is the creation of the shared-earth model. The shared-earth model provides for efficient updating of the critical information necessary for 3D modeling. At the basic interpretation stage, the discipline expert interprets the primary data, whereas the geologist and geophysicist collaborate on the structure model and sequence definition.
Water saturation (Sw) determination is the most challenging of petrophysical calculations and is used to quantify its more important complement, the hydrocarbon saturation (1 – Sw). Complexities arise because there are a number of independent approaches that can be used to calculate Sw. The complication is that often, if not typically, these different approaches lead to somewhat different Sw values that may equate to considerable differences in the original oil in place (OOIP) or original gas in place (OGIP) volumes. The challenge to the technical team is to resolve and to understand the differences among the Sw values obtained using the different procedures, and to arrive at the best calculation of Sw and its distribution throughout the reservoir vertically and areally. A 10% pore volume (PV) change in Sw has the same impact as a 2% bulk volume (BV) change in porosity (in a 20% BV porosity reservoir). This listing is the chronological order in which data are likely to become available, not in a ranked order based on the accuracy of the various methods. The choice of which Sw-calculation approach to use is often controlled by the availability of the various types of data. If no OBM cores have been cut, then this technique cannot be used unless funds are spent to acquire such data from one or more newly drilled wells. This is not a high incremental cost when OBM use is planned for other purposes. Resistivity logs are run in all wells, so these data are available for making standard-log-analysis Sw calculations. A key consideration when making calibrated Sw calculations is the availability of special-core-analysis (SCAL) data on core samples from the particular reservoir; that is, the number of laboratory electrical-property and Pc/Sw core-plug measurements that have been made. The technique chosen to calculate Sw is often a hybrid that combines the use of two of these basic data sources. For example, the OBM-core Sw data can be used in combination with the resistivity logs to expand the data set used to include all wells and the whole of the hydrocarbon column. Alternatively, the OBM-core Sw data can be used in combination with the Pc/Sw data. In this way, the OBM-core Sw data define the S w values for the majority of the reservoir, whereas the Pc/Sw data define the Sw values in the interval just above the fluid contact and perhaps in areas of the field where Pc data are available but OBM-core data are not. This section discussed the input-data availability and data quality issues for each Sw technique.
The most comprehensive data-acquisition systems present at the rigsite are provided by service companies such as mud-logging, Measurement while drilling (MWD)/Logging while drilling (LWD)), and wireline vendors. Real-time data-acquisition systems typically are connected to a suite of surface and downhole sensors that enable live monitoring of the rig-equipment operation and the well-construction process. Service-company systems are typically capable of accepting Wellsite Information Transfer Specification (WITS) inputs from other vendors so that sensor readings from all data-acquisition systems may be collated into a single real-time data set that may be provided to the operator at the end of the well. The combination of surface and downhole sensors with networked graphical data logs and text outputs enables the operator's supervisory staff, service company, and rig contractor to maintain an accurate picture of the drilling or well-services operation, and track well progress to ensure that the new-wellbore placement or completion meets the operator's safety, geologic, and production requirements. Rig-contractor personnel may use any number of commercially available electronic tour-sheet applications that enable them to complete their Intl.
Many data are interpreted to evaluate a petroleum-bearing formation, and we discuss the interpretations of data acquired through surface data logging in terms of the rock formation and fluid properties they help determine. The logging engineer or geologist gets information about the formation fluids directly from fluids that are released into the wellbore while drilling and circulating out suspended immiscibly in the drill fluid or remaining in the pores of larger cuttings that may not have been flushed. They receive information indirectly from remnants of the fluid that remain in pores of rock cuttings, as stains on the grain surface, or in solution in the drilling fluid. Oil may be identified as a sheen on the surface of water-based drilling fluid. If the circulating fluid density is sufficiently low as to render an underbalanced drilling condition, oil may be produced in large enough quantities that a sample may be skimmed off a whole mud sample.
As drilling automation has transformed from a blue sky vision to a reality, so has the industry's understanding of the depth and breadth of automation needs and opportunities. The focus is no longer solely on-bottom drilling performance but now considers the entire lifecycle of well construction. Sensors for automation Pragmatic implementations of digital twins Big data analytics that are proven Streaming analytics and edge solutions Systems approaches to automation including cyber security Discrete systems that deliver non-discrete benefits Human-centered design - industry 5.0 thinking (placing the human back in the loop) Oilfield logistics of the future Individual case studies within each session will be presented that provide a foundation for what is available today, and what will be available in the near term. This workshop will bring together operators, drilling data solution experts, as well as service company and drilling contractor personnel to identify and discuss challenges to accelerating adoption of these solutions in the industry. An SPE workshop is a multi-day event that fosters knowledge sharing in an intensive learning experience.
EBN is the Dutch state energy company that is a large non-operating partner of over 10 different operators that produce from more than 200 on- and offshore assets with more than 850 projects defined on them. Estimating budget production, medium and long-term forecasts and its associated operating and capital expenditures are of vital importance to EBN. Larger companies with many assets and even more projects, at varying degree of maturity, have great difficulty to reliably predict an aggregated forecast.
Historically, EBN would copy and risk operator data, which led to continuous overestimation of both budget production and longterm forecasts. A straightforward correction method was developed; that consists of two parts: firstly, the budget production is set for all producing assets and projects by assessing technical, subsurface, infrastructural and human factors on the operator's fields and projects performance. Secondly, the medium and longterm forecast is delayed with 1 to 4 years for respective SPE PRMS resource classes "justified for development" to "project unviable" and the associated project forecasts are risked with a chance of development according to their subclasses of the contingent resource classes.
Data analytics on almost 10 years of reserve reporting according to SPE PRMS standards led to a straightforward solution to reduce short and medium-term forecasting error. The short-term absolute average error used to be 8%. Through the implementation of the new method, 7 years ago, the absolute average short-term forecasting error dropped to 4%. The long-term aggregated forecast, obtained by simply copying the operator data, resulted in an overestimation of up to 50% 5 years ahead. The overestimation was reduced to an absolute average error of 23% by an earlier correction method, which only used risking factors on contingent projects, but no time delay. This paper presents a new method, that uses both risking factors and time delays on the realization of projects. The method reduced the error in the long-term forecast to an uncertainty band of a few percent.
Various causes for the overestimation were identified. The budget production errors were primarily attributed to wrong uptime predictions. Longterm forecast errors are impacted by the overestimation of the number of executed projects, while the timing and performance of new projects affects both the short and middle term forecasts.
The solution presented is the first methodology for EBN that is able to predict aggregated forecasts of hundreds of projects of several operators with an accuracy within a 5% margin over a lengthy period. The described risking factors described, and delay times, are dependent on the portfolio maturity and investment climate. Historic data has to be utilized to determine these factors for your portfolio.