|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Commercial valuation of a technology in Proof-of-Concept stage is often based on limited case study data, and then extrapolated to a hypothesized total market demand for that technology. The methodology presented in this paper uses a bottoms-up, data-driven, well-by-well valuation using a 60,000+ well industry benchmarking data set. The methodology was developed to support the valuation of a new technology concept using hypersonic impact drilling, then at API-17N Technology Readiness Level 1.
Any new technology has a low definition of operational performance and technical capability by virtue of being in concept stage. The well dataset used for valuation analysis is relatively high-level, resulting in a significant number of assumptions and limitations. Nonetheless, the combination of a granular technology model with a large actual dataset provides insights into sensitivities and uncertainties which are unobtainable with a broad-brush, high-level approach.
Based on the information available in the database, the methodology constructs a synthetic time-depth curve for drill and case operations after removal of non-productive time. Synthetic time and cost for each section are calculated for both the actual well and the technology model allowing section-by-section ‘bench-marking’ of the technology. The combined savings from technology-positive sections gives the size of the prize or commercial margin available to be shared between Operator and Supplier.
We present a case study in which we modelled the initial new technology deployment concept, showing this concept to have an operational sweet spot with value rapidly decreasing away from it. An alternative, downhole deployment concept resulted in a multiple times wider applicability and a multi-billion-dollar un-risked value proposition indicative of a potentially game changing technology. Based on this new insight, the technology developer was able to pivot early on, probably avoiding costly dead-end development and market disappointment, and increase industry and investor confidence and investment.
The methodology can be used to gain actionable insights at multiple levels: To obtain a mature market valuation, mature technology parameters such as reliability, directional drilling capability and all applicable hole sizes and depths are invoked. To aid the technology development and design decisions, sensitivity analyses can be performed on design parameters. To guide development requirements, a Minimum Viable Product analysis provides insight to the minimum technical requirements necessary and the de-risking work required before a technology can gain acceptance in the marketplace. To explore early applications and potential sponsoring projects, clusters of potential high-value and/or early-applications can be identified. The results from this valuation model provide insights into the potential of wells at or beyond the fringes of the database, i.e. complex wells that require extraordinarily long net times to drill.
To obtain a mature market valuation, mature technology parameters such as reliability, directional drilling capability and all applicable hole sizes and depths are invoked.
To aid the technology development and design decisions, sensitivity analyses can be performed on design parameters.
To guide development requirements, a Minimum Viable Product analysis provides insight to the minimum technical requirements necessary and the de-risking work required before a technology can gain acceptance in the marketplace.
To explore early applications and potential sponsoring projects, clusters of potential high-value and/or early-applications can be identified.
The results from this valuation model provide insights into the potential of wells at or beyond the fringes of the database, i.e. complex wells that require extraordinarily long net times to drill.
Innovations in the field of fiber optics have resulted in the application of distributed acoustic sensing (DAS) for borehole seismic applications.
A hybrid optical-electrical cable has been developed to enable conveying all logging suites required during the entire well construction. From intermediate drilling section evaluations through well integrity and production, the optical fiber enables seismic acquisition during any wireline intervention.
Because the optical fiber inside the logging cable is the seismic sensing element, velocity information can be measured along the entire wellbore very efficiently. The acquisition of a borehole seismic program, such as a zero-offset vertical seismic profile (VSP) is reduced to minutes, instead of hours or days. Also, no additional rig time is needed if the seismic information is acquired during logging runs that involve remaining stationary in the borehole, such as fluid sampling and calibration of nuclear magnetic tools.
Improvements in drilling efficiency from using this technology are achieved in several ways: 1) elimination of a dedicated seismic run, which renders substantial rig time savings; 2) lowering risk of operations (i.e., avoiding fishing); 3) virtual elimination of nonproductive time due to downhole tool failure and the provision of many opportunities to gather data (logging runs in open or cased hole); 4) obtaining seismic information, however basic, in every well in a cost-effective manner, which is important for future well placement and derisking drilling operations; 5) optimization of seismic acquisition programs in combination with conventional downhole three-component (3C) seismic tools; and 6) seismic data quality assurance resulting from measurement during the drilling process with minimum disruption (intermediate sections and TD), avoiding sections with multiple casing strings or noncemented tubulars.
An example of a zero-offset VSP acquired on hybrid optical heptacable is discussed in the context of increasing drilling efficiency. The checkshot and VSP survey required only minimum rig time to acquire the data, with the hybrid fiber optic-conventional seismic program acquired at intermediate and final
sections that obtained VSP information for the entire well with significant rig-time savings and improved data quality.
Well integrity is very important for well operation in oil and gas industry. Advanced technologies and analyses have been developed to ensure well integrity. One of the most important analysis to be done is the evaluation of the cement bond which measures the presence and bonding of cement between casing and the formation in a particular depth or interval. This evaluation is critical for hydraulic isolation to withstand subsequent completion and production operation.
Usually, Cement bond evaluation is done from wireline logging, with a tool that transmits acoustic waves and compute the acoustic energy propagating through the casing, the cement and the formation. The amplitude and the attenuation of the returned wave can be correlated with Bond Index (BI), due to the difference in acoustic impedance between casing and cement.
Over years, different wireline technologies have been developed and matured for cement bond evaluation. Nowadays, it can be found scenarios where wireline operation is not feasible, such as in highly deviated wells or when a quick turnaround is needed.
Acoustic technology in Logging While Drilling (LWD) has been developed to measure the compressional and shear slowness of the formation. These devices are able to transmit acoustic wave at different frequencies and receive them after propagation through a medium, applicable in cased hole condition. This paper presents a case study where the LWD Acoustic technology was successfully used to evaluate the cement bond in a high deviated well in Brazil and the results show the capabilities of the technology to generate a qualitative cement bond data quality. These results were successfully validated using sonic and ultrasonic wireline technologies. This comparison, between LWD acoustic and wireline results, is shown in this paper, including details of the methodology applied to the test, parameters considered and the data processing.
Although reserves estimates for known accumulations historically have used deterministic calculation procedures, the 1997 SPE/WPC definitions allow either deterministic or probabilistic procedures. Each of these is discussed briefly in the next two sections. Thereafter--except for another section on probabilistic procedures near the end--the chapter will focus on deterministic procedures because they still are more widely used. Both procedures need the same basic data and equations. Deterministic calculations of oil and/or gas initially in place (O/GIP) and reserves are based on best estimates of the true values of pertinent parameters, although it is recognized that there may be considerable uncertainty in such values.
Water saturation (Sw) determination is the most challenging of petrophysical calculations and is used to quantify its more important complement, the hydrocarbon saturation (1 – Sw). Complexities arise because there are a number of independent approaches that can be used to calculate Sw. The complication is that often, if not typically, these different approaches lead to somewhat different Sw values that may equate to considerable differences in the original oil in place (OOIP) or original gas in place (OGIP) volumes. The challenge to the technical team is to resolve and to understand the differences among the Sw values obtained using the different procedures, and to arrive at the best calculation of Sw and its distribution throughout the reservoir vertically and areally. In OOIP and OGIP calculations, it is important to remember the relative importance of porosity and Sw. A 10% pore volume (PV) change in Sw has the same impact as a 2% bulk volume (BV) change in porosity (in a 20% BV porosity reservoir). This listing is the chronological order in which data are likely to become available, not in a ranked order based on the accuracy of the various methods. The choice of which Sw-calculation approach to use is often controlled by the availability of the various types of data. If no OBM cores have been cut, then this technique cannot be used unless funds are spent to acquire such data from one or more newly drilled wells. This is not a high incremental cost when OBM use is planned for other purposes. Resistivity logs are run in all wells, so these data are available for making standard-log-analysis Sw calculations. A key consideration when making calibrated Sw calculations is the availability of special-core-analysis (SCAL) data on core samples from the particular reservoir; that is, the number of laboratory electrical-property and Pc/Sw core-plug measurements that have been made. The technique chosen to calculate Sw is often a hybrid that combines the use of two of these basic data sources. For example, the OBM-core Sw data can be used in combination with the resistivity logs to expand the data set used to include all wells and the whole of the hydrocarbon column. Alternatively, the OBM-core Sw data can be used in combination with the Pc/Sw data. In this way, the OBM-core Sw data define the S w values for the majority of the reservoir, whereas the Pc/Sw data define the Sw values in the interval just above the fluid contact and perhaps in areas of the field where Pc data are available but OBM-core data are not. This section discussed the input-data availability and data quality issues for each Sw technique.
The demand for ensuring the reliability and accuracy of drilling sensor data is rapidly increasing with the rise of data analytics and automation. Proven systems engineering methodologies used in the aerospace industry offer the drilling industry a solution for assuring the reliability of sensor data and software processing of this data. This paper describes an approach for verification and validation (V&V) of sensors and systems leveraging experiences from the aerospace industry. Both oil and gas (O&G) and aerospace have high-value assets where failures can lead to significant loss of life and huge financial impacts. Commercial aviation, in particular, has achieved a remarkable improvement in safety over many decades through design standards requiring functions be precisely defined and their implementation be verified against those requirements. Avaition industry standards, such as DO-178C for Software Considerations in Airborne Systems and Equipment Certification, acknowledge that not all aircraft systems require the same attention. The standards define a range of classification from the most critical, such as autopilots, to the least critical, such as entertainment systems. This paper describes how mapping levels and requirements from DO-178C to oil and gas can assist in the improvement of data quality to the higher standards now required in drilling operations. A range of criticality provides the ability to seek a balance of costs vs. benefits for application to drilling systems to achieve the needed data quality.
Summary In this study, shale samples were heated under inert and noninert environments to increase the permeability of the shale. The nanoscale pore structure changes under combustion and pyrolysis [air, nitrogen (N 2), carbon dioxide (CO 2), and argon (Ar)] conditions were investigated. It was found that pore diameters increased under all the gas environments. Pore diameters increased more significantly under air environment compared with other gas conditions. However, the diameters of the shale particles remained almost constant during combustion. Moreover, gases emitted from the shale during the combustion and pyrolysis process were investigated using thermogravimetric analysis coupled to Fourier-transform infrared spectroscopy (TGA-FTIR). Finally, scanning electron microscopy (SEM) images showed larger pores on the surfaces of the combusted and pyrolyzed shale samples. Introduction Shale gas produced from finely laminated, organic-rich mudstones and siltstones (i.e., shale) has attracted increasing attention around the world because of its massive reserves (Hartman et al. 2011). Hydraulic-fracture technology has been widely applied for shale oil and gas exploration in the United States and keeps natural-gas prices low.
Automatic updates of simulation models with historical field performance and events is a challenging and time-consuming task that reservoir engineers need to tackle; whether it is to maintain history matched reservoir models (evergreen assets), undertake new calibration exercise or update forecasting studies. The challenge takes another dimension with increasing complexity of field operations (production/injection/drilling/workover), and well designs and configuration of downhole equipment. This paper presents an efficient workflow capitalizing on IR4.0 Digital Twin principles to automate the process of seamlessly integrating and updating historical wells’ information in reservoir simulation models. The objective of this workflow is to drive reservoir simulation towards capitalizing on digital transformation and the Live Earth models concept to revolutionize model calibration and history matching for superior quality of prediction with great confidence.
Well data digitization in this workflow was achieved through automating well data acquisition, well data quality checking enforcement and well modeling in interconnected simulation applications. The workflow minimizes human manual interaction with data giving engineers the chance to focus more on reservoir engineering aspects of reservoir engineering tasks. The workflow consists of four steps. The first step is data acquisition in which various types of well data are fetched. The second step is data quality check in which data from different data sources is subjected to engineering and scientific measures (i.e. Quality Indices) that translate engineering knowledge and experience to detect possible data inconsistencies. The third and fourth steps cover exporting and importing relevant data within the reservoir simulation applications’ portfolio where various data types are handled and managed seamlessly. Data and event acquisition workflows were automated to provide seamless well data transfer between different data sources and reservoir simulation pre and post-processing applications. The different types of well data were obtained through automatic fetching from data repository (databases, petrophysical models … etc.). The Quality Check (QC) procedures were automatically performed against deviation surveys, perforations, casing/tubing, flowmeter, cores, formation tops and productivity/injectivity index. This helped in identifying data discrepancy, if any, including missing data entries and contradicting well events.
The automation of these workflows significantly reduced the time needed for well data transmission/update to the reservoir models, eliminated human errors associated with data entry or corrections, and helped keeping the models up-to-date (evergreen). Incorporating the digital twin concepts enabled advanced automatic digitization of well information. It provided a data exchange solution that meets E&P requirements and provided more effective and efficient methods of connecting diverse applications and data repositories.
The Cessation of Production (COP) from mature fields assimilated with the backlog of idle wells has entailed the operators to beef up the decommissioning projects in the industry. Following the recent downturn with the market going "leaner" in finances, the economic viability of any project, especially a D&A project is deemed paramount. The current trends in the industry indicate the paradigm shift towards rigless decommissioning as well decommissioning methodology from conventional rig-based operations. Though planned rigorously quite a good number of rigless P&A campaigns end up in cost over-run due to from "unknown unknowns", uncertainties of working on old wells and contingencies midway in the project thus entailing techno-management re-enforcement. However, this can be rectified by efficient candidate selection from the set of wells that are expected to undergo P&A. Therefore, the paper describes a framework that encompasses various parameters serving as an optimum guide for candidate selection of wells to undergo rigless P&A campaign, either partially or wholly.
Of the operations involved in well abandonment, irrespective of the phase of the campaign a majority of them will be significantly impacted by well plan and parameters, cement integrity behind casing and data arability or quality of data available regarding these parameters. The paper describes these parameters categorically viz 1) Well parameters which include well plan, casing policy and seats, directional parameters such as well profile, inclination, horizontal drift, dogleg of kick-off (short radius/ long radius kick-off) and 2) Cement bond integrity between casing and cement (annular integrity) and the integrity between the cement and the formation.
Considering the uncertainties involved in P&A operations such as the need to access the multiple annuli for placing barrier in the phase 1 of decommissioning due to poor cement integrity or due to sustained casing pressure in section B&C annuli or be it the contingencies in phase 3 associated with pulling of heavier tubular like 13 3/8 casing and conductor pile, pulling pack offs in wellhead, tag test of surface plugs, the paper discusses the impact of these parameters in the technology selection for the well P&A. Consequently, assimilating it with the technical limits of rigless options such as maximum load handling capacity of platform cranes, pulling capacity of jacking & Pulling unit, surface handling capacity for tubular of rigless package, limit on drift accessibility by E-line/ slickline in high angle and ERD wells, the paper presents the candidate wells for the selected strategy for well P&A.
The paper strengthens the need for adequate planning for selecting candidate wells for a proposed strategy for decommissioning of wells, whether partially or wholly rigless, and presents a structured framework as a solution for candidate wells selection. Further considering the operator and regulator specific details, the framework can be customized and detailed for the particular regions and the fields.
Nikitin, Anton (Shell International Exploration and Production) | Durand, Melanie (Shell Exploration and Production) | McMullen, Adam (Shell Exploration and Production) | Blount, Aidan (Shell Exploration and Production) | Driskill, Brian (Shell Exploration and Production) | Hows, Amie (Shell International Exploration and Production)
Sustained E&P activity levels and slim margins on highly valued Permian Basin acreage drive operators to leverage information as much as possible and in ways not seen in the recent past. Data accuracy, especially in this fast-paced, competitive environment, is strongly desired. Core analyses provide subsurface static calibration, but the thick stratigraphic section comprised largely of sublog scale facies, challenges a cost-effective approach to collect sufficient calibration data.
Saturation determination is a key petrophysical deliverable that has multiple uses, including landing zone assessment. Calibration of saturation models may originate in several ways: proprietary or joint venture core, industry consortia databases, data trades with other operators, government databases, or publications. Internal and external reviews of subsurface model inputs have repeatedly shown that Permian Basin saturations, in particular, have a wide distribution and large uncertainty. Accurately measuring core fluid saturations in tight rock continues to pose significant challenges originating from the currently accepted laboratory methods, assumptions used to interpret those data and more broadly, due to increased relative uncertainty associated with tight, low-porosity formations.
For example, crushing core samples, which enhances fluid extraction in tight rocks, causes systematic fluid losses in the case of core samples of liquid-rich mudstone formations, which are not typically quantified. Instead, as-received air-filled porosity is commonly assumed to represent hydrocarbons that were forced from core during acquisition/retrieval due to gas expansion. Additionally, fluid extraction from commercially available retorting systems have widely variable fluid collection efficiencies (<100%) resulting in significant inconsistencies between the weight of collected fluids and sample weight loss during retorting experiments. The Dean-Stark technique removes not only water and oil, but an unknown volume of solvent-extractable organic matter, and it only allows for direct quantification of the extracted water volume. Finally, fluid and solid losses during handling in the laboratory are unassessed in current commercial laboratory procedures. The reconciliation of fluid volumes with fluid and sample-weight data delivered by either of the two techniques, i.e., retort or Dean-Stark, requires numerous assumptions about pore fluid properties, which are typically not verified through direct measurements. We demonstrate that such assumptions can lead to extreme uncertainty in estimates of water saturation.
To address such critical uncertainties, a new retort-based core analysis workflow using improved core characterization and fluid-extraction techniques was developed. In one advancement, this workflow employs NMR measurements systematically performed on all as-received and crushed samples to quantify fluid losses during crushing. This approach also uses a specially developed fluid collection apparatus with close to 100% fluid collection efficiency. In addition to these advances in measurements, the workflow is optimized to avoid fluid losses during sample handling and includes repeated grain density and geochemical measurements at different stages for quality control (QC). As a result, the new workflow reduces the uncertainties in acquired data and better addresses the assumptions, i.e., parameter corrections for fluid losses, in interpreting measured data into core total porosity and core fluid saturations. The workflow is demonstrated for a set of Delaware Basin Wolfcamp A formation samples and the results suggest that previous crushed-rock core analysis protocols underestimate water saturation by at least 30% or ~15 saturation units (s.u.) for this liquid-rich mudstone formation.