Christian Bos, Senior E&P Reserves and Decision and Risk Expert with TNO-NITG, gave a presentation in December on "Why the E&P Industry Is Waiting for a New Generation of Petroleum Business Engineers" to the Netherlands young professionals group. He said there are conflicts between models that describe the Earth as accurately as possible and models tailored to E&P business decisions. The first category tends to be highly detailed, whereas the latter category generally is more probabilistic, based on integrated data, and contains less detail. The main focus of the presentation addressed determining the best approach to take in building models that support E&P business decisions, and how the bias in decisions that has characterized the industry in the past can be prevented. Looking at stock market returns, it is clear that the E&P industry has underperformed other industries in North America, he said.
Content of PetroWiki is intended for personal use only and to supplement, not replace, engineering judgment. SPE disclaims any and all liability for your use of such content. There is currently no text in this page. You can search for this page title in other pages, or search the related logs, but you do not have permission to create this page. This category contains only the following page.
Accurate and frequent mud checking is critical to optimum well construction. Proper assessment and management of drilling fluid properties such as density and rheology maintain the primary well control barrier and optimize fluid hydraulics and hole cleaning ability. However, a full mud check while drilling is typically done only once or twice a day. Moreover, the measurements are performed using antiquated equipment, with data quality and reliability that are highly dependent on the practicing mud engineer. Automated, continuous and practical drilling fluid monitoring is therefore needed.
In this paper, we introduce an automated mud skid unit (MSU) which performs continuous drilling fluid sampling and measurements at variable temperatures. The unit is able to provide the non-Newtonian rheological constants characterizing a Yield-Power Law (YPL) fluid as well as the real-time friction factor and critical Reynolds number using a pipe-viscometer measurement approach. Other important fluid properties such as pressurized-density, oil/water ratio and temperature are provided using high-quality in-line sensors. The unit is controlled by a programmable logic controller (PLC) coupled with a Linux operating system for data analysis. The system is able to send real-time data to WITSML data servers and provide detailed mud reports to engineers working either on-site or remotely.
The MSU was deployed in the Permian basin by an independent operator for automated mud monitoring during unconventional shale drilling operations. Rheology, density and phase content measurements were compared with conventional mud reports provided by the on-site mud engineer. Excellent accuracy was observed in mud rheology tests. The pressurized mud-density measurements provided by the MSU proved to be more accurate than non-pressurized mud balance measurements which were affected by mud aeration. Moreover, the MSU provided mud check data 25 times more frequent than those generated by the mud engineer at temperatures of 50°C and 65.5°C. Drilling fluid related issues such as chemical over-treatment as well as sudden changes in mud density, rheology and oil/water ratio were reported immediately to the drilling crew. This paper provides details about the measurement technology as well as the results from the field deployment of the MSU.
Digital Transformation across the geosciences and engineering domains is driven by the need to share data of increasing complexity and quantity as part of a distributed workload involving operators, partners, service companies and consultants. Unconventional plays pose an additional challenge since a large number of wells need to be planned, to be drilled in days. Companies operating multiple rigs, fracking crews and workover teams have a significant, time-constrained workload that requires rigorous and efficient execution.
To support a broadening of collaboration among all actors involved, such a work context would require all users to operate the same software suite, with the ability to periodically synchronize data repositories or work concurrently in a common cloud environment. However, the reality is that different companies or departments within a company use different software systems. The push to apply more analytics, new modeling technologies or broader, multi-disciplinary workflows to unconventional reservoirs, with software tools often sourced from new third-party providers, adds to the burden of more complex data flows and the concomitant data verification required from data consumers.
File formats exist to transfer specific data objects, for well data (e.g. LAS, DLIS), grid data, etc.. between different software platforms. These formats package the information for one data instance such as one well. With large unconventional resource play projects drilling hundreds of wells a year while managing production from hundreds more, individually writing and reading files for all new or updated wells and sharing this data across multiple systems is labor-intensive and prone to errors or omissions. This in turn puts the burden on the recipient of the data to verify data correctness and completeness.
An industry consortium has developed and published a data exchange standard that packages all the data pertaining to a hydrocarbon reservoir system. This includes subsurface measurements, interpretation, drilling, completion and production data such that a complete project can be exported from one system and read into another in a single pass. The process supports a full charting of data items and their metadata. This drastically reduces the workload at both ends of the transaction while offering significantly better guarantees that the set of data is complete and correctly referenced.
This has been proven in a full-scale live demonstration involving 6 different software systems each executing a step in a reservoir model enrichment workflow and writing out the data ready for the next software system to ingest it. The total time including the workflow steps and read-write operations totaled 45 minutes, significantly less time than would be needed to execute such a workflow using traditional I/O processes and file formats, and at considerably less risk of errors or omissions.
The most comprehensive data-acquisition systems present at the rigsite are provided by service companies such as mud-logging, Measurement while drilling (MWD)/Logging while drilling (LWD)), and wireline vendors. Real-time data-acquisition systems typically are connected to a suite of surface and downhole sensors that enable live monitoring of the rig-equipment operation and the well-construction process. Service-company systems are typically capable of accepting Wellsite Information Transfer Specification (WITS) inputs from other vendors so that sensor readings from all data-acquisition systems may be collated into a single real-time data set that may be provided to the operator at the end of the well. The combination of surface and downhole sensors with networked graphical data logs and text outputs enables the operator's supervisory staff, service company, and rig contractor to maintain an accurate picture of the drilling or well-services operation, and track well progress to ensure that the new-wellbore placement or completion meets the operator's safety, geologic, and production requirements. Rig-contractor personnel may use any number of commercially available electronic tour-sheet applications that enable them to complete their Intl.
Exploration and production (E&P) project data-management systems are database-management systems designed to support integrated suites of exploration and production applications. Applications share data through a single common data store. Data are administered with a single set of tools for importing, exporting, viewing, editing, and performing database administration. By centralizing all data available for an E&P asset in a single integrated data store, project data-management systems greatly reduce the amount of time spent moving data between applications. Data-management systems serve as readily accessible repositories for the knowledge about an asset.
Multiple point statistical (MPS) simulation is a modern pattern-based geostatistical approach for describing and stochastically simulating geologic formations with complex connectivity patterns. In MPS geostatistical simulation, a template containing data patterns around each simulation cell is used to extract and store the local conditional probabilities from a training image (TI). To generate a simulated sample, a random path is generated to sequentially visit all unsampled grid cells and draw conditional samples from the corresponding stored conditional probabilities. The grid-based implementation of MPS simulation offers several advantages for integration of hard and soft data. In the Single Normal Equation SIMulation (SNESIM) implementation of MPS for facies simulation, it has been observed that the integration of soft data can result in many facies realizations that do not provide consistent patterns with the incorporated probability map. This is partly explained by the Markov property that only considers probabilities that are co-located with the simulation node, and hence ignoring spatial information from neighboring cells. In addition to this effect, we show another important mechanism is in play in the SNESIM algorithm that explains the observed behavior. Specifically, at the early stage of the simulation when the first few percentage of the simulation nodes on the random path are visited the local conditioning data are limited and the resulting conditional probabilities that are obtained from the TI are not strictly constrained. Hence the conditional probabilities cover a wide range of values in the range [0,1]. However, after this initial stage, as the simulated data populate more cells in the model grid, they tend to severely constrain the conditional probabilities to assume extreme values of 0 or 1. With these extreme values at the later stages of the simulation the probability values that are included in the soft data (as secondary source of information) tend to be disregarded and the facies types are predominantly determined by the TI. We demonstrate and discuss this behavior of the SNESIM algorithm through several examples and present strategies that can be adopted to compensate for this effect. The presented examples are related to indirect integration of the flow data by first inferring probabilistic information about facies types and using the results as soft data for integration into SNESIM algorithm.
Jamaludin, Izzuddin (PETRONAS Carigali Sdn Bhd) | Mandal, Dipak (PETRONAS Carigali Sdn Bhd) | Arsanti, Dian (PETRONAS Carigali Sdn Bhd) | Dzulkifli, Izyan Nadirah (PETRONAS Carigali Sdn Bhd) | Zakaria, Nurul Azami (PETRONAS Carigali Sdn Bhd) | Mohamad Salleh, Salhizan (PETRONAS Carigali Sdn Bhd) | Ahmad Hawari, Saiful Adli (PETRONAS Carigali Sdn Bhd) | Mohd Azkah, Mohd Zubair (PETRONAS Carigali Sdn Bhd)
Data acquisition remains one of the crucial activities to be consistently executed throughout field life for any oilfield development. Significant operating expenditure (OPEX) is allocated each year to understand reservoir performance, thus reduce uncertainties and enable optimizations. This paper aims to highlight the issues faced during simulation model history matching (HM) process of a waterflood reservoir, including understanding of depositional environment and production data integrity. The output is utilized to improve recovery factor (RF) via infill opportunities and water injection optimization.
Field A has run a second shot of 3D seismic in 2006 (first in 1995) and processed into a time lapse, 4D seismic. In 2014, a cased hole logging campaign utilizing the high precision temperature, spectral noise logging (HPT-SNL) tool has been completed to check the integrity and flow contribution of 12 wells in Reservoir-X. Within the same period, a pulse pressure testing (PPT) was carried out to verify the communication between wells, in addition to acquiring regular surveillance data which helped to improve reservoir simulation study.
The 4D seismic helped to understand the areal waterflood front movement and explained the water cut trend anomaly in an updip well which experienced earlier water breakthrough than near downdip producers. Moreover, it helped to identify a bypass oil zone which can potentially be an infill location. As most of the wells are on dual string completion, the HPT-SNL campaign helped to improve production allocation of multi stacked reservoirs as well as identify problematic wells which required rectification jobs. The PPT assisted in identifying a baffle zone to explain the poor pressure support observed in some producers in the south from the nearby water injectors. All data interpretations were incorporated into final HM model which subsequently identified infill locations and the reservoir management plan (RMP) was successfully revised. An infill program was executed in 2015, which successfully secured additional EUR of ~9 MMstb. Based on the studies and outcome of the infill campaign water injection optimization helped to improve production and added ~2 MMstb reserves, through voidage replacement ratio (VRR) optimization and oil producer (OP) to water injector (WI) conversion. With these efforts, team could successfully project RF of >55%.
This case study demonstrates how acquiring focused surveillance data and their effective integration in performance analysis in simulation study helps to reduce uncertainties, unveils infill opportunities, improves production injection optimization and thus helps to improve the recovery factor in brown fields.
Microseismic monitoring of multi-stage hydraulic fracturing has emerged as a key technology for unconventional stimulation wells to show the 3D propagation of the induced fractures. Monitoring the dynamic growth of these fractures gives insight into the quality of the completion design as well as better understanding of reservoir characteristics and rock properties.
The lack of industry standard format for microseismic data resulted in major challenges on different aspects of transmission, interpretation and archiving. This has raised the need to establish a standard format for both raw and basic processed microseismic data.
Basic processed data for each microseismic event is processed remotely in the field and transmitted in near real-time to central offices to be mapped and integrated with other data types using specialized technologies to enable timely decision making. Data should contain the minimum and sufficient information to characterize and indicate the location and magnitude of the induced fractures.
Raw microseismic files are large in size and typically delivered to data proponent in external storage devices at the end of the hydraulic fracturing operation. These files should contain additional attributes collected at the field for all microseismic events and stages such as trace header description, geophone orientation and specification. Both basic processed and raw microseismic data must be captured in a standard format to be stored in a corporate database to safeguard company assets, ensure data integrity and availability for future operations.
Data standardization for both basic processed and raw microseismic was developed by compiling the minimum data requirements from subject matter experts. The new standard for basic processed data consists of a vendor-neutral format that is suitable for all service companies and compatible with major interpretation and visualization technologies. This standard mainly focuses on setting guidelines for microseismic events location and magnitude. Raw microseismic files are in "SEGY" format that include additional meta data attributes such as frac stage number, treatment and monitoring well.
Solutions used to address current microseismic data exchange challenges went beyond data standardization to include modification of existing real-time data transmission system, interpretation technologies and database schema to accommodate and visualize microseismic data at different stages.
Future challenges of microseismic will include the need for technology and data transformation to adopt new generation processing technologies and algorithms to enable on-site data analytics and huge data transmission. This new approach will automate filtering signal from noise to streamline real-time signal transmission and ensure integrity. This session will focus on highlighting microseismic current and future data challenges including discussions on some of the proposed solutions.
Djamaluddin, Basirudin (Petrolink International) | Prabhakar, Prajitha (Petrolink International) | James, Baburaj (Petrolink Data Service Pvt. Ltd.) | Muzakir, Anas (Petrolink International) | AlMayad, Hussain (Petrolink International)
Real-time data stream in the format of WITSML which can have frequency as low as 1 Hz is one of the best candidate to produce KPIs for the drilling operation activity. The KPIs generated from this calculation will have a relationship with other information from other data sources, known as metadata. The question is how can this KPI information be utilized for further analysis, wider/more complex analysis process which needs to be combined with metadata? An OLTP model is not the recommended model for data analytics but OLAP is. Another question is how will this data be stored in terms of the physical storage? We argue to use column-oriented for the physical storage which can perform analytical queries 10x to 30x faster than the row-oriented storage. The implementation of an OLAP model for storing KPIs data is proven to improve the performance of the analytical query significantly and combined with the implementation of column-oriented in the OLAP model improves more performance. This concludes that the implementation of OLAP with column-oriented data model can be used as the solid foundation for storing KPI data.