Dhote, Prashant (Kuwait Oil Company) | Al-Adwani, Talal (Kuwait Oil Company) | Al-Bahar, Mohammad (Kuwait Oil Company) | Al-Otaibi, Ahmad (Kuwait Oil Company) | Chakraborty, Subrata (Schlumberger) | Stojic, Slobodan (Schlumberger)
Subsurface petroleum industry is burdened with uncertainties in every aspect from exploration to production due to limitations of accessibility to reservoir and technology. The most important tools used to understand, quantify and mitigate the uncertainties are geostatistical static modeling and numerical dynamic simulation geomodels. Geomodels are widely used in the industry for characterizing the reservoir and planning favorable development strategy. It is vital instrument for maximizing asset value and optimize project economics.
Static geomodels are foundation for all the advanced numerical and analytical solutions to solve the intricacies of reservoir performance. At the same time, it is where all the static and dynamic geological and engineering observations get integrated to develop common understanding of the reservoir for future studies. Understanding of the above observations and imaging of reservoir framework by individual is the basis for building static geomodels. Hence, at time, the process is highly subjective and proper QC'ing of the models to achieve the general and specific modeling objectives becomes imperative. Simple Questionaries’ based QC'ing and ranking methodologies are also controlled by subjectivity and individual preferences.
In the present endeavor, quantitative ‘Key Performance Indicators (KPIs)’ based standard static geomodeling practices and QC'ing methodologies at corporate level are developed in specially designed "Process Implementation Project (PIP) – Hydrocarbon resource and Uncertainty Management"’ under the aegis of ‘Kuwait Oil Company (KOC) - Reservoir Management Best Practices Steering Committee'.
The main objectives are to establish a practical modeling process, workflows and criteria to standardize modeling processes. A structured self-guidling modeling document has been developed with self-assemment guidelines and questionary. Finally, for each individual process a set of KPIs are specified as minimum standard to meet to obtain the approval of static model.
The present efforts are important for any geologists, geomodelers and reservoir engineers dealing with geostatistical and numerical reservoir modeling and will provide the KPI's based general practices for quality assurance (QA) and QC'ing of the models.
Han, Rubing (Research Institute of Petroleum Exploration and Development, Petrochina) | Li, Shunming (Research Institute of Petroleum Exploration and Development, Petrochina) | Song, Benbiao (Research Institute of Petroleum Exploration and Development, Petrochina) | Tian, Changbing (Research Institute of Petroleum Exploration and Development, Petrochina)
The giant H oil field was discovered in 1976 and put into development in 2008. The Mishrif reservoir has half of the geological reserves, and the current development challenges are greater. First, the sedimentary evolution is very complicated (inner ramp, lagoon, shallow sea, et al), the existing facies model is generally derived from rock type interpolation with kriging algorithm, and does not reflect sedimentary understanding, and is quite different from the actual situation displayed by seismic data. Furthermore, the variogram of rock types is also difficult to be accurate, resulting in poor fitting of reservoir simulation. Second, the seismic data quality is generally low, and its correlation with rock types is worse. If it is directly used to constrain rock type interpolation, the accuracy is generally unsatisfactory.
In order to solve these problems, this study first comprehensively studied the core, thin slices, scanning electron microscopy, X-ray diffraction, electron probe and other data for sequence stratigraphy and sedimentary research. Secondly, based on the single well facies interpretation, combined with the inversion data, the dimension, spatial relationship and variogram of each depositional element were studied. Thirdly, the facies model was built using the Sequential Indicator Simulation (SIS) algorithm, and was manually modified zone by zone according to the sequence stratigraphy and sedimentary understanding. Finally, the relationship between depositional elements and rock types was studied, and the probability bodies of different depositional elements with different rock types were obtained. Based on this, the SIS algorithm was used to simulate rock types. Then this could be further used as a constraint to build petrophysical models.
The results showed that the correlation between the depositional elements and the rock type was obviously better than the inversion wave impedance. The artificially modified facies model could better reflect sedimentary appearance, and various static and dynamic data. The rock type model obtained under the constraint of the facies model also bypassed the problem that the seismic data quality was not good enough for direct rock type interpolation.
It was verified that with the new modeling workflow, the preliminary numerical simulation fitting rate reach over 80%. The facies model was then verified with water flooding test in H oil field and had good results. This workflow could provide a good reference for similar oilfields in the Middle East.
Saleh, Ahmed (Safer Exploration & Production Company) | Al Maktari, Amin (Safer Exploration & Production Company) | Alogily, Abdulkhalek (Safer Exploration & Production Company) | Haygana, Adel (Safer Exploration & Production Company) | Al Adhashi, Jaber (Safer Exploration & Production Company) | Ahmad, Ammar (Schlumberger) | Khan, Waqar A. (Schlumberger) | Baig, Muhammad Zeeshan (Schlumberger) | Jibran, Ali (Schlumberger)
The objectives of this work were to achieve realistic reservoir modeling using seismic inversion volumes, advance petrophysical & rock typing analysis for predictive modeling of reservoir quality sands of Lam Formation. Generation of predictive scenarios for the constructed reservoir model for reduction in uncertainty. Identification of new infill well locations based on the predictive reservoir sand distribution. Reevaluation of in place volumetrics for Lam Formation in Prospect D Field. The methodology adopted to achieve the above results.
The methodology for this work contained the following steps; In the first step seismic inversion was performed on Prospect D Field 3D seismic dataset to obtain volume of clay seismic volumes and facies volume. Petrophysical rock typing combined with the core data analysis was sued to calibrate the inverted volumes by identification of clay typing. The seismic inversion volumes were integrated with the depositional settings of Lam Formation to qualitatively interpret the inverted volumes. Reservoir modeling was performed using seismic inversion facies volumes and petrophysical rock type model to predict and distribute the different depositional facies being controlled by inversion trends. Generation of In place volumes and predictive scenarios for reduction in uncertainty and attributing more predictive strength to the reservoir model.
On the basis of reservoir modeling two (2) prospective areas were observed to show good quality sand bodies which were non-tight and reflected good reservoir properties. The seismic inversion volumes captured the depositional trends within the Lam Formation showing the variation between channel complexes to reservoir quality delta sand bodies. Two (2) to Four (4) infill well locations were identified along with forecasted results which showed positive results based on the delineated prospective areas. The seismic inversion volume results, petrophysical rock typing combined with core data completely changed the field development plan by identifying new prospective areas which were not identified or interpreted previously.
Conceptual limitations of existing gridding technologies often lead to undesirable simplifications to the modeling of structurally complex areas, and consequently poor predictions. We present a structural modeling and gridding workflow that limits these modeling compromises.
A volume-based 3D structural model based on fault and horizon surfaces is constructed from input data that has undergone basic quality checking using a variety of techniques. The critical step in the grid creation is the definition of a flattened (‘depositional’) space that deforms the structural model mesh under mechanical constraints. A 3D ‘unstructured’ grid is created in the depositional space, based on ‘cutting’ a property-populated, regular cuboidal grid by the geological discontinuities. The tectonic consistency and better preservation of geodetic distance make the flattened space ideal for a range of property modeling approaches. The forward-deformation of the grid into true geological space tends to preserve the layer-orthogonality of the grid columns and makes the grid more suited to numerical simulation approximations. The final grid is unstructured, high quality and an accurate representation of the input structural model.
The 3D structural model, depositional space transform and grid geometries all provide valuable information on the structural quality of the input data. The stretching and deforming of the orthogonal local axes in the transformation from depositional space to geological space are used to focus further effort on structural model quality assurance (QA). The key step in generating accurate property population and simulation models is the application of QA metrics on the grid geometry; the transformation from depositional space to geological space is used to generate a set of grid properties that highlight potential structural inconsistencies or data quality issues back in the structural model. We present several examples based on a range of structurally complex models, and demonstrate the downstream impact of applying this QA workflow throughout the stages of input data validation, structural model creation and grid creation.
Ampomah, W. (Petroleum Recovery Research Center) | Balch, R. S. (Petroleum Recovery Research Center) | Ross-Coss, D. (Petroleum Recovery Research Center) | Hutton, A. (Petroleum Recovery Research Center) | Cather, M. (Petroleum Recovery Research Center) | Will, R. A. (Schlumberger Carbon Services)
This paper presents field scale reservoir characterization of the Farnsworth field unit undertaken as part of a Phase III project conducted by the Southwest Regional Partnership on Carbon Sequestration. Farnsworth Unit (FWU) is located on the northwest shelf of the Anadarko Basin. The target unit for CO2 injection, storage, and enhanced oil recovery (EOR) is an upper Morrow sandstone informally named the Morrow B Sand.
The Morrow B reservoir was deposited during early Pennsylvanian time as incised valley fill fluvial sands. Core and thin sections were examined to determine lithology, mineralogy, porosity types, depositional environment and diagenetic history. Data from XRD analysis, optical and scanning electron microscopy, and microprobe analysis were compared with results from an elemental log analysis (ELAN). This information, together with additional core, well log, borehole image log, vertical seismic profile and 3D surface seismic data was used to characterize and subsequently create a fine scale lithofacies-based geological model of the field. Structural modeling was based on integration of 3D seismic and compressional sonic well log data to create a velocity model convertin the seismic z-axis into the depth-domain. Converting domains allowed the 3D seismic data to be correlated to other depth-domain datasets, such as new and legacy well log data and core sections. Seismic attributes were able to illuminate previously unknown faults and structural elements within the field.
During the petrophysical modeling, several deterministic and stochastic techniques were compared and analyzed to ascertain which method best populates the geological properties into the model. These techniques included kriging, sequential Gaussian simulation (SGS) and Gaussian random function simulation (GRFS). A data analysis approach was used as a quality check to reduce uncertainty in the modeling. Studies of depositional environment provided data used constructing porosity—permeability crossplots; this proved to be a useful approach to assigning permeability to the lithofacies-based geological model. The accepted static model was upscaled for dynamic reservoir simulation.
The approach illustrated in this study presents an improved methodology in characterizing heterogeneous and complex reservoirs that can be applied to reservoirs with similar geological features.
Reservoir simulation engineers have to go through an intense and time consuming process to manually quality check large amounts of data which consist of wells, logs and completion events for more than 50 years of history. It is fundamental that data used in building reservoir simulation models is quality controlled and crosschecked to eliminate inconsistencies during the history match process. This new implemented method will give an overview of the simulation data quality workflow design and outline the development of a state of the art software package that guides engineers through the quality checking process as the first stage of simulation model construction. We have developed a workflow software package that covers most of the data required for building simulation models. The preparation and validation of data is conducted using tailored automated workflows that guide engineers through the quality control steps in a streamlined manner. At every step, specific data is reviewed and validated based on some well-developed quality check (QC) criteria. In case of any inconsistencies, the software issues relevant error and warning messages and recommends possible remedial solutions or actions. This tool eliminates tedious manual nonformal effort and results in high-quality simulation models checked and verified through formal and automated workflows, therefore. The new workflow/software reduces the checking process from weeks to days or hours and hence minimizes the overall time to perform history matching.
Understanding and management of subsurface uncertainties has become increasingly important for oil and gas companies to optimize reserve portfolios, make better field development decisions and improve day-to-day technical operations such as well planning. In this paper, the authors developed models, workflows and methodologies for multiple development scenarios that span the range of uncertainties for all stages of reservoir description and simulation for two fields (Field A and Field B) in Saudi Arabia. These processes are based on responses to uncertainty parameters in order to capture and quantify uncertainties. Uncertainty quantification for the two reservoirs was performed using experimental design (ED) approach taking into account seismic interpretation, geological uncertainty, and dynamic data. ED is an intelligent way of sampling parameter space by selecting combination of predefined variables to minimize the number of “experiments” to characterize the behaviour of a system, and to limit the number of models to be run. As a result, robust and responsive reservoir models were constructed to predict the field performance as well as to evaluate various future development strategies alongside better risk-assessments.
Understanding and management of subsurface uncertainties has become increasingly important for oil and gas companies to optimize reserve portfolios, make better field development decisions and improve day-today technical operations such as well planning. Uncertainty refers to a parameter that has an unknown value. In other words, uncertainty is a lack of certainty. It is a state of having limited knowledge where it is impossible to exactly describe the existing state or value of specific parameter.
In reservoir studies, uncertainties exist at every step: geophysical study followed by structural modelling, geological modelling and dynamic flow simulations (Adepoju et al. 2009, Rivera et al. 2007). Geophysical uncertainties are related to acquisition & processing, interpretation (migration, picking, time-to-depth inversion and seismic-to-well tie), velocity law, etc. Geological uncertainties are coming from sedimentary concept, geological scheme, rock properties (heterogeneity, spatial variability, extension and orientation of sedimentary bodies, petrophysical characteristics, and phase contacts), etc. Number of uncertainties is related to the dynamic flow simulations that include fault transmissibility, extension of barriers, abs. permeability, relative permeability, ratio Kv/Kh, viscosity, PVT, Well PI etc. These uncertainties affect the ability to understand the reservoir behaviour and also make reliable production forecasts and risk-free decisions (Adepoju et al. 2009, Dejean and Blanc 1999, Rivera et al. 2007).
Therefore, it is crucial to develop models for multiple development scenarios that span the range of uncertainties for all steps of reservoir description and simulation. The method that is able to do that is a process based response to uncertainties. Uncertainty quantification for Field A and Field B was done using the experimental design approach. Experimental Design is a “smart way” of sampling parameter space by selecting combinations of variables. It is an established method to minimize the number of “experiments” to characterize the behaviour of a system described by many parameters. The traditional statistical method requires modelling of all possible subsurface realizations and development scenarios. It is impractical due to the large number of models to be considered. For instance, 19 parameter case easily requires 3^19= 1,162,261,467 models. Experimental Design (ED) offers a technique to limit the number of models to be run by ‘intelligent selection’ of models that will represent the full uncertainty space (‘sampling’). Detailed workflow is presented below in Figure 1.
Huang, Jixin (Research Inst. of Petroleum Exploration and Development, PetroChina) | Fan, Zheng (Telon Hi-Tech Co. Ltd.) | Xiao, Kun (CNOOC Ltd.) | Guo, Songwei (Research Inst. of Petroleum Exploration and Development, PetroChina)
A new internal architecture modeling method was proposed for lateral accretion(LA) shale beddings of point bar sand body in meandering channelized reservoir. Based on the point bar 3D grid model and the internal architecture analysis results,three orderly modeling technology links were applied to form a full set of embedding modeling technology process and the algorithm implementation, including 3D vector field LA pattern fitting modeling, the LA thickness distribution interpolation under LA surface trend control and partial grid subdividing model embedding process. And then, the modeling method has been validated through a typical internal architecture model of point bar at a study block of China East oilfield. The results show that firstly the internal architecture modeling method can be used to establish fine 3D model of LA shale beddings which condition to wells. Secondly, using LA pattern fitting modeling technology, the method can be effectively encountering inadequate well point information, and can predict the distribution of LA between two wells. Finally, an optimized grid description of different levels and scale architecture was provided by partial grid subdividing model embedding technology. Introduction
Point bar is the most important types of sand bodies of meandering channelized reservoir. Within the point bar sand, development of LA, LA shale beddings and LA surface. Wherein, LA shale beddings are one kind of very important barrier, largely control waterflooding development, and then influence the distribution of remaining oil. Therefore, the establishment of three-dimensional quantitative internal architecture of point bar sand body model has the practical value.
It can be seen from the internal architecture model of point bar, LA, LA shale beddings and LA surfaces of the three elements. Among them, LA surface as the LA of a flushing body exists between the erosion surface; the LA shale bedding refers to the LA shale layer deposited on the surface, a more oblique insertion mud wedge in the section, the plane is arc, thickness of between 0.2m~2m. Point bar LA models generally can be summarized into three kinds, respectively is the horizontal echelon (see Fig.1A), ladder echelon (see Fig.1B), wave (see Fig.1C). This thesis research algorithm will focus on horizontal echelon and step diagonal LA pattern lateral layers as modeling objects.
The goal of modeling for internal architecture of point bar, is the LA shale beddings model embedded grid on every single point bar model(see Fig.2). To this end, first modeling method of embedded architecture surface model by an automatic model fitting surface modeling techniques used to characterize the spatial distribution of the LA; secondly, according to the LA laminate thickness data extraction for each well point, and in the LA surfaces within the scope of interpolation, to establish a model of LA laminate thickness; finally, on the basis of LA model surface and LA laminate thickness, and to the local grid refinement approach will be embedded into the point bar LA shale beddings solid model established in the previously step.
The paper presents examples of integration of production, geological and geophysical data at different stages of geological modeling: input data systematization, well-logs normalization, interpretation of lithology, calibration to dynamic data. The goal of such integration is to reduce labor efforts on construction of geological models and subsequent history matching of hydrodynamic models comparing to traditional approach to modeling.
Geological models of oilfields are widely used not only as a tool for reserves calculation, but also as a basis for construction of reservoir simulation models. Meanwhile due to different reasons dynamic data (well-tests, PLT, liquid rates, bottom-hole pressure, watercut) are not fully accounted for in geological models. Besides, geological modeling is a complex multistage process, which involves professionals from different areas. In current approach every stage of geological modeling performed independently from others. Such approach does not allow control of consistency of different kinds of data as well as removal of inconsistency at early stages of modeling. Because of that construction of geological models becomes more time-consuming, and history matching of hydrodynamic models becomes more complicated.
Thus, in order to improve quality of geological models it is necessary to integrate production, geological and geophysical data at all stages of geological modeling. By this we mean use of all kinds of data in consistent manner as well as cross-check of results of all stages of modeling. Such approach to modeling also leads to reduce of labor-cost for construction of geological models and subsequent history matching of hydrodynamic models. This is especially important for service companies facing necessity to perform large amount of work in tight schedule.
Integration of production, geological and geophysical data at different stages of geological modeling is an extensive subject and the present paper does not fully cover it. Instead it discusses the practical experience if integration of different kinds of data. Also notice that the majority of presented approaches to different stages of geological models are automated in a varying degree, which allows to reduce the amount of errors due to human factor.
Examples of integration of production, geological and geophysical data at different stages of geological modeling
The present paper considers the process of examples of integration of production, geological and geophysical data while building geological models stage by stage. Approaches applied to different stages of geological modeling of West Siberian oil fields are discussed.
Input data systematization Service companies often face necessity to analyze and generalize dissimilar input data, provided by clients. For example, geophysical well logging or core researches are usually performed by several different companies. Therefore data formats are not compatible with each other and during data transfer duplication or even loss of information may occur. For input data handling an automated system, which systemizes input data by types of analysis, brings the data to a common format and controls internal consistency of data was developed. This allows to identify and eliminate data inconsistencies in time, as well as to plan work on qualitative and quantitative characterization of reservoir more efficiently.
The typical industry process for evaluating the economic recovery of hydrocarbon resources using geologic modeling and flow simulation is sequential, involving several disciplines. Each discipline in the chain has a desire to do an excellent job, but may spend more time than is warranted given the quality of the data available at the time.
Speeding up the process while maintaining high quality, RETR is an iterative, parallel work process using a common-scale modeling approach that provides for optimized reservoir models based on the available data.RETR allows an integrated team of geoscientists and engineers to more efficiently extract information from the primary data sets available.The RETR approach is a more parallel process.Instead of waiting until weâ€™ve built a large, detailed model before we run flow simulations, we build simpler models early to get feedback from flow simulation to guide construction of more detailed subsequent passes.
Fine-scale well data and geologic interpretations are accurately included using Effective Property Simulation (EPSIM).Effective property simulation is a fast way to integrate fine-scale geologic information directly into a flow simulation model.Its application has helped improve history matches for several carbonate reservoirs.
The RETR process begins by modeling large-scale features.We complete an initial pass through flow simulation then add detail with each subsequent pass until we have achieved the highest quality result possible with the data available. The idea is to learn from differences between passes, identify critical factors, incorporate additional data as it becomes available, and better judge when to stop working on a particular issue. Finally, because the RETR process is designed to accommodate change, we can generate multiple models quickly, thereby permitting us to look at multiple scenarios.
The RETR work process significantly reduces cycle time by:
Â· focusing an integrated team on a common objective,
Â· using common-scale modeling to provide early feedback from flow simulation, and
Â· accommodating rapid model updates as additional data become available.
RETR is a process that significantly reduces cycle time for reservoir evaluation. The process was developed by a team of geoscientists and engineers at ExxonMobil.Our teamâ€™s challenge was to produce quality reservoir evaluations that incorporate all available data, but significantly faster.We were looking for a step-change reduction in cycle time.The business impact of meeting this challenge was to improve economics from discovery to development and provide fast responses to change over the life cycle of a resource.