The petro-elastic model (PEM) represents an integral component in the closed-loop calibration of integrated four-dimensional (4D) solutions incorporating time-lapse seismic, elastic and petrophysical rock property modeling, and reservoir simulation. Calibration of the reservoir simulation model is needed so that it is not only consistent with production history but also with the contemporaneous subsurface description as characterized by time-lapse seismic. The PEM requires dry rock properties in its description, which are typically derived from mechanical rock tests. In the absence of those mechanical tests, a small data challenge is posed, whereby all necessary data is not available but the value of reconciling seismic attributes to simulated production remains. A seismic inversion-constrained n-dimensional metaheuristic optimization technique is employed directly on three-dimensional (3D) geocellular arrays to determine elastic and density properties for the PEM embedded in the commercial reservoir simulator.
Ill-posed dry elastic and density property models are considered in a field case where the seismic inversion and petrophysical property model constrained by seismic inversion exist. An n-dimensional design optimization technique is implemented to determine the optimal solution of a multidimensional pseudo-objective function comprised of multidimensional design variables. This study investigates the execution of a modified particle swarm optimization (PSO) method combined with an exterior penalty function (EPF) with varied constraints. The proposed technique involves using n-dimensional design optimization to solve the pseudo-objective function comprised of the PSO and EPF given limited availability of constraints. In this work, an examination of heavily and reduced-order penalized metaheuristic optimization processes, where the design variables and optimal solution are derived from 3D arrays, is conducted so that constraint applicability is quantified. While the process is examined specifically for PEM, it can be applied to other data-limited modeling techniques.
Integration of time-lapse seismic data into dynamic reservoir model is an efficient process in calibrating reservoir parameters update. The choice of the metric which will measure the misfit between observed data and simulated model has a considerable effect on the history matching process, and then on the optimal ensemble model acquired. History matching using 4D seismic and production data simultaneously is still a challenge due to the nature of the two different type of data (time-series and maps or volumes based).
Conventionally, the formulation used for the misfit is least square, which is widely used for production data matching. Distance measurement based objective functions designed for 4D image comparison have been explored in recent years and has been proven to be reliable. This study explores history matching process by introducing a merged objective function, between the production and the 4D seismic data. The proposed approach in this paper is to make comparable this two type of data (well and seismic) in a unique objective function, which will be optimised, avoiding by then the question of weights. An adaptive evolutionary optimisation algorithm has been used for the history matching loop. Local and global reservoir parameters are perturbed in this process, which include porosity, permeability, net-to-gross, and fault transmissibility.
This production and seismic history matching has been applied on a UKCS field, it shows that a acceptalbe production data matching is achieved while honouring saturation information obtained from 4D seismic surveys.
The objective of this paper is to propose an alternative data analysis approach to working with microseismic data. Modern machine learning techniques, such as MWCA (Multiway Component Analysis) and TD (Tucker Decomposition) can give the capability to efficiently work with complex high-dimensional microseismic data structures. Using this method, it was possible to restore hidden information about the signal, compress the data, and get insights about fractures without using conventional time-consuming simulations. Therefore, it is an important addition to the hydraulic fracturing quality assessment. It is a cost-effective technique providing a greater degree of automation in comparison to conventional methods.
The approach was tested on synthetic data and relevant real microseismic data provided by a service company. The data was integrated in a 3rd-order tensor form where modes are: seismic events time, receiver locations, and event locations. The tensor was then decomposed into a core tensor and three factor matrices by means of a special form of TD called HOSVD (Higher-Order Singular Value Decomposition). HOSVD is a multidimensional decomposition used to extract low rank approximations of tensors. The MWCA technique was utilized to impose constraints on TD. HOSVD showed potential as a tool for a rapid fractures analysis by observing decomposed tensor structure. Additionally, the technique helped reduce the original model by 73% (supercompression).
The proposed workflow is general and highly applicable to various plays. Since the applications of MWCA and TD are still emerging, future enhancements to this methodology are expected. In turn, this will reveal further insights from microseismic data, making it paramount to optimal fracturing and improved field management.
Time-lapse seismic monitoring is a powerful technique for reservoir management and the optimization of hydrocarbon recovery. In time-lapse seismic datasets, the difference in seismic properties across different vintages enables the detection of spatio-temporal changes in saturated properties and structure induced by production. The main objectives are (1) to identify bypass pay zones in time-lapse seismic data for the deepwater Amberjack field, located in the Gulf of Mexico, (2) confirm the identified bypass pay zones in the results of reservoir simulation, and (3) recommend well planning strategies to exploit these bypassed resources.
A high-fidelity seismic-to-simulation 4D workflow that incorporates seismic, petrophysics, petrophysical property modeling, and reservoir simulation was employed, which leveraged cross-discipline interaction, interpretation, and integration to extend asset management capabilities. The workflow addresses geology (well log interpretation and framework development), geophysics (seismic interpretation, velocity modeling, and seismic inversion), and petrophysical property modeling (earth models and co-located co-simulation of petrophysical properties with P-impedance from seismic inversion). An embedded petro-elastic model (PEM) in the reservoir simulator is then used to affiliate spatial dry rock properties with saturation properties to compute dynamic elastic properties, which can be related to multi-vintage P-impedance from time-lapse seismic inversion. In the absence of the requisite dry rock properties for the PEM, a small data engine is used to determine these absent properties using metaheuristic optimization techniques. Specifically, two particle swarm optimization (PSO) applications, including an exterior penalty function (EPF), are modified resulting in the development of nested and average methods, respectively. These methods simultaneously calculate the missing rock parameters (dry rock bulk modulus, shear modulus, and density) necessary for dynamic, embedded P-impedance calculation in the history-constrained reservoir simulation results. Afterward, a graphic-enabled method was devised to appropriately classify the threshold to discriminate non-reservoir (including bypassed pay) and reservoir from the P-impedance difference. Its results are compared to unsupervised learning (k-means clustering and hierarchical clustering). From seismic data, one can identify bypassed pay locations, which are confirmed from reservoir simulation after conducting a seismic-driven history match. Finally, infill wells are planned, and then modeled in the reservoir simulator.
Houston-based Surge Energy drilled the Medusa Unit C 28-09 3AH well in the Midland Basin to a TMD of 24,592 ft, with a total horizontal displacement of 17,935 ft, or 3.4 miles. This paper discusses a project with the objective of leveraging prestack and poststack seismic data in order to reconstruct 3D images of thin, discontinuous, oil-filled packstone pay facies of the Upper and Lower Wolfcamp formation. A new technique that analyzes scanning electron microscope (SEM) images of formation samples has been used to measure porosity and total organic carbon (TOC) in the Wolfcamp Shale of the Delaware Basin in west Texas. The integration of microseismic data with 3D seismic attributes, and well log and completions data is used to understand geomechanical rock properties. Data mining for production optimization in unconventional reservoirs brings together data from multiple sources with varying levels of aggregation, detail, and quality.
A challenging problem of automated history-matching work flows is ensuring that, after applying updates to previous models, the resulting history-matched models remain consistent geologically. This paper discusses a project with the objective of leveraging prestack and poststack seismic data in order to reconstruct 3D images of thin, discontinuous, oil-filled packstone pay facies of the Upper and Lower Wolfcamp formation. The Oklahoma City independent has a new-look portfolio and new operational and financial priorities. And now it has enlisted an energy research firm to leverage advanced analytics and machine learning to help get the most out of its assets. The objective of this case study is to describe a specific approach to establishing an exploration strategy at the initial stage on the basis of not only uncertainty reduction, but also early business-case development and maximization of future economic value.
The Programme Committee of the 2019 SPE Russian Petroleum Technology Conference invites you to submit a paper proposal and contribute to this event. The paper proposal categories below are used to direct the paper proposals to the appropriate subject matter experts for evaluation. Please choose one or several categories and submit your paper.
Saluja, Vikas (Oil & Natural Gas Corporation LTD.) | Singh, Uday (Oil & Natural Gas Corporation LTD.) | Ghosh, Aninda (Oil & Natural Gas Corporation LTD.) | Prakash, Puja (Oil & Natural Gas Corporation LTD.) | Kumar, Ravendra (Oil & Natural Gas Corporation LTD.) | Verma, Rajeev (Oil & Natural Gas Corporation LTD.)
The case study demonstrated here is the innovative workflow for fault delineation technique on a 3D seismic volume in B-173A Field of Heera Panna Bassein (HPB) Sector, Western Offshore Basin, India. B-173A is located 50 kms west of Mumbai at an average water depth of about 50 m. The field was discovered in the year 1992 and it was put on production in Aug 1998. In B-173A field there are two hydrocarbon bearing zones one is gas bearing Mukta (Lower Oligocene carbonates) Formation and oil bearing Bassein (Middle to Upper Eocene Carbonates) formation.
The present study is an extended workflow on Advanced Seismic Interpretation using Spectral Decomposition and RGB Blending for Fault delineation. Iso-frequency volumes are extracted from Relative Acoustic Impedance data instead of seismic data itself.
The workflow is for effective fault delineation and it consists of Spectral Decomposition of relative acoustic impedance data and RGB Blending of discontinuity attributes of different Iso-frequency volumes.
It is observed that RGB blend volume of discontinuity attributes provided more convincing results for fault delineation as compared to the results of traditional discontinuity attributes.
Until recently, reservoir characterization methods in the industry were limited to use of seismic technologies in exploration of oil and gas and had a very constrained role in production and development. In the past, using characterization for development fields was considered a very perilous task. Technological advancements and the risk-averse mindset have significantly expanded the application of reservoir characterization. Today, reservoir characterization is the basis of any development plans made for a commercial field.
Development of 3D reservoir modeling techniques to generate field development plans (FDPs) marked a step-change in reservoir characterization methods. Introduction of geostatistics and numerical simulation made it possible to build precise models to generate realistic field development scenarios. This is the state-of-the-art seismic-to-simulation method of reservoir characterization used in FDPs today. However, the struggle to estimate reservoir properties spatially away from the well continues.
Surface seismic data provide excellent areal coverage but do not provide the vertical resolution required for a fine-scale reservoir model. Geostatistical methods reduce the uncertainty in spatial distribution of petrophysical properties from pseudo-point supports (wells) but are not calibrated spatially between the wells. Correspondingly, the fluid saturation distribution and the parameters used in dynamically calculating the same during numerical simulation are not calibrated in the interwell space.
This paper details necessary data acquisitions and methods of calibration of 3D reservoir model to reduce uncertainty in the interwell space. The data acquisition methods have been available for some time, but have rarely been electronically incorporated in the 3D reservoir model and have been largely used to analytically guide the modeling and its inferences. A logical way of interpreting the results of acquisitions and calibrating the 3D reservoir model cell-by-cell is detailed in this paper.