When key geological scenario uncertainties, captured in multiple conceptual models, are combined with continuous parameters, the evaluation of a representative sample set quickly becomes unmanageable, laborious and too time consuming to execute. A workflow is presented that enables users to easily model conceptual as well as parametric uncertainties of the reservoir without the necessity of any complex scripting. The chain of models for all concepts is presented in one view, to provide overview of the key differences between concepts used. An ensemble of geologically sound samples can be created taking into account parameter dependencies and probabilities of concepts. The chain of models per concept can easily be (re)executed.
A case study is presented that consists of multiple concepts based on different hierarchical stratigraphic models in combination with different fault models, each of which with its own fluid- (defined contacts per compartment), grid- (sub-layering and areal resolution) and rock property models. Volumetric calculations are run on an ensemble to get static model observables like GRV, Pore Volume, Oil-In-Place, etc., reported by multiple sub-regions of the model in combination with a lease boundary. (When coupled with dynamic simulation, observables like ultimate recovery, break-through timing, etc. could also be obtained). As thousands of realizations were run concurrently, run time was reduced from weeks to hours. Results reveal the distribution and dependency of observables like GRV on top-structure-depth uncertainty and contact-level uncertainty. For in-place volumes the full suite of concepts and other parametric uncertainties including the stochastic uncertainties (i.e. seed) is analyzed. This also enables the identification of the key uncertainties that impact equity the most, which can be of great commercial value during equity negotiations. This workflow demonstrates how, with the power of Cloud computing, rigorous evaluation of multiple concepts combined with many parametric uncertainties has been achieved within practical turn-around times. As such it overcomes the prohibitive hurdles of the past that often have led to simplifications necessary to save time and effort. The result is better decision quality in resource development decisions.
3D model is a valuable tool in reservoir management, provided its representativeness of reservoir dynamics.Traditional History Match mainly focuses on reproducing reservoir behavior at well scale. A good match is not always representative of fluid movements in the reservoir. The proposed approach for 3D model validation combines and compares the results of integrated production analysis, in particular flow paths identification, with history matching by using streamlines technology. Streamlines speed up the comparison process especially in complex 3D models.
The workflow is based on a massive Production Data Analysis (PDA) where geological and dynamic data are integrated to identify preferential paths followed by the different fluid phases during the producing life of the field. The main result is the Fluid Path Conceptual Model (FPCM) where aquifer and injected water movements are clearly identified. Once the flooded areas are detected, streamlines are traced on the history matched model in order to easily compare the simulated connections with hard information from PDA. Actions to improve the model representativeness are suggested and integrated in an iterative tuning process.
This paper presents the results of the methodology applied on two complex fields with different injection strategies. FPCMs resulting from PDA provided a powerful boost to drive the history match and speed up the whole process. Priority was given in reproducing the identified preferential paths rather than to perfectly match well production data (which can be also affected by allocation uncertainties) by means of local unrealistic adjustments.
Streamlines were run on Intersect simulation, proving to be a fast and powerful tool for the visualization and understanding of fluid movements in the 3D Model. Since streamlines are used as visualization tool and are traced on a corner point geometry grid using fluxes provided by reservoir simulation, the reliability of the simulation output is preserved.
Once the model is representative of the real field behavior, it can be used as predictive tool in Reservoir Management to optimize the current injection strategy, promoting most efficient injectors.
This paper presents a diagnostic workflow to understand and implement rock and fluid modeling in a diagenetically heterogeneous and hydrodynamically pressured Middle East carbonate field. The workflow allows interactive field data integration, provides guidance for reservoir property distribution and fluid contact generation in order to improve reserves and forecasting estimation. The workflow is useful to a reservoir modeler in QA/QC role and in this case it proves particularly applicable in an organization with constrained resources during the farm-in process. The workflow runs on numerical methods within the static model to avoid database discrepancy during the diagnostic process. Using the core (CCAL, SCAL), log and pressure database, the geoscientist can assess subsurface modeling outputs from the simplest to more complex deterministic scenarios. The process aims to minimize the discrepancy between data input and model output while continuously honoring the data, maintaining realistic correlations (e.g. between static permeability and water saturation) and respecting inherent uncertainty.
Using a data-rich Middle East carbonate reservoir, the pre- and post-diagnostic comparison of 3D modeled reservoir properties to the input data are demonstrated. Diagnostic steps have helped to understand potential subsurface scenarios and thus minimize the discrepancy post exercise. The value of the workflow is its ability to pinpoint the key uncertainties in rock and fluid modeling from the field’s vast dataset in a shorter diagnostic time. The application of the workflow in this carbonate reservoir case study increases the importance of geological and property driven rock type classification and its 3D distribution in matching the water saturation profile. This proved particularly challenging in this case study due to the field’s compartmentalization - fluid contact scenario.
In this work we discuss the successful application of our previously developed automated scenario reduction approach applied to life-cycle optimization of a real field case. The inherent uncertainty present in the description of reservoir properties motivates the use of an ensemble of model scenarios to achieve an optimized robust reservoir development strategy. In order to accurately span the range of uncertainties it is imperative to build a relatively large ensemble of model scenarios. The size of the ensemble is directly proportional to the computational effort required in robust optimization. For high-dimensional, complex field case models this implies that a large ensemble of model scenarios which albeit accurately captures the inherent uncertainties would be computationally infeasible to be utilized for robust optimization. One of the ways to circumvent this problem is to work with a reduced subset of model scenarios. Methods based on heuristics and ad-hoc rules exist to select this reduced subset. However, in most of the cases, the optimal number of model realizations must be known upfront. Excessively small number of realizations may result in a subset that does not always capture the span of uncertainties present, leading to sub-optimal optimization results. This raises the question on how to effectively select a subset that contains an optimal number of realizations which both is able to capture the uncertainties present and allow for a computationally efficient robust optimization. To answer this question we have developed an automated framework to select the reduced ensemble which has been applied to an original ensemble of 300 equiprobable model scenarios of a real field case. The methodology relies on the fact that, ideally, the distance between the cumulative distribution functions (CDF) of the objective function (OF) of the full and reduced ensembles should be minimal. This allows the method to determine the smallest subset of realizations that both spans the range of uncertainties and provides an OF CDF that is representative of the full ensemble based on a statistical metric. In this real field case application we optimize the injection rates throughout the assets life-cycle with expected cumulative oil production as the OF. The newly developed framework selected a small subset of 17 model scenarios out of the original ensemble which was used for robust optimization. The optimal injection strategy achieved an average increase of 6% in cumulative oil production with a significant reduction, approximately 90%, in the computational effort. Validation of this optimal strategy over the original ensemble lead to very similar improvements in cumulative oil production, highlighting the reliability and accuracy of our framework.
Yonebayashi, Hideharu (INPEX CORPORATION) | Iwama, Hiroki (INPEX CORPORATION) | Takabayashi, Katsumo (INPEX CORPORATION) | Miyagawa, Yoshihiro (INPEX CORPORATION) | Watanabe, Takumi (INPEX CORPORATION)
CO2 injection is one of widely applied enhanced oil recovery (EOR) techniques, moreover, it is expected to contribute to the climate change from a viewpoint of storing CO2 in reservoir. However, CO2 is well known to accelerate precipitating asphaltenes which often deteriorate production. To understand in-situ asphaltene-depositions, unevenly distributed in composite carbonate core during a CO2 flood test under reservoir conditions, were investigated through numerical modelling study.
Tertiary mode CO2 core flood tests were performed. A core holder was vertically placed in an oven to maintain reservoir temperature and to avoid vertical segregation. A composite core consisting of four Ø1.5" × L2.75" plug cores, which had similar porosity range but slightly varied air permeabilities, was retrieved from a core holder after the flooding test. The remaining hydrocarbon was extracted by Dean-stark method, and heptane insoluble materials were extracted from each plug core via IP-143 method to observe distribution of asphaltene deposits. The variation of asphaltene mass in plug cores was investigated to explain its mechanism thermodynamically.
The core flood test was completed to achieve a certain additional oil recovery by 15 pore volume CO2 injection without any unfavorable differential pressure. The remaining asphaltene mass in each plug core revealed a trend in which more asphaltene collected from the inlet-side core. We assumed a scenario to explain the uneven asphaltene distribution by incorporating the vaporized-gas-drive and CO2 condensing mechanism. Namely, asphaltenes deposited immediately when pure CO2 contacted with oil. The contact between more pure CO2 and oil might be more frequently occurred in inlet-side core. To reproduce the scenario, a cubic-plus-association (CPA) model was generated to estimate asphaltene precipitating behavior as injected gas composition varied. In the first plug core, more pure CO2 gas was considered to contact with fresh reservoir oil compared with the downstream cores which might have less pure CO2 because of its condensation. The light-intermediate hydrocarbon gas vaporized by CO2 was also considered to emphasize the trend of more asphaltene deposits in upstream-side cores. The CPA model revealed consistent phenomenon supporting the scenario.
Al-Jenaibi, Faisal (ADNOC - Upstream) | Shelepov, Konstantin (Rock Flow Dynamics) | Kuzevanov, Maksim (Rock Flow Dynamics) | Gusarov, Evgenii (Rock Flow Dynamics) | Bogachev, Kirill (Rock Flow Dynamics)
The application of intelligent algorithms that use clever simplifications and methods to solve computationallycomplex problems are rapidly displacing traditional methods in the petroleum industry. The latest forward-thinking approaches inhistory matching and uncertainty quantification were applied on a dynamic model that has unknown permeability model. The original perm-poro profile was constructed based on synthetic data to compare Assisted History Matching (AHM)approach to the exact solution. It is assumed that relative permeabilities, endpoints, or any parameter other than absolute permeability to match oil/water/gas rates, gas-oil ratio, water injection rate, watercut and bottomhole pressure cannot be modified.
The standard approach is to match a model via permeability variation is to split the grid into several regions. However, this process is a complete guess as it is unclear in advance how to select regions. The geological prerequisites for such splitting usually do not exist. Moreover, the values of permeability and porosity in different grid blocks are correlated. Independent change of these values for each region distortscorrelations or make the model unphysical.
The proposed alternative involves the decomposition of permeability model into spectrum amplitudes using Discrete Cosine Transformation (DCT), which is a form of Fourier Transform. The sum of all amplitudes in DCT is equal to the original property distribution. Uncertain permeability model typically involves subjective judgment, and several optimization runs to construct uncertainty matrix. However, the proposed multi-objective Particle Swarm Optimization (PSO) helps to reduce randomness and find optimal undominated by any other objective solution with fewer runs. Further optimization of Flexi-PSO algorithm is performed on its constituting components such as swarm size, inertia, nostalgia, sociality, damping factor, neighbor count, neighborliness, the proportion of explorers, egoism, community and relative critical distance to increase the speed of convergence. Additionally, the clustering technique, such as Principal Component Analysis (PCA), is suggested as a mean to reduce the space dimensionality of resulted solutions while ensuring the diversity of selected cluster centers.
The presentedset of methodshelps to achieve a qualitative and quantitative match with respect to any property, reduce the number of uncertainty parameters, setup ageneric and efficient approach towards assisted history matching.
This work presents a new open access carbonate reservoir case study that uniquely considers the major uncertainties inherent to carbonate reservoirs using one of the most prolific aggradational parasequence carbonate formation set in the U.A.E; the Late Barremian Upper Kharaib Mb. as an analogue. The ensemble considers a range of interpretational scenarios and geomodelling techniques to capture the main components of its reservoir architectures, stratal geometries, facies, pore systems, diagenetic overprints and wettability variations across its platform-to-basin profile.
Fully anonymized data from 43 wells across 22 fields in the Bab Basin, U.A.E from different geo-depositional positions and height above FWL’s (specified to capture multiple structural positions) within an area of 36,000 km2 was used. The data comprises of a full suite of open hole logs and core data which has been anonymized, rescaled, repositioned and structurally deformed; FWL’s were normalized and the entire model was placed in a unique coordinate system. Our petrophysical model captures the geological setting and reservoir heterogeneities of selected fields but now at a manageable scale.
The novelty of this work has been to create semi-synthetic
Valencia, Juan D. (Universidad Nacional de Colombia, Exergy - Modeling and Analytics) | Mejía, Juan M. (Universidad Nacional de Colombia) | Ocampo, Alonso (GaStimTechnologies) | Restrepo, Alejandro (Equion Energía)
This paper address the numerical simulation of the chemically enhanced gas injection technology (ChEGas-EOR) at core and reservoir scales. In this technique, a liquid chemical solution, having engineered properties, is sprayed along with the gas stream. The mist travels through the wellbore and further introduced in the reservoir. Previous lab tests, pilot studies in light & intermediate oil reservoirs indicate that the application of CheGas-EOR allows for a reduction in operational costs, increases the chemical penetration radii and decreases the retention rate in the rock. However, the associated uncertainty is still too high to develop this process on a productive scale. In this work we use a developed phenomenological model to build a tool that assist in design and evaluation of Chemical Gas EOR operations aiming to reduce the uncertainties and optimize oil recovery.
We developed a mathematical model, based on the most important transport and surface phenomena. Non-equilibrium mass transfer between phases during the interception of the chemical solution droplets with the liquid phases. Active chemical concentration in miscible liquid phases is much lower than liquid-based chemical injection opperations. As a consequence, dissolution and adsorption rate of active chemicals with reservoir rocks are slow. The model is base on the extended black-oil model formulation coupled to local mass balance equations of active chemicals. Non-equilibrium mass transfer processes are represented with interception, dissolution and a first order kinetic sorption models.
The model was adjusted and then validated using experimental data from core-.floodint tests. Good agreement of the simulations results with experimental observations were obtained. The model can predict the relevant behavior of the disperse chemical injection in the gas phase in porous media. Also, well injections simulations at reservoir scale using the matched parameters from laboratory, reproduced pilot field results. Simulation experiments predict that the CheGasEOR process can increased substantially the oil recovery factor.
For the first time, a model for disperse chemical injection for EOR applications is developed and validated at core and reservoir scale. The simulation model allows the evaluation of this technology at different scales. Therefore, it is possible to use it to optimize operating conditions and perform sensitivity analysis for field applications.
In this paper, the approach to multivariate static and dynamic modeling is considered on the example of an offshore field discovered in 2017. Based on the limited volume of information, the quantitative and qualitative description of uncertainties included further in the 3D modeling is made. This model is proposed to be used as a tool for prompt decision making when implementing a fast-track project with limited time between exploration and pre-FEED stages.
The traditional trial and error approach of history matching to obtain an accurate model requires engineers to control each uncertain parameter and can be quite time consuming and inefficient. However, automatic history matching (AHM), assisted by computers, is an efficient process to control a large number of parameters simultaneously by an algorithm that integrates a static model with dynamic data to minimize a misfit for improving reliability. It helps to reduce simulation run time as well.
Particle Swarm Optimization (PSO) is a population based stochastic algorithm that can explore parameter space combined with the least squares single objective function. The process of AHM can adopt parameterization and realization methods to reduce inverse problems. In this study, realizations of various reservoir properties such as porosity, net to gross, relative permeability, horizontal and vertical permeability, and aquifer size were chosen for controlling throughout the AHM. History matching was conducted to validate the efficiency of each method. The guidelines for optimized AHM with a stochastic algorithm are also disccussed.
The realization and parameterization methods improved matching results in a full-field application with resulting in a reduced misfit and in less. A stochastic algorithm generates multiple models to deduce control parameters to reduce a misfit. In this study we identified that PSO converged effectively with updated control parameters. The optimized AHM improved the accuracy of a full-field model although some misfit remained in the match to bottomhole pressure.
We found that updating with too many parameters makes the problem difficult to solve while using too few leads to false convergence. In addition, while the simulation run time is critical, a full-field simulation model with reduced computational overhead is benefitial.
In this study, we observed that the PSO was an efficient algorithm to update control parameters to reduce a misfit. Using the parameterization and realization as an assisted method helped find better results. Overall this study can be used as a guideline to optimize the process of history matching.