Sun, Qian (Petroleum Engineering, Texas A&M University at Qatar) | Zhang, Na (Division of Sustainable Development, College of Science and Engineering, Hamad Bin Khalifa University) | Alyafei, Nayef (Petroleum Engineering, Texas A&M University at Qatar) | Wang, Yuhe (Petroleum Engineering, Texas A&M University at Qatar) | Fadlelmula, Mohamed (Petroleum Engineering, Texas A&M University at Qatar)
Reservoir simulation is commonly performed on upscaled models of complex geological models. The upscaling process introduces a principal challenge in accurately simulating two-phase fluid dynamics in porous media. To tackle this challenge, it is important to upscale relative permeability accurately. In this paper, a numerical method, which is based on the mimetic finite difference method (MFD) and digital rock analysis (DRA), is proposed for relative permeability upscaling. The validation of MFD is tested by two different cases with exact pressure solution. Then, the relative permeability of the digital rock (small element) is calculated based on the pore network modeling. The small elements are combined together to make up a larger model with different sizes (4×4×4, 6×6×6, 8×8×8, 10×10×10 elements). Finally, the accuracy of the proposed method is verified by comparing simulated results of the different sizes with that of the original one. The results show that MFD can solve the multi-phase flow scenarios with high accuracy and the
When key geological scenario uncertainties, captured in multiple conceptual models, are combined with continuous parameters, the evaluation of a representative sample set quickly becomes unmanageable, laborious and too time consuming to execute. A workflow is presented that enables users to easily model conceptual as well as parametric uncertainties of the reservoir without the necessity of any complex scripting. The chain of models for all concepts is presented in one view, to provide overview of the key differences between concepts used. An ensemble of geologically sound samples can be created taking into account parameter dependencies and probabilities of concepts. The chain of models per concept can easily be (re)executed.
A case study is presented that consists of multiple concepts based on different hierarchical stratigraphic models in combination with different fault models, each of which with its own fluid- (defined contacts per compartment), grid- (sub-layering and areal resolution) and rock property models. Volumetric calculations are run on an ensemble to get static model observables like GRV, Pore Volume, Oil-In-Place, etc., reported by multiple sub-regions of the model in combination with a lease boundary. (When coupled with dynamic simulation, observables like ultimate recovery, break-through timing, etc. could also be obtained). As thousands of realizations were run concurrently, run time was reduced from weeks to hours. Results reveal the distribution and dependency of observables like GRV on top-structure-depth uncertainty and contact-level uncertainty. For in-place volumes the full suite of concepts and other parametric uncertainties including the stochastic uncertainties (i.e. seed) is analyzed. This also enables the identification of the key uncertainties that impact equity the most, which can be of great commercial value during equity negotiations. This workflow demonstrates how, with the power of Cloud computing, rigorous evaluation of multiple concepts combined with many parametric uncertainties has been achieved within practical turn-around times. As such it overcomes the prohibitive hurdles of the past that often have led to simplifications necessary to save time and effort. The result is better decision quality in resource development decisions.
A particular challenge inherent to carbonate reservoirs is reservoir rock typing which impacts model initialisation and saturation distributions and hence STOIIP, phase mobilities, and flow behaviours. We explore how flow diagnostics can be used best to detect subtle differences in reservoir dynamics arising from different model initialisations by comparing flow diagnostics simulations with full-physics simulations.
Flow diagnostics are applied to two reservoirs, a synthetic but realistic model representing an analogue for the Arab-D formation and a giant carbonate reservoir from the Middle East. Saturation modelling and reservoir rock typing is based on uniform and heterogeneous Pc and kr distributions, and further employs a state-of-the-art software that integrates of SCAL data and log-derived saturations. Sweep efficiency and dynamic Lorenz coefficients are then derived from the flow diagnostics results to quantify and compare the dynamic behaviour of the reservoir models. The full-physics simulations, which are used to validate the flow diagnostics results, are carried out with a commercial Black Oil simulator.
The flow diagnostics results can clearly distinguish between different homogenous and heterogeneous rock-type distributions, wettability trends, as well as novel saturation modelling approaches that use dedicated software tools. Flow diagnostics capture the same trends in recovery predictions as the full-physics simulations. Importantly though, the total CPU time for a single flow diagnostics calculation including model loading is on the order of seconds, compared to minutes and hours for a single full-physics simulation. These observation give confidence that flow diagnostics can be used effectively to compare and contrast the impact of reservoir rock typing, saturation modelling, and model initialisation on reservoir performance before running full-physics simulations. Flow diagnostic hence allow us to reduce the number of reservoir models from a model ensemble and select a small number of diverse yet realistic reservoir models that capture the full range of geological uncertainties which are then subjected to more detailed reservoir simulation studies.
Flow diagnostics are particularly well suited for complex carbonate reservoirs which are geologically more complex than clastic reservoirs and often exhibit significant uncertainties. Giant carbonate reservoirs are also challenging to simulate using full-physics simulators due to their size, so the impact of geological uncertainty on the predicted reservoir performance is often underexplored. Flow diagnostics are hence an effective complement to quantify uncertainty in state-of-the-art reservoir modelling, history matching and optimisation workflows, particularly for giant carbonate reservoirs.
3D model is a valuable tool in reservoir management, provided its representativeness of reservoir dynamics.Traditional History Match mainly focuses on reproducing reservoir behavior at well scale. A good match is not always representative of fluid movements in the reservoir. The proposed approach for 3D model validation combines and compares the results of integrated production analysis, in particular flow paths identification, with history matching by using streamlines technology. Streamlines speed up the comparison process especially in complex 3D models.
The workflow is based on a massive Production Data Analysis (PDA) where geological and dynamic data are integrated to identify preferential paths followed by the different fluid phases during the producing life of the field. The main result is the Fluid Path Conceptual Model (FPCM) where aquifer and injected water movements are clearly identified. Once the flooded areas are detected, streamlines are traced on the history matched model in order to easily compare the simulated connections with hard information from PDA. Actions to improve the model representativeness are suggested and integrated in an iterative tuning process.
This paper presents the results of the methodology applied on two complex fields with different injection strategies. FPCMs resulting from PDA provided a powerful boost to drive the history match and speed up the whole process. Priority was given in reproducing the identified preferential paths rather than to perfectly match well production data (which can be also affected by allocation uncertainties) by means of local unrealistic adjustments.
Streamlines were run on Intersect simulation, proving to be a fast and powerful tool for the visualization and understanding of fluid movements in the 3D Model. Since streamlines are used as visualization tool and are traced on a corner point geometry grid using fluxes provided by reservoir simulation, the reliability of the simulation output is preserved.
Once the model is representative of the real field behavior, it can be used as predictive tool in Reservoir Management to optimize the current injection strategy, promoting most efficient injectors.
In this work we discuss the successful application of our previously developed automated scenario reduction approach applied to life-cycle optimization of a real field case. The inherent uncertainty present in the description of reservoir properties motivates the use of an ensemble of model scenarios to achieve an optimized robust reservoir development strategy. In order to accurately span the range of uncertainties it is imperative to build a relatively large ensemble of model scenarios. The size of the ensemble is directly proportional to the computational effort required in robust optimization. For high-dimensional, complex field case models this implies that a large ensemble of model scenarios which albeit accurately captures the inherent uncertainties would be computationally infeasible to be utilized for robust optimization. One of the ways to circumvent this problem is to work with a reduced subset of model scenarios. Methods based on heuristics and ad-hoc rules exist to select this reduced subset. However, in most of the cases, the optimal number of model realizations must be known upfront. Excessively small number of realizations may result in a subset that does not always capture the span of uncertainties present, leading to sub-optimal optimization results. This raises the question on how to effectively select a subset that contains an optimal number of realizations which both is able to capture the uncertainties present and allow for a computationally efficient robust optimization. To answer this question we have developed an automated framework to select the reduced ensemble which has been applied to an original ensemble of 300 equiprobable model scenarios of a real field case. The methodology relies on the fact that, ideally, the distance between the cumulative distribution functions (CDF) of the objective function (OF) of the full and reduced ensembles should be minimal. This allows the method to determine the smallest subset of realizations that both spans the range of uncertainties and provides an OF CDF that is representative of the full ensemble based on a statistical metric. In this real field case application we optimize the injection rates throughout the assets life-cycle with expected cumulative oil production as the OF. The newly developed framework selected a small subset of 17 model scenarios out of the original ensemble which was used for robust optimization. The optimal injection strategy achieved an average increase of 6% in cumulative oil production with a significant reduction, approximately 90%, in the computational effort. Validation of this optimal strategy over the original ensemble lead to very similar improvements in cumulative oil production, highlighting the reliability and accuracy of our framework.
Hydrocarbon in place volumes are often inaccurate as a result of poor representation of the reservoir structure (by means of a 3D grid), that in combination with the use of traditional saturation calculation methods, lead to erroneous hydrocarbon volumes and poor investment decisions.
Traditionally a reservoir model is represented with a 3D grid, in a complex setting such as fault intersections and stacked reservoirs. A corner point grid is often used, which has limitations to represent this complexity. Further, the hydrocarbon saturations are then derived on a cell by cell basis on that 3D grid using simple averaging techniques of saturation height functions. The poor structure representation on the pillar grid in addition to the simplistic averaging methods lead to inaccuracies of the in place volumes especially where a prominent transition zone is present.
This paper presents new advanced saturation averaging methods (volume and height weighted) using saturation height functions on 3D grids. The new advanced saturation averaging methods are used on different reservoir models to compare the saturation distribution and volumetric differences against the traditional saturation calculation methods. A 4-way dip closure reservoir model with a tilted free water level (typical example of a carbonate reservoir in the Middle East), and a faulted S-grid model of the F3-FA field (North Sea) are used.
For the 4-way dip closure reservoir model, when comparing the advanced ‘volume weighted’ and traditional ‘by center of the part of the cell’ saturation averaging methods, a significant difference in the water saturations is observed which leads to about 5% difference in the calculation of in place hydrocarbon volumes. Further, it is observed that changing the thickness and orientation of the 3D grid cells can result in even larger differences of 5-10%.
The faulted F3 model shows that the difference between the hydrocarbon saturation values is largest where it matters most, that is, around the fluid contacts and in the transition zone. The new advanced saturation averaging methods give accurate hydrocarbon saturations irrespective of the size or complexity of the 3D grid and without any discretization effects.
Waterflooding is the main technic to recover hydrocarbons in reservoirs. For a given set of wells (injectors and producers), the choice of injection/production parameters such as pressures, flow rates, and locations of these boundary conditions have a significant impact on the operating life of the wells. As a large number of combinations of these parameters are possible, one of the critical decision to make is to identify an optimal set of these parameters. Using the reservoir simulator directly to evaluate the impact of these sets being unrealistic considering the required number of simulations, a common approach consists of using response surfaces to approximate the reservoir simulator outputs. Several techniques involving proxies model (e.g., kriging, polynomial, and artificial neural network) have been suggested to replace the reservoir simulations. This paper focalizes on the application of artificial neural networks (ANN) as it is commonly admitted that the ANNs are the most efficient one due to their universal approximation capacity, i.e., capacity to reproduce any continuous function. This paper presents a complete workflow to optimize well parameters under waterflooding using an artificial neural network as a proxy model. The proposed methodology allows evaluating different production configurations that maximize the NPV according to a given risk. The optimized solutions can be analyzed with the efficient frontier plot and the Sharpe ratios. An application of the workflow to the Brugge field is presented in order to optimize the waterflooding strategy.
Al-Jenaibi, Faisal (ADNOC - Upstream) | Shelepov, Konstantin (Rock Flow Dynamics) | Kuzevanov, Maksim (Rock Flow Dynamics) | Gusarov, Evgenii (Rock Flow Dynamics) | Bogachev, Kirill (Rock Flow Dynamics)
The application of intelligent algorithms that use clever simplifications and methods to solve computationallycomplex problems are rapidly displacing traditional methods in the petroleum industry. The latest forward-thinking approaches inhistory matching and uncertainty quantification were applied on a dynamic model that has unknown permeability model. The original perm-poro profile was constructed based on synthetic data to compare Assisted History Matching (AHM)approach to the exact solution. It is assumed that relative permeabilities, endpoints, or any parameter other than absolute permeability to match oil/water/gas rates, gas-oil ratio, water injection rate, watercut and bottomhole pressure cannot be modified.
The standard approach is to match a model via permeability variation is to split the grid into several regions. However, this process is a complete guess as it is unclear in advance how to select regions. The geological prerequisites for such splitting usually do not exist. Moreover, the values of permeability and porosity in different grid blocks are correlated. Independent change of these values for each region distortscorrelations or make the model unphysical.
The proposed alternative involves the decomposition of permeability model into spectrum amplitudes using Discrete Cosine Transformation (DCT), which is a form of Fourier Transform. The sum of all amplitudes in DCT is equal to the original property distribution. Uncertain permeability model typically involves subjective judgment, and several optimization runs to construct uncertainty matrix. However, the proposed multi-objective Particle Swarm Optimization (PSO) helps to reduce randomness and find optimal undominated by any other objective solution with fewer runs. Further optimization of Flexi-PSO algorithm is performed on its constituting components such as swarm size, inertia, nostalgia, sociality, damping factor, neighbor count, neighborliness, the proportion of explorers, egoism, community and relative critical distance to increase the speed of convergence. Additionally, the clustering technique, such as Principal Component Analysis (PCA), is suggested as a mean to reduce the space dimensionality of resulted solutions while ensuring the diversity of selected cluster centers.
The presentedset of methodshelps to achieve a qualitative and quantitative match with respect to any property, reduce the number of uncertainty parameters, setup ageneric and efficient approach towards assisted history matching.
Valencia, Juan D. (Universidad Nacional de Colombia, Exergy - Modeling and Analytics) | Mejía, Juan M. (Universidad Nacional de Colombia) | Ocampo, Alonso (GaStimTechnologies) | Restrepo, Alejandro (Equion Energía)
This paper address the numerical simulation of the chemically enhanced gas injection technology (ChEGas-EOR) at core and reservoir scales. In this technique, a liquid chemical solution, having engineered properties, is sprayed along with the gas stream. The mist travels through the wellbore and further introduced in the reservoir. Previous lab tests, pilot studies in light & intermediate oil reservoirs indicate that the application of CheGas-EOR allows for a reduction in operational costs, increases the chemical penetration radii and decreases the retention rate in the rock. However, the associated uncertainty is still too high to develop this process on a productive scale. In this work we use a developed phenomenological model to build a tool that assist in design and evaluation of Chemical Gas EOR operations aiming to reduce the uncertainties and optimize oil recovery.
We developed a mathematical model, based on the most important transport and surface phenomena. Non-equilibrium mass transfer between phases during the interception of the chemical solution droplets with the liquid phases. Active chemical concentration in miscible liquid phases is much lower than liquid-based chemical injection opperations. As a consequence, dissolution and adsorption rate of active chemicals with reservoir rocks are slow. The model is base on the extended black-oil model formulation coupled to local mass balance equations of active chemicals. Non-equilibrium mass transfer processes are represented with interception, dissolution and a first order kinetic sorption models.
The model was adjusted and then validated using experimental data from core-.floodint tests. Good agreement of the simulations results with experimental observations were obtained. The model can predict the relevant behavior of the disperse chemical injection in the gas phase in porous media. Also, well injections simulations at reservoir scale using the matched parameters from laboratory, reproduced pilot field results. Simulation experiments predict that the CheGasEOR process can increased substantially the oil recovery factor.
For the first time, a model for disperse chemical injection for EOR applications is developed and validated at core and reservoir scale. The simulation model allows the evaluation of this technology at different scales. Therefore, it is possible to use it to optimize operating conditions and perform sensitivity analysis for field applications.
The traditional trial and error approach of history matching to obtain an accurate model requires engineers to control each uncertain parameter and can be quite time consuming and inefficient. However, automatic history matching (AHM), assisted by computers, is an efficient process to control a large number of parameters simultaneously by an algorithm that integrates a static model with dynamic data to minimize a misfit for improving reliability. It helps to reduce simulation run time as well.
Particle Swarm Optimization (PSO) is a population based stochastic algorithm that can explore parameter space combined with the least squares single objective function. The process of AHM can adopt parameterization and realization methods to reduce inverse problems. In this study, realizations of various reservoir properties such as porosity, net to gross, relative permeability, horizontal and vertical permeability, and aquifer size were chosen for controlling throughout the AHM. History matching was conducted to validate the efficiency of each method. The guidelines for optimized AHM with a stochastic algorithm are also disccussed.
The realization and parameterization methods improved matching results in a full-field application with resulting in a reduced misfit and in less. A stochastic algorithm generates multiple models to deduce control parameters to reduce a misfit. In this study we identified that PSO converged effectively with updated control parameters. The optimized AHM improved the accuracy of a full-field model although some misfit remained in the match to bottomhole pressure.
We found that updating with too many parameters makes the problem difficult to solve while using too few leads to false convergence. In addition, while the simulation run time is critical, a full-field simulation model with reduced computational overhead is benefitial.
In this study, we observed that the PSO was an efficient algorithm to update control parameters to reduce a misfit. Using the parameterization and realization as an assisted method helped find better results. Overall this study can be used as a guideline to optimize the process of history matching.