3D model is a valuable tool in reservoir management, provided its representativeness of reservoir dynamics.Traditional History Match mainly focuses on reproducing reservoir behavior at well scale. A good match is not always representative of fluid movements in the reservoir. The proposed approach for 3D model validation combines and compares the results of integrated production analysis, in particular flow paths identification, with history matching by using streamlines technology. Streamlines speed up the comparison process especially in complex 3D models.
The workflow is based on a massive Production Data Analysis (PDA) where geological and dynamic data are integrated to identify preferential paths followed by the different fluid phases during the producing life of the field. The main result is the Fluid Path Conceptual Model (FPCM) where aquifer and injected water movements are clearly identified. Once the flooded areas are detected, streamlines are traced on the history matched model in order to easily compare the simulated connections with hard information from PDA. Actions to improve the model representativeness are suggested and integrated in an iterative tuning process.
This paper presents the results of the methodology applied on two complex fields with different injection strategies. FPCMs resulting from PDA provided a powerful boost to drive the history match and speed up the whole process. Priority was given in reproducing the identified preferential paths rather than to perfectly match well production data (which can be also affected by allocation uncertainties) by means of local unrealistic adjustments.
Streamlines were run on Intersect simulation, proving to be a fast and powerful tool for the visualization and understanding of fluid movements in the 3D Model. Since streamlines are used as visualization tool and are traced on a corner point geometry grid using fluxes provided by reservoir simulation, the reliability of the simulation output is preserved.
Once the model is representative of the real field behavior, it can be used as predictive tool in Reservoir Management to optimize the current injection strategy, promoting most efficient injectors.
Al-Jenaibi, Faisal (ADNOC - Upstream) | Shelepov, Konstantin (Rock Flow Dynamics) | Kuzevanov, Maksim (Rock Flow Dynamics) | Gusarov, Evgenii (Rock Flow Dynamics) | Bogachev, Kirill (Rock Flow Dynamics)
The application of intelligent algorithms that use clever simplifications and methods to solve computationallycomplex problems are rapidly displacing traditional methods in the petroleum industry. The latest forward-thinking approaches inhistory matching and uncertainty quantification were applied on a dynamic model that has unknown permeability model. The original perm-poro profile was constructed based on synthetic data to compare Assisted History Matching (AHM)approach to the exact solution. It is assumed that relative permeabilities, endpoints, or any parameter other than absolute permeability to match oil/water/gas rates, gas-oil ratio, water injection rate, watercut and bottomhole pressure cannot be modified.
The standard approach is to match a model via permeability variation is to split the grid into several regions. However, this process is a complete guess as it is unclear in advance how to select regions. The geological prerequisites for such splitting usually do not exist. Moreover, the values of permeability and porosity in different grid blocks are correlated. Independent change of these values for each region distortscorrelations or make the model unphysical.
The proposed alternative involves the decomposition of permeability model into spectrum amplitudes using Discrete Cosine Transformation (DCT), which is a form of Fourier Transform. The sum of all amplitudes in DCT is equal to the original property distribution. Uncertain permeability model typically involves subjective judgment, and several optimization runs to construct uncertainty matrix. However, the proposed multi-objective Particle Swarm Optimization (PSO) helps to reduce randomness and find optimal undominated by any other objective solution with fewer runs. Further optimization of Flexi-PSO algorithm is performed on its constituting components such as swarm size, inertia, nostalgia, sociality, damping factor, neighbor count, neighborliness, the proportion of explorers, egoism, community and relative critical distance to increase the speed of convergence. Additionally, the clustering technique, such as Principal Component Analysis (PCA), is suggested as a mean to reduce the space dimensionality of resulted solutions while ensuring the diversity of selected cluster centers.
The presentedset of methodshelps to achieve a qualitative and quantitative match with respect to any property, reduce the number of uncertainty parameters, setup ageneric and efficient approach towards assisted history matching.
Standard approaches to optimization under uncertainty in reservoir simulation require use of multiple realizations, with variable parameters representing operational constraints and actions as well as uncertain scenarios. We will show how appropriate use of local optimization within the simulation model, using customized logic for field management strategies, can bring improved workflow flexibility and efficiency, by reducing the effort needed for uncertainty iterations.
To achieve meaningful forecasts for an ensemble of uncertain scenarios, it is important to distinguish between different types of decision. Investment decisions, such as facilities sizing, depend on global unknowns and must be optimized for the complete ensemble. Operational actions, such as closing a valve, can be optimized instantaneously for individual scenarios, using measurable information, although subject to constraints determined at a global level. In this study, we implement local optimization procedures within simulation cases, combining customized objective criteria to rank reactive or proactive actions, with the ability to query reservoir flow entities at appropriate frequencies.
The methods presented in the paper can be used for reactive response modeling for smart downhole control; optimization of ESP/PCP pump performance; and implementation of production plans subject to defined downstream limits. For selected cases, we compare the advantages and disadvantages of the local optimization approach with standardized "big-loop" uncertainty workflows. The methodology can significantly reduce optimization costs, particularly for high-frequency actions, achieving similar objective function values in a fraction of the time needed for post-processing optimizers. Use of tailored scripting provides the capability to modernize the logic framework for field management decisions, with realistic representation of smart field equipment and flow entities at any level of complexity.
Use of efficient workflows as described in this paper can reduce the cost of multiple realization studies significantly, or enable engineers to consider a wider range of possible scenarios, for deeper understanding and better risk mitigation.
Development of oil rim reservoirs is challenging and could lead to low oil recovery, if multiple determining factors are not well understood, that influences successful field development concept. It requires detailed analysis and development of specific procedures to optimize the oil production from a thin oil rim underlaying gas cap. Few IOR/EOR applications for oil rim development have been reported in the literature so far. This study presents a concept for the optimization of oil production from an oil rim reservoir by numerical simulation.
As a starting point, a representative sector of the field was selected for the initial analysis. It was decided to perform IOR/EOR methods including water/gas flooding/injection and surfactant flooding using inverted five-spot horizontal well pattern, for the application in the selected sector. Upon execution of the detailed sensitivity analysis, the pattern was optimized by its characteristic geometric variables including the length of the vertical/horizontal section of the well, the location of the wells, lateral well distances and the orientation of the pattern. The optimization was performed by setting an objective function to improve recovery factor and reduce water/gas cut by using the differential evolution algorithm. The latter was run until converging, and the optimal solution was used to perform further IOR/EOR studies.
Finally, after selection of a base-case scenario and best well pattern, IOR/EOR options were evaluated, and the comparative results were reported. The generated results show that the application of 5-spot horizontal well pattern in the oil rim reservoir could increase the oil recovery by water flooding, but with low sweep efficiency. The losses of injected water into the underlaying aquifer and up laying gas gap are large. Immiscible gas injection into the gas cap can support the pressure but massively increases the gas cut. In addition, displacement efficiency by gas flooding is poor.
Simulation results of the surfactant flooding case shows better displacement efficiency compared to water flooding. Also, the possibility of reducing residual oil saturation could increase the ultimate oil recovery but at very late time.
Production optimization is the method for seeking the best possible well control and schedule plans in order to enhance reservoir performance under a given state and economic constraints. Determining the optimal injection and production control strategies through adjoint gradient-based optimization is a well-known practice in today’s modern reservoir management. However, apt handling of nonlinear control inputs, state and output constraints can be quite tedious with effects on the computational efficiency of the optimization algorithms used in practical production optimal control problems. In this paper, we develop an adjoint based interior-point inexact trust filter sequential quadratic programming (IITRF-SQP) method for solving constrained production optimization problems. Inexact trust-region is an extension of a filter trust region approach, which is used when the control input constraints Jacobians are of high dimension and are expensive to compute. The output constraints are handled using an interior-point method called- modified barrier-augmented Lagrangian, in which inequality constraints are treated by a modified barrier term and equality constraints with augmented Lagrangian terms. The algorithm we present uses the approximate information of Jacobians achieved through composite-step computation, which eliminates the cost of direct calculation of Jacobians and Hessians (gradients). The gradient information that provides criticality measure of the objective function is calculated using the adjoint method. Two numerical experiments on optimal water-flooding are presented. Performance comparisons of the proposed IITRF-SQP method with Lagrangian barrier method and sequential linear quadratic programming (SLQP) for solving production optimization problem are carried out. Results indicate that the gradient-based adjoint coupled with IITRF-SQP was able to improve net present value (NPV) through optimal production profiles with better computational efficacy via reduced convergence time and number of gradient and objective function evaluations.
We demonstrate how key geological uncertainties in a giant onshore carbonate reservoir in the Middle East, most notably fracture permeability and saturation distribution, impact the quality of the history match and change the performance forecasts of a planned Miscible Water Alternating Gas (MWAG) injection process. To achieve this, we used a history matching and multi-objective optimisation (MOO) workflow that was tightly integrated with an innovative reservoir modelling workflow that paid particular attention to the fracture and saturation modelling.
Different geological models for the reservoir were designed by integrating static and dynamic data. These data indicated the need to consider fault-related fractures and to update the saturation distribution in the reservoir model. The effective medium theory was therefore used to estimate effective permeability in order to capture the presence of low-intensity fault-controlled fractures in the reservoir. The integration of Special Core Analysis (SCAL) and log-derived J-functions allowed us to build alternative saturation models that honoured well data with great accuracy. The resulting history matched models therefore accounted for the key geological uncertainties present in the reservoir. Afterwards, MOO was applied for each history matched model to identify well controls that optimally balanced the need to maximise the time on the plateau rate while adhering to the field's gas production constraints.
Our results clearly show that including low-intensity fault-controlled fractures in the reservoir model improved the quality of the history match for the gas oil ratio (GOR), bottom hole pressure (BHP) and water cut. This is especially true for wells located near faults, which were difficult to match in the past. Moreover, our results further show that the updated saturation model improved the quality of the history match for the water cut, particularly for wells located in the transition zone. These different history matched models yielded different production forecasts, where the time at which the reservoir can be produced at the plateau rate varied by up to ten years.
Applying MOO for each history matched model then allowed us to identify well controls for the MWAG injection that could extend the time at which the reservoir would be produced at the plateau rate for up to nine years and the risk of losing production plateau down to two years, while always adhering to the current field operational constraints.
We demonstrate how the integration of MOO with an innovative workflow for fracture and saturation modelling impacts the prediction of a planned MWAG injection in a giant onshore carbonate reservoir. Our work clearly illustrates the potential of integrating MOO with new reservoir characterisation methods to improve the quantification of uncertainties in reservoir performance predictions in carbonate reservoirs.
Reservoir simulation optimization under uncertainty typically invokes a sense of anxiety mainly because of a lack of a systematic criterion to choose between different development scenarios under uncertainty, how to go about doing well placement and optimizing well controls in the face of a large uncertainty ensemble of static realisations, and most of all the large number of simulation runs that potentially needs to be conducted. This is exacerbated when the models are large and require many hours to run. Moreover, even with the prevalence of distributed and parallel computing clusters, there is still a limited amount of computing resources available when spread out over the number of reservoir engineers within a company. Time and budget constraints also contribute to complicating this process. Furthermore, with the requirement of an inordinately large number of simulation runs comes the dilemma as to which optimizer to choose that would help speed up the process.
This paper first starts off with a brief background into historical attempts at tackling this problem by delving into the literature. Then it discusses a rigorous criterion for optimization under uncertainty viz. stochastic dominance, hitherto little known or used in the industry. A commonly used greenfield case study which is an ensemble set of uncertainty realisations is then introduced, which the rest of the paper will be based on. The ensemble is a pre-generated set of fifty realisations designed specifically for this problem. Two challenging areas will then be addressed viz. well placement optmisation under uncertainty, and well controls optimization under uncertainty.
Finally, a comparison between the simplex, proxy response surface, differential evolution and particle swarm optimization methods is made in the optimization of well controls. Hence the paper aims to give a complete picture on how to go about reservoir simulation optimization under uncertainty, with a drastically reduced amount of computational runs that needs to be conducted. Practical and sensible formulation of the optimization problemcan go a long way to making this process more understandable and easier to implement.
Improved reservoir knowledge is key to extracting additional value from existing oil and gas assets. However, given the uncertainty in the subsurface, it is always a question if our current development strategy is the most robust choice, or if there are alternatives that can further increase the value of our field. This paper presents a novel solution that enables the asset team to answer these questions in a new way. Furthermore, the solution helps teams quickly identify and screen new opportunities that ultimately increase both subsurface understanding and the value of the field. The solution combines a quasi- Newton gradient based numerical optimization scheme with a stochastic simplex approximate gradient (StoSAG) algorithm. Because the algorithm is non-intrusive with respect to the fluid flow simulator, we can directly apply the solution on any flow optimization problem without the need to access the simulator source code. The solution is implemented using a microservice architecture that allows for efficient scaling and deployment either on cloud-based or internal systems. We demonstrate the proposed solution on a field containing 11 oil producers and 7 water injectors by optimizing the water injection and oil production rates. The machine learning algorithm allows us to quickly explore different drainage strategies, given the current understanding and associated uncertainties of the reservoir.
Specifically, the software solution suggests that 6 of the 18 pre-defined well targets are high risk and/or of little value. Running a second development scenario where we do not drill these six wells reduces the investment cost of this field by 163 MUSD and increases the expected net present value per well of the field by 48 percent. Compared with the reactive control drainage strategy approach, we increase the expected net present value of the field by 9.0 %, while simultaneously lowering the associated risk.
Recent advances in data acquisition systems have helped in monitoring wells performance and recording their production parameters like pressure, temperature and valve opening in real time with high frequency. A cost-effective technology to estimate well production rates is Virtual Metering, which integrates real time data and analytical models. This paper presents the methodology of an innovative virtual metering tool and the promising results obtained in real case applications on gas, gas condensate and oil fields.
A Virtual Metering tool has been developed by integrating a commercial software platform and mathematical models (algorithms). The algorithms solve simultaneously dynamic pressure and temperature gradients (VLP) along with the choke equation to find the optimal solution rates that match physical sensor readings. Moreover, the tool manages the communication between real time data and the models enabling a safe storage of the results. Models require a manual calibration at reference dates based on well separator tests or MPFM readings, in a way to match total field production. After calibration, the algorithm is able to run automatically in real-time.
Three implementations are presented about gas, gas and condensate and oil fields, showing the benefits and limitations of virtual meter application. Virtual meter proved to be a valid technology with the potential of even replacing MPFM results, especially in dry gas fields. Where MPFM are installed on each wellhead, virtual meter worked as redundant system and allowed to detect precociously flow meters malfunctioning. The allocation workflow has been modified in order to replace MPFM estimations with virtual meter ones. For oil fields with variable production parameters, the tool has provided reliable independent rate estimation by combining VLP and choke calculator in a unique optimization tool. The real time flow rate can be used as a basis for pro-rata allocation of fiscal production in the framework of a Production Data Management System software. Additional features of the tool are the following: a real-time input for pressure and rate transient analysis and a workflow for real-time well drawdown estimation of gas wells, which makes use of automatic p/z reservoir model update to estimate reservoir pressure. Moreover, this tool had a significant impact on production monitoring, improved the effectiveness of production optimization actions and the quality of history match of reservoir 3D model.
This paper contains a novel approach of a reliable and robust virtual metering tool that can be flexibly applied to gas and oil fields through a unique optimization algorithm, which is able to combine information coming from production network and from the reservoir side. It gives benefit to company workflows by feeding external reservoir analysis applications that would not be possible without virtual meter results and uses the results of external applications for validation purpose.
Ismail, Ahmed (ADNOC ONSHORE) | Hazem, Yasser (ADNOC ONSHORE) | Al Obaidi, Mazna Naji (ADNOC ONSHORE) | Bogachev, Kirill (Rock Flow Dynamics) | Gusarov, Evgenii (Rock Flow Dynamics) | Shelepov, Konstantin (Rock Flow Dynamics) | Kuzevanov, Maksim (Rock Flow Dynamics)
This paper proposes an assisted history matching (AHM) and uncertainty analysis workflow that was applied to facilitate the history matching of a giant carbonate reservoir in Middle East. The objective was to identify and quantify reservoir uncertainties and assess their impact on the field performance. In addition, to create a sufficient number of realizations to allow combinations of all uncertainties to capture a combined effect.
A real field case is represented by a consistent workflow that iteratively updates the ranges and number of reservoir uncertainties constrained by the actual measurements. The process has the following steps: definition of global uncertainty, sensitivity analysis, exclusion of less influential parameters, experimental design, revision of uncertainty matrix, and run optimization algorithms. The approach was firstly implemented at a global level and then continued to a regional level. The primary objective function is consisted of oil and water production mismatches, and the plan is to upgrade the objective function to include more parameters for further model HM enhancements.
Initially, the workflow was based on five uncertainty parameters. Ten sensitivity analysis cases were performed and tornado chart analysis suggested excluding some parameters that have less impact on the match quality, hence the objective function. Next, experimental design using Latin Hypercube was performed which allows seeing a combined effect of uncertainty parameters. During several experimental design iterations, the uncertainty parameter matrix was revised and a total number of uncertainty parameters was increased from 5 to 17. Finally, a total number of 260 experimental cases were completed, however, no good history match case was obtained. Therefore, a transition from the global level to a regional level was performed. The most sensitive identified uncertainties at global level were absolute permeability, vertical permeability anisotropy, pore volume and fault transmissibility. At the regional level, additional permeability multipliers for well regions were added to the uncertainty matrix. After that, a good quality matched cases were obtained.
Field scale and complexity were the main drive to implement AHM workflow. In a giant carbonate reservoir with long history and complex geology, a classical history matching method with unique solution cannot assure an accurate model predictability. The key advantages of this approach were the facilitating of the HM process and reducing of the total calculation time.